I'm not able to get high quality video when exporting MP4 files. Does anyone have any tips on how to improve my video export quality? my original footage is pretty crisp, but all my exported videos become pixelated. Thanks!
You'll have to give us some details on your export parameters (frame size, bitrate), but quality is almost solely proportional to bitrate.
You'll have success if you use the export presets, although in my experience you can reduce the preset bitrates by half and still achieve a good outcome, quality-wise. You'll only get pixellation at very low bitrates.
. . . . You'll have success if you use the export presets, . . . .
I agree totally with this, and would add - do not change any of the settings of the preset they are already optimised for the video resolution and encoding format.
You will see some recommendations in the forum to increase the bitrate and/or set the GOP structure to I frames only, however, depending on the video content, this usually only gives you a marginal increase in perceived quality and/or a massive increase in the file size.
If you give Youtube at least 1440P ie 2560x1440 (16:9) or 1920x1440 (4:3), then you'll invoke the YT VP9 video codec, which is better than H264.
And make sure you choose "Best" in the Advanced export settings. Why Magix thought it should default it's exports to something other than the best you can do is a mystery to me.
Why Magix thought it should default it's exports to something other than the best you can do is a mystery to me.
We are back to the 'perceived quality 'argument again and the effects of varying compression levels applied during a re-render of a video file verses end file size and the disk storage space needed.
Some people never see a difference while for others it becomes an increasing annoyance while a third section of the population see it but are not bothered by it.
My final conclusion on the subject is if a person reports they see a difference, believe them even if you can't because it is just as reliant on the graphics components in a system and how that is set up, as the quality of the eyesight of the viewer as well as viewing distance vs screen size.
That is something I rarely ever do except exceptional circumstances. Sharpening to my eye, for my own personal material has side effects I notice from the original such as an increase in colour noise in the shadow areas, even at lightly applied levels of use. I know others use it and would not go above a 20% level of added sharpening and is yet another division of opinion between users.
Most people get obsessed with this subject as they progress but forget the importance of starting with a well calibrated monitor and the setting up of their graphics components in the various graphics card control panels which can add their own sharpening and colour calibration before you even start editing. While the editing package should take charge of some of those settings for the actual editing process, the playback in whatever player you designate to play the file back in is controlled from the graphics card interfaces.
So the question then becomes 'How far don the rabbit hole do you want to go?' along with 'After all that effort, will the person at the other end see what I see?' followed by 'Is it all worth the effort?' 😅😉
. . . . Regarding sharpening. That is something I rarely ever do except exceptional circumstances . . . .
This is something I always do - I work with 4K UHD video and it is most beneficial when downscaling eg to FullHD and when effects tend to affect the image sharpness (Blur excepted 😂).
Sharpening should be the very last effect applied to the movie, IMHO never on the individual clips, using the Movie Effects option - I set a value of 30 which adds a subtle amount, which is perceivable as a crisper image without over doing the sharpening, in the export.
. . . . well calibrated monitor and the setting up of their graphics components in the various graphics card control panels which can add their own sharpening and colour calibration before you even start editing . . . .
This I totally agree with, particularly if you are producing videos professionally, or are working with dual monitors, colour calibration, ideally, should be done , or the monitors at least adjusted so colours are as close to identical as possible.
If you do not need or want to go to the expense of a colour calibration device - I use an X-Rite monitor colour calibrator - then Windows has some rudimentary colour calibration options for the displays.
To answer Rays 'rhetorical' questions:
. . . . So the question then becomes 'How far down the rabbit hole do you want to go?' . . . .
As far as the 'rabbit hole' is concerned this is a user choice.
. . . . will the person at the other end see what I see?' . . . .
Almost definitely not, their monitor/TV/viewing device calibrations may be way off and the incident lighting totally different so they will see the video differently.
. . . . followed by 'Is it all worth the effort? . . . .
IMHO - Yes - your satisfaction that you have produced the best quality video you can for the viewer, without going down another rabbit hole, is important.
In case anyone following these added bits of information @johnebaker is referring to when he says
Sharpening should be the very last effect applied to the movie, IMHO never on the individual clips, using the Movie Effects option - I set a value of 30 which adds a subtle amount, which is perceivable as a crisper image without over doing the sharpening, in the export.
John is referring to the following feature I'm not even sure most users are familiar with.
We are back to the 'perceived quality 'argument again
Ray, I'm not arguing about the perceived quality. I can't see why the default encoding setting for any encoding would not be "Best". If somebody wants to reduce the quality then fine, they could set it via dropdown, but it seems silly to me to want anything less than "best" for a given bitrate.
Notice at the lowest part of the scale of encoding it says Fastest rather than poorest quality?
In that case does that not go with their advertising claims of being able to be used on the lowest ability of computers within the specified component list? Assuming the 'Best' option would require more processing power or at the least take a lot longer to render than the selected default setting? I'm just playing devil's advocate at this point so don't take me too seriously. After all, what's in a name?