14 Mar

Why you should use Avid DNxHD and Apple ProRes

  • Written by 
  • submit to reddit  


So, what’s so good about ProRes and DNxHD?

What’s good is that they don’t compress the video as much. They trade storage space and bitrate for quality and ease of processing. They typically need ten times as much storage. That’s a lot, but hard drives are tending towards being free, so it hardly matters these days.

In fact, ProRes and DNxHD (sometimes called “intermediate” or “production” codecs) sit in a kind of “sweet spot” between the humongous data rates of uncompressed video, and the egregious processing demands of Long Gop video.

Why not just work with uncompressed?

Because you need as much as six times more storage and data bandwidth than you do with ProRes and DNxHD. Even with disk prices as low as they are, that’s asking a lot. Moreover, you’ll find yourself unable to process as many streams of uncompressed video in real time: you’ll hit a brick wall once your disk system maxes out. ProRes and DNxHD are so processor-friendly, that even though your computer has to uncompress multiple streams on the fly, you’ll probably find you can edit more simultaneous streams than with uncompressed.

But what about quality?

Both ProRes and DNxHD are effectively “visually lossless” at their higher bitrates. That means exactly what it says: you can’t tell the difference between your original video and the same material encoded to ProRes or DNxHD.

So you don’t need to worry about generation loss. In fact, you can usually make many copies in the intermediate codec format without any apparent degradation.

It’s worth stressing that the main reason for ProRes and DNxHD’s increased quality is that they are not long-GOP codecs

Once you’ve converted, your edits will fly. These codecs are so efficient that you can play multiple streams with a fraction of the effort you need for a single stream of H.264 - based video. As an example, on my Macbook Air, playing any HD Long-Gop clip will cause the fan to come on at full power within seconds, but with ProRes, I can play a clip - in much higher quality - with no sign of the fan.






Not registered? Sign up now

Lost your password?

  • When Avids DNxHD codecs first came out we ran them up against our uncompressed DS suite and pushed a whole load of material through comparing the results on our CRT HD grade 1 monitor and you just couldn't see any losses in the DNxHD content.
    The only real issue I've seen has been when DNx185 content has been laid off to HDCAM (with it's horrid sub sampling and compression) and then re-captured as DNx185. Artefact's start to appear and quality quickly falls away depending on the content and the original acquisition codec used.

    0 Like
  • You mentioned in this article that "Transcoding to an intermediate codec is not your only option." But I was wondering how comparable (in image quality) is transcoding to intermediate codec against capturing in a intermediate codec.

    0 Like
  • If you capture from uncompressed directly to an intermediate codec, that will be better than transcoding from a Long GOP codec like H.264.

    0 Like
  • Recording from the camera using DNxHD or ProRes is very different that converting AVCHD or H.264 to DNxHD or ProRes before editing -- and you do not make this difference clear. Obviously, other than demanding significantly greate storage, the former is ideal. (But then uncompressed RAWN is even better.)

    No quality is gained by converting AVCHD or H.264 files to DNxHD or ProRes before editing. AVCHD or H.264 must first be decoded to uncompressed data and then compressed to DNxHD or ProRes. Obviously no information that was eliminated during AVCHD or H.264 encoding can, or is, restored during the decode. And, obviously, compressing to DNxHD or ProRes does not restore already lost information. In fact, information is always discarded during compression. So following your idea results in less picture quality.

    What you are missing is that modern NLEs do not "edit with" the source codec. As each frame is decoded it becomes 4:2:2/4:4:4 uncompressed data. Editing is always performed with uncompressed frames. This happens on-the-fly. From a buffer of thousands of data packets, long-GOP decoding utilizes multiple packets of information to generate a frame of video. No modern computer has a problem with either reading buffers from disk nor calculating an HD video frame from information already stored in memory.

    Moreover during encoding these packets are shuffled so they are recorded in exactly the correct order to make decoding very fast. As each frame is decoded the color-space is converted to 4:4:4 so it can be combined with 4:4:4 graphics. (Some NLEs may only convert video to 4:2:2.)

    All streams of uncompressed data are appropriately combined to make a data stream that goes only to the display. During export the same process--starting with encoded data from the disk-- is followed except the uncompressed result is compressed to the "export" codec. AT NO POINT IS UNCOMPRESSED DATA RECOMPRESSED TO AVCHD OR H.264. Thus, files are stored on disk in AVCHD or H.264 files that are very compact and which reduce disk read bandwidth.

    You may be thinking of the old days where after combining uncompressed streams the result was encoded back to the project codec which was the same as the source codec. This was called rendering. When iMovie was released, Apple completely eliminated rendering. The GPU did everything in realtime. Only during export was anything written to disk.

    Some NLE's do use rendering so that very complicated FX can be played back in realtime. And, some will use rendered files to speed-up exports -- although by deleting render files you can force them to start from the source files. THESE RENDER FILES NEVER USE A LONG-GOP CODEC.

    Because the move to 4K requires frames 4X bigger than HD, NLEs do support during import, conversion of long-GOP to "proxy" or "optimized" formats. While this reduces the computational load during editing -- it requires lots of storage and increases the required disk bandwidth. (An alternate approach decodes every other source pixel of 4K data so it is only working with FHD video.)

    0 Like
  • Hi Steve,

    I don't think I did say that you gain quality if you transcode from H.264 to ProRes. Of course you don't. But it may be useful to you anyway if your source codec is 8 bit, because you can then work in ProRes's ten bit space. This won't add quality per se, but leaves more headroom for grading etc.

    And it is always better to work with codecs that are designed to need less processing power. Normally that means intra-frame codecs. Although I'm told that Sony's XAVC is pretty efficient for editing in its long GOP format, and of course in its Intra Frame version. Yes you can do clever stuff now with Long GOP codecs in editing, but it still takes more processing grunt than ProRes.

    To get ProRes directly out of a camera, you'll have to record from HDMI or SDI to an external recorder.

    0 Like
  • Articles like these always make me happy that I left the Mac/FinalCut platform so many years ago, around the time of FCP 5. Transcoding is so Amiga... But I do appreciate RedSharkNews to no end since they seem able to entertain the concept of Windows machines. Over here in America, it seems like every video writer equates Mac with Computer, although since FCPX, many are now coming out of the jungle like the Japanese soldier in the 60s. Remember lads, all you have to do in Premiere is select "keyboard layout" and choose "Final Cut"... Then you can use AVCHD and video capture is "elegant & intuitive" you know, like the Macs were before August of 1995.

    0 Like
  • Remember lads, all you have to do in Premiere is select "keyboard layout" and choose "Final Cut"... Then you can use AVCHD and video capture is "elegant & intuitive" you know, like the Macs were before August of 1995.

    I see this typical scenario as the cause of much of the 'loose practice' witnessed in today's digital acquisition and post production chain: everything from letting the camera run between takes, a lack of understanding of the benefits derived from working with intra-frame codecs even it that means transcoding and a total ignorance of timecode which is still the 'glue' that ties together an efficient post production workflow. Problems often arise when those skilled in the DSLR/Mac/Premier/Youtube workflow progress on to a more professional scenario where they are spending other people's money and the stakeholders wish to exercise their right to see that their investment is spent wisely.

    A quick perusal of some high end cinema and broadcast forums like Lift/Gamma/Gain will reinforce the fact that since the dawn of the Mac 'Trashcan', more and more top shelf DaVinci Resolve workstations are being built on Win PC i7 and Xeon platforms. Faster, cheaper and easily re-spec'd as the PC hardware evolves. Perhaps the film school 'blinkers' are slowly coming off...

    0 Like
  • And Cineform - just as good as ProRes and DNxHD, properly cross-platform, but totally overshadowed since GoPro bought them. Good to see that the current version of Premier Pro CC has brought Cineform back into the mainstream workflow.

    0 Like
  • " When iMovie was released, Apple completely eliminated rendering."

    Hummm.., and I was under the impression that the article was inclined toward the professional side of video. I don't see the point in presenting a consumer editors capabilities as a competing factor over using DNxHD or ProRes in a multi layered professional editor.

    Regarding Mr. Marshall's comments:

    I hear you loud and clear. I was one of those "film school" (actually TV Production) instructors. And far from being insulted by your comment, I applaud it. Where as I tried to teach the virtues you mentioned it fell on deaf ears. Today the tail is wagging the dog. Students come in and have no respect for professional protocol. Slating, logging, have gone the way of punctuality and manners. They want to capture every worthless frame of video (as one enormous clip!) and then edit by exclusion. Maybe I'm just a throwback to the non-linear days. But I see contemplative forethought in editing by inclusion (from a select group of usable clips might I add).

    To Mr. Shapton:

    Thank you for your explanation. The myriad of codecs out there can often leave one feeling like they are looking at a food menu with 101 items written in a foreign language. Rather than explaining them all (and likely adding to the confusion) you have distilled it down with a reasonable purpose. For those from the educational side articles like this can be most helpful. We often worked using antiquated equipment with no incentive to learn/study about things we were not likely to acquire (at least anytime soon). It was often deemed a successful day just to to keep things functional for the duration of class. Going home at night and learning something that was not applicable at the moment would have taken time away from praying for another day of equipment functionality.

    0 Like
  • Great article David!
    Keep up the good work ;-)

    0 Like
David Shapton

David is the Editor In Chief of RedShark Publications. He's been a professional columnist and author since 1998, when he started writing for the European Music Technology magazine Sound on Sound. David has worked with professional digital audio and video for the last 25 years.

Twitter Feed