Video with 10-bit Channels: Update: Sony Vegas (12) Handles it OK

In my post http://blog.davidesp.com/archives/573 I reported that when I imported 10-bit footage to Sony Vegas 10, even in 32-bit mode, it still appeared to act as if only using 8-bit footage.

Since then, I re-did the test for Sony Vegas 12.  I applied a 10-bit recording from a CineDeck, in QuickTime (.mov) containing Cineform 10-bit, taken from SDI output of a Sony EX3 camera.  In that case, Vegas 12, with 32-bit (Video Levels) mode, did correctly make use of the 10 bits, as verified by bands on the vectorscope on dark areas of high-alpha-increased underexposed footage.

One thing I noted was the tendency of 8-bit to round-down.  Consequently, when switching project to 32-bit mode (hence 10-bits used), the dark levels visibly (and on vectorscope) brightened.  I guess better compatibility would in that case be obtained (in that NLE or by pre-processing the footage) by first subtracting “4” from the 10-bit levels, since the “2^4” represented by those extra 2 bits would in 8-bit format not exist, hence effectively would be rounded-down.  Or maybe the offset should be 1 less than this, i.e. subtract 3 (depends how the rounding gets done).

Workarounds would be to either pre-process the 10-bit footage (to subtract the offset) or else, less conveniently, apply a levels effect to increase the input minimum level by that amount.  But would be awkward and may or may-not work, depending on Vegas 12 nuances.  Something to be tested!

Comments are closed.