How to Fix Crushed Blacks on Sony a7S and a7S II External Recordings

To say that the Sony a7 series of mirrorless cameras has been popular among videographers is an understatement. A large number of filmmakers have been using these cameras and some them make good use of the external recording feature to get even higher resolutions and data rates. 

Unfortunately, there has been a pitfall plaguing the cameras for a while now. Of course, I’m talking about the crushed blacks and highlights on external recordings. This article will take a deeper look at why this phenomenon appears and how to resolve it. Or at least, how to work around it. [UPDATE: We have developed a LUT that fixes it.]

crushed blacks sony a7s external recording

Crushed Blacks on External Recordings

It is no wonder that external recording is so popular on the Sony a7S. With the assistance of an external disk recorder, such as the Atomos Shogun, the possible resolution increases from 1080p to 4K and filmmakers can tap into high-quality codecs such as ProRes or DNxHD (as opposed to Sony’s H.264 derivative, XAVC).

Sadly, when using S-log2 Gamma Space, crushed blacks and skimmed highlights can make a surprise appearance at the editing desk. A good portion of the lows appear cut off, and some highlight information tends to be lost, too. In S-log3 only the blacks are affected (as the video output only goes as high as 94%). The contrast is raised and, in essence, the whole purpose of S-log is defeated. This leads to images that were perfectly fine during shooting becoming broken.

This phenomenon has been the main reason that some people have avoided using external recorders with the Sony a7S, a7S II, and a7R II cameras.

a7s internal vs external crushed blacks

What causes the Crushed Blacks Phenomena?

Let’s face it. We’re often quick to blame manufacturers when things like this crop up. In this case, however, it turns out the culprit isn’t Sony. After digging deeper into the matter, we discovered that the reason we lose image information on external recordings is that the NLE is incorrectly interpreting the data it is given—it doesn’t realise that we’re using the higher bandwidth Gamma (S-log2/S-log3). In our case, the NLE in question is Adobe Premier Pro CC 2015.

incorrect display range cuts off information leading to crushed blacks

Wrong display range settings of the video data cut off image information.

The image above is a waveform representation (captured in S-log2 with DaVinci Resolve) of a recording of an OEFC chart. In the camera’s S-log2 & S-log3 Gamma Setting, apparently data from 0 to 1023 is used for each channel. However, when no further metadata is embedded in the recordings, all NLE’s, including DaVinci Resolve, automatically interpret the footage recorded on external recorders with video levels from 64 to 960. This cuts off some information in both the blacks and the highlights. The orange arrows point out where that happens.

When you record in-camera (XAVC, for example), then metadata is written directly to the file to tell your NLE to use 0 – 1023 (If you’re working in 10bit). On external recordings, however, this metadata cannot be transferred via the HDMI cable thus either the recorder would have to set that metadata flag (which it probably doesn’t), or it has to be set in your NLE manually.

The Fix

[UPDATE:] We have developed a custom LUT that fixes this problem either in post, or directly on a recorder, so you have 0 rendering time and 0 quality loss. Get the LUT by clicking the button below, or read all about it here.


The Workaround

Unfortunately in the NLE’s that many of us use (Adobe Premiere Pro CC 2015 & Final Cut Pro X) there isn’t an option to manually set the data levels. The NLE will always automatically interpret the footage with levels from 64 to 960 (or 16-240 in 8bit) and this will cut information, leading to crushed blacks and skimmed highlights and there is no way to change that yet.

To correctly interpret externally recorded footage you will either have to switch to another NLE that lets you set this manually, or go through DaVinci Resolve, which allows you to re-interpret the footage. Fortunately, DaVinci Resolve is a free download.

[UPDATE]: Thanks to xdcam-user we realized there is a workaround directly in Premiere.

levelsAdobe Premiere Pro CC

To get your video levels back to include the full range of your external Slog recordings do the following in Adobe Premiere Pro CC.

After importing your footage to the timeline:

  1. Apply the effect “Fast Color Corrector” to your footage.
  2. In the effects tab on the “Fast Color Corrector” effect change your “Output Black” to 16 and your “Output White” to 235.

Make sure that the “Fast Color Corrector” is always the first effect in your filter-stack (before Lumetri for example), otherwise you will get the wrong results.

Your footage now uses the full range right within Premiere. Other “Levels” filters will not get your levels back by the way.

davinci-clip-attributesDaVinci Resolve

If you’re using an externally recorded clip from an a7 series camera, then go to the EDIT tab in DaVinci Resolve and right-click on the clip(s) you are importing in your library window. On the drop-down select “clip attributes.”


A window will open that lets you choose either “Auto”, “Data Levels” or “Video Levels”. By default this is set to “Auto”, but we want to select “Data Levels 0 – 1023” to properly display our S-log2 / S-log3 footage.

After all of your clips are set to “Data Levels”, drag them to the timeline where you will then be able to color the clips to your liking—with all information in the blacks and highlights. Once this is done, go to the DELIVER tab in DaVinci Resolve to export the clips.

Note that the above tip only applies to external recordings! When handling internally recorded files (XAVC) the levels should always be left to “auto”.

ae1Adobe Premiere After Effects CC

[Update: Here’s the fix for After Effects]
In After Effects the fix is a bit more tricky. First you have to open “Project Settings…” which can be accessed by right-clicking the small menu box on the project tab.

ae2In the window that opens you have to change your project’s Color Settings.

Click the “Depth” dropdown and select “32 bits per channel (float)”. Then click “Ok”.

ae3Now you have to apply the “Levels” effect to your footage. It can be found under Effects –> Color Correction –> Levels.

In the Effect Controls tab under “Levels” you can now set your ouput levels. Set Output Black to “0,0627” and Output White to “0,9255”. Those are the values that correspond to 16-235 in the 32-bit space.

Now your footage includes all data levels of Slog.

Besides being a more complicated workaround the real downside of this method is that it will considerably slow down your rendering speed.

Final Cut Pro X

Unfortunately Final Cut Pro X seems to handle the video range incorrectly as well and I have not found a fix. Although the xdcam-user article claims that Final Cut X handles superwhites correctly, this was not the case in my tests. To prepare externally recorded Slog footage for a correct grade in Final Cut X you will probably have to go through DaVinci first, or simply switch to Premiere Pro, like I did a while ago. I know many people out there love Final Cut X, so no offence please.

Future Solutions?

Of course, it would be an easy fix if Adobe and Apple would just give us an option to use the full range of levels in Premiere / After Effects & Final Cut Pro — just like in DaVinci Resolve. Apparently some users have been aware of this issue for a while, but so far there has been no direct implementation of such feature. We can only hope that this article will add some pressure.

If you are affected by this, you can help and let the software companies know by leaving a comment underneath and fill out the feature request form.

For Adobe Premiere Pro CC:  Adobe feature request form.
For Adobe After Effects CC:  Adobe feature request form.
For Apple Final Cut X: Apple feature request form.

[UPDATE:] We have developed a custom LUT that fixes this problem either in post, or directly on a recorder, so you have 0 rendering time and 0 quality loss. Get the LUT by clicking the button below, or read all about it here.


Leave a Reply

Please Login to comment
21 Comment threads
24 Thread replies
Most reacted comment
Hottest comment thread
14 Comment authors
 Ben DoyleSebastian WöberStephen de VereJorg EhrlerDaniel McMahon Recent comment authors
newest oldest most voted
Adrian Outlaw

question on the a6300 when u record in 4k and a monitor is attached the Face Registration turns off. is there a fix for this ?

Markus Lubenica

Firstly fix your crushed 4k brainwash and don’t do the manufacturers parrot pinocchio style in your SEO/SEM affiliate marketing blogs. There’s no 4k on any Alpha series camera out there. Not internal, not external. It’s goddamn UHD, so go and figure! :P

Johnnie Behiri

Markus Lubenica. Move on. A bit tiring listening to your 4K/UHD mantra.

 Emile Modesitt

Markus got destroyed!!! Haha. Somebody’s got a serious stick up their butt. And it seems ridiculous that Adobe hasn’t implemented a fix for this problem. With the millions of updates they roll out, you’d think they’d be able to add this feature which has a massive effect on workflow for so many filmmakers.

Markus Lubenica

So, and why you’re hunting for „likes“ on a SEO brute force (self)marketing CMS platform like this, which serves a facebook ranking $ystem? C’mon, nobody runs a blog 24/7/365 for free ;) Coming back to the key facts…..sorry you’re simply technically incorrect. The term “4k” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4k content. In this case, 4k is 4096×2160, and is exactly four times the previous standard for digital editing and projection (2k, or 2048×1080. 4k refers to the fact that the horizontal pixel count (4096) is roughly four thousand. The 4k standard is not just a resolution, either… it also defines how 4k content is encoded. A DCI 4k stream is compressed using jpeg2000, can have a bitrate of up to 250Mbps, and employs 12bit 4:4:4 color depth. Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1920×1080. UHD quadruples that resolution to 3840×2160. It’s not the same as the 4K resolution made above – and yet almost every TV or monitor you see advertised as 4k is actually UHD. Sure, there are some panels out there that are 4096×2160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3840×2160, for a 1.78:1 aspect ratio. And the same counts for the affiliate marketing hyped term „4k” on all Alpha series cameras. Now, it’s not as if TV manufacturers aren’t aware of the differences between 4k and UHD. But presumably for marketing reasons, they seem to be sticking with 4k in the consumer market which is overall wrong. So as to not conflict with the DCIs actual 4k standard, some TV makers seem to be using the phrase “4k UHD,” though some are just using „4k“ wich is again – wrong. To make matters more confusing, UHD is actually split in two – there’s 3840×2160, and then there’s a big step up, to 7680×4320, which is also called UHD. It’s reasonable to refer to these two UHD variants as 4k UHD and 8k UHD – but, to be more precise, the 8k UHD spec should probably be renamed QUHD (Quad Ultra HD). The real solution would have been to abandon the 4k moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines, with the letters “i” and “p” referring to interlacing, which skips every other line, and progressive scan, which doesn’t: 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p, and so on. Now that there are 4K TVs everywhere, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon use of 4K in favor of UHD and 2160p. In all honesty, though, it’s too late. That said, 4k has already been bastardized, before it’s even really “out” per se. Consumers don’t have a clue there are industry specs for it. The only reason anyone has heard of H.264 is because Apple introduced the nomenclature to consumers via Quicktime. 4k is already just a marketing term and hyped by blogs for UHD content which is simply wrong. But we don’t need to exercise a fundamental debate about things that should be be widely clear. Anyway, Sony charges money from not very well enlighted consumers for features of a camera that doesn’t exist. VW gets sued for their exhaust value scandal world wide because they claimed some features that doesn’t exist. The main problem isn’t just the fact of manufacturers lies. It’s also about their affiliate marketing, their $hareholders do the 1:1 pinocchio parrot spec style. The customers are misled since years and learn technically and total wrong issues and finally agree to terms that aren’t true. That’s brainwash! Sony is aware of that and would never do that in their (Semi-) Professional lineup cause they would be eaten alive and loos a lot of $$$$$ clients. It’s just because most consumers don’t know the difference and they grew up and were teached by the profit bound industry to accept UHD as 4k. See Sony Bravia TVs, see all consumer line Alpha cameras, see their action sports cameras, see their mobile phones……. – but face that not Sony alone does that. Bottom line, selling UHD as 4k is a lie and technically wrong – so better correct this and your backlog.

 Gentry Jonathan
Gentry Jonathan

wooohooo!!! awesome post man. ‘lot of good stuff there.

Sebastian Wöber

Hi Markus, Good to get your perspective on this. I call for your source please, my sources differ a bit. It seems to me that you are mostly upset about the general consensus to call any format that has a resolution of around 4000 pixels “4K”, regardless of any standardisation that might have happened in the past. Really it seems most manufacturers of cameras, televisions and projectors (television and cinema), but even most users I see are ok to refer to these resolutions as “4K”. Personally I also think it makes sense to call it all 4K and I’m afraid I think you won’t be able to convince the whole world to use other terms. I think it’s a bit far fetched to say “it’s all a lie”. I know there are debates on what are the correct terms, but it’s too complicated for the end user, so we shouldn’t bother them too much with this stuff. I’d recommend to simply take it easy on this topic. 4K certainly isn’t a selling argument against UHD in my opinion. There are endless attributes about a camera that are so much more important. Nobody around here, and certainly I don’t want to mislead or lie, but rather do the exact opposite, so I’m not your enemy. But I’ll stick to the term “4K” like most people in cinema / video world seem to have agreed to do. Regarding our magazine’s running costs: Yes we have advertisements on the site by select manufacturers and links to our sponsors, but we recommend only the products we work with ourselves and we do not recommend many of the ones we test as you will no doubt see when you read our site. Also we work with retailers that sell all the products and we recommend those retailers as we use them ourselves, in order to make sure we retain our objectivity with our content. This is very important to us. So when you call us “a marketing blog” it’s actually “a lie” as you would call it.


[…] How to Fix Crushed Blacks on Sony a7S and a7S II External Recordings […]

Please enter a valid e-mail address. We will send the download to your inbox.
We hate spam! Your e-mail will be stored with exclusively and you can unsubscribe from our newsletters at any time.
Please check your inbox and confirm your e-mail address.
Please enter a valid e-mail address. We will send the download to your inbox.
We hate spam! Your e-mail will be stored with exclusively and you can unsubscribe from our newsletters at any time.
Please enter a valid e-mail address. We will send the download to your inbox.
We hate spam! Your e-mail will be stored with exclusively and you can unsubscribe from our newsletters at any time.