What is HDR? – The New Technology Explained

The Inter BEE 2015 technology exhibition that took place near Tokyo ended earlier this week, but during my visit two topics where very evident. HDR and 8K! While I intend to dedicate a separate news post to 8K, HDR is the more “relevant and upcoming technology” that might affect our professional lives soon.


But what is HDR? The new “high dynamic range” technology started to emerge in a more prominent way during last NAB show in Las Vegas (April) and now, half a year later, after looking at the Japanese domestic market, there is no doubt that it will find its way to our homes sooner rather then later.

To grasp the impact of the technology you simply have to look at a video which was shot and played back on an HDR monitor in order to appreciate the enhanced clarity and depth you are getting. It is almost like looking at a three dimensional picture without the need of 3D glasses, but in fact it is something else than 3D entirely. If I had to describe “what is HDR” in a sentence, I would say the new technology brings us a higher dynamic range viewing experience, richer colours, and more realistic images, than we’ve been used to. The displays are a lot brighter in the highlights, thus giving you a more realistic experience when it comes to light distribution across an image.

During Inter BEE I had a chance to talk to Ishii-san who took the time to highlight some of the benefits we should expect from embracing the new technology. According to Mr. Ishii, implementing HDR on video cameras is almost done, as sensor dynamic range constantly increases, but it’s the displays that are lagging behind. Leading companies like Dolby laboratories together with American and Japanese manufacturers are trying to standardize the technology and hopefully they will be done soon.

If you already had a chance to see some video demonstrations of HDR, please share your experience in the comments and let us know what you think about the emerging technology.

cinema5D at Inter BEE 2015
Tilta Came-TV

Watch it on Vimeo

Leave a Comment

You are not subscribed to this post. Follow new comments

Login to comment

 Francis Rafal Reply
Francis Rafal November 27, 2015

I went to Eindhoven (Netherlands) this year, just to see Mission: Impossible 5 in the Dolby Cinema in Dolby Vision (Dolby’s HDR technology). Unfortunately, although Mission: Impossible 5 was also graded in Dolby Vision, the theatre didn’t have the right file, so we ended up watching a DCI-P3-finished film screened with an HDR projector. In my opinion, the problem with HDR right now is the lack of content since most films are shot in HDR but then compressed to a lower dynamic range for finishing.

A day later we watched Pixels 3D in Dolby Vision and although I didn’t like the story, some of the pictures were breathtaking in terms of color richness and contrast. So I’m really looking forward to having a lot of HDR movies in the future!

Johnnie Behiri November 27, 2015

Francis Rafal, thanks for sharing!

Mathias Sonnleitner Reply
Mathias Sonnleitner November 27, 2015

new? :D

Eno Popescu November 27, 2015

I say NO to the over-bright Dolby Vision tech! I don’t know what to do to dim the brightness down on my TV and they want to boost it several times in intensity, absolutely not!

 Richard Van Den Boogaard Reply
Richard Van Den Boogaard November 27, 2015

Aside from some animation films, we’ve all long forgotten about 3D. I’ve been a propenent of HDR implementation for years. Curious to learn how the technology actually works, because RED had/has an implementation that involves synced recording of two signals, each at varying shutter speeds. A beam-splitter inside the camera that records at +3, 0, -3 EV stops and post-processing would probably yield the best results.

It’s such a great time to be a filmmaker!

Patrick Zadrobilek November 27, 2015

HDR imagery is used since the late 80s in computer generated images for texturing environments and is used for many years that way. I think if they truly want to implement HDR into cameras and displays than either you need sunglasses if you sit in front of an HDR display that can reproduce bright lights, but I think when HDR is processed so you can watch it “normally” it does not look real, but comical regarding shading and colors. For my opinion HDR is not really usable as a end-user imaging system. It would be better to implement higher bit-depths to the displays rather than making them able to display a brighter image to simulate “real” light brightness.

Andreas Prohart Reply
Andreas Prohart November 27, 2015

.. an old shoe if you are a photographer….

Zee Ristic November 27, 2015

those guys are trying to mess up the things again… lets not get confused between the the real HDR and trying (or forcing) again to sell us the “new” tech. Try to imagine the happiness of all the content owners and LCD makers if they convince us that we all need to bay new HDR copies of all the movies that we already own and of course new TVs. That’s what we are talking about here.
Instead, the HDR is just the way you “pack” a pic in file and “unpack” it on the screen. Just remember opening the BM DNG file in premiere and tweaking shadows/highlights and there is HDR on any existing screen on the planet. So, for real HDR they just need a peace of software to put in any existing TV to “do the tweak” for you, and that’s it, HDR. Of course the tech should go on, develop, advance… but that’s not the issue they are talking about here. And yes, the story here about HDR is simplified but in a nutshell that’s it.

Oscar Goldman Reply
Oscar Goldman November 28, 2015

“Just remember opening the BM DNG file in premiere and tweaking shadows/highlights and there is HDR on any existing screen on the planet.”

Is that sarcasm? Because you know that’s absolute bullshit, right?

Zee Ristic November 28, 2015

What’s BS? Just load any BM DNG file, tweak exposure, white, blacks… or curves… and you’ll have all from deepest shadow details to every detail in your clouds. It can be discussed if BM DNG is really HDR, but that’s not the point here, the point is the principle. You can take a RED 19 stops (so they say if I’m not wrong here) file and doing the same process you’ll have all the 19 stops on any plastic LCD [this is sarcasm :)] in the world. The point is that only important thing is that your camera/ sensor “sees” let’s say 19 stops and that it’s written in your file. From there you can take almost any screen and do the software “rescaling” that range into a screens range, and all is going to be there on the screen. Actually, what you see in their own demo movie up is exactly that, they show no HDR file shot with the camera that has let’s say 10 stops range, and it looks bad, and they show the same shot with let’s say 15 stop range camera and file, and they show it in HDR on the SAME SCREEN. Thanks for reading, sorry for maybe overwriting, just trying to make clear. If you have more comments, it’s more than welcome.

Oscar Goldman Reply
Oscar Goldman November 28, 2015

You can’t show HDR on a normal monitor, period. That’s what the special Dolby monitors are for.

One good thing to come out of this, hopefully: People will learn what HDR is and stop calling LDR crap “HDR.” That garish fad was bad enough, but then to misuse the term “HDR” made it worse.

 Gregor Schulze Reply
Gregor Schulze November 29, 2015

in the last seven years every three-stops-mixed-photo was called “HDR”. This images where shown on every monitor and printed magazin.
HDR ist allready not defined, so you can’t say “this is LDR…this is HDR”. In my opinion you should look to the possible dynamic range of digital film making. Every thing that have more than 13 Stops DR looks more HDR, than bad exposed DSLR footage.
The photos of my A7s have a incredible DR. In comparison to my old 5D Mk III it is HDR (for me). I love Slog2 for the high DR and the cool soft pictures, but I would like to increase my budget of my projects for working with a RED (or others RAW cams) because they deliver a much better HDR picture.
As long as HDR is not a defined standard, every over-expensive-future-cinema-show room can be more HDR than other (cheaper) HDR devices.

Oscar Goldman Reply
Oscar Goldman December 1, 2015

Yeah, actually HDR IS defined. You should spend some time talking to color scientists, or maybe attend SIGGRAPH and familiarize yourself with what HDR is. For one thing, HDR requires a file format that can handle it, like EXR.

Talk to a CGI-effect technician who takes light probes on film sets, so he can integrate CG elements into live-action scenes. He can explain to you what HDR is and why it’s important. Or do some reading on it. I’m not going to give a tutorial on it here.

Your statements are simply not true. You can absolutely identify LDR, because the person creating it has purposefully REDUCED the dynamic range of the images he’s gathered. By definition REDUCED dynamic range is contrary to HIGH dynamic range. This isn’t up for debate.

“very over-expensive-future-cinema-show room can be more HDR than other (cheaper) HDR devices.”

No. Devices can have more dynamic range. But that doesn’t make them HDR. By your “logic,” an iPhone 4 is an HDR motion-picture camera because it has greater dynamic range than an SD Flip video camera of 10 years ago. Obviously not true.

Zee Ristic December 2, 2015

When you watch any dynamic range test on ur 8bit screen and see 19 stops range, where is the HDR? Ur looking at it.

Oscar Goldman Reply
Oscar Goldman December 2, 2015

The fact that you think “ur” is a word tells us all we need to know about your “knowledge.”

Then there’s your assertion that 8 bits can display HDR.

You clearly have Internet access, and yet you don’t bother to inform yourself. That’s just lazy.

Bo Lorentzen December 10, 2015

Ahhhhh… Thank you.!

Eric Bogan November 30, 2015

The New Technology Explained? I heard no real explanation. And the sample shots both HDR off and on look crappy. HDR and 8K, no thank you!

Emil H December 1, 2015

An Arri Alexa image already looks great on a rec.709 display… If we upgrade the dynamic range of TV’s, wouldn’t the image look flat ? Is it me only or I don’t really see the purpose of HDR except for a wider color gamut ? Would’nt we need cameras of 20 stops of DR so we can take advantage of these HDR TV’s ?

Oscar Goldman Reply
Oscar Goldman December 1, 2015

We do need cameras with film-like dynamic range (better than today), so the answer to your last question is essentially yes.

As far as current images looking flat on an HDR display: I would think that their curves could be expanded to fill the greater dynamic range, at the risk of banding in the gradients. Maybe someone with expertise in these displays can weigh in with a more authoritative answer.

Luke P December 3, 2015

I’m surprised that so many commenters here are against HDR. If you guys would like to learn more about color science and HDR, then watch this:


Zee Ristic December 3, 2015

What we are talking about here is some corps. trying to sell some quasi new tech. Adobe has Premiere which is twenty years old, they still don’t have color against color effect and stability… and… ok. What do they know about picture!? They showed themselves not competent for the argument (if they didn’t put in Lumetry few months ago, which is another company’s soft it would still be kid’s software, after 20yrs, the call it Pro!?). So talking about HDR, yes, it should be introduced and standardized, and we need it, but, by the SMPTE, MPEG… as a free universal codec covering imaging sensor/ file format/ playback decoding/ projection hardware, not by merchants (Adobe, Dolby… and others). So once SMPTE, MPEG… put the rules, then everybody go there and take the “white paper”… Otherwise they are trying to push us in buying expensive, half working tech. (like so many times before). Anyway, this whole conversation doesn’t have to much sense because in 2016 we’re goin to see cell phones shooting HDR 4k so standard/ tech. wise the problem will be resolved by itself.