Title
Prolost
Go Home
Category
Description
Filmmaking with your nerd out.
Address
Phone Number
+1 609-831-2326 (US) | Message me
Site Icon
Prolost
Tags
More From This Site
Page Views
0
Share
Update Time
2022-05-05 13:53:21

"I love Prolost"

www.prolost.com VS www.gqak.com

2022-05-05 13:53:21

Blog About Tutorials Prolost Store Archive Recent Featured All Where to Begin? Maschwikipedia Contact Comment Policy Menu by Stu Maschwitz Blog About Tutorials Prolost Store Archive Recent Featured All Where to Begin? Maschwikipedia Contact Comment Policy Mac Studio with M1 Ultra and Apple Studio Display, running Cinema 4D and Redshift. Mac Studio and Studio Display March 17, 2022 In October of 2021 I got to test a 14″ MacBook Pro with M1 Max processor. It performed so well, that I, along with many Mac power-users, questioned whether it could replace my desktop Mac.Last week, I reported that the answer turned out to be no:What makes a computer powerful, for my workflow, is not just processing power. It's connectedness, and presence. It’s speakers and microphones and ingesting CFAST while rapidly recalling raw files from fifteen years ago. It’s the power of side-by side displays that remind me in the morning of what I was working on the previous day, because nothing has moved and a dozen apps are still running. It’s desk space and disk space, and most importantly, head space.I love having a separate desktop and laptop. My ideal laptop should be nimble, freeing my desktop to be a workstation, intractably entangled in a jumble of peripherals.My desktops of choice for many years now have been highly-specced iMacs, culminating with Apple’s one-hit-wonder iMac Pro, which I spent Mac Pro money on and considered to be the ideal computer for my needs.Until the very day after I wrote the above, when Apple announced the Mac Studio.Split PersonalityAs an iMac fan I never saw any downsides to the all-in-one design. From 2009 to 2017, with each new iMac I bought, everything got better; processors, drive speed, displays. Upgrading all of them at once in a reliable, preconfigured package perfectly met my needs as a power user, not computer-tinkerer.I was hoping that Apple would put the power of the M1 Max in a big iMac. But I was concerned that the direction they went with the iMac 24″, which is thinner than the original iPhone, would challenge two of the features I love most about my iMac Pro: its quiet cooling and plethora of professional ports.For Apple to address this, they would have to do two things they’ve been steadfastly avoiding: build a mini-Mac Pro (or a big Mac Mini?) and release a standalone display for mortal humans. Not only was I unconvinced Apple would do this, I was skeptical I would prefer the result to the convenience of a pro-grade iMac.So I watched the March 8th event with cautious hope, but little optimism.Mac Stu(dio) POV: You are Stu in his studio watching the reveal of the Mac Studio. Spoiler alert: Apple nailed it.Before revealing what kind of container it would be in, Apple talked at length about “one last chip” in the M1, M1 Pro, and M1 Max lineup: the M1 Ultra. Simply put, it’s two M1 Maxes “fused” together, which means it is configurable to double every one of the M1 Max specs, but is still seen by the OS as one multi-core CPU, and a single GPU.The M1 Max runs cool, but will heat up and potentially even thermally throttle in the 14″ MacBook Pro. Two of them together didn’t feel like something that could fit in a paper-thin all-in-one, even with a power cord and a fan.And then Apple revealed the Mac Studio. Milled from a solid block of aluminum, this mini Mac Pro — or is it a big Mac Mini? — is exactly what I said I didn’t want, and the minute I saw it, I knew I’d been wrong. ? Especially when Apple revealed the exact display I’ve been begging them to make. The Studio Display is essentially the same 27″ non-HDR panel found in the 5K iMac, but with a Center Stage camera, barely-there bezels, and aluminum trappings. It had been convenient to upgrade both my display and computer with each new iMac, but there are downsides to using a consumer computer for pro work. It’s not fun to add or remove peripherals from a tall display, and it’s scary sometimes to know that when you tilt your monitor by a few degrees (or elevate your standing desk), you’re yanking on a dozen or more cables. I immediately saw how decoupling the display from the computer could create a better experience for me — one that I simply wasn’t open to in the form of a 40-pound behemoth rolling around on $400 wheels, enclosing airspace I would never fill with PCI cards I would never buy.Profiles in Courage I tested a maxed-out Mac Studio with M1 Ultra, 64-Core CPU, 128GB of integrated memory, and 8TB SSD. This time I was prepared, with Maxon’s benchamrk tool for the Redshift GPU-based 3D render engine, and the same render-intensive After Effects project I had just profiled on the M1 Max laptop.While the Mac Studio is configurable with an M1 Max, the M1 Ultra version should be the fastest Mac ever made, containing two complete M1 Max chips. I was curious how close it might come to being twice as fast as the M1 Max MacBook Pro.Redshift BenchmarkThe M1 Ultra rendered the Redshift Benchmark in six minutes, ten seconds. That’s not quite twice as fast as the 11:08 from the M1 Max, but significantly faster. Second to render Redshift Benchmark. Shorter bars are better. In testing my own scenes, with render quality cranked up to unnecessarily high levels (I am me, after all), I got a similar proportion of results. Seconds to render a super cool Redshift scene incompetently set up by me. Shorter bars are better. The trajectory of Apple’s integrated GPU performance is fascinating to me. On one hand, this is a grahics card-without-a-card with up to 128 GB of VRAM, performing at levels never before seen on a Mac. On the other hand, it’s still outpaced by a high-end gaming laptop that unabashedly spins up loud fans and draws a power like a Marvel villain. Will Apple’s GPUs have terabytes of RAM by the time they outperform an NVIDIA card?In practice, working with Redshift is way more fun on the big display and cool-to-the-touch keyboard of the Mac Studio. There's less “road feel” than with a laptop — you just push on it as hard as you can, blissfully unaware of how hard it’s working. You would not say that about working on the Razer gaming laptop.After Effects BetaWhere Activity Monitor shows Redshift pegging the M1 Ultra’s integrated GPU to 100% for the duration of a Cinema 4D batch render, After Effects, even with M1-optimized multiprocessing in public beta, does not do the same for the 20-core CPU. Compositing is more i/o bound than ray-tracing, but this project uses very little media, and the SSD of the Mac Studio is faster than some RAM, so the issue is really that my chosen project uses a part of After Effects that is notoriously not modern: the 2.5D renderer introduced in, and not substantially updated since, 2001.My little spaceships rendered faster on the Studio, but we’ve hit the point of diminishing returns. Circle of Stone spaceship pass only, rendering in Adobe After Effects. Shorter bars are better. iMac Pro as tested: 3 GHz 10-Core Intel Xeon W, 128 GB 2666 MHz DDR4, Radeon Pro Vega 64 16 GB. Double, or Trouble?In my limited testing, with these highly-specific tasks, the M1 Ultra is handily faster than an M1 Max, but not doubly so. There’s a soft shoulder to the performance envelope of the M1 line. I plan on running some more tests of course, but my sense is that those of us who make the creative tools that can truly take advantage of the M1 Ultra’s power still have work to do to occupy every corner of performance possible. After Effects beta rendering about seven frames at once of my benchmark project. CPU to spare. Cool it NowI didn’t just render a few benchmarks on the M1 Ultra, I did my best to keep the little Alumilump rendering 24 hours a day. The design of this big little box is half fan, and I wanted to see and hear it pushed to its limits. View fullsize Frickin’ lasers. On the rare occasions that I could do that to my iMac Pro, that quiet cooling would become audible, and a stream of hot air would flow from the back. After an all-night Redshift render, the 14″ MacBook Pro with M1 Max would also get loud enough to call attention to itself in a quiet room, and I recorded the temperature of the housing at 100.8º F (38.2º C). This is hot enough to be uncomfortable, but nothing compared to some Intel MacBook Pros I’ve owned over the years, one of which would burn my fingertips routinely even when placed on an active cooling stand. There's a spot on the top of my 16″ MacBook Pro (2019, 2.4 GHz 8-core i9) that gets that hot when the machine is essentially idle.After letting the Mac Studio with M1 Ultra render hundreds of frames with its 64-core GPU pegged the entire time, I placed my hand behind the box and felt the slightest waft of warm-ish air gently emitting. If I pressed my ear to it, I could just barely hear the fans over the other sounds in my studio.This is why this thing is not built in to the back of a display. It’s a quiet, cool beast with ludicrous specs.It’s a workstation. After 20 straight hours of maxed-out GPU rendering, a wee puff of warm air. Many Ports in a StormIt’s a workstation because it never breaks a sweat, but it’s also a workstation because it is bristling with ports. The SSD card slot is on the front, for human use. There are two USB-C ports right next to it (Thunderbolt 4 on the M1 Ultra). You know. for stuff. View fullsize Surely you ingest. There are four Thunderbolt 4 ports on the back, alongside 10-gig ethernet and HDMI. All welcome, but the real joy is from the two USB-A ports. I filled them up fast and still had to resort to some adapters, but I’ll be able to swap this one-for-one with my iMac pro and not miss a Loupedeck or a Streamdeck.Display’s the ThingThe Studio Display lacks true HDR and 120Hz ProMotion, two things almost no one will miss. But for only $300 more than the LG 5K UltraFine Apple had been suggesting one might use with a Mac Mini or MacBook Pro, you get a factory-calibrated display with a beautiful aluminum housing and stand. Strangely, this display also contains a better processor than Apple shipped in the Dev Kit Mac Mini (or currently supplies in an Apple TV), presumably to run that Center Stage camera.As you should expect from Apple, the Studio Display has great P3 wide-gamut color accuracy out of the box. In my testing, it doesn’t seem to exhibit the mild and temporary burn-in that I sometimes notice on my iMac Pro. It’s a very good display, and should be viewed as the essential mate to the Mac Studio, although it will be popular with anyone looking to dock a MacBook Pro as well.Center StageCenter Stage is Apple’s term for a front-facing camera that pans and zooms to frame people. This works by starting with an ultra-wide camera, and using facial recognition to drive a crop. Simple enough, except that doing just that would look terrible. No one loves how they look on the edges of a fisheye lens.Apple is doing it the hard way, like they did on the 12.9″ iPad Pro; they are accurately removing the lens distortion and vignetting from the ultra-wide image, then panning and zooming across that rectilinear result with a virtual 3D camera. You can see perspective lines change as if the camera was physically panning. We learn to do this kind of photographically-accurate reframing in visual effects, and typically don’t expect commodity hardware to do it in real time for chats with grandma. Unless it’s Apple.Of course, since you’re looking at a crop of a heavily-processed image, resolution and clarity are what Apple trades for panning and tilting, and you can sure tell. On a portable iPad this makes a lot of sense. On a desktop, Center Stage may have the side effect of forcing you to clean a wider frustum of your room before that morning Zoom with the team.Update: Many reviewers have noticed the suspiciously low quality of the camera, and Apple has stated that there’s a software issue at fault and a fix coming.We Don’t Talk about Nano View fullsize If Mork had an iPhone would he be a Nanu-texter? When I finally caved and bought a Pro Display XDR, I did not opt for the Nano-texture surface option, a $1,000 add-on designed to cut down on glare. I was already accustomed to positioning my desk to avoid reflections in my brightly-lit studio because of the reflective iMac Pro screen.In 3D rendering we talk about surfaces being “energy conserving.” This is a fancy way of saying that no more light can bounce off a surface than hit it in the first place, and that light can be either specular (like a mirror reflection), or diffuse (like paper), or somewhere in-between — but where it reflects specularly, it cannot reflect diffusely, and vice-versa.This is why, while you can’t see my reflection in the photo above, you can see a slight hint of a shadow on the screen from my window frames. Only diffuse ilumination registers shadows (you can’t cast a shadow on a mirror). Apple has traded specular reflection for diffuse. If you could have a slider to blend between Nano and non-nano. When Apple started shipping shiny laptop screens (as an option) in the 2000s, it was because of this trade-off: diffuse light washes out a display, where specular reflectivity improves contrast, as long as the reflections aren’t overpowering. When a display surface is highly specular, like my iMac Pro screen, manufacturers attempt to cut down on reflections by using anti-reflective coatings. No one had yet combined the idea of an anti-reflective optical coating with a matte display. But that’s exactly what Apple has done with the Nano-texture Glass.Testing the Studio Display with Nano-texture Glass on sunny days in my studio, it seems to achieve the best of both sides of the trade-off. It absorbs the majority of the direct light that falls onto it, and what light it does bounce back, it scatters into unrecognizable (and slightly purple) diffuse reflections. It seems worth the comparatively-reasonable $300 surcharge if your environment is not well-controlled for light. Just beware that the Nano-texture surface is more difficult to preserve and clean than a standard shiny display. View fullsize Studio Display alongside a 4K iMac. Out Standing View fullsize Apple’s new hardware is based on ARM. Apple gotta Apple, and one place they did it with the Studio Display is the $400 option of a tilt- and height-adjustable stand. In my opinion, which is corroborated by ergonomics experts, Apple’s iMacs place the display too low on a surface that also supports the keyboard and mouse. I’ve always put my iMac on a riser, which is not only ergonomic, it’s also useful as a place to stash stuff or route cables.The luxury of the premium stand is not in the extra height it provides (it actually looks kinda E.T.-awkward in tall mode), but rather in the experience of adjusting it. So if you share a computer with a person with different ergonomic preferences, and would therefore be manipulating the display often, it might be something to consider. Otherwise I’d suggest you stack your Mac on a rack.HDR is the Future View fullsize My rendition of how an LED array backlights an HDR display. I guess. But we don’t live in the future, we live in the present. Apple’s approach to High Dynamic Range display (from the Pro Display XDR down to the 12.9″ iPad Pro) is a miraculously good implementation of a flawed method: using zoned backlights that are significantly less dense than the pixels of the display. I illustrated this process in my MacBook Pro post, but I didn’t go into detail about how and when you can see artifacts associated with it. Let’s put it this way, if you want to remain happy with your Apple HDR display, don’t seek out videos designed to show where it breaks.But even with a true per-pixel HDR display, such as my OLED TV, I remain skeptical of the value of really bright stuff in movies and TV. I like the deep, rich black and color accuracy promised by HDR presentation, but the actual High part in HDR still leaves me with mixed feelings. I wrote about this back in 2017 and still feel mostly the same.I was reminded of this when watching the season two finale of Ted Lasso on Apple TV+. Early in the episode, Sam (Toheeb Jimoh) is in the locker room looking at a piece of paper. Typically my family watches Ted Lasso on a Standard Dynamic Range (SDR) TV, but this time we were watching on my 12.9″ iPad Pro. I was shocked by the creative decision to render the overhead lighting in the locker room at what seemed to be the very maximum brightness supported by the display. It completely distracted from Jimoh’s beautiful performance. In fact, it simply made it hard to see, because my eyes had to adjust to the brightness of the background. A frame from Ted Lasso Photographed off the HDR display on a MacBook Pro. That same photo, underexposed by three stops. The ceiling lights are just below clipping now. Let me try to capture this experience for you. Here’s the frame displayed in HDR on my 14″ MacBook Pro. It looks normal enough at exposure, but since the photo is not HDR, you can’t see how much brighter the ceiling lights are. But if I reduce the brightness of the photo by three whole stops, you can see what the true visual impression of this shot is: his face and the white of the paper he’s holding are at such wildly different brightness values than the background that the shot is hard to look at — like trying to talk to a friend when they’re sitting in front of a bright window.This is such an odd creative choice (repeated throughout the series) that one might even suspect that Apple is pushing its show creators to grade for HDR effect, to show off their hardware’s capability. But HDR mastering does not require your backgrounds to overpower your actor’s faces. As I discovered when writing about this in 2017, Roger Deakins made damn sure this didn’t happen in Sicario, for example.A maximum brightness lets you tell the story of “bright” without having to literally be bright. Literal brightness steals focus. pic.twitter.com/vSMh12Fhbd— Stu Maschwitz (@5tu) April 20, 2017There are beautiful HDR experiences out there, such as Deakins’s own work on Blade Runner 2049. But it’s not an automatic win across the board. You shouldn’t be watching movies on your computer screen anyway, but you will view your own iPhone-shot photos and videos there, and those are likely to be HDR now as well. So, are you missing out tremendously to not have a High Dynamic Range computer monitor?No, for two reasons:First: for home movies, HDR is fun, but not essential.And B: you get a little of it anyway, thanks to EDR.EDR is Apple’s quietly cool tech for eeking HDR visuals out of an SDR display. And for you, dear reader, I have quantified exactly how much HDR your Studio Display’s EDR can muster. sRGB Hard clip at 1.0 white. ACES Rec. 709 ODT. ACES Down five stops. Consider this image. It’s an HDR test pattern I made that shows gray bars increasing in brightness, with text over each at exactly one stop hotter than the bar. I’m displaying it here as both a clipped-at-1.0 sRGB image, as well as an ACES Rec. 709 ODT version, the highlight-rolloff of which helps you see the otherwise-too-bright-for-SRD values. There’s also a five-stops-underexposed ACES version, just so you can read all the text. The bottom bar really is 32 times as bright as “white.”The clipped sRGB version above is how the test pattern will appear on a pure SDR display. The white bar is the same pure white as the rest of the image, and you can’t read the text on it.And that’s how it appears on the Studio Display: View fullsize WIth the brightness all the way up, the HDR text pattern clips at 1.0 white on the non-HDR Studio Display. But only at maximum brightness. As soon as you reduce the screen brightness by even one tick, Apple’s EDR technology kicks in and uses the headroom to show you values brighter than white. View fullsize EDR reveals one extra stop of HDR highlight detail. Hey, that’s HDR! Well, kinda. To be true HDR there are more requirements, such as deeper blacks than this uniformly-backlit display can muster (you'll only notice in a pitch-dark room). But it’s a little bit of HDR, and that’s just fine. And you get it without zoned backlight artifacts, or hey-where-did-my-$6,000-go artifacts.Let’s take a look at what true HDR gets us on the Pro Display XDR: View fullsize HDR test pattern on Pro Display EXR. Basically, one more stop of overbrights. A stop is a lot, twice as much light. But it’s also the bare minimum of extra range this chart can reveal. And hey, also notice that the image is not as uniform in color temperature as the Studio Display’s. The shadows are cool and the highlights are warm, where the visible range presents as neutral on the cheaper Studio display. And yes, you can plainly see that dark border around the edge of the Pro Display XDR in everyday use.The Pro Display XDR and the HDR displays on the 2021 MacBook Pros and the 12.9″ iPad Pro are good, don’t get me wrong. But there are going to be people telling you you’re missing out on the HDR party with the Studio Display. I hope this little exploration gives you some perspective on whether or not you agree. Download the HDR Test Pattern Enough Mac ProseThere’s so much more to say about the Mac Studio, let’s do a lightning round:Is the Mac Studio beautiful? No. Did Apple intentionally make it utilitarian in appearance to head off any form-over-function critiques that turned out to be tragically valid regarding the trashcan Mac Pro? Good question.Do I love the way this little Meatwad looks? Yes. I thought about hiding it in the cabinet alongside my RAIDs. but then I’d lose access to the front ports. It’s a functional object and it looks like one. It’s kind of, dare I say it, a truck.When I first set this thing up and just plugged my old-fashioned USB-A mouse into it without a second thought, I felt joy.If it’s weird that the Studio Display has a better computer in it than many computers, what’s even weirder is that the keyboard also has a computer in it, for handling Touch ID. Which is wonderful to finally have on my desktop.The Studio Display is not just a monitor with a camera. It’s a hub, with three USB-C ports and one Thunderbolt 3. It can also deliver enough power over USB to fast-charge a 16″ M1 Max MacBook Pro.The Studio Display has speakers! And they do a sort of sound-bar attempt at spatial audio. As with the speakers in Apple’s recent laptops, they sound better than they should for their size and orientation. Unlike with a laptop, Studio-ness demands real speakers, and there goes my second USB-A port to my DAC.These are handy.Configuring a Mac StudioWith the MacBook Pro, I suggested that if you weren’t tempted by the maxed-out configuration, or accustomed to spending $4,000+ on a Mac laptop, then maybe you should consider the MacBook Air instead. This was a glib way of saying that the sweet spot for the pro-est laptop Apple’s made since the 17″ “lunch tray” MacBook Pro was at the pro-est end. Basically, don’t buy a 2WD Jeep.I do not feel the same about the Mac Studio. The M1 Ultra version is extra in every way: It starts at double the base price of the M1 Max version. It weighs two pounds more, thanks to a copper heat sink. It is configurable to specs that your software almost assuredly cannot fully harness, and your wallet almost certainly will remember.There’s a real range of viable options for configuring a Mac Studio. It starts great and goes to eleven. So here’s my buying advice:First, the easy part: Everyone tempted by the Studio Display should go for it. It will be the perfect partner to a laptop or the Mac Studio. There’s nothing else you should get. As I’ve said before, if you’re the rare weirdo who should have ever bought a Pro Display XDR, you know it. Everyone else, here’s your perfect Mac display. If you have extra dough to spend, Nano over E.T.-mode.As for the Mac Studio, it will be hard to gauge your needs for a computer this far ahead of what’s come before. But here are a few things to consider:The SSD upgrades are expensive, but this is your one shot at getting a massive amount of insanely-fast, Apple-warranteed drive space. If you edit big media files, stills or video, and are accustomed to working off a big external RAID like I am, you may want to consider upgrading the SSD and working locally for truly ludicrous multiple-streams-of-ProRes performance.As I mentioned above, the front ports are slower in the Max than in the Ultra.RAM upgrade? I’ve been living with 128 GB of RAM for years now. It was a luxury in 2017, and I can’t imagine going backwards. But RAM hits different in the M1 world, and might not be the be-all-end-all of lazy ADHD computing like it was in the Intel era.GPU upgrade? I bet you already know the answer to this one based on your workflow. This might be the exclusive domain of folks doing crazy computations on the GPU, like 3D rendering. That’s all. There is no other valid use of GPU compute. Stop ruining the planet.The everything-maxed-out Mac Studio I tested retails for $7,999 USD. Add the display and that bumps to $10,298. Not cheap — also not more than I’ve ever spent on a Mac. But let’s look at another comparison:A little context on the pricing of #MacStudio. Compare my iMac Pro config from Dec. 2017 with a Mac Studio + Display with max CPU, same RAM, and same SSD. Pre-tax: $9750 for iMac, $8,000 for Studio. #AppleEvent pic.twitter.com/ROYHXhAGOI— Stu Maschwitz (@5tu) March 9, 2022If you back off on the display extras and the SSD, the price drops to significantly less than I spent on my iMac Pro four years ago. Everyone’s idea of reasonable is different, but Apple has the low end locked up with the M1 Mac Mini and MacBook Air, and now they have a steady progression into the extreme high end, with something for everyone along the way.And truly, consider the comparisons above. Doubling the chip and the base price does not (yet) double your speed, espcially where the GPU is concerned. So the Ultra option really is for those kinds of pros who can turn every second of faster render times into money — but for whatever reasons haven’t extended that theory into a willingness to work on Windows.More Mac ProsApple also said there’s a Mac Pro coming.I’m not sure what to do with this information.I will be very curious to see how Apple differentiates such a beast from the Mac Studio. Time will tell, but Apple is on a tear with their home-spun silicon, and I wouldn’t want my satisfaction with the Mac Studio to stop them.There’s also an interesting dynamic here, where as Apple doubles, and doubles-again their chip architecture, the CPU performance is pantsing the industry, but the GPU performance, for the tests that matter to me, are not accelerating upwards fast enough to compete with NVIDIA. There’s still a performance gap to close on the GPU front — maybe the Mac Pro is where Apple will tell that story.They’re just making it harder and harder for me to keep my renders nice and slow. But I’ll find a way.Stu StudioApple made a Mac just for me, the Mac I didn’t know I wanted and never dared ask for. It’s the first new model of Macintosh computer in a very long time, and it’s a more than worthy addition. As with the recent MacBook Pros, it speaks with design and features to a renewed focus on usability above all else, along with industry-moving performance.The best thing that happened during a short week of having Mac Studio in my studio was that I got re-excited about doing computationally-intensive creative work. Since my Amiga 1000 days, I’ve always loved the feeling of my computer dutifully rendering a complex scene overnight. Coming up with projects to keep this little beast busy stole time from more urgent responsibilities. Because, you see, I couldn’t wait to see what I did with it. Comment The 2021 MacBook Pro alongside the cable-management fail of my iMac Pro M1 Max MacBook Pro Long-term Report March 07, 2022 Back in October when I got a chance to use a pre-release 14″ MacBook Pro with M1 Max processor, I openly questioned whether this laptop could replace my venerable iMac Pro. Four months later, I’m back with an update.A good amount has happened since then, for example:After Effects: Explores the Cores Yes, I figured out how to make this render very, very slowly. After Effects multiprocessing is now a thing, released for Intel and in public beta for Apple silicon. And it’s largely good! It doesn’t speed up the processing of a single frame, but allows After Effects to render multiple frames at once, both for interactive previews and final renders. Not every project is ideal for this kind of optimization. TANK, for example, is the kind of project that benefits greatly from re-using cached information from one frame to the next.But I do have a rather heavy After Effects project from Circle of Stone, the one that I talked about at IBC 2019. I used After Effects 3D and expressions to create procedural flying car traffic for a futuristic matte painting shot. Like TANK, I used hundreds of layers to create the final effect. Unlike TANK, this project multi-processes well, and doesn’t require any features or plug-ins that aren’t yet running on Adobe’s M1-optimized After Effects public beta. Circle of Stone spaceship pass only, rendering in Adobe After Effects. Shorter bars are better. iMac Pro as tested: 3 GHz 10-Core Intel Xeon W, 128 GB 2666 MHz DDR4, Radeon Pro Vega 64 16 GB. In just four months even Adobe, who were famously late to the Mac Intel transition, have optimized quite a bit of After Effects for Apple's processors. While we’re not all the way there (crucially, I am still not able to run the TANK benchmark project from my previous post), the proposition of replacing my desktop machine with this portable powerhouse became even more pressing.The Display: The Best There is, but Not PerfectI’m of two minds about the MacBook Pro display and Apple’s mini-LED HDR displays in general. On one hand, they look great, have excellent color accuracy out of the box, and compete favorably with both professional HDR reference displays and Good Computer Monitors. On the other hand, if Apple bills them as being good enough for professional HDR video color grading, they have to figure out how to close the gap between zoned-backlighting, which does have artifacts despite all of Apple’s cleverness, and true individual-pixel HDR.The Hardware: As Good as You’d HopeThe laptop is a solid workhorse. Only a few issues have come up during my time with it:I noticed that the display would frequently seem dim when I opened the laptop after some time. I’d have to tap the F2 key a few times to get it where I’d like it. This seemed to go away after I turned off “Slightly dim the display while on battery” (in the Battery section of System Preferences), but I don’t recall ever needing to do this with previous Apple laptops.You may recall that I didn’t post any benchmarks of copying files off an SSD via the built-in card slot. This is because I was getting strange behavior that I wound up reporting to Apple, where the copy operation would hang for a long time before initiating. This seems to have been cleared up in non-beta macOS updates since I first encountered it. SSD copying is now just fine.Ug. My Return key is sticking, just a little. Just enough to be annoying.The notch turns out to be very much not a problem day-to-day. But having owned the 12.9″ iPad Pro, it’s frustrating to have a lesser front-facing camera on this more-expensive, more-pro, more-always-making-you-aware-of-a-camera’s-presence device. This is not a huge deal breaker for me though, as I am not a person whose participation in a meeting is enhanced by camera clarity.Everything else that you’d hope would be great about this hardware, is. The battery life is as promised, engendering iPad-like charge-it-once-every-few-days habits. The HDMI port is hugely useful, the cooling works so well you’ll never know it’s there, and the function keys are great to have back, alongside Touch ID.So it's a solid machine, a fast machine, a reliable machine. Is it ready to be my daily desktop driver?But It’s Still a LaptopEven in light of these findings, I never could fully insert the M1 Max to into the role of my venerable iMac Pro.My iMac Pro is not just a computer, it’s a workstation. I use every one of its eight USB ports (four USC-C, four USB-A), plus a hub, and the 10-gig ethernet jack as well. I run color control surfaces and audio DACs, a Stream Deck and multiple external RAIDs, and even a second display.What makes a computer powerful, for my workflow, is not just processing power. It's connectedness, and presence. It’s speakers and microphones and ingesting CFAST while rapidly recalling raw files from fifteen years ago. It’s the power of side-by side displays that remind me in the morning of what I was working on the previous day, because nothing has moved and a dozen apps are still running. It’s desk space and disk space, and most importantly, head space.So while I absolutely adore this laptop, it has not replaced my desktop, which is why I’m hopeful and excited about whatever Apple may have in store for us tomorrow and for he rest of the year. Comment View fullsize The M1 Max MacBook Pros October 25, 2021 Apple opened their October event with a young musician creating an Apple-inspired music track in a dingy garage filled with gear worth tens of thousands of dollars. Some viewers commented on the unrealistic portrayal of a creative professional. But I felt like I was looking in a mirror.If Apple’s target market for the new MacBook Pros with M1 Pro and M1 Max processors are scruffy weirdos in grungy surroundings with suspiciously killer kit, I am dead-center in their cross-hairs. By day I’m an executive software-maker at Maxon, helping create Cinema 4D and the Red Giant tools. By the other half of the day, I’m a filmmaker and a photographer working out of a no-frills loft in beautiful downtown Emeryville California, home to Pixar and potholes, Bay Bridges and burning trash bins. Like the camera-ready A. G. Cook, I have a Pro Display XDR perched on some reclaimed barn lumber. Unlike him, my work involves nits as much as it does decibels. View fullsize Little guy making a play for his turn with the XDR This won’t be an exhaustive review, but I’ve had a 2021 MacBook Pro for a few days now, and I’ve been able to do enough with it to weigh in on whether it might have a place in your hipster garage of pro-ness.Here’s what Apple sent me:14‑inch MacBook Pro - Space GrayApple M1 Max with 10-core CPU32-core GPU16-core Neural Engine64GB unified memory4TB SSD storageThis maxed-out silicon configuration is almost exactly what I would buy for myself, except I might opt for the 2TB SSD. With 2TB the price for the 14″ totals $4,099, for 4TB, add $600 USD.You can indeed configure these machines well into the $5,000 range, but that is not new for Apple’s highest-end laptops.What is new enough is the the melding of the M1 unified memory architecture with a “pro” specification. A laptop with a 64GB GPU and an SSD as fast as RAM from 10 years ago is so weirdly new that it has a lot of would-be garage-pros confused about where their sweet-spot configuration might be.The Lazy Pro The last tower-shaped Mac I owned. I’m not a Mac Pro guy. I just know myself well enough to know I’ll be too lazy to pull apart my computer and swap out parts.When Apple announced the Trash Can, I recommended folks consider spending the same money on two iMacs. My habit at the time was to budget a replacement maxed-out iMac every three or so years rather than spend more on an ostensibly upgradable Mac Pro.This frequent-iMac-upgrade plane worked great for me for about a decade. With each new machine, everything got better — CPU, GPU, storage, memory, and display.This culminated with a machine that seemed to suggest Apple agreed that iMacs are ideal for professional work. When Apple released the iMac Pro, I immediately bought two iMacs-worth of it. Four years later, with ten CPU cores and 128 GB of DDR RAM, it’s still my solid workhorse. I frequently have a dozen apps open at a time, and it runs 24/7 executing automations and remote and local renders. The iMac Pro is incredibly stable — I might restart it once a month or so. I never hear its fans over the other noises in my studio.While I’ve had to work hard to find any apps that will push its CPU to the limit, the same has not been true for GPU. I color graded a 20-minute short in 4k on this machine, and it did eventually get a bit bogged down.At the four-year mark, the iMac Pro is about ready for a replacement. After the impressive launch of Apple’s home-grown M1 processor, I’ve been thrilled to imagine what the next pro iMac might look like. What I did not expect is that a laptop might beat it to the punch in replacing my trusty desktop powerhouse. The 14″ MacBook Pro next to the titanium G4 Powerbook A New Old DesignI gotta be honest, I made some kind of sound when I opened the box. Maybe like a half-gasp, half chuckle. This thing just looks great. You will get mad at yourself for being so happy that Apple has brought back things it willfully removed, like useful ports and keys. But you’ll get over that quickly as you bask in the sense that Apple has made this thing just for you, you professional garage weirdo. View fullsize Touchy About the BarI was open to the Touch Bar when it was first introduced. We jumped on supporting it right away in Slugline. I notoriously love alternative input devices, whether it be color control surfaces or keypads, trackballs or touchscreens. What I came to realize though was that however promising the Touch Bar was, it was never an additional input method. It came at the expense of function keys — and as boring as function keys are, they are damn useful. The Touch Bar not only failed to be better than the thing it stole real-estate from, it also didn’t work reliably, and seemingly struggled to hold even Apple’s attention. I never managed to build habits around it, as I am only a part-time laptop user.I applaud Apple for trying the Touch Bar, and feel bittersweet to see it gone. I hope that Apple won’t stop experimenting with ideas like this, and listening to their users about the results.From “Courage” to HumilityThat uncharacteristic willingness to admit that a grand experiment did not pay out is perhaps the single most dominating vibe of these computers. Apple is not known for graciously admitting a mistake, yet here we have laptops that so resoundingly repudiate their design assertions of the last half-decade that it’s hard for us pros to not feel at least seen, if not downright vindicated.The SD card slot is back. Its removal was a bet on a wireless future — a bet that no working photographer would take today (I once watched a Sony employee struggle for a good portion of an hour to link a Sony camera to a Sony phone). SD cards were useful in 2016 when they were obliterated from Apple’s laptops, but not exclusively so. Back then, bigger DSLRs mostly shot to bigger and faster CF cards. But in 2021, SD card speeds have earned the smaller storage sticks a place in even my Sony a7RIV, where every shutter click results in a roughly 62MB file.The HDMI port speaks for itself, but it’s the return of MagSafe that feels like the most profound reversal of course. There can be no sensible explanation for why it was removed now that we’ve seen how it can so perfectly coexist with USB-C charging.And then there’s the keyboard. Inverted-T arrow keys, of course. Function keys are not only back, they’re full-height. That’s a statement, as is the black surround. Apple is visually emphasizing what’s both new and old, in a way that seemingly pays homage to the titanium G4 PowerBook, the Mac that set the course that Apple’s laptop designs have been sailing on for 20 years. View fullsize Why do I still have my TiBook? Of course, it also ensures that folks at the coffee shop will know you have the new one. The combined package does that thing Apple excels at, where a new design makes your existing device instantly feel old and clunky.Welcome to Mac-y NotchBoth the 14″ and 16″ MacBook Pro models feature a notch at the top of the screen where the FaceTime camera and display-related sensors reside. On paper it seems like this will take getting used to, but in practice it’s quite easy to forget about — except in Cinema 4D, which already has a hard time fitting all of its menus on smaller screens. View fullsize Two key takeaways for the notch: First, it’s pure bonus. The screen below the notch is the typical Mac laptop 16:10 aspect, and the notch area adds 74 pixels more for the (now slightly taller) Monterey menu bar to straddle it. So you’re not losing one notch’s-worth of screen, you’re gaining the two “ears.”Second: Finally the bizarre menu bar transparency Apple added a few macOS revisions ago makes sense. Choose a dark wallpaper image, leave Reduce Transparency off (for me, this is a change from my usual Mac setup), and enjoy a dark-mode-esque menu bar into which the notch all but disappears, even when the system appearance is light.The DisplayPart of the reason that the best Macs have built-in displays is that Apple makes the best displays. I wrote about the Pro Display XDR and Apple’s EDR technology at the end of last year, and since then Apple has miniaturized their mini-LED backlit displays to fit in an iPad Pro and now these laptops.Mini LED not as pure an HDR delivery method as, say, OLED, where every pixel is individually addressed. As with the 12.9″ iPad Pro, you can spot the characteristic blooming artifacts around starfields and bright titles against black. But only if you're in a pitch-black room and looking very closely.Think about it this way — Apple touts 10,000 LEDs, which sounds like a lot. But that roughly measures out to a grid of, say, 132 × 76 LEDs. And Apple has not claimed that the illumination zones are single-LED in size. So the HDR-nes of these displays is far lower resolution than an Apple Watch Series 3 screen. View fullsize A fanciful rendition showing how the LED backlight array is coupled with a traditional LCD panel. It shouldn't work as well as it does, but for the most part, the display simply looks great — and then when you throw some real HDR imagery up and those 1,600 peak nits kick in, it transforms into a shockingly gorgeous thing.But, crucially, not just gorgeous — your absolute best bet for the correct gorgeous, under a variety of viewing conditions.The TLDR of a future @prolost post is basically: If you want to do color management right, buy a recent Mac with a built-in display — and that's it. I'm sure it won't be controversial in the least. ?— Stu Maschwitz (@5tu) March 6, 2021I’m not quite ready to open those floodgates today, but I’ll reiterate the sentiment: Apple’s displays are calibrated, profiled, accurate, and consistent, at a commodity level. The same display can show color-accurate HDR right next to color-accurate SDR.Which makes it all the more frustrating that Apple doesn’t make a sensible monitor to attach to these new computers. View fullsize How do you demonstrate an HDR display? Obviously you take an HDR photo, display it in HDR on the HDR display, and then take another HDR photo of that. Then you make an animated GIF of exposures to show how the HDR highlights compare to the white of this page. Obviously. My hope is that the Pro Display XDR is like the 2011 Tesla Roadster — an open ploy for the money-is-no-object customer to fund the development of more broad-market options. If an 12.9″ iPad Pro can sell for $1099, then surely Apple could sell a larger version of that display, sans computer guts, for less than $6,000?The 2021 MacBook Pro displays are miraculous. But these laptops desperately want to be connected to one or more additional displays.Because they are taking a real shot at replacing your desktop. View fullsize Max Out the M1 Max in your Macs?It was interesting to listen to the ATP guys struggle to figure out their ideal configurations. Spoiler alert: none of their decisions will make any sense to you.But it’s a real challenge, because the speeds and feeds stats we’re so accustomed to are now intermingled and out-of-scale. What is the importance of RAM on these new integrated systems? I’ve been doing genuine production work on my M1 Air, and it only has 16 GB of shared memory. The new architecture makes our old assumptions obsolete.Typically a big reason I splurge on RAM is for the playback cache in Adobe After Effects. But Apple‘s latest SSDs could be as fast as the RAM in your last computer, so when After Effects reverts to its disk cache, you may not notice.I think I have an answer to how you should configure these new computers. You’re going to hate it. But first...Some Performance NumbersThis is not my thing. I’m not a fastidious hardware tester, and I haven’t had much time with this MacBook Pro.Worst of all, being me, I ran the most extensive tests using Adobe After Effects, which is not M1 optimized, and has never been famous for using multiple cores well.Prepare to hate this part. Hit me up on Twitter and tell me what paces you’d like to see me run this thing through.After Effects: Ignores the CoresThe first test project I chose was 300 frames from my film TANK. This is a deep and complex After Effects project with hundreds of comps, tens of thousands of layers, and a rats nest of complex expressions.I was delighted to see that my trusty iMac Pro turned in per-frame averages around 27 seconds. This is at least twice as fast as when I actually rendered the film, an improvement that has to be almost exclusively due to the improved JavaScript expressions engine. Good job, After Effects team!I chose After Effects for my testing because it matters to me, but After Effects is not an ideal tool for measuring a computer’s raw power. While cooking my TANK frames, the ten cores of the iMac Pro hovered at about 15–20% utilization. This lack of parallel processing ability in my most-used creative app is a big part of why I chose the iMac Pro configuration I did — the 10-core was the best choice for single-thread speed. View fullsize After Effects leisurely rendering on my iMac Pro Now that we’ve established that my CPU test is utterly ridiculous, let’s double down and compare it with After Effects running on a computer that the After Effects team has probably never seen, under emulation. That’s right, at the time of this writing, After Effects runs under Rosetta 2 on M1 Macs.So how fast did the pretend computer running on a real computer render 300 frames of my short film? View fullsize Minutes to render 300 frames of TANK. Shorter bars are better. Almost exactly the same.Let that sink in: After Effects is about as fast running in emulation on the M1 Max as it is on a $10,000 desktop computer from four years ago.Oh, and there’s, ahem, one more thing.The MacBook Pro was not plugged in for this test.In a crazy stroke of fate, my entire block had a power outage the first full day I had with the M1 Max. So my first test render was sans juice.Once the power was restored, I launched the same render again, now with the MacBook Pro powered via USB-C. And then I got a reminder of why I hate doing these tests. View fullsize Why was the render slower under AC power? I don’t know for sure, but I suspect it’s the common “new device syndrome” that plagues tech reviewers. When a Mac (or iPhone or iPad) is newly set-up, it has a lot of housekeeping to do: syncing account data and photos, checking for updates, and various other background tasks which, under Monterey, could include scanning every single photo in your library for text to OCR.My guess is that some of these tasks had been paused under battery power, and resumed once I plugged-in, stealing a few cycles from After Effects.A few days later I re-rendered the sequence and got results identical the battery test. But that’s still very notable: After Effects did not push the M1 Max hard enough to engage any kind of power throttling. In fact, the laptop never even got noticeably warm during these renders. View fullsize After Effects running on the M1 Max. I rounded out my testing with my Intel 16″ MacBook Pro (2.4 GHz 8-Core Intel Core i9) and my M1 MacBook Air (16GB). The 16″ got right in there with the iMac and M1 Max, and got hella loud and hot in the process. The M1 Air quietly and coolly chugged along to provide the only result not within a margin of error with the others. Minutes to render 300 frames of TANK. Shorter bars are better. Update 2021-10-28Adobe has released a multiprocessing update to After Effects, as well as a public beta of an M1-native version. More tests wiill be needed, but for now there are specifics about this beast of a project file I chose as my benchmark that complicate that.Adobe Premiere: More Testing NeededI loaded some 4k ProRes footage into Premiere and layered a few Magic Bullet Colorista corrections on top, including a key and an animated mask. Then I added Magic Bullet Renoiser for grain, and a light pinch of Mojo. I set the preview resolution to full, and pressed play. Silky smooth, even at full-screen.Unlike After Effects, Premiere is M1-native, as are the Magic Bullet effects. So this is a reasonable test, and a very promising one.There’s a lot more to test here. I plan on de-archiving the 20-minute short film I graded in Premiere a few years ago, the one that started to bog down my iMac Pro. A big difference there is that the footage is XAVC, which Premiere has to work a lot harder to decode. In my limited testing, Premiere did not love heavily inter-frame compressed footage on the M1 Max any more than it does on my Intel Macs. View fullsize Macsin and relaxin’ Cinema 4D and Full Metal RedshiftAgain, I have only scratched the surface here, but, well, just watch:That’s Redshift, Maxon's GPU render engine, pegging the integrated graphics to render an IPR session (that’s Interactive Photorealistic Rendering) in Cinema 4D. I’ve never seen this kind of performance from a Mac.Oh, and this was recorded while the MacBook Pro was on battery power.Update: Some Real NumbersThis morning after posting I ran some proper Redshift benchmarks, pitting the M1 Max against a Razer gaming laptop with a GeForce RTX 3080. This is a $4,000 laptop (that is currently unavailable, like all things NVIDIA), the power brick of which feels like it weighs as much as any Mac laptop.The M1 Max completed the Redshift render in 11 minutes, 13 seconds. The Razor took less than half that at 4:29.But on battery, the numbers are much closer. Again, the Mac weirdly sped up without AC, coming in at 11:08, where the Razer slowed to 9:52. View fullsize Second to render Redshift Benchmark. Shorter bars are better. The NVIDIA card with its RTX module specifically designed for ray-tracing certainly has the edge — as long as you're plugged-in. Also worth remembering: Redshift for Mac is fresh out of beta and still in active development. View fullsize The Razer is a damn fine laptop with a touch OLED display and a GPU you can’t buy because bitcoin? Comes with free lead brick. IssuesNot everything was seamless with my little 14″ powerhouse. It came with macOS Monterey 12.0.1, which seems very Big Sur-like in compatibility with apps I use, and has some lovely features such as Focus modes that sync with your iOS/iPadOS 15 devices. But Shortcuts, which I was excited about, is still quite buggy. The TV app failed to play purchased movies from my iTunes library, which is a bummer when you’re trying to test an HDR display. I couldn’t get Sidecar to work with my iPad Pro running iPadOS 15.1 public beta.Adobe Premiere blazed through a ProRes encode but maxed out the CPU and then crashed on an H.265 encode.And as port-y as the new ports are, all my three-button mice are still USB-A. So I’m still rocking’ dongles.The Biggest Issue...is that, like I said above, this machine wants to be a desktop. Now, me and A. G. Cook can plug ours into our Pro Display XDRs and make that happen, but unless HDR color grading is a well-paying gig for you, the XDR makes no sense.Presumably Apple will make my dream M1 MaxiMaxMax iMac soon enough, but I’ll still want to dock my laptop to an Apple-quality display. I currently do that with the LG UltraFine 4k, a black plastic nothing that doesn’t support EDR.It’s time for Apple to make a display for normal garage weirdos.How to Configure your 2021 MacBook ProI am in a weird position here in that I already have two of the very few Intel Macs that are marginally speedier than these new laptops at some tasks. I also own an M1 MacBook Air, which snuck up on me and stole my heart. It’s light, it feels fast, and it punches miles above its weight. It’s the cheapest Mac I’ve mentioned here by a healthy margin.So my advice is this: Go big or go Air. Either max out your M1 Max, or don’t bother with these machines. These MacBook Pros exist to compete at the very highest end of laptop performance, so don’t buy one that’s not racetrack-ready.Spending the extra money to max out my iMac Pro has kept it useful for at least a year longer than my usual iMac cycle. In that year, a lot has happened. So spending as much as you can now on a computer might buy you extra time, during which I can almost guarantee you, Apple will release something that makes these machines look old and clunky.On the other hand, the M1 MacBook Air is just an insane amount of computer in an affordable, sleek package. If you’ve never used one, you will be shocked at how fast it feels. And if you’ve never spent five grand on a Mac laptop before, I’m not clear on why you’d start now.I bet you hate this advice. Here’s why I feel good giving it: You will ignore it. Because you are a pro. You are in your garage with your warped priorities and your cool gear and you know exactly what you want to do with this computer. Like me and my silly After Effects project file, you may never peak those performance bars. But you’ll love knowing that you could. Towering OverThese days it doesn’t require much patience to have your laziness vindicated. Shortly after the release of the new Mac Pro, Apple made available the Afterburner card. This $2000 add-on offers hardware-accelerated ProRes encoding and decoding. Now that same power is essentially built-in to the iPhone 13, and these new MacBook Pros.Apple is going hard with their in-house processors, and with these pro laptops, I think they are showing their vison for the future: integrated, enclosed, and efficient. Just like the best Macs have always been.I would not be surprised if the days of the towering Mac full of PCI Express card slots are over. Apple has demonstrated that they can scale the M1 to be competitive with big, expensive power-hungry laptops with dedicated GPUs. We knew they were competing with Intel. Now I think it’s clear they intend to go toe-to-toe with NVIDIA as well, on the desktop as well as in our backpacks.Time will tell. But for now, I have to decide if this laptop is going to be my new desktop. Tags: Apple, Magic Bullet, Cinema 4D, Redshift Comment Linear Light, Gamma, and ACES June 22, 2021 Imagine a digital 50% gray card. In 0–255 RGB values, it’s 127, 127, 127. On the RGB parade scope, the card is a perfect plateau at 50%.Now imagine increasing the exposure of this scene by one stop. “Stops” of light are an exponential scale, meaning that subtracting one stop is cutting the light in half, and adding one stop is twice as much light. The light in our image is expressed in RGB pixel values, so let’s double the simulated light in this scene by doubling the brightness of the pixels. View fullsize Predictably, the 50% region has doubled to 100%. The perfectly-white regions are now overexposed to 200%, which looks the same as 100% in this non-HDR view. Our idealized pure-black patches remain unchanged.But anyone who has moved a camera out of Auto mode knows that overexposing by one stop does not slam middle-gray into pure white. And anyone who has shopped for physical camera charts knows that you don’t buy “50% gray” cards. A middle-gray card at a camera store is an 18% gray card. So what’s up?Yes, We’re Back to This AgainBack in 2009 (yikes) I tried to draw to a close my long history of writing about linear light and how it affects 3D rendering and compositing. But a funny thing has happened since then — along with many formerly niche Prolost subjects such as large sensors, 24 fps, and cinematic color, the topic of color management has become, and I can’t believe I’m writing this, popular?That is thanks largely to ACES, the Academy Color Encoding System aspiring to become the industry standard for managing for motion picture and television production. ACES builds on the ideas of performing certain kinds of creative work in a realistic model of light, and adds an output rendering that is so creatively friendly that a new generation of 3D artists have seized on it as a key part of generating realistic and/or pleasing imagery.The other reason I’m back to this is that, in looking back at my numerous posts on color, gamma, and linear floating-point, they reflect a process of discovery, exploration, and advocacy — but they don’t coalesce into one convenient archive of information. Much of my unabashed championing of working in linear light was in the form of my tutorial series on eLin, which has long been taken down as eLin itself is now blessedly approaching a decade and a half of obsolescence.This post is an attempt to consolidate, summarize, and modernize the Prolost take on film color management. Buckle up, it’s a long one.Middle ManagementAn 18% gray card appears “middle gray” to our eyes because we humans do not perceive light linearly. Human vision has a “gamma” of sorts — a boosting curve that pumps up our perception of darkness and compresses highlights. I’ve heard this explained as a survival adaptation — it’s easier to see a predator or prey in the dark if we boost up the midtones on our monkey goggles. View fullsize Raw light values without gamma (asterisk asterisk asterisk). View fullsize An approximation of the roughly 2.5 gamma of human eyesight. It’s complicated, but the non-linearity of our vision closely matches a few historical imaging methods, such as the densities of dyes on a piece of film, and the voltages in a CRT. So by a combination of happy coincidence and clever design, images that “look right” to our eye on modern displays have a gamma that aligns with the way our brains transform light into pictures.For the purposes of this discussion, you don’t need to deeply understand all that (exhibit A: your dear author). All I want you to take away from this section is: linear images, where pixel math aligns well with real-world light phenomena, don’t look “right.” An 18% gray card looks middle-gray both to our eyes on on our devices because of a shared/complimentary nonlinearity. Our eyesight has a gamma, and so do the images.Why do we Gamma?This convenient alignment actually makes it counter-intuitive to imagine working with real-world light values. If a 50%-bright thing on the display looks 50% of the way between black and white to our eyes, where’s the problem?The problem comes when we want to model the real-wold behavior of light. In VFX, we do this in 3D rendering of course, but also in compositing. That obviously-wrong one-stop-over-is-blown-completely-out gray card example at the top? We call that “working in display-referred space,” and it’s how a lot of computer graphics were created in the early days. It wasn’t right, and it often didn’t look right.Light WinsIn the mid-nineties I was part of a commercial shoot so ambitious that the post house sent their technical wizard/color scientist to the set. We were shooting on 35mm film, of course, and had an elaborate post session planned that was, if you can believe it, to be handled largely using a video switcher, not anything digital. Our animation crew was preparing to dangle some props in front of a greenscreen, and we asked him what we should do for the strings. Use fishing line? Paint them green? We were not anticipating having the ability to digitally paint out the strings (the Flame was just in beta back then!), so our decision here mattered a lot. He suggested matte-black thread. “With the smallest amount of motion, the strings will disappear against the exposure of the greenscreen.”I and my fellow art school graduates were dubious. Surely black would be highly visible against bright green?We shook off our skepticism and took his advice, and of course he was right. But I didn’t quite understand why. In my mind, a black string would stand out against a green background — and even if it was motion blurred, it would still be a very visible black blur. Model Hanging from Black Thread Recreated in sRGB gamma Simulated Motion Blur Performed in non-gamma-managed sRGB. How I erroneously imagined a real-world blur would look. The simulated blur above is what I thought the film would record, because I thought light and dark things were all equally-weighted in the motion-blur soup. I was thinking that light mixed in units that matched my perception.But the linear-quality of light means that bright things occupy more of the number-space of the simple math we use to blur and layer digital images. So light “wins.” Here’s the same simulated model shot with simple sRGB gamma management: Gamma-managed Composite Layers composited in linear-light Gamma-managed Blur The simulated motion blur is performed in linear gamma For Comparison Here’s the non-gamma managed blur. Note how much darker the strings are. In this example, the jet, the strings, and the background are converted from video gamma to linear using an sRGB curve, making them appear darker. Then the blur is performed. An inverse sRGB curve is applied to the result, brightening it back up. The pixels that aren’t mixed or blurred look identical (they “round trip,” as we say), but the blurred areas of the image now reflect the real-world phenomenon of light’s predominance over dark.Another real-world example from my own history of discovery: In 2003 I snapped this photo of possibly the greatest movie poster ever, but accidentally shot a second exposure as I moved the camera, capturing some streaky motion blur. Of course I tried blurring the sharp photo to match the streaks in the blurry one, but performing the blur in the native sRGB gamma of the camera JPEG resulted in muddy blur thanks to the perceptual mixing. But wrapping the synthetic blur in that sRGB → linear → and back pipeline makes it a near perfect match. A photo I took for some reason. This movie rules. The blurry accidental exposure Straight out of the camera. Blurring the original to match the blurry shot In linear-light, with an sRGB wrapper. Compare this one with the true blur. Blurring the original to match the blurry shot In native sRGB gamma There’s one more experiment you can easily perform yourself to see light winning through your viewfinder: print a fine checkerboard, and photograph it both in and out of focus. Photographed checkerboard Photographed checkerboard Focus set to deep FG. Note where the defocused checkerboard appears in the waveform. Synthetic Defocus in Display Space Note the dark appearance. Synthetic blur in linear gamma Compare with the photographed blur. Simplified example A CG checkerboard Display-space blur Converges to 50% Gamma 1.0 blur Converges to a brighter value, just like the photo. Blurred in display space, the checkerboard converges to a logical 50%. But in linear-light, the checkerboard smudges out to something brighter than 50% (0.5 ^(1/2.2) = 73%), just like the real photographed sample.The history of my advocacy for a linear workflow has been full of examples like this. Motion blur, defocus blurs, simple compositing operations, 3D lighting and shading, combining 3D render passes or live-action exposures, even anti-aliasing of text, all look better, more organic, and more realistic when performed in gamma 1.0.Linear Light & HDR are BFFsIn both the real world and in gamma-managed image processing, light overpowers dark. And so far we haven’t even broached the subject of HDR. When you add the ability to process pixel values greater than 1.0, light has even more opportunity to “win,” clobbering other elements in the mix. Linear SDR Composite This composite is gamma-managed, but the sky is capped at 1.0. Linear HDR Composite The same gamma processing, but including the HDR sky values. This example was discovered in the ashes of Pompeii. Back to that Gray CardTo create a linear-light version of that gray card example, 2005-style, we apply an sRGB-to-linear conversion to the textures in the scene. We then perform the exposure calculations as above, but this math is now happening on linear-light pixels. The final step is to convert the results back to sRGB, using an linear-to-sRGB lookup. Without that lookup, the linear images look too dark on our display, like the deer example above.With the sRGB lookup, the textures round-trip perfectly. 50% gray is still 50% gray. But the defocused background looks better, because highlights are “winning” in the boke calculations, just like real light does.And when we start to increase exposure, we get a much more plausible sequence of increasingly-bright images: For Comparison, the sRGB-native Example Above This scene was designed and rendered in native sRGB gamma. 50% Gray Card in Linear Light, with sRGB Lookup More realistic boke and sampling. Note the sRGB curve in the scope, from the gray ramp at the bottom of the frame. +1 Stop This closely matches most expectations of what one stop of overexposure looks like. +2 Stops Even overexposed by two stops, there is still detail in the gray card. +3 Stops At three stops of overexposure, even this model blows out. +4 Stops Very blown-out. +5 Stops Why are we even looking at this? +6 Stops Or, what did future Sarah Connor see? Plausible — but maybe not the most pleasing. The sRGB curve is basically just a gamma curve, with a little straight-line portion at the base. If you have a camera that actually used this tone curve to map its linear sensor data to a JPEG, you would not love the results. They would appear flat and be prone to color artifacts as colors clipped.Here’s a real-world example for comparison — in-camera JPEGs shot with a Canon 5D Mark III: An Actual Photograph Shot with a Canon 5D Mark III in Standard Picture Profile, JPEG, sRGB color space. +1 Stop +2 Stops +3 Stops Still holding some detail in the gray card. +4 Stops Only now is 18% gray overexposed. +5 Stops The DSLR, even in sRGB JPEG mode, holds detail in the gray card at 3+ stops of overexposure in this case.This is because when Canon says these JPEGs are “sRGB,” that defines their correct profile for display, but not necessarily their exact encoding. The encoding profile — the color adjustments and tone curve used to convert the linear raw sensor data to a viewable image — may be based on the sRGB curve, but it has some subjectivity baked into it; likely a little bit of s-curve contrast, and some highlight rolloff.And that’s with the “Standard” Picture Profile, sRGB, and JPEG— likely the least dynamic range this camera would ever present. A raw file, log video, or even a less-contrasty profile could offer a significantly gentler highlight treatment.If you work in linear-light, you’re doing things right — but if you want your results to look pleasing and/or photographed, an sRGB lookup alone is not good enough.sRGB and Gamma VisualizedBefore we skewer the sRGB “gamma” as a view transform, let’s examine what it actually is.First, some terminology. Strictly-speaking, gamma is a power function. A gamma of 2.2 is the same as raising the pixel value, on a 0.0–1.0 scale, to the power of 1/2.2. But the term gamma has been broadened by some to include any kind of 1D tone curve applied to, or characteristic of, an image. Life is easier with this relaxed definition, so that’s how I use it.Gamma Management is the term I use for a workflow that uses 1D lookups/conversions between formats. Magic Bullet Looks 5 and Supercomp 1.5 use Gamma Management rather than full color management.You can absolutely gamma-manage your workflow using the pure gamma-2.2 and its inverse. But if your imagery is sRGB, it’s slightly more accurate to use the sRGB curve. The sRGB tone curve is a very close match to a pure gamma 2.2, but it has a little kink at the bottom to solve an old problem.A pure gamma curve has a slope of 1.0 or 0.0 at its base, i.e. as the values in the image approach zero, the gamma curve approaches a flat line. This means that calculations on the darkest pixels in your image could be inaccurate, and those inaccuracies could compound through multiple steps of linearization and de-linearization.sRGB has a steep, but not infinitely steep, linear slope at the very bottom, and then the rest of the curve uses a gamma of 2.4 squished to fit in the remaining range. The clever result is that the curve is smooth at the transition and robust through multiple generations of processing, even if the processing is not done in floating-point.It’s easy to see how similar the gamma 2.2 and sRGB curves are by graphing them: Pure Gamma 2.2 Curve Infinite slope at 0,0 sRGB Curve Not quite as steep at the base Gamma 2.2 & sRGB Compared Very similar, just different enough. Tripping on Round TrippingWhile the pure gamma curve and the sRGB curve are similar, two values for which they are identical are zero and 1.0. That’s fine, although there’s nothing special about 1.0 in either curve in the sense that the power function extends naturally through 1.0 and operates equally well on “overbrights,” or HDR values greater than one.What is significant about these curves and their 0.0–1.0 range is that they round-trip cleanly, as I mentioned above. If you linearize with the inverse of these curves, do your thing, and then de-linearize, the pixels that didn’t get blended go right back to their original values. This is convenient, and for some motion-graphic applications, essential.However, it’s the reason working linear is not enough.Rendering a White ThingHere’s a simple rendering to show what I mean. The first image is rendered using a simple Blinn-Phong shader in display-referred space, just like I used to do on my Amiga. The second is that same scene but with simple sRGB gamma management. Display-space Linear with sRGB lookup While the linear-workflow image above looks “better” within the limitations of this intentionally simple example, it doesn’t solve the clipping from the gamma-space version, in part because of this prioritization of round-tripping white.No object is really “white” in the sense of reflecting 100% of the light that hits it. But we often work with synthetic images that have pure white in them (such as logos or text), and of course we expect those values to remain pure white even after round-tripping through an sRGB or gamma 2.2 linear workflow.But at the same time, we expect our cameras to have that gentle roll-off. We expect a white object to photograph not as pure white, but as some reasonable white-ish shade that is not blown-out. In fact, from modern cameras, we expect enough dynamic range to capture a sun-lit shiny white car, for example, and shadow detail on a person’s face. An unused take from Circle of Stone, directed by Mark Andrews and shot by me. As an experienced cinematographer, I would approach challenging lighting situations like this — with the bright white car surfaces and deep shadow detail — by pointing the camera and praying. There’s a lot of detail in this shot, and a lot of challenging exposure. We can actually inspect the exposure values, because this shot was captured in log. This also means we can accurately convert it into linear-light values, and then render it with a simple sRGB curve: Log Original The only clipping is on the specular kicks. sRGB Image converted from log to linear-light, then to video via a simple sRGB lookup. Why would we do such a thing? The results, as you can see, are terrible. When you pass scene values to a simple sRGB lookup, with no other “display prep,” as cinematographer Steve Yedlin calls it, you get ugly results. Low dynamic range, clipped highlights, and posterized colors near areas of overexposure.In fact, this synthetic example reminds me of early digital cameras that lacked the dynamic range to create a proper highlight rolloff. The overexposed waves in this Nikon CoolPix 995 photo from 2003 have the same harsh transition to white through posterized cyan as the sRGB-converted car above: I paid $1,000 for this camera in the year 2000. This photo is 2048 x 1536. So, 1K for 2K in Y2K. Rendering to linear scene values and then converting them to sRGB with the stock curve is ugly. If a modern camera did this, we’d laugh it back to 2003.But this linear-to-sRGB (or gamma 2.2) final lookup is exactly how a lot of artists have been doing things “right” for years. We learn that we should work linear, so we dutifully convert our textures to gamma 1.0 and render to EXR. But if we use nothing more than the sRGB curve as our final lookup, we are treating our beautiful 3D rendered scenes as if shooting them with a first-generation digital point-and-shoot.The industry’s standardization on this kind of simplistic linear workflow has left an aesthetic gap demanding to be filled.Roll Out the Roll-offWhen I was designing Magic Bullet Looks, and later Magic Bullet Colorista, I was aware of these issues. Magic Bullet Looks has always done its processing in linear floating-point values, which meant that it was possible to both manage and create HDR values, even back when no camera could generate them.One thing we came up with to help render bright scenes in a more pleasing, film-like way was the Shoulder tool in Magic Bullet Looks. Shoulder The tool icon shows you the curve it applied to the image. Shoulder in Action On the left side of the split-screen comparison. Like many tools in Looks, Shoulder shows you exactly what it’s doing — in this case, smoothly mapping the brightest values in an image to asymptotically approach a maximum. The Highlight Rolloff control in Colorista V packs the same process into a single slider.Let’s take a clear look at the effect Colorista’s Highlight Rolloff has on our example: Log Original Subjective conversion to video You’ll never guess what method Naive sRGB Lookup sRGB Lookup with Colorista V Highlight Rolloff = 100% And on our simple rendered ball: sRGB The same sRGB lookup as above, followed by Colorista V with Highlight Rolloff set to 100% Highlight Rolloff The same sRGB lookup followed by Colorista V Highlight Rolloff = 100% Highlight Rolloff is a nice, easy way to add a film-like “shoulder” to your HDR imagery. If you are using the gamma 2.2 “linear workflow” option in Cinema 4D, adding the (now built-in) Magic Bullet Looks Shoulder tool to your rendering is an easy way to create more pleasing highlights without radically changing the look of your renders. It’s the first step in upgrading our virtual cameras to match the expectations we’ve come to have of our real ones.But can Highlight Rolloff alone solve all our rendering issues? No. And the easiest way to show that is byRendering a Blue ThingHere’s that ball again, now textured blue. sRGB Linear render with sRGB lookup Highlight Rolloff The same sRGB lookup followed by Colorista V Highlight Rolloff = 100% Again, you can see the failings of the sRGB version (clipping and posterizing of highlights) are addressed, if not fully eliminated, by the Highlight Rolloff.But what if we change the color of the light? sRGB Highlight Rolloff WTF This does not look good. The very red light seems unable to illuminate the not-quite-pure blue of the billiard ball, instead tinting it a weird green.If that feels wrong to you, but you can’t quite figure out why, let’s look at a real photo of a blue thing lit with red light: View fullsize The illuminated portions are purple, not green.Highlight Rolloff, you are awesome, but you are not enough. The aesthetic shortcomings of sRGB view lookups are now joined by this bogus color rendering. There’s both an artistic and technical void here to be filled — and you guessed it, ACES is what’s come along to do so. ACES: Come for the Technical, Stay (or Don’t) for the Subjective AestheticWhat, exactly, is ACES? For the purposes of this article, here’s what I want you to know:ACES is a color management systemACES has tools and technology for converting images among various color profiles. It is specifically design for the motion picture industry.ACES is a color space. Well, two.ACES defines two color gamuts, AP0 and AP1. AP1 is the “working” gamut, and like AdobeRGB and ProPhotoRGB, it is a wide-gamut color space, encompassing more colors than sRGB.ACES is a set of color profiles for popular cameras.ACES ships with profiles for Canon, Sony, ARRI, Red, and more. This means it’s trivial to match the output from various cameras.ACES is an evolving set of final lookups for presentation.For that final conversion from the linear-light, wide-gamut working space of AP1, ACES offers a handful of Output Display Transforms, or ODTs. The ones designed for SDR video output have built-in highlight rolloff, a subtle contrast curve, and special handling for bright, saturated colors.ACES is gentle prescription for a workflow.ACES ships with color profiles designed to support the phases of a motion picture project:ACEScg is the linear, AP1 color space for 3D rendering and compositing.ACEScc is a log color space that also uses AP1 primaries. It is designed to be a universal space for color grading.ACES2065-1 is intended to be a universal mastering color space for sharing and archiving finished projects. This is where that AP0 gamut comes into play — it encompass every color visible to the human eye.The TechnicalACES CG is a linear-gamma working space of course, so it’s ideal for rendering and compositing. But that it is also a carefully-chosen wide-gamut color space is an equally important part of its design. Working in a wider-gamut space is one way to combat the green ball problem above.The SubjectiveOnce you choose to work in a wide gamut, you then have to figure out how to map that image back to various output formats. As we have established, the simple sRGB transform (and its cousin, Rec. 709) is not good enough. The ACES team performed numerous tests and evaluations in designing their output transforms — and then revised the results several times. And they are still working on it. The look of these transforms is both studied and subjective, and while many people love the look, others have criticisms (especially around rendering of saturated colors). Remember above where I said that a simplistic linear workflow had left an aesthetic gap to be filled? Well these Output Display Transforms (OTF) are the primary way that ACES has stepped up to fill it. This explains why folks are so enthusiastic about the results it gives them, even if it is an ongoing field of development.One of the most exuberant advocates of ACES for 3D rendering is Chad Ashley of Greyscalegorilla. Here’s a typical before/example from one of his excellent tutorials: Image courtesy Grayscalegorilla. Watch the tutorial. That is a pretty solid mic-drop of a comparison there. You can see how the ACES example has both the pleasing push of contrast we associate with film, as well as the smooth, languorous highlight rolloff. Colors are somehow both rich and restrained. The render looks real, but more importantly, it looks photographed.Let’s do the same comparison with our gray card example from the top of the article: To be clear, what makes the right side of the split an ACES render is a combination of transforming the textures into ACEScg linear, and then applying the ACES Rec. 709 ODT as a final view/encode transform. And while it looks fine, the contrast and highlight rolloff do make for an overall darker image. This is probably a much more realistic portrayal of the scene. The pure white patches on the card, which are far “whiter” than any real-world surface (fresh snow is about 85% reflective) render as light gray, and our 50% gray is coming in at 43%. View fullsize The “gamma,” or tone curve, of the ACES Rec. 709 ODT shown in magenta. It’s easy to see how it is darker overall than the sRGB curve (cyan). View fullsize Boosting ACES Rec. 709 by 0.36 EV causes 50% output to match sRGB. Note how similar the Colorista Highlight Rolloff variant of sRGB is to that boosted ACES curve. To compensate for this, it looks like Chad Ashely rendered his scene a little brighter. The non-ACES version looks overexposed. Let’s boost the scene exposure so the gray card matches the sRGB example: With gray matched, we get a better overall comparison. The contrast and soft highlights look nice. It’s a more photographed-looking version of our idealized scene.What it is not, however, is a safely round-tripped version of our texture maps. Where the sRGB linear workflow mapped black back to black, 50% back to 50%, and impossible white right back to 1.0, this more realistic portrayal reminds us more of the real-world photography of the white cars. We see the bright white things as “white,” even though they are no longer pegging 255 on our displays.What about our motion blur example? Here ACES has let us down. By rendering the linearized image with the photographic contrast and highlight compression of the ODT, we’ve lost our seamless round-tripping. Our results are dark and dull. Because we knew what we expected our texture to look like at the end of the pipeline, the pleasing, subjective look of the ODT was not the right choice for this example.This is meaningful for motion graphics, color grading, and compositing workflows. If “working in ACES” means changing the look of every pixel before you’ve even started to get creative, that’s going to surprise and dismay many artists.For example, if Chad was trying to render his realistic vase in front of a client-supplied background plate, the same post-processing that he loved on his CG would mute out the photographed background.Oh heck let’s look at that: The sRGB rendering on the left, composited over this SDR iPhone video, has the typical sRGB artifacts covered above: clipped highlights and posterized colors near white. While the ACES rendering on the right solves these issues, it applies that same highlight compression to the SDR background, making it look dingy and dull.If we want 3D rendered scenes to look photographed, do we have to let go of round tripping?Oh Inverted Display Transform View fullsize Every ACES conversion requires at least an input color profile and an output. ACES has a solution for this too. You’ll remember that ACEScg is our working space for rendering and compositing. It therefore is also our texture map color space, so in the example above, I’ve converted the billiard ball texture map and the SDR background plate from sRGB into ACEScg. I did this using the Open Color IO effect in After Effects, setting sRGB as the input, and ACEScg as the output. But critically, ACES also allows for using the contrasty, soft-highlights Output Display Transform as the “from” in this conversion. In other words, you can invert the output transform for images you want to cleanly round trip. View fullsize Using the Output Rec. 709 profile as the input, AKA inverting the ODT. Given how complex the ACES Rec. 709 ODT is, I’m impressed that this is even possible. It’s a straightforward process to invert a 2D lookup, but the ACES ODT is a complex, 3D conversion, with special handling for saturated highlights. Inverting all this not only allows for round-tripping, it also has the interesting side effect of plausibly surmising HDR values from an SDR image.Think about it this way: The photographed examples we’ve been discussing all have some kind of “shoulder” baked in. Inverting the shoulder-y ACES Rec. 709 ODT effectively un-shoulders photographed images, putting their compressed highlights back into a reasonable estimation of what scene values might have generated them.Believe it or not, we used to have exactly this functionality in Magic Bullet Looks 1.0. We had Highlight Rolloff in the Output tab, and its inverse, “Highlight Roll-on” in the input tab! People were confused by this, so we ultimately removed it, but now we’ve replaced it with ubiquitous Input and Output tools. sRGB to Linear Ball and BG both processed with sRGB to linear conversion. Render post-processed with linear to sRGB. ACES Rec. 709 ODT BG and ball texture converted from sRGB to ACES CG using the inverse sRGB curve. ACES Rec. 709 ODT Ball texture unchanged from previous, but now the BG has been processed into ACES CG using the inverse Rec. 709 ODT. This same conversion was also used for the reflection map derived from the plate. Defocus 1 That same setup with a defocus performed on the linearized plate. Note how realistic the boke is on the highlights, because of the aggressive presumption of HDR values. Defocus 2 Focusing on the BG rather than the ball continues to demonstrate the realism of this setup. Note the way the now-HDR clouds shine though the blurred edge of the ball. Inverted ODT is not for Texture MapsThe inverted ODT allows us to round-trip video though ACES, but since it does so by creating HDR values, it’s not appropriate for texture maps representing diffuse reflectivity.This is a big stumbling block for many artists dipping their toes into ACES. Their texture maps suddenly appear dark and dim, like the sRGB background above.Step through the images below for a simulated example: sRGB to Linear In this sample product shot, the rendered box matches the reference on the left because the gamma management uses simple sRGB curves, meaning 1.0 white round-trips. But the highlight in the back clips harshly. ACES The ACES version looks more realistic, and the highlight rolls off nicely. But the texture now appears dark. Inverse ODT Textures If the artist tries to solve this by using the inverse ODT on the textures, the result is a self-illuminated box. A better solution would be akin to the real-world answer: brighten up the lights. Crank Up Those LightsYou might have noticed something in the floating billiard ball example above: The ACES ODT so aggressively addressed the clipped highlights from the sRGB example that the resultant render appears a bit flat compared to the plate, which has lots of poppy highlights from the low sun.When you invert the Rec. 709 ODT, the compliment to the rolloff curve causes 1.0 white to map to a very bright linear-light value: about 16.3 on a scale of zero to one. That sounds aggressive, but it represents about 6.5 stops of overexposure on an 18% gray card (0.18 × 2^6.5 = 16.3) — more dynamic range than the 5D JPEG example above, but right in line with the Sony s7SII log example with the white cars.Another way of looking at it: It’s not a stretch to presume that the clouds in the iPhone plate are 6–7 stops brighter than the gray side of the dented car.Artists working with a simple sRGB or gamma 2.2 “linear workflow” have been inadvertently training themselves to use conservative light values, because of the lack of highlight compression modeling high-end film or digital recording. If you lit your scene too bright, you’d get ugly highlights. But real scenes have big, broad dynamic ranges — which is part of why they’re so hard to photograph.The virtual “sun” light that’s illuminating the rendered ball is set to 300% brightness, but the HDR values that light creates in the render get compressed down so much that I now want to push it more. Here’s the same scene with the light at 1,000% brightness). Key Light 300% Key Light 1,000% Key Light 1,000% Scene underexposed by six stops. If you’re not used to it, setting a light’s brightness to 1,000% feels strange — but in this example, that results in reflectance values of around 10.0, right in line with the HDR-ified highlights in the linearized background plate — as you can see in the underexposed version.Astute readers will note that if inverting the ODT results in white being mapped to 16.3, then an ACEScg linear value of 16.3 is the darkest value that will be mapped to pure white in Rec. 709 — i.e. you need ACEScg scene values of greater than 16.3 to clip on SDR output.Rendering to an ACES ODT encourages artists to create higher-dynamic-range HDR scenes, with brighter lights and more aggressive reflections. When you use brighter lights in a modern global-illumination render, you get more pronounced secondary bounces, for a more realistic overall appearance. ACES encourages artists to create CG scenes that better show off the power of modern CG pipelines, and, quite simply, look better, because they better model how real light works.Even if that light is red, and the object is blue.Back to BlueRemember our blue billiard ball that went green when hit with a red light? ACES wants to help us with that too. View fullsize I’m not really this organized. That’s sRGB with Highlight Rolloff on the left, and ACES on the right. Look how closely this matches the photographed example of a blue object under red light.Our sRGB render failed in this case because of its limited color gamut. The saturated blue of our number two ball was near the edge of sRGB’s range of available colors. When we hit it with a strong red light, the results were out of gamut, so the closest approximation was returned. View fullsize ACES addresses this with the wider gamut of its P1 color space. When you convert a texture to ACES CG, you are both linearizing the gamma and also assigning new color primaries. Visually, this results in a reduction in apparent saturation when viewing the raw pixels, so it’s easy to see how a once-saturated blue color is no longer dangerously near the edge of the available range. View fullsize Adobe ProPhotoRGB has an even wider gamut than ACEScg. But AP1 is not tremendously larger than sRGB, especially at the blue corner. A common-use color space that does offer more range there is Adobe’s ProPhoto RGB. Just for fun, I tried rendering my blue ball in ProPhoto, with a hacked-together view LUT made from a 1D approximation of the ACES Rec. 709 ODT. As you can see, our red light can make the ball even more purple within the extra-wide gamut. sRGB ACES CG ACES CG rendering, ACES Rec. 709 ODT Adobe ProPhoto ProPhoto rendering, with a 1D approximation of the ACES Rec. 709 ODT applied before gamut-only conversion to sRGB. Sidebar: Adobe Camera Raw, and by extension Lightroom, reportedly does its processing in linear ProPhoto RGB, with an implicit s-curve for contrast and highlight rolloff. I’ve always admired Lightroom’s color rendering, and it seems it might be for the same reasons that folks like ACES.So, if more gamut is better, why is ACES AP1 so conservative compared to other pre-existing color spaces? Why create yet another standard? At this point I have to explicitly call out this amazing page by Chris Brejon on ACES — specifically this section, where he has collected links and quotes about the decision-making behind the design of ACES AP1. The TL;DR is that a oversized gamut, especially one that includes colors not visible to the human eye (that’s the part of the ProPhoto triangle that extends outside the CIE kidney-shape) can result in render artifacts like negative numbers and funky colors. He cites this thread on ACES Central forums, where Jim Houston also points out that the primaries were chosen to line up with colorist’s expectations of where R, G and B are on a color control surface.ACES AP1 is a pragmatic color space designed for real-world use — a well-vetted blend of technical and artistic considerations. Nothing it does is expressly new (Adobe Camera Raw has been around since 2002), but ACES as a package is a practical standard for the film industry that, I will say once more, has risen to popularity largely because of gaps in mainstream workflows. Is it perfect for every use-case? No. Is it a boon to the film industry and the digital art community? Absolutely.Color Grading in ACESThis is topic is most certainly worth its own post, if not series of posts, but here’s the short version: The same advantages and possible gotchas I’ve covered with rendering and compositing in ACES also apply to color correction.Since ACES has color profiles for many popular cameras, it’s easy to unify footage from a variety of sources into one common color space for grading.The ACEScc log color space is, in my experience, a creatively-friendly color space for grading. There’s also a tweaked version of it called ACEScct. The T is for “toe,” so this is the profile favored by Quentin Tarantino.As with rendering, the ACES Output Transforms either jibe with your creative intent or don’t, especially around the presentation of overexposed, saturated colors. However, there are lots of ways to customize them.Using the inverse Rec. 709 ODT to grade consumer video as if it was shot log is pretty darn cool. Check it out:This short video demonstrates how ACES can elevate the basic color corrections on a video file. As I mentioned above, Magic Bullet Looks 5 has what we at Maxon/Red Giant call “Color Handling” rather than full color management — where we adjust gamma, but not the primaries. Why not full color management? The simple answer is that color management can be as confusing and off-putting as it can be helpful. The deeper answer is that, by using 1D LUTs, we can ensure perfect round-tripping. Which gets us to the biggest ACES gotcha of all:The Inverse Rec. 709 ODT Workflow Does Not Round Trip PerfectlyIn ACES 1.0.3, OpenColorIO 1.x, the magical inverse Rec. 709 workflow does not cleanly round trip all colors. Some highly-saturated colors get stomped on in the process. View fullsize A cleanser version of the same issue in Resolve. The wonderful Open Color IO After Effects plug-in from friend of Prolost (and eLin co-creator) Brendan Bolles uses LUT approximations for some transforms (because Open Color IO 1.x does), so there’s bound to be some quantization. But even in Resolve, where ACES transforms are done in native CTL/DCTL code, these problems persist.ACES 2.0 and OpenColorIO may address these issues. So we’ve been conservative about fully adopting ACES within Magic Bullet, even as we’ve aimed for compatibility with it.The same is true with Supercomp, although it’s relatively easy to composite in ACES with Supercomp even without native support. Just use the OCIO effect to convert your layers to ACEScg, and tag them as Linear in Supercomp. Don’t forget to set Supercomp’s output gamma to Linear as well. Then add an Adjustment Layer above the Supercomp layer with another OCIO effect converting ACEScg to Rec. 709, or the ODT of your choice. View fullsize Supercomp in ACES All the advantages of color correcting in ACES apply to VFX compositing as well. The inverse ODT limitation could be an issue for folks working with SDR video sources. Most interesting though, is that the aggressive HDR-ification of video highlights (remember, values that were 1.0 in an sRGB conversion will be 16.3 in ACEScg) feeds directly into Supercomp’s floating-point rendering, making Light Wraps, glows, and other effects respond more intensely to highlights than you might be accustomed to. This can either be wonderful or unwieldily depending on the source material. sRGB Supercomp’s Light Wrap, using sRGB gamma management. ACES Supercomp in ACES. The HDR-ified background place well with Light Wrap around Kong’s head, but the clipped water reflections are so super bright that the Light Wrap gets overloaded. The white highlights on the water overpower the Light Wrap effect, but then check out this example: Text over a BG In Supercomp sRGB Supercomp’s beautiful wrapping Optical Glow applied to the BG. With sRGB gamma management, the highlights catch the glow just fine. ACES ACES color management creates super-bright, super-saturated colors from the neon sign, which Optical Glow (same settings) does delightful things with. The ODT lends a natural falloff to the glow. Experimenting with ACES in After Effects Once you get the OCIO plug-in and the ACES profiles installed on your system, After Effects is a good place to experiment with ACES. After Effects even ships with an ACEScg ICC profile, which you can use as a project working space, and/or with the Color Profile Converter effect. I find this handy for converting HDR sources from sRGB to ACEScg, because Adobe’s ICC method does not clip, where the OCIO LUT-based operators sometimes do. Get the ACES Presets Things to Be Aware OfUse sRGB to ACEScg for TexturesOr whatever the appropriate source color space is. The point here is, texture maps shouldn’t try to represent more than 100% reflectivity.Carefully Use Rec. 709 to ACEScg for Video PlatesJust beware of the aggressive reconstruction of near-white pixels into HDR values, and the potential for saturated colors to get truncated.Procedural Color Management is Better than Baking Conversions into FilesIf you must bake out your ACEScg texture maps, remember that 8 bpc is not enough to store a linear-light, wide-gamut image. Use 16-bit TIFF or EXR.A proper ACES color management solution includes managing the user-chosen colors for things like untextured objects and light sources. In my examples above, I had to rig up systems of converting my light colors into the color space I was rendering to, for proper apples-to-apples comparisons.Don’t Try to Do ACES with LUTsYou can’t really emulate an ACES workflow using LUTs. Most LUTs are not designed to map HDR input, for example. It’s possible, but there are lots of gotchas. Native processing is better.This Post References ACES 1.0.3OCIO 2 is already released, as is ACES 1.3, and ACES 2.0 is in development.Coming in for a LandingRemember the model-on-strings example way at the top? While the sRGB version showed how a linear workflow could emulate the real-world devouring of the strings on film, it only partially obscured the strings. My recollection was that our post-advisor was more resoundingly correct about the bright green background hiding the strings. By recreating the scene in ACES, I am finally, all these years later, able to simulate the way our black thread photographed in front of that greenscreen. I wish I could go back in time and tell 2004 Stu that this blog would survive long enough for color management to become cool. That, to me, is the most surprising thing about ACES — that it has captured the interest of technical artists and non-technical alike. ACES takes the concept of doing things “right” in linear light, and extends it to doing things beautifully. It’s transformed color gamuts and tone curves from the broccoli side-dish to an ice cream dessert.ACES is not perfect for every use-case, but it is purpose-built for film and video work. Today, if you choose not to use ACES, it’s probably either because you haven’t tried it yet, or you already have your own complex, bespoke color pipeline.To me, ACES is most significant as a common color language that I can use in my creative work and in my tool building. Expect to see more of ACES in Red Giant and Maxon tools.ResourcesIf you enjoyed this post, I can’t imagine you haven’t also pored over Chris Brejon’s entire glorious chapter on ACES. The highest compliment I can pay it is that I essentially rewrote it for this post.ACES Central is the home of the canonical discussions on ACES, where you can be confused and intimidated right from the source.There are many ACES tutorials online, and not all of them are good. But this one provides a compelling demo of ACES out-of-the-box ability to match different cameras into one unified color space for grading.This article by Ben Baily is also quite good.Here’s a brand-new tutorial on using ACES in Redshift.And of course, you can grab my ACES presets for After Effects here: Prolost ACES Presets for After Effects And you know what? I have a feeling I’m still not done writing about this stuff. Tags: Magic Bullet, Color, Resolve, Image Nerdery, Red Giant, Photography, Motion Graphics, Adobe After Effects, ACES, HDR, Film, Lightroom, Maxon, Tutorials Comment Was it worth buying a Pro Display XDR just for this joke? Yes. Apple’s “EDR” Brings High Dynamic Range to Non-HDR Displays December 04, 2020 Apple caused quite a stir with the announcement of their Pro Display XDR, a High Dynamic Range display that occupies a convoluted space in the market. It seeks to be both a Very Nice Computer Display, and a reference HDR video monitor — but by most measures it’s far too expensive to be the former, and not quite up to the rarified specs of the latter. Confusingly, it also outperforms some HDR displays costing considerably more, by some metrics. It’s currently the only display-only device Apple makes, and it’s simultaneously ludicrously expensive and a too-good-to-be-true deal. Suffice it to say that while there may be very few people for whom the Pro Display XDR is the unquestionably right choice, they know who they are, and they don’t need the internet’s advice about it.When I watched the announcement of this display, I was curious how Apple would handle an HDR video monitor that was also tasked with the mundane duty of displaying your email and a web browser. Was Apple planning on rendering the 255-255-2551“white” of Google’s home page at one brightness level, and the HDR overbrights from a video clip at a much brighter level, right next to each other, on the same display?HDR In SituThe answer is a resounding “yes,” and the effect is both impressive and a bit unnerving. Below is a photo of a Pro Display XDR casually presenting the Finder thumbnail of an HLG clip I shot on my Sony a7RIV. The sky is radically brighter than the “white” pixels above it. The screenshot of the thumbnail reveals where HDR values are clipped at “white.” The photo, of course, cannot capture how startling this is in person. It’s like strolling through an art gallery and stumbling onto a painting with its own backlight.It’s one thing to see HDR video on an HDR TV, where the entire image appears simply brighter and richer. It’s another thing to see this kind of imagery presented in the long-familiar context of a computer screen full of folder icons and file names. It’s probably the right way to handle HDR in an SDR world, but it’s strange and new, and possibly unique to Apple.It also seems to be an important part of Apple’s ongoing display strategy. The company, long known for shipping high quality, color-accurate displays, is not reserving this HDR-in-an-SDR-world experience for folks shelling out $6,000 for a computer monitor. It’s also part of how all iPhone OLED displays are defined as HDR. Here you can see HDR video shot with an iPhone 12 Pro Max rendering the sky out my studio window as brighter that the “white” background of the Photos app:The HDR display on the iPhone 12 Pro Max is wild. It happily shows brighter-than-white pixels alongside the UI “white.” pic.twitter.com/3nYhF5WuVH— Stu Maschwitz (@5tu) November 15, 2020Apple is commoditizing and normalizing HDR on their most popular platform, both in capture and display. And they’re not doing it just by making iPhone screens brighter. They’re making the right pixels the right brightness. It’s an impressive technical feat made all the more admirable by how natural it feels in practice.Elderly Displays RejoiceSo Apple has a method of showing HDR and SDR content together on the same screen. It works on every display Apple bills as “HDR,” even though the phones are performing the stunt using a different underlying technology than the 32″ Mac display. The XDR uses “local dimming” to light up an array of LEDs brighter behind the HDR pixels, as needed. The OLED displays drive each pixel to the desired brightness individually.Apple groups all this under one umbrella they call EDR, or Extended Dynamic Range. And even as they tout EDR as a selling point of their professional display and flagship iPhones, Apple has also quietly extended it to older Macs that were never advertised as being HDR-capable.From Apple’s developer documentation:Some Macs can process pixel data with a wider range of pixel values and send those extended values to the display. In macOS, the ability to process larger pixel values and display them is referred to as extended dynamic range (EDR). When you configure a Metal layer to support extended values, you can provide pixel values — and therefore brightness levels — that exceed the normal SDR range in order to display HDR content.“Some Macs” is not limited to those connected to a Pro Display XDR. It includes my 2019 16″ MacBook Pro, and even my three-year-old iMac Pro. I think the limiting factors may be P3-gamut displays on Macs running Catalina or later. If you have such a Mac, you can try this at home. Two HDR clips, one shot in Dolby BT.2020 with an iPhone 12 Pro Max, the other shot in HLG with a Sony a7RIV. In both the Finder and Quicktime Player on Catalina, the highlights in these clips should be visibly brighter than UI “white.” Download Sample HDR Clips If the Pro Display XDR is like finding a gallery painting with its own backlight, seeing HDR pixels popping off a Mac display you’ve known to be SDR for three years is like discovering a painting that’s been hanging in your house forever suddenly has backlight button. Here’s another HDR clip happily blasting its whiter-than-white values on my non-HDR iMac Pro screen. So add a third method of displaying EDR content to Apple’s roster: On these non-HDR displays, Apple has remapped “white” to something less than 255-255-255, leaving headroom for HDR vales, should they be called for. The operating system is complicit in this trickery, so the Digital Color Meter eyedropper shows “white” as 255, as do screenshots.2With Catalina, Apple quietly changed what “white” means for millions of Macs, and none of us noticed.Think of it this way: This EDR display philosophy is so important to Apple that they are willing to spend battery life on it. When you map “white” down to gray, you have to drive the LED backlight brighter for the same perceived screen brightness, using more power. Apple has your laptop doing this all the time, on the off chance that some HDR pixels come along to occupy that headroom (or not: see update below). It’s a huge flex, and a strong sign of Apple’s commitment to an HDR future.It also means that when you adjust your display brightness, macOS is commensurately adjusting the amount of headroom available for overbrights. Max out the brightness slowly and you can watch the HDR values gradually get pinched against the UI “white” as the headroom shrinks, until HDR white and Google white meet. Conversely, the lower your display brightness, the more headroom there is for EDR, although there will never be as much as on the Pro Display XDR, or even an iPhone 12 Pro.This variance in EDR capabilities across Apple devices is where a defining feature of HDR display comes into play. HDR standards like Dolby Vision were designed to accommodate screens of varying maximum brightness. The content is display independent, and when macOS goes to render it, it first asks the display how bright it can go, and then builds a bespoke lookup for that output brightness, correctly displaying the content within the available range.Apple handling this all for developers is a new kind of leg-up in the fraught field of color management. It’s actually trivial to display HDR content correctly on a semi-recent Mac. Apps such as DaVinci Resolve, Affinity Photo, and of course Final Cut Pro already do it, and you can expect to see it in Red Giant and Maxon tools as well.Apple sells one very expensive, very capable device for displaying HDR — and literally millions of iPhones, iPads, and Macs that are also pretty darn good at it.It’s an HDR WorldI’ve been critical of HDR as a creative tool. My North Star is the look of film, with its glorious highlight rolloff. Trying to sell me “brighter” was like screaming at me in the front row of a Metallica concert that the sound could go louder.In the words of the great Roger Deakins:If you create a balance of light and dark on set you expect that balance to be maintained throughout the process. I personally resent being told my work looks ‘better’ with brighter whites and more saturation.That was him talking about Sicario, five years ago. Since then, he’s shot a number of films, including Blade Runner 2049, which, damnit, takes artful creative advantage of HDR exhibition. For the most part, like Sicario, it intentionally occupies a narrow band of the available dynamic range. But at key parts of the story, certain colors eek outside of that self-imposed SDR container, to great effect. In a very emotional scene, brilliant pinks and purples explode off the screen — colors that not only had been absent from the film before that moment, but seemed altogether outside the spectrum of the story’s palette. Such a moment would not be possible without HDR.Or would it? While there are certainly colors that digital projection can uniquely display, in many ways digital cinematography is still chasing the tremendous dynamic range and color fidelity of celluloid film. Properly projected film3 is, by any measure, HDR. So maybe I should warm up to digital’s latest efforts to live up to this legacy. How much more orange could it be? The answer is none. None more orange. Hug, Don’t RejectHDR presentation joins the rich catalog of film techniques that can have a profound affect on audiences so long as they are not overused; alongside extreme close-ups, aggressive surround mixes, handheld camera work, fart jokes, and, well, just about every filmmaking tool from color to sound.I suppose I knew this reality was coming. I guess I just wasn’t expecting to see it on my laptop screen, glaring at me between my email and my grocery list.Update 2020-12-05There’s good evidence that I’m wrong about EDR being speculatively on all the time on non-HDR Apple displays. Michael Fortin writes:When the [EDR] preview first appears on screen, it is rendered normally, in SDR. It then progressively becomes brighter over the span of one or two seconds. Brighter than the surrounding white. This appears to be the EDR system firing up: slowly cranking up the display brightness at the same time as it darkens the standard white point for everything but the video. Those two operations are done in tandem so well that you don’t perceive any change on screen other than the video becoming brighter.”I noted this transition animation on Twitter, but it's that “so well” part that tricked me — it is very hard to see UI white change at all during this EDR ramp-up, which feels impossible. More evidence: The screenshot borders I mention here are only EDR-white when there are other EDR pixels on the screen. The seamlessness of this transition completely fooled me, and serves as spectacular testament to how dialed-in Apple’s software and hardware are on matters of color.On the XDR, the HDR values appear instantly. On the older Macs, they fade into view in a few choppy steps, as seen here. pic.twitter.com/6WvG26Yz0k— Stu Maschwitz (@5tu) December 2, 2020Here's a page with some info. As I went to screenshot part of it on my iMac, I noticed that the screenshot crop displayed in EDR (visually brighter than the 255 white page). How meta.https://t.co/XNN8EDazgq pic.twitter.com/hR1G8N72t4— Stu Maschwitz (@5tu) December 3, 2020Update 2021-06-11For WWDC 2021, Apple has posted an excellent video on EDR.I don’t love using this simplistic 8-bit nomenclature for RGB color values, but in this context I’m doing so to invoke an old-school familiarity (echoed by the Digital Color Meter app). Between high-bit-depth P3 displays, display color management, software calibration, and True Tone, it’s probably been a very long time since “white” in a Mac UI was truly 255-255-255.If EDR reminds you of eLin, congrats on being old. And cool.Something that, sadly, few people have ever seen. Tags: Apple, HDR, Color, Image Nerdery 4 Comments Prev / Next Featured Cinema 4D 3D for the Real World Red Giant Color Correction, Motion Graphics, Visual Effects Slugline Screenwriting Mac, iPad, and iPhone Prolost Store Apps, presets, and other goodies for filmmaking and photography. Featured Post Featured May 23, 2018 TANK May 23, 2018 May 23, 2018 Recent Posts Featured Mar 17, 2022 Mac Studio and Studio Display Mar 17, 2022 Mar 17, 2022 Mar 7, 2022 M1 Max MacBook Pro Long-term Report Mar 7, 2022 Mar 7, 2022 Oct 25, 2021 The M1 Max MacBook Pros Oct 25, 2021 Oct 25, 2021 Jun 22, 2021 Linear Light, Gamma, and ACES Jun 22, 2021 Jun 22, 2021 Dec 4, 2020 Apple’s “EDR” Brings High Dynamic Range to Non-HDR Displays Dec 4, 2020 Dec 4, 2020 Nov 17, 2020 Magic Bullet Suite 14 and Trapcode Suite 16 Nov 17, 2020 Nov 17, 2020