I don’t know about you, I remember when I first started seeing people using SLRs for video. It was some time back when Ikegami, Sony, Panasonic, JVC were all settling on various kinds of Full HD and not interested in Digital Cinema per se. They were leaving that to the old line camera makers who were designing specialty cameras for the new generation Star Wars series (remember Jar Jar Binks?). Anyways, those top of the line digital cinema jobbies cost $125,000K and only George Lucas was willing to spend that much to get effing 2K resolution (big whoopity do-dah-dey).
Within a few years all the still camera makers were shooting well past 2K for still cams in an arms race to have the highest rez, largest frame size, etc. And Nikon vs. Canon, it seemed like Nikon got there first but within months Canon stole the show. But what nobody tells you is how much people had to effing “mod” their cams in order to make them digital cinema. Meaning, they had to wire up HDMI output jacks to the sensor so they could record with a “studio monitor” device like the Atomos Ninja https://www.atomos.com/ninja/, cuz’ internally Sony and Canon had proprietary video formats and you had “NO” control over any setting for digital cinema. Atomos changed that dynamic by saying, “Camera maker, give me your sensors, I will do the rest” and that’s the way it worked. Atomos could record and save out to ProRes or Avid formats all day long. Any framesize/frame rate you wanted. Digital cinema finally came down in price. However at the end of the day those cameras were STILL digital still cameras.
And that’s what brings me to this video right here: https://www.youtube.com/watch?v=X1u-9YqrIJc
‘Cuz the thing is,… I noticed all the digital still videographers (even the old mainline Sony/Ikegami/Panasonic/JVC guys) when the adopted SLRs, always used 2 cameras? That became a thing suddenly. Always two cameras when in the old days it was one Betcam SP shoulder mounted camera or tripod mounted. SLR videographers were buying those add-on rails for their tripods so they could mount two heads on one tripod and have 2 cameras. Again, why two? Why did that become a thing when still cams became video digital cinema cams?
It was because of the HEAT! OMG! the heat. Those kick-ass, high-rez CMOS sensors were not like our old CCD friends from years ago. And the SLR cam folks weren’t old-line video cam makers either (Although Canon “did” have some amount of Video cam knowledge it could have leveraged). So I say Canon of all the manufacturers should have known better once SLRs were being used for digital cinema. Except for Canon, all the old video cams makers know one thing from the “tube” era was that Saticon, and Vidicon tubes run hot, and they used to have water coolers built-in to those giant monster TV studio cams back in the 50s and 60s and 70s (before CCD took hold in the 80s with Betacam SP). So why did this happen? My guess is, that aside from Canon, the SLR still cameras are made by old photo still cam manufacturers. So the poor digital cinematographers all said, “Well, let’s workaround the thermal overload on these cams like a bucket brigade putting out a fire.” Use cam #1 until it overheats, the use cam #2 while cam #1 cools off. All of this just to record at 4K instead of George Lucas’s measly 2K for Jar Jar Binks.
Which brings us to ALL the mods the British guy in the YouTube video made (DIY Perks). That guy more or less in 20minutes shows WHY everyone bought 2 still cams to do the work of single, solitary older generation VIDEO camera. It proves Canon doesn’t care, Nikon doesn’t care. They charge you $4,000 for that Canon R5 with 8K full frame sensor with a gorgeous dynamic range LUT. But then they don’t cool it for shit. So now you have to buy 2 of these hunks of metal and run them for 20 minutes each. Stupid, stupid, stupid.
And that British guy in the video demonstrates how easy it would be for Canon to add adequate cooling to their $4K cam to get access to all it’s neato keen features (stabilization on teh sensor, dynamic range, full frame, 8K!)