ADVERTISEMENT

Tech Reviews: Blackmagic’s URSA Mini Pro 12K, Lenovo’s ThinkPad P16 Gen 2 and Nvidia’s RTX 5000

Blackmagic Design’s URSA Mini Pro 12K

Blackmagic Design first released its URSA Mini Pro 12K camera back in the summer of 2020, making a giant leap from 4.6K up to 12K. This past summer, the company delivered its new URSA Mini Pro 12 OLPF. As expected, there are a great many similarities and a few differences. So, here’s my recap:

The USRA 12K has a CMOS sensor the size of a Super 35 film negative. This is great because not only does it make for a lot of resolution (up to 12,288 x 6,480), but it also means that the lenses that were meant for cinema cameras will act like those lenses. 35 millimeter on a Blackmagic will basically have the same field of view as if you put it on an early Panaflex Millennium or an Arriflex 435. Also, when changing your acquisition resolution, the frame doesn’t crop in, so your FOV doesn’t change. The exception is when shooting at higher frame rates at 4K and 6K; then it does crop to a Super 16. Speaking of higher frame rates, the URSA records 12K at 60fps, 4K/6K/8K at 120fps and 4K at 240fps. For my two cents, 4K is plenty — but there are some benefits to shooting 12K, such as for pulling greenscreen mattes and then mushing comp edges organically when scaling down to 4K for delivery.

The chip does not use a Bayer pattern and shoots only in the proprietary Blackmagic Codec (BRAW). This might sound inconvenient for those used to ProRes and others, but there are good reasons. The codec is really compact; I converted BRAW to ProRes and found the ProRes to be two to three times the size. BRAW has 14 stops of dynamic range and pretty reasonable file sizes, even at lower compression ratios. Plus, the codec is designed to work natively within Resolve — so there is a synergistic partnership between the acquired footage and the software using it. But fear not: There is a BRAW plugin so that Avid Media Composer, Adobe Premiere/After Effects and Sony Vegas can work with the footage. Nuke Studio/NukeX also supports BRAW as of Nuke 13.0 — but tread lightly, because as of this writing there is a bug that is being worked on regarding the Foundry side for BRAW files coming from the OLPF. I’m sure it will be worked out by the time this review comes out.

What makes the OLPF flavor of the camera better than the original 12K is literally the OLPF (Optical Low Pass Filter). This filter, which is matched to the frequency of the sensor, comes in handy for Virtual Production. High-frequency LED matrices in video walls will frequently cause moiré (large-scale interference) patterns, especially with sharp glass and high-resolution sensors. The OLPF reduces the banding you might pick up when shooting in LED volumes. Additionally, the OLPF uses updated IR filtering that plusses up red color response.

I have a few minor quibbles: The body is a bit heavy, so it’s not really a run-and-gun type of camera. The LED display is great, but I would love it if it had a broader range of motion so a director could view the display without moving the camera operator. The camera doesn’t have an HDMI port to plug into an additional monitor (which would negate the previous note) without setting up an SDI to HDMI conversion. (Or maybe I’m just lazy!)

At $6,385, it’s an investment, but still reasonable compared to other cinema cameras. However, if you aren’t shooting on LED stages you could get by with the older model, which you can find from $4,000 to $5,000.

Website: blackmagicdesign.com
Price: $6,385

Lenovo ThinkPad P16 Gen 2

Lenovo’s ThinkPad P16 Gen 2

Reviewing Lenovo products is somewhat new to me, but you got to start somewhere!

When working with Lenovo, we specifically spec’d out a ThinkPad P16 Gen 2 that wasn’t going to have a Gen III Hemi with NOs. You can certainly throw a lot into this little chassis, but we wanted to try and be more conservative and put together something that would be in most people’s price range.

What we put together was an AMD Ryzen 5 PRO 7540U processor with an integrated AMD Radeon 740M card. (We were losing the RTX option, though, because we wanted to pair the AMD products and have them complement each other.) We had 32GB of RAM, which is enough for most things I might be doing. However, it’s unfortunately soldered in, so there was no upgrading it. The internal drive is 1TB (remember when a terabyte didn’t even exist as a drive?). And the display is “just” 1,920 x 1,200 (non-touch) — although you can display more on external monitors through the USB-C and USB4 ports. This particular setup is priced at about $1,200, which is pretty modest.

The chassis has a small footprint and is surprisingly lightweight, which is great for throwing in the backpack and heading to, say, the coffee shop, or wherever you take your work. Having three USB-3 ports, one USB-C and one USB4 feels like plenty (although I’d probably carry around a hub with me). The USB4 is your power port connected to a small 65W power supply. It can also act as a charging port for your phone.

I’m running some conversion and rendering software that is basically pinning the CPU, and while the fans have spun up, they aren’t terribly loud. And the bottom of the laptop is remaining tolerable to the touch. Setting it on your lap might not be the best idea because you’d be blocking the airflow — and you just might start to feel the heat.

The speeds in Photoshop, After Effects, Premiere and Resolve are totally acceptable, even if I am not benefitting from having an RTX card inside (which is an option if you go with an Intel processor). The display is crisp, and at 16 inches, I don’t see a ton of benefit from getting up to UHD. I’d opt to go out to an external monitor. However, the sound is a bit tinny on the little speakers, so I would most likely choose to listen on headphones.

One nit to pick has to do with keyboard configuration. I’ve heard a lot of people complain about having the number keypad on their laptop because it shifts the center of your keyboard over; this was not my problem. It’s the switching of the Function and Control keys that gets to me. The muscle memory is so ingrained that putting those keys in different places makes copying and pasting a mental chore. So, if you can change anything, Lenovo, please put those keys back where they should be!

Website: lenovo.com

 

Nvidia RTX 5000 GPU

Nvidia’s RTX 5000

This past summer, Nvidia came out with its ADA generation series for the RTX cards. I got to play with the RTX 5000 — not to be confused with the RTX A5000. (Please don’t get me started on these naming conventions!) The card is the mid-tier model of this generation but still has a ton of power and 32GB of RAM, which is certainly no slouch.

The Ada Lovelace architecture (named after mathematician and writer Augusta Ada King, Countess of Lovelace, who is credited with laying the groundwork for modern computing) doubles the speed of the ray-tracing cores from the previous gen (which is most important to my rendering mind). But the Tensor Cores, which drive the AI-power, has a four-times increase in inference performance. The CUDA cores double the single-precision floating-point operations.

However, it’s not all about raw computing. The architecture is set up with an optimized AV1 stack to accelerate video transcoding, streaming, videoconferencing, AR, VR and AI vision. It can host more parallel video streams and speed up JPEG decoders for computer vision.

This might sound like a lot of technical gobbledygook, especially for those that just want to know “how much faster is it going to make my workflow and renders.” But I’m specifically trying to separate these RTX cards from consumer-based gaming cards. The end goal is different: The cards we are talking about are enterprise-level graphics cards. Yes, they are great at ray-tracing and making pretty pictures. But they are also meant for AI training, computer vision and simulations.

In benchmark comparisons between last year’s RTX A6000 with 48GB of RAM and this year’s RTX 5000 with 32GB of RAM, they are remarkably similar. Outside of load-time, the ray-trace benchmarks are really close. This is an indication that newer chips are doing their job and performing the same functions faster with less power. And quite literally too: The RTX A6000 max power consumption is 300W, while the RTX 5000 is 250W.

The price of the RTX 5000 hovers around $4,200 to $5,400 (about what you could find the RTX A6000 for), which is substantially more than GeForce cards, but you have to ask yourself what your end goal is. Additionally, you need to assess how much math you are going to throw at the card. In order to take advantage of GPU rendering, the scene has to be loaded into RAM, and if the majority of your work is going to be greater than your available RAM, then you need more RAM, which means a bigger card — or more cards.

The RTX 5000 Ada is a strong addition to the Nvidia lineup. It’ll definitely power through most things you throw at it. And if you are into AI stuff and calculations that dig into the Tensor Cores, you’ll want to check out these kinds of cards.

Website: nvidia.com
Price: $4,000-$6,800

 


Todd Sheridan Perry is an award-winning VFX supervisor and digital artist. You can reach him at todd@teaspoonvfx.com.

ADVERTISEMENT

NEWSLETTER

ADVERTISEMENT

FREE CALENDAR 2024

MOST RECENT

CONTEST

ADVERTISEMENT