Recently in Technology | Main

January 19, 2014

Hong Kong in 4k on a Galaxy Note 3???

I recently visited Hong Kong on a business trip, and actually ended up with a rare free day to do something. Unfortunately, my DSLR is in for repair right now, and the only camera I had with me was the one on my phone – the Samsung Galaxy Note 3.

The Note 3 impressed me on its spec sheet by being able to shoot full motion video at 3840×2160, or UHD resolution. (Sometimes also called 4K) However, I was pretty skeptical whether this tiny camera could really be useful at that resolution. The tiny form factor and the sensor size didn’t seem capable of such resolution. Still, since it was the only camera with me, I figured it was worth a test. I decided to shoot everything in UHD, and then deliver an edit in both UHD and 1920×1080, taking advantage of the higher resolution source material by panning/zooming around.

I shot in a wide variety of lighting conditions, both during the day and at night, taking full advantage of my free day, and a little bit of extra time the day afterwards.

Here’s the cut, posted on YouTube in full UHD resolution:

HK4K: The Galaxy Note 3 Edit

Honestly, I was pleasantly surprised what I could accomplish with this little camera. It’s probably too noisy and compressed to be of much use to anyone really needing to master in 4k, but I enjoyed the latitude in reframing shots for 1080p. There are a couple of quirks that would keep me from using this in real, paid production work, but it would definitely be a camera that I’d set up if I needed extra coverage of an event.

 

The camera DESPERATELY needs some kind of stabilization. Rolling shutter is bad on this sensor, and the H.264 recording codec doesn’t like a huge amount of motion. I used 2 tricks to help fix this – first, I found a small cell phone clamp at a local electronics store for less than $10.

Tripod mount and mini-tripod for mobile phones.

Tripod mount and mini-tripod for mobile phones.

This one even came with a small tripod. The second technique I used was to press the phone onto the glass of a window. It worked great on several shots on board the trams. You get whatever the glass is facing, and don’t have any other options, but the image is rock-solid stable.

Auto-iris cannot be turned off on the phone, so I had to fix this a few times in Premiere Pro with animated Brightness/Contrast filters.

There are also moments where horizontal lines will appear in the frame, almost like the codec just couldn’t handle the content. These were rare, but visible in a few shots.

The most annoying problem, which I couldn’t fix, and you’ll see in the final video, is what I call the “focus Pop” or autofocus “snapback.” About every 30 seconds, you’ll see a moment where the entire picture seems to go “pop”. Seems like the lens just gets tired of holding the same focus for too long, and, for lack of a better phrase, it “blinks.” This is a real pain – I’m hoping Samsung has a firmware fix, or some Android developer takes a look at it. The only solution I found for this was to edit around it as much as possible. I left it in a few shots to show people what I’m talking about, and it kind of added to the future feel of the video.

Trying to play the clips back in QuickTime was painful. The MP4 files this generated wouldn’t play back smoothly from the operating system. However, the Media Browser in Premiere Pro performed well with the shots. So did the hover scrub thumbnails in the project bin.

(As an aside – set yourself up a “junk” Premiere Pro project on the desktop, and use it ONLY for media browsing. Makes life so much easier.) 

The footage drops direct into Premiere Pro without any need for transcoding. I found that setting playback at half res was perfect for my 2011 MacBook Pro. Premiere Pro uses a pretty straightforward way of adjusting quality/performance. In the Source and Program monitors, there’s a menu for visible resolution – full, 1/2, 1/4, etc. Set it as high as it’ll go, without dropping frames, and you’ve balanced your performance for your hardware.

In Premiere Pro, one thing you’ll notice right away is that the frame rate of the clips doesn’t always conform to 29.97fps. The majority of clips actually came in at 30.02 fps, and some had other frame rates. I tried using the default setting at first, but ran into trouble with some of the speed changes later on. Don’t use the trick of dragging/dropping clips onto the New Sequence Button. The wonky frame rate will create a timeline with a time base of 10.00 frames per second. For best results, I selected all the clips, and used Modify Clip -> Interpret Footage to set all the clips at 29.97fps. This can be done once and forgotten about – it’s not a rendered function. This didn’t affect the visible playback rate of the clips, which makes me think there’s something odd about the default rate in the clips. Maybe Premiere Pro isn’t detecting it properly, or the metadata in the clip is written wrong.

Rather than deal with the nonstandard frame rate, I created a custom sequence preset at 29.97fps at 3840×2160 resolution, and dropped all the clips into that.

The Warp Stabilizer was used on multiple handheld shots, trying to get slow, smooth motion. Don’t even try it for really shaky stuff – the wobble and the blurring are too much even for the Warp Stabilizer. But holding a high shot, and trying to be steady, worked really well with warp stabilizer. You won’t be able to tell those weren’t tripod shots.

I had to get artistic with the sunset – I’m still not happy with it, but I only was able to shoot it once. No second chances. The camera shoots really wide angle, and the time-lapse I did lacked any focal point for the eye, so I gave up trying to deliver in 4k at that point. I added a cut and zoomed in 200% to get a decent framing. Even at 1080p, that shot is soft and artifacted. I added some noise to balance out the blockiness, but it’s still visible.

I did some minor color work by using the new “Direct Link to SpeedGrade” function. I will say that my 2011 17″ MBP kept up admirably up until this point. SpeedGrade was great, but then bringing the project back to Premiere forced me to render some sections of the timeline before it would play back. The Lumetri Effect that SpeedGrade adds is heavy, and my 3 yr old GPU wasn’t up to the task. (Would love to try this on a Retina MacBook. Hint Hint to my boss! :-) )

Oh, one important note on rendering previews – In Premiere Pro CC 7.2 and higher, you can now edit the Sequence Settings! I was able to change the Preview settings to QuickTime, ProRes, 3840×2160, and get full 4k previews for the rendered sections of my sequence.

All in all, this was a fun project to play around with. I hope that someone fixes the lines and the “focus pop” in the camera – here’s hoping that it’s just a firmware issue or a camera app issue. If those two problems are addressed, this will make an excellent pocket “2nd coverage camera” for 1080p shooters, with lots of room to reframe what you get.

 

April 24, 2013

NAB 2013 Sneak Peeks

Wow. Where to begin? NAB this year was one of the best shows I’ve attended in a long time. Attendance was up, great crowds at the Adobe booth, and the reactions to the sneak peeks were very very positive.

There are a lot of different videos showcasing what we showed at NAB here: http://tv.adobe.com/show/adobe-at-nab-2013/

 

Premiere Pro is adding so many new features, that some of my favorites have been overlooked:

1. It will now be possible for 3rd-party effects to be GPU-accelerated. Yep, for the first time, 3rd party effects can take full advantage of the Mercury Engine’s real-time performance. The engineering group is working with plug-in makers now to show them how it’s done. Can’t wait to see what comes from that.

2. Smart Rendering is now possible for many new formats. ProRes? Yup! DNxHD? Yup! Plus many more – including some added flavors of QuickTime. As soon as I have a full list of formats, I’ll post it. This is going to speed up a lot of renders by as much as 10 times, and will make the “Use Previews” function in final output render quicker too.

Those are just 2 examples of the multitude of new features coming soon – keep your eyes open for more examples coming soon.

 

December 9, 2012

Avoiding RAM Starvation: Getting Optimum Performance in Premiere Pro

Something I wanted to share for all you “build-it-yourself” users. Recently, I helped a customer build out a really beefy system – 16 physical cores, plus hyperthreading, 24 GB of RAM, Quadro 5000, etc.

The system wasn’t rendering well at all. Bringing up the task manager was showing each processor only hitting about 19% – 20%. My MacBook Pro was actually handling the same tasks MUCH faster.

This was a classic case of Processor RAM Starvation. With Hyperthreading turned on, the system was showing 32 processors, and there wasn’t enough RAM to drive all those processors! Some processors had to wait for RAM to free up, and the processors that finished their calculations had to wait for THOSE processors to catch up. It’s a really bad state to be in. With multiple CPU’s, everything has to happen in parallel, so when some threads take longer to finish, everything comes to a screeching halt.

I turned off hyperthreading, and suddenly, the system started to just FLY – all the CPUs were being utilized effectively and roughly equally. Render times were over 10-20x faster.

I can’t stress enough the need to ‘balance’ the system to get proper performance. There’s never a danger of having “Too much RAM”, but too many processors is not necessarily a good thing!

You can check this on your system – using the stock effects, when you render previews or render your output files, you should see all the CPU cores being utilized. They won’t exactly be used the same amount, but roughly, they all should be about the same for common tasks.

Also, a BARE MINIMUM amount of RAM I recommend for Premiere Pro is 1GB per core. If your budget can afford it, 2GB per core is pretty optimal for a Premiere Pro system. 3GB per core isn’t required, but isn’t a bad thing. If you are trying to decide between 4 cores, 8 cores, 12 cores, or 16 cores, let the amount of RAM be your guide – look at the cost of 2GB per core, and pick the CPU accordingly.

UPDATE: Some of the feedback I’m getting on Twitter seems to believe that this points to Premiere Pro needing extreme amounts of RAM. No – that’s not it at all. RAM needs to be balanced with number of Cores. The days of just “getting the best CPU” are past. Modern processors are actually multiple CPUs on a single chip, and each one needs to allocate its own chunk of RAM to operate at peak efficiency.

On a dual core processor, 4GB of RAM is a reasonable amount of RAM to have, and 6-8 GB would be pushing into that “it ain’t a bad thing” category. A 4-core processor runs great on 8GB of RAM, which is what I have in my MacBook Pro. RAM is really cheap nowadays – I think I just paid about USD$40 for 8 GB on my son’s computer, and 16GB is less than $80 right now for a desktop system. Remember, it’s about balance, people…

SECOND UPDATE: If you’re an old Classic Auto tinkerer, like I used to be, think of it this way – the CPU is like the engine block, and the cores are the number of cylinders. Each cylinder needs fuel and air delivered to it. RAM is like the carburetor – it provides what the cylinders need. But, you have to match the right carburetor for the size of the engine. A wimpy carburetor on a V8 engine is a disaster – low horsepower, and because it’s heavier, it’ll be outperformed by a properly tuned 4-cylinder engine.

Clear as mud? :-)

December 2, 2012

Premiere Pro and QuickTime and Nikon, OH MY!

This post is going to get a little techy and geeky – I want to take a minute and explain the relationship between Premiere Pro and QuickTime for a minute. I feel it’s important to understand it, so that you’ll also understand why it’s sometimes necessary to change file extensions on some .MOV files in order to get them to play properly in Premiere Pro. This mostly seems to affect Nikon owners, but can be a workaround for certain other types of cameras as well.

Premiere Pro actually has its own built-in system for decoding files, and Adobe works with the camera manufacturers and codec owners to ensure that the majority of cameras and codecs are supported directly.

For certain codecs, like H.264, there are a number of wrappers for the file – an H.264 file can come in a QuickTime .MOV file, an .AVI file, or an .MP4 file.

In the case of a QuickTime .MOV file, Premiere Pro will generally let QuickTime handle the decoding of the file, unless there’s metadata in the file header that suggests otherwise. If there’s nothing in the header, it just hands off the file to QuickTime, and the performance is reliant on QuickTime for decode and playback. This is required for a number of codecs, since there are many QuickTime codecs that only exist inside of the QuickTime framework. (ProRes, for example.) And, the performance can be very good with QuickTime files. However, it’s not the case with certain codecs. For example, decoding H.264 files with  QuickTime can sometimes cause less-than-ideal performance in Premiere Pro. Some of the QuickTime codecs are really more optimized for viewing and playback, rather than editing.

In the case of Canon DSLR files, there’s something in the file header. Premiere Pro can recognize that the clips came from a Canon camera, and bypass QuickTime. This enables Premiere Pro to have smooth playback of DSLR files, and get better dynamic range from the clips. Premiere will use its own built-in decoder, which is optimized for editing, and respects the extended color used by the Canon cameras.

For this reason, it’s sometimes necessary to force Premiere Pro to bypass QuickTime for a certain set of files. I tend to see this the most with certain types of Nikon DSLR cameras. For whatever reason, Premiere Pro cannot detect what camera these .MOV files come from, and it just hands off the decoding of the files to QuickTime, usually with less-than-stellar results.

For this reason, when I see a problem with a .MOV file performing badly within Premiere Pro, I first determine the codec used. If it’s some type of MPEG/H.264 derivative, I rename the file extension manually in Finder or Windows to .MPG. This will force Premiere Pro to use the built-in MPEG decoders to decode the file, and will usually help playback/performance a great deal.

If you run into this problem, and deduce it’s from an H.264 file in a .MOV wrapper, you can use Adobe Bridge to batch rename files very quickly, and without re-encoding the files. All bridge does is change the 3-letter extension of the existing files, so it can plough through hundreds of files in minutes.

In Bridge, select all the files you wish to rename, and go to Tools – Batch Rename. Then, set up the Batch renaming tool something like this:

 

October 9, 2012

On CS6 and CinemaDNG

There’s been a big resurgence in the CinemaDNG format because of the new Black Magic Camera. Let me take a moment to explain where Adobe is at with supporting it.

First off, go read Todd Kopriva’s most excellent blog posting here: http://blogs.adobe.com/toddkopriva/2012/09/cinemadng-in-after-effects-cs6-and-elsewhere.html

Let me give you my take on what’s happened with Premiere Pro support.

There was never “official” support for CinemaDNG in past versions of Premiere Pro. There was an experimental plug-in for Premiere Pro CS5 and CS5.5 that was up on the Labs site. CinemaDNG is a HEAVY format to edit directly, and the Premiere Pro engineering group was never really happy with the performance of the experimental plug-in. And, during the CS5/5.5 time frame, there wasn’t a huge interest in the format – only a tiny handful of cameras supported it.

During the CS6 development period, it was decided to not use engineering resources to make a new plug-in for CinemaDNG. The number of downloads of the CS5 and CS5.5 plug-in didn’t justify it. The existing plug-in for After Effects actually shipped with AE, and that support was continued. Also, SpeedGrade was added to CS6, and the SpeedGrade software was actually the first program ever to support CinemaDNG.

The week Adobe announced Premiere Pro CS6 at NAB, Black Magic Design introduced their new camera. They implemented CinemaDNG using the open documentation that’s freely available. It’s an open standard that Adobe gave to the community. Black Magic kept this camera a closely guarded secret, and really surprised the industry – including the Premiere Pro engineering group at Adobe! :-)

So, where does this leave CinemaDNG support in Premiere Pro? Well, today, there isn’t support for CinemaDNG in Premiere Pro, but Adobe Engineering is listening to what people request. Want to see it in a future version? Submit a feature request here: http://www.adobe.com/go/wish

If you currently own Production Premium CS6, and have the sample files of CinemaDNG footage from the Black Magic Camera, make sure your copy of SpeedGrade has the latest updates and try them out there – SpeedGrade works directly on the RAW files, and the playback/performance is VERY impressive. Jon Barrie recently posted a quick side-by-side comparison of a grade done with the BMC footage, and Speedgrade handles playback of the footage in real-time, even with masks, primaries, secondaries, etc, all stacked together. Check out the demo video here: https://www.youtube.com/watch?feature=player_embedded&v=akLQQ1h10WY

So, there’s a lot of misinformation floating out there that somehow “Adobe killed CinemaDNG.” That’s far from the truth! If you see someone saying that, refer ‘em back here! :-)

 

October 1, 2012

Adobe Anywhere for Video

I’ve been quiet about a new technology coming from Adobe, called Anywhere for Video, not because I didn’t have anything to say. Rather, I’ve been trying to keep the excitement to myself until the time was right. Every time I get to play with the technology, I end up giggling hysterically, since my brain keeps trying to tell me what I’m doing shouldn’t be possible.

If you haven’t heard of Adobe Anywhere yet, start by watching this short-but-informative video: http://tv.adobe.com/watch/adobe-anywhere/introducing-adobe-anywhere-for-video/

For a more detailed technical understanding, read this post by John Montgomery from FXGuide: http://www.fxguide.com/featured/new-tech-adobe-anywhere/

I first got the opportunity to work with a VERY EARLY version of this technology back in Feb/March of this year. Keep in mind that this was an early “proof-of-concept” version, so I need to stress that what I played with may not represent the final product. It didn’t even have a name at that time. But what I got to touch was mind-blowing. I sat down at Premiere Pro, and began to edit. This was XDCAMHD 50 422 footage. JKL playback in the Source monitor was super-smooth. Inserting clips onto the timeline was super-smooth. Adding effects and transitions between clips worked just like I expected they would. The kicker? The footage was on a server over 1000 miles away. Quality during playback was nearly indistinguishable with the original media, and if I paused on a frame and blew it up full-screen, it WAS the original frame.

There are very few technologies that make me cackle maniacally, but Anywhere did it. Many Many times.

At NAB 2012, we did the first public-facing demonstration of this early collaborative editing technology. I edited onstage in Las Vegas, and then handed off what I was working on to Dan working up in Seattle. The footage we were editing on was on a server in San Jose, California. And the time to pass an edit back and forth took seconds.

I’ve since showed the technology around Asia-Pacific, and it gets the same reactions that I experienced – this is the way editing remotely should be, and the way collaboration should be. Anyone who has had to download massive files, or waited around for an overnight delivery, can related to the power of Adobe Anywhere.

Anywhere also has fun implications across shorter distances. Doing massive amounts of layers in a multicam edit? Anywhere sends a single “stream” to your local machine, eliminating traditional bandwidth concerns across a facility network. Need 30 students working on the same source media? No need to copy to 30 workstations, when Anywhere can serve up the footage without massive fibre-channel installations.

While Adobe Anywhere has now been officially unveiled, it’s not going to be available until sometime in 2013. The most up-to-date information on pricing or availability will be at the Anywhere site here: http://success.adobe.com/microsites/adobeanywhere.html

It’s gonna be big. :-)

August 5, 2011

ProRes Workflow in Premiere: Advanced Options

I’ve already seen some great questions out there regarding my last tutorial. There are a couple of advanced options that I skipped over in order to get the basics out there for everyone.

Question: What about using an AJA or BMD card with these ProRes Presets? I thought I had to use manufacturer-specific presets to get a reference video output.

Answer: Not so! To make a preset that take advantage of your monitoring hardware, you need to click on the Playback Settings button in the Sequence Settings panel:

Inside the Playback settings, you can choose your display device under Realtime Playback here:

(I’d love to show you a screen grab of this, but since my desktop computer is in a cargo container halfway across the Pacific Ocean, you’ll have to trust me.)

This setting is saved as part of the timeline preset, and can also be turned on later by selecting a sequence, going to Sequence – Sequence Settings, and clicking the Playback Settings button found there.

 

Question: I’ve heard that Premiere Pro only uses 8-bpc color. How does this affect my 10-bit ProRes files?

Answer: Premiere Pro can actually work in 32-bpc floating point color, which would be the preferred mode for anyone working with 10-bit source media. In order to use this higher color bit depth when rendering preview files, you need to turn it on here in the Sequence Settings:

 

This setting can also be changed on any existing timeline sequence by selecting the sequence, and going to Sequence – Sequence Settings.

If you are doing precise color work, you also may want to limit yourself to the effects that have the “32″ icon next to them. These are the effects that are full 32-bpc, floating-point color effects.

 

Question: Okay, now that you’ve explained what Maximum Bit Depth does, what about Maximum Render Quality?

Answer: That affects how sharply Premiere Pro scales clips. For example, if you work with 1080p media, but put it into a 720p timeline, and resize/reframe, then you are scaling the clips in size, and would definitely see better quality with this turned on. The only downside is that it increases the render time. It’s also a setting that you can turn off and on later.

 

 

August 4, 2011

A ProRes workflow end-to-end

With the radical change going on right now in the world of Final Cut Pro, I’ve had some FCP7 users ask me about maintaining an end-to-end ProRes workflow in Premiere Pro. There are questions whether it’s even possible. Well, I’m here to show you it IS possible, and how to make it go.

What do I mean by an “end-to-end ProRes workflow”? This means ingesting ProRes clips, dropping them right to the timeline, rendering previews when necessary to a new ProRes file, and outputting back to a ProRes master. While Premiere Pro works great with a wide variety of native camera formats, there are times when this workflow is a good idea. For example, using an AJA KiPro for capture, shooting with the ARRI Alexa, or working with ProRes media from an FCP timeline.

This particular workflow does only work on a Mac system that has the ProRes encoder installed. There are a couple of ways to get this component, but unfortunately, they are not free. For most people using this workflow, you probably already have Final Cut Pro 6 or 7 installed, so you won’t have to worry. If you’re equipping a new Mac, you can also buy Motion 5 for under US$50 from the App Store. This will also get you the necessary codecs.

For Windows users, unfortunately, there is not a ProRes encoder component available. But that doesn’t mean you can’t use ProRes files. QuickTime for Windows does include the decoder. It just means that, if you render preview files in the timeline, you’ll need to use another codec. So, technically, it won’t be a “full” ProRes workflow, but you’ll still get great results. On the bright side, Windows users have more options for Nvidia cards, which is a worthwhile investment, since it ELIMINATES the need to render previews in most cases anyway. Also you won’t be able to output back to ProRes. Until a ProRes encoder is released for Windows, that’s sadly going to be the case.

What makes this possible is the flexibility of Premiere Pro to input and output in pretty much any format that the system has access to. Unfortunately, since Premiere doesn’t ship with ProRes encoding components, this’ll take a bit of time setting up. But, once it’s set up, using it is really easy.

Setting Up Timeline Presets:

You’ll need to first set up some timelines that use ProRes as the Preview File format. It’s a good idea to create as many as necessary for the different resolutions and frame rates you’ll be working with. For this tutorial, I’m going to show you how to make a 1080p/24 timeline preset.

Open up Premiere Pro, and set up a “dummy” project. We just need to have a blank project open to access some of the settings in Premiere. In this picture, I’m using a project called “Untitled” that I use for stuff like this.

My universal "Untitled" New Project.

My universal "Untitled" New Project.



In the New Sequence panel, ignore all the existing presets! Most people assume incorrectly that these presets are the only formats that Premiere Pro can work with. I’m going to take you into the “guts” of how a Premiere Pro timeline is set up. Find the Settings Tab near the top:

 

Find the Settings Tab

Find the Settings Tab



 

Custom Sequence Settings panel - where the magic happens...

Custom Sequence Settings panel - where the magic happens...



This is where the real power and flexibility of Premiere Pro lies – Premiere can essentially edit any format or file type that it can decode, and this includes working with QuickTime files.

What you’ll want to do here is to start by making a Timeline preset for ProRes 422 at a resolution of 1920×1080, 23.976fps. There are a lot of setting in here, so let me list them:

Editing Mode: Custom

Timebase: 23.976 frames/second

Frame Size: 1920 horizontal, 1080 vertical (should show 16:9 aspect)

Pixel Aspect Ratio: 1.0 (square pixels)

Fields: No Fields (Progressive Scan)

Display Format: 24fps Timecode

Audio: 48000 Hz

Now, up until this point, you’ll notice that nothing is format-specific. All we are doing is setting up the size and frame rate all our media will conform to in the timeline. That’s how Premiere operates – in general, it is format-agnostic, meaning that you can mix and match ANY format on ANY timeline. The main settings for any timeline are just resolution/frame rate settings, period.

The bottom half of the panel is where formats start to play a role:

Video Previews

Video Previews



The Video Previews setting only affects things when you render the timeline. When you are playing back unaltered video clips on the timeline, it has no effect. If you are using GPU-accelerated effects on your clips, again, this preview file format has no effect. But for people using non-accelerated effects, or working on a system without GPU acceleration, you probably will want to render the red-bar portions of your timeline.

Set the Preview File Format to QuickTime (Desktop) and set the Codec to Apple ProRes 422. Also, make sure the Width and Height match the other timeline settings.  Now, STOP! BEFORE you hit the OK button, locate the Save Preset button:

Save your new Preset!

Save your new Preset!



 

To make this easy, you’ll want to be as descriptive as possible in saving your preset. I recommend using a naming convention, and WRITE IT DOWN as you make these. That way, all of your ProRes timeline presets will have easy-to-understand, logical names. I’m going to call this one “ProRes 422 1080p24.”  If you need some additional descriptive help, make whatever notes you like in the Description field. This information will be visible each time you select the preset.

Once you have saved your preset, Premiere Pro will take you back to the Sequence Presets panel, and you should see your shiny new preset appear at the bottom, in the Custom folder:

Your shiny new ProRes 422 1080p24 preset!

Your shiny new ProRes 422 1080p24 preset!



 

Now that you understand the steps to create your first ProRes preset, you’ll want to repeat these steps again for each type of ProRes format, size and resolution you typically work with. Go back to the Settings tab at the top, and modify the settings again to make another preset. Then save and name the second new ProRes preset.

Back to the Settings Tab. Wash, Rinse, Repeat.

Back to the Settings Tab. Wash, Rinse, Repeat.



You may want ProRes 422 (HQ) presets, 1280×720 presets, or frame rates other than 23.976fps. This is up to you, and totally dependent on what type of ProRes clips you are working with. On my system, these are the presets I’ve created:

Just a sample of potential ProRes presets you can create.

Just a sample of potential ProRes presets you can create.



 

Setting Up Output Presets:

Just like the Timeline Presets, we will need to set up some Export Setting Presets for ProRes as well. To do this, we need a timeline with at least one clip in it so that we can access the Export Settings panel.

Go ahead and choose one of your ProRes Timeline presets so that the full Premiere Pro interface opens up. Import a clip, any clip, and drop it onto the timeline. If you have no clips on this system, you can just create a Countdown Leader file by choosing File-New-Universal Counting Leader. Drop it onto the timeline.

Now, with the timeline selected, go to File-Export-Media.

Export Settings Dialogue

Export Settings Dialogue Box



 

In the upper right of the panel, Choose Format: QuickTime. Then, click on the Preset button, and look at the puny list of QuickTime presets that Premiere Pro ships with. I’ve had several people assume from this list that Premiere Pro can only export DV format QuickTime files! NOT SO!!

Is this all QuickTime can do? OF COURSE NOT.

Is this all QuickTime can do? OF COURSE NOT.



To access other QuickTime formats and flavors, including ProRes, we need to create additional QuickTime Presets. These are one-time setups – in the future, we can just choose the preset and output without additional setup.

To get started, head down to this part of the Output Settings screen, and click on the Video tab:

Where the Output Magic happens...

Where the Output Magic happens...



We are going to make a matching Output Preset for our earlier ProRes 422 1080p24 Timeline Preset.

Change the Video Codec to Apple ProRes 422.

Change the Width to 1920.

Change the height to 1080.

Change the Frame Rate to 23.976

Change the Field Type to Progressive.

Change the Aspect to Square Pixels (1.0)

Now switch to the Audio Tab:

Audio Settings Tab

Audio Settings Tab



Change the Sample Type from 16-bit to 24-bit. This will match most source ProRes files, but if you know that your source media uses a different sampling rate, use that.

Double-check your settings in the Video Tab one more time, and if everything looks good, save your preset by clicking here:

Click to save your Output Preset

Click to save your Output Preset



Again, make sure and give your preset a descriptive file name. I’m calling mine “ProRes 422 1080p24 (24-bit Stereo).”

Now, when it’s time to output, I can output a ProRes master that matches my source footage, my preview files, and my Timeline Settings.

Oh, one last tip for longtime FCP users – I’ve heard from FCP users that they are used to ProRes outputs taking less time. That’s probably because, by default, FCP uses the preview files, and just copies the frames into the output file. To make Premiere Pro mimic this behavior, you need to check this box:

Check this box to use your ProRes Preview files.

Check this box to use your ProRes Preview files.



 

Because a lot of native file formats are extremely lossy, Premiere, by default, doesn’t use the previews for final output. It prefers to re-render the effects in the timeline from scratch to get the maximum quality. But, with an end-to-end ProRes workflow, that’s not really necessary. So, using the preview files will speed up the output when going back to the same ProRes format.

You’ll want to make a number of different Output Presets following these steps – one for each format of source material. Again, I’ve created output presets that match the same timeline presets:

My ProRes output Presets

My ProRes output Presets



 

Whew! Okay, now the hard part is done! In actual use, now you can open up Premiere Pro any time, choose a ProRes timeline, and start editing. Previews will automatically be in ProRes format, and when you choose to output your timeline, you can output to the same ProRes format by choosing QuickTime, and then choosing the appropriate preset from your list of ProRes presets. End-to-End Workflow!

 

June 3, 2010

Understanding Color Processing: 8-bit, 10-bit, 32-bit, and more

Recently, I’ve been getting a lot of questions about the new icons in the Premiere Pro Effects panel, in particular, the “32-Bit” icon seen here:
32BitIcon.png

People have asked how these effects relate to the 64-bit Mercury Engine, if they are limited in some way? The answer is no – these icons mean that these effects use 32-bit floating point color, the gold standard of color processing.

Trying to understand video color precision is, well, a confusing task. There are so many different terms floated around – 8-bit and 10-bit color are used to describe cameras, while software talks about 8 bits per channel, 16 bits per channel, and 32-bits per channel “floating point” color. What does it all mean?? And, for the colorist, how does Premiere Pro handle color?? If these are burning questions in your mind, then read on.

When your camera processes the light coming in the lens into data, it has to assign a number to each of the colors being recorded. Each pixel gets its own set of numbers. Typically a low number means very little of that color – a pixel with an RGB value of 0,0,0 would be completely black.

If 0,0,0 represents black, then what represents white? Well, that depends on what we call the bit depth. The higher the bit depth, the bigger each number can get.

Let’s look at one color – blue. In an 8-bit world, blue is represented by a number than can be between 0 and 255. If I had a knob to adjust the value of blue, it would look like this:

8BitBlueKnob.png

Pretend that this knob makes a “click” every time you raise or lower the value, and there are 256 distinct “clicks” on the knob. This means that there are 256 “steps” between the brightest, most saturated blue, and no blue at all. A “middle-of-the-road” value of blue would be around 128 on this scale. Adjustments have to be made in whole “clicks” – there is no value of “127.5″ in 8-bit color precision.

Now, let’s look at 10-bit blue. A knob to adjust blue on a 10-bit device might look like this:

10BitBlueKnob.png

Wow! The knob now goes to 1023! This doesn’t mean that 10-bit blue is more saturated – it means that there are more steps to get to the maximum saturated shade of blue. A 10-bit value of 1023 is potentially the same color as the 8 bit value of 255. If you look at the two knobs in the pictures, you’ll see that the middle points of the knob are 128 and 512, and these values also represent the same color. There are just a LOT more subtle shades of blue selectable on the 10-bit knob. Again, there are no intermediate steps; no decimal values. There are 1024 distinct “clicks” on the knob.

Just for giggles, here’s what a blue control knob would look like for a 12-bit device:

12BitBlueKnob.png

Starting to see the pattern? The higher the color bit depth, the higher the color precision. A higher color bit depth means more variety, more choices on how much color can be used for each pixel.

Each pixel has more than just one color – each pixel usually has 3 numbers assigned to it – either RGB or something called YUV, which I’m not going to explain here. Each of these values are all the same bit depth – If a camera is an 8-bit recording format, each value for each pixel is an 8-bit number.

Now, 8-bit, 10-bit, and 12-bit color are the industry standards for recording color in a device. The vast majority of cameras use 8-bits for color. If your camera doesn’t mention the color bit depth, it’s using 8-bits per channel. Higher-end cameras use 10-bit, and they make a big deal about using “10-bit precision” in their literature. Only a select few cameras use 12-bits, like the digital cinema camera, the RED ONE.

Software like After Effects and Premiere Pro processes color images using color precision of 8-bits, 16-bits, and a special color bit depth called 32-bit floating point. You’ve probably seen these color modes in After Effects, and you’ve seen the new “32″ icons on some of the effects in Premiere Pro CS5.

8-bit processing actually works the same as 8-bit on the camera – each color for each pixel is stored as a value of 0-255. When adjustments to colors are made, they move up one whole number. So, for example, if I had a blue value of 128, and wanted to make a small adjustment, I could change the value to 127 or 129.

To enable more steps, there’s 16-bit color. 16-bit color is used by After Effects and Photoshop, but isn’t in Premiere Pro CS5. This works the same way, except each channel has 32,768 steps to choose from. Any time you drop an 8-bit source into a project using 16-bit color, the 8-bit values are remapped to their relative positions in the new color space. Zero stays zero, and 255 becomes 32768. The midpoint value of 128 in my last example would be mapped to 16384. That’s a whole lot more steps to work with – I can make much more subtle adjustments to the amount of blue in the image. 16-bit color also requires whole “clicks” – You still can’t use a decimal value, like 16384.5 to define a color value.

Both 8-bit and 16-bit color still suffer from the limits when you hit the top end or bottom end of the range. If I start to play with values, I can brighten and darken color values that push pixels to the maximum values. If colors are pushed too far, the values can run into a “wall”. Bright areas in 8-bit precision can turn into a big undefined blob. In 16-bit, there’s more latitude in the middle, but the extra precision doesn’t help if the values go over the top or the bottom end of the scale.

Here’s an example image where a brighten filter has been applied in an 8 or 16-bit image, and a darken filter has been applied to the actor’s head:

8BitWizard.png

If you look at the darkened area on the upper right part of the head, it’s just a big bright blob. All the detail is gone from the bright parts of the image, because all the values were maxed out, and the darken filter is just reducing these values to uniform shades of gray. The brighten filter brought all the pixels up to 255,255,255,(8-bit) or 32768,32768,32768 (16-bit) and the darken filter is reducing all 3 values by the same amount.

32-bit color gets around this by mapping the colors differently, leaving some room for over-bright and under-dark values. Instead of mapping 0 to 0 and 255 to some HUGE number, zero is actually mapped to the midpoint of the range. Zero is moved to the middle of the dial, and there are many steps in either direction above and below the old, mapped, “maximum” values.

Here’s the same image as shown above, but the processing was done in a 32-bit sequence:

32BitWizard.png

As you can see, the Darken filter is bringing back the detail, where the head and the light meet, because 32-bit floating point color can store the differences in the pixels, even when the values are pushed above 100% white.

When you see a 32-bit value mapped into a number, it’s expressed with a decimal. The standard range of colors are mapped to a value of 0.0000 and 1.0000, so 0 in 8-bit mode is 0.0000 in 32-bit, and 255 in 8-bit is mapped to 1.00000. The middle of the regular range is 0.5000. Thanks to the decimal place, there are still many thousands of steps to make adjustments, but now, there’s also the potential to go “out of range” and create values that aren’t visible.

Using the same knob graphic, as above, you’d have to think of 32-bit floating point as a smooth-turning knob, with an LED readout above it, like this:

FloatingPointBlue.png

With 32-bit float color, you have a near-infinite amount of values, and you can store over-bright and under-dark values as you manipulate and play with color values.

(BTW – the reason it’s called “floating point” is because the position of the decimal can change as needed. The maximum number is not 9.9999. It’s 99999. The decimal values get lost at the higher ends of the scales, but the ends of the scales are almost never used.)

Chris Meyer has a great tutorial that describes 32-bit float color here:

http://www.lynda.com/home/Player.aspx?lpk4=30903

Okay, now how does this relate to Premiere Pro? Some of Premiere Pro’s effects are full 32-bit floating point effects and have the ability to work in this high color precision. There’s a little secret to making this happen, however. Since 32-bit float color is more memory intensive, you need to turn on a small check box in the Sequence settings:

MaximumBitDepth.png

This “Maximum Bit Depth” check box enables your timeline sequence to work in 32-bit floating point color if those effects are used on the timeline. Keep in mind that this does increase the RAM used by Premiere Pro, so it’s recommended for higher-end systems.

If you have existing sequences, right-click on the sequence in the bin, and choose Sequence Settings to change this value. You can change it any time.

Most file formats are 8-bit formats – rendering back to a DV file, or a QuickTime file means that the color precision needs to be crunched back to 8-bit. If a file format does support a higher color precision (DPX and AVC-Intra P2 are 2 formats that support 10-bit precision) then there will be a “Maximum Bit Depth” selection in the Export Settings dialog box as well. Here’s an example of the Max Bit Depth check box for AVC-Intra output:

AVCIntraBitDepth.png

Some formats, like DPX, have built-in presets for output that include this extra color precision:

DPXMaxBitDepthPresets.png

Steve Hoeg, one of the Premiere Pro engineers, provided some examples of how Premiere Pro will handle color precision in different scenarios:

1. A DV file with a blur and a color corrector exported to DV without the max bit depth flag. We will import the 8-bit DV file, apply the blur to get an 8-bit frame, apply the color corrector to the 8-bit frame to get another 8-bit frame, then write DV at 8-bit.

2. A DV file with a blur and a color corrector exported to DV with the max bit depth flag. We will import the 8-bit DV file, apply the blur to get an 32-bit frame, apply the color corrector to the 32-bit frame to get another 32-bit frame, then write DV at 8-bit. The color corrector working on the 32-bit blurred frame will be higher quality then the previous example.

3. A DV file with a blur and a color corrector exported to DPX with the max bit depth flag. We will import the 8-bit DV file, apply the blur to get an 32-bit frame, apply the color corrector to the 32-bit frame to get another 32-bit frame, then write DPX at 10-bit. This will be still higher quality because the final output format supports greater precision.

4. A DPX file with a blur and a color corrector exported to DPX without the max bit depth flag. We will clamp 10-bit DPX file to 8-bits, apply the blur to get an 8-bit frame, apply the color corrector to the 8-bit frame to get another 8-bit frame, then write 10-bit DPX from 8-bit data.

5. A DPX file with a blur and a color corrector exported to DPX with the max bit depth flag. We will import the 10-bit DPX file, apply the blur to get an 32-bit frame, apply the color corrector to the 32-bit frame to get another 32-bit frame, then write DPX at 10-bit. This will retain full precision through the whole pipeline.

6. A title with a gradient and a blur on a 8-bit monitor. This will display in 8-bit, may show banding.

7. A title with a gradient and a blur on a 10-bit monitor (with hardware acceleration enabled.) This will render the blur in 32-bit, then display at 10-bit. The gradient should be smooth.

There are other examples, which I hope to highlight in my next blog entry.

April 29, 2010

We’re with Cocoa.

It’s been a whirlwind month, with road trips, press briefings, and more, as we’ve ramped up for CS5. And guess what? It’s available now!

With all the craziness in the press right now over other issues, I just wanted to call out our support in this cycle for the Mac. I’ve read a lot of Tweets and blog posts about what Adobe “should” do or “will probably do” with Mac development of our Creative tools, considering this chilly climate. Let me tell you, right here, right now: it just ain’t gonna happen. We’re not gonna hurt our customers like that. Adobe will continue to support our Mac-based creative customers. With CS3, we took on the task of porting over our Mac code base to Intel-native Mac versions of the apps, and brought video applications like Premiere Pro and Encore back to the Mac platform.

With CS5, we’ve gone further:

  • A new, Native, 64-bit Cocoa version of Photoshop.
  • A new, Native, 64-bit Cocoa version of After Effects.
  • A new, Native, 64-bit version of Premiere Pro – at this time the only Pro NLE to be Cocoa-based on the Mac.
  • A new, Native, 64-bit version of Adobe Media Encoder.

I just wanted to take a moment to thank our Mac users. Don’t worry – we’re going to continue to support you guys ‘n gals in the creative space. Now start downloading CS5, and have some fun this weekend! :-)

Next Page »

Copyright © 2014 Adobe Systems Incorporated. All rights reserved.
Terms of Use | Privacy Policy and Cookies (Updated)