Topics by tag:
Recently published articles by Dave G:
A while ago, I wrote a how-to on making a 3D LUT (look-up table) in Nuke to accurately calibrate your colour, which is important because Nuke doesn’t support ICC (Colorsync) colour profiles automatically. Since V-Ray also supports .cube 3D LUTs, the same process can be used to create an accurate colour profile for the V-Ray Framebuffer, so that you can see your images without them being blasted out by a wide-gamut display. The settings are in the VFB’s Correction Controls:
But this .cube profile can also be used as a filter to see your render colour-graded within Nuke or the V-Ray VFB. The process is similar to the calibration method linked above but here you’re generating it from a filtered image from Photoshop and using it in Nuke…
…or the V-Ray Framebuffer:
But this method of “filter and generate” takes some added care: you have to make sure to turn off localized adjustments like vignetting, blur/sharpen and grain, so that only the colour treatment of the Nuke CMS image is affected. The process is much the same with AE, if you want to simulate a grading sample you have output from there. Just export a lossless conversion of the sRGB Nuke CMS image with the grading applied and then build the 3D LUT in Nuke. From there, you just use a Vectorfield node to apply the 3D LUT you’ve made:
If you’re here, just wanting a sample .cube LUT to play with, here’s the Kodak 5229 Vision2 Expression 500T 3D LUT.
Now, to deal with the elephant in the room: what happens if we want both a filter and a calibration? Well, the VFB does have support for both an ICC profile (your hardware-calibrated monitor profile) and a .cube file but currently the ICC correction doesn’t seem to work for me. In Nuke, the solution is much easier: use two Vectorfield adjustments and keep the calibration .cube outside of your Write nodes, just next to the viewer:
That ensures that the filter is applied in your Write but you’re not baking your monitor profile fix into an image that is fine otherwise.
I wanted to automate the process of making these .cube files since I plan to make a lot from Photoshop treatments and other video filters that I like. Nuke is pretty amazing and lets you use the app as a headless workhorse via Python in command line mode (nuke -t script.py), so I built a shell script that uses Nuke in Python command mode and then made an Automator script that makes the process even easier:
The Python script is here if you want it.
Finally, since I’m going to have a ton of .cube files that don’t have previews, the last step was to speed up the process of trying out these LUTs on a working image. So I took my Python script as a source and had a shell script read each .cube file I’ve made and use those as apply those to a variable image:
If you want the Python script to apply the LUT file like above, here it is. The syntax in the Terminal is “/Applications/Nuke6.2v1-32/Nuke6.2v1.app/Contents/MacOS/Nuke6.2v1 -t /path/to/nukeLUTapply.py /path/to/inputimage /path/to/output”. If you have a symlink to the Nuke binary in a $PATH, that’s:
nuke -t /Users/beige/Documents/nuke_scripts/LUT_apply/nukeLUTapply.py /Users/beige/Desktop/succession_hands26.exr /Volumes/HOME_RAID/STOCK/cube_3D_luts/colour_finesse_kodak/Kodak_5247_Vision_200T.cube /Users/beige/Desktop/out.tif
The script to batch apply them is a bit of a mishmash made for my folders:
masterlist=`find /Volumes/HOME_RAID/STOCK/cube_3D_luts/ | grep \\.cube$`
for file in $masterlist
filenameOut=`basename $file -s .cube`
cat /Users/beige/Documents/nuke_scripts/LUT_apply/nukeLUTtester_template.py | sed "s%cubeListHere%$file%g" > /tmp/nuketempfile.py
nuke -t /tmp/nuketempfile.py $1
That’s the beauty (I use that term loosely here) of a Unix system: my crap Python skills combined with some decent shell scripting knowledge means that I can build a system that works without having to rely solely on Python. Pipe the Python to a filter and, while it may not be pretty, it’s pretty good for a guy who only learned how to declare a variable two years ago.
3 notes | Permalink
I don’t use the FCheck or the View command for anything so I thought I’d put it to good use with an Automator utility to reveal the file in the Finder. It’s the World’s Simplest Automator Application but it’s pretty useful if you need to copy or work with the file other than just opening it in an editor:
It’s dead simple to make yourself with Automator but here it is to download if you want it. You’ll need Maya 2012 and up for this one though since there was a bug in 2011 that didn’t let you select .app bundles as valid applications.
Update: I made a similar Automator service for use with the clipboard path to files and apps like Nuke:
Also shown in that video is my Open Clipboard path service which launches the file referenced in the clipboard.
2 notes | Permalink
I just had a Nuke scene script go bad for whatever reason and I am in the process of rebuilding my Time Machine backups so I was screwed until I realized that Nuke files are plain text (this is what makes the contents searchable with Spotlight in OS X). So I opened the script file in BBEdit and pasted the scene nodes into a new Nuke clipboard and then pasted the combined clipboard into Nuke.
I left out the main scene info so I’m guessing that’s where the problem was.
3D Illustration for Review/Tendances magazines.
2 notes | Permalink
Nuke is one of those programs that doesn’t currently have support for ICC colour profiles, so it doesn’t automatically compensate for your display the way that Photoshop or a browser like Safari or Firefox does. You may see “sRGB” in the Viewer menu but without this compensation, it’s never going to match the actual sRGB rendering that Photoshop and others will do correctly. This is especially problematic for wide-gamut displays and, although I have two sRGB NEC 2490WUXi Spectraview screens that are hardware calibrated with the Eye1 Display 2, I’m also not immune to these colour shifts. Certain times it was tolerable and I would make final tweaks in Photoshop but, as I’m using Nuke increasingly in colour-intensive print work, it became obvious I needed to sort this out to work comfortably. Reds are especially effected by this lack of calibration:
Nuke at the left and Photoshop at the right. Everything’s displayed in sRGB on my calibrated sRGB-ish monitor but still there is oversaturation in the reds within Nuke.
So here is the quick process that is used to generate a 100% accurate 3D LUT monitor profile (not a generic “wide gamut”) profile for Nuke to use as the destination rendering intent for your comps. This workflow requires that you have a hardware calibrator because that’s the only way to get an accurate ICC monitor profile that is used in the process within Photoshop. The best way to explain this is with a video and some comments:
My deer in Nuke and Photoshop after applying the calibrated 3D LUT:
Any differences you think you might see are due to the scaling within the Nuke screenshot. I’ve sampled the output and it’s the same between them. might If you missed it in the video, it’s key that you don’t apply the LUT node to your actual output, just your Viewer. So make sure that Write nodes are before the Vectorfield node.
With this one monitor sorted out, this probably has a lot of you wondering “what if you’re using two or more displays?” Well, that’s why I bought these two NEC screens with the Spectraview software/hardware capabilities. It lets you calibrate one monitor and use the white point from that one as the target for the second, so calibration is perfectly synced. That way, you don’t have to worry about involving another Nuke profile when you drag the Viewer over to another screen. I highly recommend the NEC Spectraviews if you need accurate colour across multiple displays. If you’re looking for further reading on colour management, Practical Color Management: Eddie Tapp on Digital Photography is very good.
Update: If you are getting banding in the calibrated display, change the colorspace in/out to sRGB.
3 notes | Permalink
I have a habit of using underscores instead of spaces in folder names so you forget that many people don’t do this and it can cause problems for shell scripts. So I wrote a fix to my Nuke Scene Collector to address this. The only limitation is that, in order for the relinked copy of your Nuke scene file to work, you need to collect to a folder that has no spaces (/Volumes/external_HD/path/collected/ for example). Grab the latest version here and, if you don’t know what this script does, check it out:
I wrote a shell script for OS X and Linux that collects Nuke files and the latest version makes a relinked duplicate of your scene:
Sorry, I can’t make a Windows version since this is just passing the Nuke file through a lot of UNIX filters like sed, it’s not a binary.
Finished 3D illustration for REVIEW magazine. Software used: Maya, V-Ray, ZBrush, Nuke, and Photoshop.
Nuke doesn’t use all cores in my hyper-threaded Nehalem Xeon Mac Pro 2009 by default, so I have to use a TCL command to force it to use all cores. If you want to check how many cores Nuke is using at any time, hit x and enter this TCL command:
By default, this is what’s shown on my max 16-thread Mac Pro:
If you have hit the x key and enter:
set threads 16
It will then use all 16 threads of the hyper-threaded machine:
Update: For machines over 12-cores and hyperthreaded (24 threads), this is not recommended. It’s slower than using the default, in my experience.
I’m feeling pretty happy with my basic shell scripting skills. I just managed to make a script that reads Nuke files and collects all the used files to a folder for archiving/transport. All OBJ ReadGeo nodes and %04d numerical input stuff collect as well.
Here’s the script to download:
Usage: drag the script into the Terminal window (or copy it to a bin directory and run it like a shell app) and drag in a Nuke scene script. It just uses a simple shell script to read the Nuke file and pare down the Read files and all %04d type things to collect all nodes, OBJ type geometry included. It doesn’t delete anything. The only problem I know of with the script is that it doesn’t like directories with spaces or illegal characters in the name. It does a check to make sure that no duplicate copies will clobber each other so if you have render passes that have the same name (images/diffusepass/render.exr, images/specularpass/render.exr for example), the script tells you that a conflict occurred and nothing is done. Enjoy!