Topics by tag:
Recently published articles by Dave G:
I’ve been playing with a lot of different hair workflows lately and found that ZBrush’s Fibermesh is good for making shaped, long hair, where this can be difficult to do with plug-ins like Shave and a Haircut or Yeti. Many people use hand-drawn guide curves with a Maya hair system and there is a good tutorial for that here. But I am using straight OBJ output from ZBrush’s Fibermesh and converting it to renderable curves with this Python script for Maya. The reason you want to do this is because faceted polygons will face in one direction and give you an unrealistic reflection while a renderable curve will more accurately imitate a round hair. So, here is a quick rundown of how you get your polygonal Fibermesh into Maya for use with V-Ray:
The nice thing about that script is that it lets you pick a threshold, so if you use a ton of hairs in ZBrush, you can set the conversion threshold to 0.5 so you only make half the curves. If you have too few curves even at 1.0, try using the linked workflow above to use the hair system approach to fill in spaces between the guide curves instead. You’ll want to use a much lower threshold for that though, since it is really slow to generate with a lot of curves. I had to do just that workflow in my example above. Here is the 1:1 fibermesh conversion:
It’s too sparse (and my hair is too thick).
and a 0.1 ratio curve conversion from the Fibermesh with a hair system applied and 28 hairs per clump assigned in the hairSystemShape1 settings:
Still needs work, but obviously it’s a lot closer already. Thanks to Roman Lapaev for the script link.
1 note | Permalink
So I’m working on this piece with a guy floating in the water and thought I’d share the workflow I used to make the simple water ripple around his body. Within ZBrush, have your water plane and object meeting and then take a screenshot from above with perspective off. Open the screenshot in Photoshop, crop it to square – since you’ll be making an alpha map for your brush – and then use outline as a selection. Offset the selection, stroke it and blur, and you now have an alpha map for ZBrush:
My water cube is too small but it’s just for a still, so I can clone out that wonky bit at the top. Also, water has an index of refraction of 1.333 but I used 1.6 to increase the refractive effect since I’m not going for realism. As you can see in the video, I like to use Smart Objects for my Photoshop layers since the blur filter can always be changed later if I want less or more of the filter and I can scale the stroked layers non-destructively.
Just did a few fixes and small additions to V-Ray Tuner. Changelog lady:Tumblr is all bunged up so read the changelog on the Creative Crash page.
There is a dearth of workflow information around about using V-Ray with Shave and a Haircut (or Yeti), so I had to contact Vlado at Chaos Group about how to use the V-Ray Material Hair 3 material with Shave and a Haircut. In typical Chaos Group amazing support fashion, I got an answer when he woke up the next day. To apply a V-Ray material, you need to apply it to the “shavedisplay” node – select it in the Outliner…:
and then, in a viewport, right click and select Assign New Material and pick V-Ray Material Hair 3. Then you can change the hair colour settings but it won’t render in place of the default Shave material until you disable “Override Geom Shader” in the material section of the shape node for the selected fur:
Then you’re good to go:
I’ll be updating V-Ray Tuner’s linearize colour script to work with V-Ray Material Hair shortly. Keep you posted.
V-Ray for Maya has some pretty handy (but slightly hidden) tools that can be added from the Create From V-Ray Plug-in menu and one of the best is the Color Correction node. It’s basically a grading dialog for textures and can do quite a bit, and it can save you trips to Photoshop to just do slight tweaks to hue, contrast, etc.:
The only thing to keep in mind is that the brightness is a gain slider, not a gamma-based correction, so it should be avoided to adjust mid-tones. Also, you can try out the other Color Correct plug-in for yourself but it’s less full-featured than the Color Correction plug-in. Why they are so similarly named is a question for the sages.
V-Ray Tuner 3.4.1 posted for download. There’s not too much new here but there’s an important fix for the Optimize button script for Maya 2014. Maya 2014 has a bug that returns the wrong number for RAM amount for “memory -phy” so, if you’re running 2014, the script asks you to enter your memory amount in GB. Otherwise, the only other new addition is that I’ve added Point Light support to per-light render script.
This is not a huge update to V-Ray Tuner but it brings a few fixes and features that affect various things. Here’s 3.4 changelog:
I am pretty happy with the latest V-Ray Tuner 3.3 update. It adds a number of features that I now use constantly and I think they’ll be pretty popular. Here’s a break down of what’s new in 3.3:
It’s not shown in that video but the release script now has a palette with an option to keep the original swapped-out mats after conversion.
As always, let me know if there’s anything you want added or tweaked in V-Ray Tuner.
If you do compositing in Nuke, you know it can be annoying to get camera data in and out of a 3D package like Maya or Max. Imported camera data quickly gets outdated when new frames are rendered, which forces you to jump back and forth between your 3D app and Nuke. Fortunately, yesterday I discovered a script for Nuke that makes it dead simple to generate camera data from the camera metadata embedded in V-Ray’s EXR-rendered images. After installing the Python script as a menu item, just select a V-Ray EXR render and the script will make a camera node that you can plug into a 3D scene template. It’s amazing:
It even works with animations, meaning you never have to jump back to your 3D app to get new camera data – just kick out a new camera node for the newer EXR frames.
So, here is how you install it. Grab the Python code from the Pastebin site here. Paste it to a new plain text file in your Nuke scripts folder. In OS X and Linux, that’s ~/.nuke/ (where ~/ is your home folder). OS X users can make short work of it by doing this in the terminal after copying the code:
pbpaste > ~/.nuke/createExrCamVray.py
pbpaste is a command line version of a pasteboard paste command in OS X (pbcopy does a copy if you want to pipe to the clipboard). Next, add the call to the script to your Nuke menu bar by adding these lines to the end of Nuke’s menu.py file (For me, it’s located in /Applications/Nuke6.3v4/Nuke6.3v4.app/Contents/MacOS/plugins/menu.py):
m = menubar.addMenu("V-Ray")
m.addCommand("Create Camera from EXR", "createExrCamVray.createExrCamVray()")
The code is wrapping on my blog site, so it’s better to grab it from this text file. Then, once you relaunch Nuke, you’ll be set up.
5 notes | Permalink
For a while now, I’ve been mixing V-Ray skies with some HDR environments and was doing it in Nuke because I didn’t realize it was doable in Maya – that is, until Vlado at Chaos Group suggested using a Blend Colors utility. I should have thought of it myself but sometimes you forget just how good V-Ray integration is within Maya, and assume something like that won’t work. But it does, so here is a quick rundown of how to mix V-Ray sun/sky environment with a HDR Environment Light Image:
The only trick to this workflow, as shown in the video above, is to disconnect one of the sun/sky environment connections in the Render Settings dialog and then create a file node for the HDR. This ensures that V-Ray will make the proper VRayPlaceEnvTex spherical UV projection. You could do this manually but it’s just easier to have V-Ray do it automatically when connecting it to the environment slot. Then you make a Blend Colors utility, drop the file node and the sky into the two slots and then MMB-drag the Blend Colors utility into the Environement slots. If you would rather have the lights in your IBL added to your sky, use an +/- Average utility to have the IBL pixels added to the sky instead of linearly blended. Make two Input 3D connections (the 3D in this context is not XYZ, it’s RGB) and then connect the sky and file nodes to the inputs in the Hypergraph or Node Editor:
Then you will have a bright sky and added IBL highlights: