Greetings!

Welcome to Scifi-Meshes.com! Click one of these buttons to join in on the fun.

Local TutorialFun With Shader Math: Procedural Polar Coordinates, with Triplanar Mixed In

McCMcC373 Posts: 704Member
For Coronado, I found myself in a situation where I need to be able to convert Blender's readily available Generated or Object-derived UV coordinates into polar coordinates, so as to have a texture seem to radiate out from a central point rather than be applied along rectilinear XYZ object axes. In particular, I needed this for Coro's saucer.

Pretty certain that this sort of mapping conversion was not only possible, but a relatively (relatively!) easy math problem, I set out to do so. This first required reading up on the math behind converting cartesian and polar coordinates, and a whole lot of thrashing against those equations before I realized that the direction it had seemed to me (and still does, tbh...) to make sense to run the equation was the opposite of what I actually wanted.

What follows is how I did this as it pertains to Coro. I make no guarantee that this will work for any given model without particular tweaking, but the approach behind it is definitely sound and reproducible.

STEP 0: CHOOSE GENERATED OR OBJECT-BASED COORDINATES

This all presupposes you're using either Generated (i.e. UVW space relative to the object's bounding box) or Object (i.e. UVW space relative to the world distance of a surface from a reference object) texture coordinates. I'm sure there are cases where you might want to do this with existing unwrapped UVs, but that wasn't what I was after.

Of important note: Generated vs. Object will give you vastly different mapping scales. Generated always supposes normalized coordinates of 0 to 1 across each axis of the object to which the material is applied itself. Object always uses the relative offset of any point to which the material is applied to the origin coordinate of a chosen reference object. When dealing with starships, this is often hundreds of meters, so your scaling will be very different!

For my purposes, I used Object with the object being an invisible bounding box object that all my other objects were already parented to. An Empty would work just as well; mine just happened to be an object.

The following all presupposes Object and it's the approach I recommend. However, it does mean that if you break apart your object at some point (e.g. detachable saucer, destruction scene, whatever), it will cause your material to start moving all over the place as your computed distances all change! It may be worth baking out the results of this process first before attempting that sort of thing. This write-up will not cover that process (because I haven't done it myself yet!).

STEP 1: DETERMINE DIMENSIONS

You need to determine a few dimensional values.

First, you need to know where you want your polar coordinate origin to be relative to the normally-calculated center point for the material. This is where Object space really helps, because you can just use scene distances. With Generated, you're suddenly in the realm of calculating percentage along a given object and it becomes a pain. In the case of Coronado, the saucer center is 79m forward of the parent object's origin, so I created an Input node with a value of 79 and named it Polar Origin.

Second, you need to determine the overall size of the disc you want your polar coordinates to convert into. In my case, this is the width of Coro's saucer, or 251m, so I created another Input node with a value of 251 and called it Polar Map Width. You could potentially script both of these nodes to fetch calculated values from Blender directly, but I didn't need that level of precision.

STEP 2: POLAR PRECPROCESSING GROUP

Before we can do our polar conversion, we need to prepare the Object coordinates for that conversion.
  • Take the Object vector output of your Texture Coordinate node and plug it into a Mapping node set to Point coordinates. Call this node Remapped Polar Origin.
  • Take the Polar Origin input and plug it into the appropriate channel of a Combine XYZ node and call it Polar Origin Offset Vector. In my case, I wanted to adjust the polar origin forward (i.e. +Y) of the object center, so I plugged it into the Y channel. This node has now created a vector output for us that we can plug into other vector input channels. Plug the Polar Origin Offset Vector into Remapped Polar Origin's Location input
  • Create another Combine XYZ node and call it Polar Map Width Vector. Plug the Polar Map Width into the Z coordinate value.
  • Create a Math node and set it to Divide. Call it Polar Map Radius. Plug the Polar Map Width node into its first value and set the second value to 2: we want to cut the overall width in half to turn it into a radius.
  • Then, plug the output of this Math: Divide node into the X and Y channels of the Polar Map Width Vector node. This is highly dependent on your object: in my case, I basically wanted the X and Y axes to become my polar axes, because the saucer viewed from above/below is a circle; the Z axis was its own thing (and will govern "depth" of our procedural textures later).
  • Create another Mapping node and plug the vector output of the Remapped Polar Origin node into this one. Call this node Remapped Polar Scale. Plug Polar Map Radius into this node's Scale input. Why not plug it into the previous Mapping node and cut down on the number of mapping nodes? Because then you have to do more math, frankly. If you transform the scale and the location at the same time, you have to make sure your location accounts for the changes in scale, which means you'd have to premultiply your position and scale vectors together and all sorts of nonsense. It's much simpler to just use the two chained mapping nodes.
  • Create another Input node and call it Polar Rescaler. Give it a value of 16000. You'll definitely want to tweak this later, and it will likely also be object- and texture-dependent, but this basically helps us account for the difference in procedural scale between triplanar cartesian coordinates and the resulting polar coordinates.
  • Plug Polar Rescaler into all three inputs of a Combine XYZ node to turn it into a vector. Call it Polar Rescaler Vector
  • Create a Vector Math node and plug the output of Remapped Polar Scale into the first input and the output of Polar Rescaler Vector into the second input. Set it to Divide -- this will have the effect of enlarging the resulting procedural scale. Again, you'll likely want to tweak the value to taste.

At this point, I'd recommend selecting the following nodes and turning them into a single node group: Polar Origin Offset Vector, the Divide node feeding into Polar Map Width Vector, Polar Map Width Vector itself, Polar Rescaler Vector, Remapped Polar Origin, Remapped Polar Scale, and the Vector Math node at the end of the chain. You should have as inputs the incoming Vector from your object texture coordinates, your Polar Origin Offset Value node, your Polar Map Width value node, and your Polar Rescaler node. You should have a single vector as an output. I called this group Rect2Polar Preprocessing.

02-polar_preprocessing_group.jpg&size=640

STEP 3: POLAR COORDINATE CONVERSION

This is where the real magic, and the real math, happens. We need, essentially, three pieces of information for any given coordinate: what is the corresponding polar radius (r), what is the corresponding polar angle (theta), and are we in the "top half" (Y >= 0) or "bottom half" (Y < 0) of the cartesian grid. Again, your object may vary here, if you're not doing a top-down circle in the XY plane.

The equation for r is sqrt(x^2+y^2). The equation for theta is +/- arccos(x/r). We're going to create node networks to compute these.
  • Create a Separate XYZ node and plug the output of the above group node into it. This gives us separate X, Y, and Z channels to mess with.

First, we'll compute r:
  • Plug the X output into a Math: Power node, with the exponent set to 2. This is x^2
  • Plug the Y output into a Math: Power node, with the exponent set to 2. This is y^2
  • Plug the output of these two nodes into a Math: Add node. This is x^2+y^2.
  • Plug the output of that node into a Math: Square Root node. This is r.
  • Create a frame around the Power nodes, the Add node, and the Square Root node, and call it "r computation" or similar.

Next, we're going to create "masks" for Y >= 0 and Y < 0.
  • Plug the Y output from the Separate XYZ node into a Math: Greater Than node, with a threshold of 0.
  • Also plug the Y output into the first Value input of a Math: Compare node, with the other Value and Epsilon set to 0. This is the "or equal to" part of "greater than or equal to".
  • Also also plug the Y output into a Math: Less Than node, with a threshold of 0.
  • Plug the output of the Greater Than and Compare nodes into a Math: Add node.
  • Create a frame around the Greater Than, Compare, and Add nodes and call it "Mask Y >= 0"
  • Create a frame around the Less Than node and call it "Mask Y < 0"

Next, the big money: theta.
  • Plug the X output from the Separate XYZ node into a Math: Divide node. Plug the output of the "r" node into the second Value input.
  • Plug the output of this Divide node into a Math: Arccosine node.
  • Plug the output of the Arccosine node into a Math: Multiply node, with the other value being -1. This gives us the "or negative" part of our "positive or negative" sign in the theta equation.
  • Plug the output of the Arccosine node into another Multiply node, with the other value being the output of the Mask Y >= 0 frame's Add node.
  • Plug the output of the Math * -1 node into another Multiply node, with the other value being the output of the Mask Y < 0 frame's Less Than node.
  • Plug the output of these latter two Multiply nodes into a Math: Add node. We have just created theta.
  • Select the Divide node, the Arccosine node, the three Multiply nodes, and the Add node and put it in a frame called "Theta".

And finally:
  • Create a Combine XYZ node and plug Theta into the X channel, "r" into the Y channel, and connect Z from the original Separate XYZ node into the Z channel.

Congratulations: you've just created polar coordinates!

You may now want to group all of the above into a single group node consisting of: the Separate XYZ node, the "r" frame, the "theta" frame, the two Mask Y frames, and the Combine XYZ node. You should have a single Vector input and single Vector output. I called this group "Rect2Polar".

03-polar_coordinates.jpg&size=640

Continued below, because I've run out of space!
Tagged:

Posts

  • McCMcC373 Posts: 704Member
    edited December 2020 #2
    STEP 4: TRIPLANAR MIXING

    If you just have a single object that you want polar coordinates on, you're pretty much done. Odds are, though, you want some parts of your object to have polar coordinates and some to have, say, triplanar mapping. In my case, I had a saucer where I wanted the polar coordinates, and then I wanted triplanar coordinates on the rest of the ship. There are several ways to achieve this, and in my case I actually combined a couple of them.

    STEP 4a: UV THRESHOLD MASKING

    For one thing, I had organized my actual unwrapped UVs such that all of the saucer "stuff" (relevant to this, anyway) existed in a single UDIM span: UDIM 1002, which occupies the UV range (2,0) to (3,1). That meant I could use whether or not the object-to-which-the-material-was-applied's own UV information as a "mask":
    • Create a Texture Coordinate Node
    • Plug the UV output into a Separate XYZ node
    • Create four Value nodes: U Low Threshold, U High Threshold, V Low Threshold, and V High Threshold. Set the values as appropriate (in my case, 2, 3, 0, and 1, per above).
    • Plug the X output -- really the "U" output -- into both a Math: Greater Than node and a Math: Less Than node. The Greater/Less Than node returns true (1) for any value where the input is over/under the threshold and false (0) for any value below/above the threshold. It just so happens that 1 and 0, might also be called "white" and "black" and be exactly what we want as a mask!
    • Plug the U Low Threshold value into the second input of the Greater Than node and the U High Threshold into the second input of the Less Than node.
    • Multiply these two nodes together, so that we get only the true (1/white) values where they overlap; everything else goes to 0.
    • Repeat the above with the Y output -- really, the "V" output -- to create a V mask.
    • Multiply the U mask and V mask values together, further cutting down on their "true" space so that it's only in the UV range specified above.
    • For fun, you can also plug the output of this into the second input of a Math: Subtract node, with the first input set to 1, to create the "inverse" mask.

    You may want to group the above: select the Texture Coordinate node, the Separate XYZ node, the U Mask nodes, the V Mask nodes, the final Multiply node, and the optional final Subtract node. You want as inputs each of the U/V High/Low Thresholds and as outputs both the Mask (from the Multiply node) and the Inverse Mask (from the optional Subtract node). I called this group "UV Threshold Mask".

    04-uv_threshold_mask.jpg&size=640

    STEP 4b: COMBINING A MASKING TEXTURE

    The above wasn't quite enough, however: I also wanted to mask out areas of the saucer map where it reverted to triplanar: namely, the upper shuttlebay and Specter bay housing. To do this, I had to succumb to using an actual white/black image mask painted to correspond to these areas on the unwrapped UVs. There's no special magic to this; it's just a white/black mask image that happens to line up with the correct UV space.

    To combine with the above, simply take the output the mask Image Texture and Add it -- with Clamp enabled -- to the UV Threshold Mask group's Inverse Mask output. I called this node Triplanar Mask.

    STEP 5: CREATING THE COMBINED TEXTURE COORDINATES

    To make use of this combo triplanar + polar setup, you need to now mix all your vectors together properly.
    • Create a Mapping node and call it Triplanar Vectoring.
    • Plug the original Texture Coordinate node pointing to your Object (from step 0) into the input Vector of the Triplanar Vectoring node. You should also already have this vector input going to your Rect2Polar Preprocessing group, which feeds into your Rect2Polar group.
    • Create two Vector Math: Multiply nodes and a Vector Math: Add node.
    • Into the first Vector Math: Multiply node, plug in your Rect2Polar vector as the first input.
    • Create a Math: Subtract node, with Clamp enabled, and set the first value to 1. Plug the Triplanar Mask into the second value. Call this "Polar Mask". (Yes, this is somewhat redundant with the actual Mask output from the UV Threshold Mask node and it could probably be done more efficiently!)
    • Plug the Polar Mask node into the second Vector input of the first Vector Math: Multiply node.
    • Plug the Triplanar Vectoring node into the first input of the second Vector Math: Multiply node.
    • Plug the Triplanar Mask node into the second input of the second Vector Math: Multiply node.
    • Plug the two Vector Math: Multiply nodes into the Vector Math: Add node.

    Congratulations, you now have a masked triplanar + polar mapping vector you can plug into anything your heart desires!

    05-triplanar_polar_combo.jpg&size=640

    STEP 6: GO WILD!

    As you can see from the above, however, my actual setup gets even more complex, in part because I'm dealing not just with one set of coordinates, but another ever-so-slightly offset set of coordinates, which I use to create ultrafine panel lines, as discussed here.

    06-dual_voronoi_group.jpg&size=640

    ...and then repeat that four times, each with slightly rejiggered mapping, all mixed together, filtered through color ramps and more...

    06-offset_voronoi_setup.jpg&size=640

    There's all sorts of crazy stuff you can do with it. I leave all of that...to your imagination!
    Post edited by McC on
Sign In or Register to comment.