Friday, October 27, 2017

Photogrammetry Workflow by Unity

Comprehensive document about photogrammetry. I think we are witnesses of  the beginning of a new era of the computer graphics, because photorammetry is a way to capture surface and color at the same time. Previously we could digitalize only the surface and collect color information separately. The future of the process I think is to collect information about the surface and the material (meaning that not only RGB values).

Unity Photogrammetry Workflow

By the way I tested Nuke as a photorammetry tool and it was impressing.

Wednesday, October 4, 2017

Industry news - Maya 2018 and Nuke 11

Okay, it is oldies right now but I think it is worth to mention a couple of thing. I think Nuke and Maya now related to announce the latest release almost the same time. I can't say that huge development going on. I miss the feeling when a software release hit me and I say "waaoooo amazing! I can't wait to try the new features".


Maya 2018

It's a little bit strange that there is no official announcement video. Maya 2018 released so quietly. We can think that Autodesk knows there is no reason to do the hype. So here are what I found:



The devil lies in the details. I guess they were tired to create fancy feature videos but if we go through the user guide we can find a lot of useful stuff especially in connection with modelling. But still no breakthrough at Bifröst. There is now sing of a general procedural workflow inside Maya. The (new) render setup system seams still not working with references so it kind of a worthless.

Nuke 11

And Nuke announcement is also a bit woozy (or something like that, and the music is shitty as well).


So there is new Lens Distortion and Precomp node has a new name called LiveGroup, background rendering. What else? I guess the VFX Reference Platform is interesting for big studios for pipeline TD's but not for artist.
 I guess with Nuke is the main problem is that Nuke is a complete software. Probably the best choice for compositing for films (is there competitors? Maybe Fusion?). There is no certain direction to develop Nuke because it would overlap other Foundry related softwares like Modo and Mari. In a business point of view there is no reason to develop for example a better modelling toolkit for Nuke to do better photogrammetry and re-modelling, projecting, texturing workflow. There is Modo and Mari for that. But for an artist it is always better to use one software for related tasks saving "tons of time" to export-import, naming, versioning, convert files.



Friday, June 2, 2017

Making of Adam - The VFX

You have certainly seen this short film:



What you may haven't seen this article about the making of the vfx stuff. Quite comprehensive article (I wish I could write things something like this):

Adam – VFX in the real-time short film


Saturday, May 20, 2017

Maya Tricks - Maya Pivot

If you want to know everything about pivot points in maya you should check this video:




Maya Tricks - Measure rgba value range

Let's say we have a file texture and we want to adjust the color of that. We have plenty of options to do that. I think the most sophisticated is the ColorRemap node which is similar to Photoshop's or GIMP's Curves tool or Nuke's ColorLookup node however Maya's ColorRemap is less user friendly I think.


Maya has an option to make a node graph to recolor the picture based on its luminance. We can create the node graph simply just clicking to the Color Remap button below the file node's effects rollout.


The node graph is going to look like this.


The ramp node has a default black to white gradient. We can change the black to blue for eg. and it will change the darkest colors on the picture to blue. Picture like this:


 with a ramp like this:


going to result a picture as we can see below:



The simpliest and obvious solution to change colors is to use the file node's built-in Color Balance options.


These options have limited capabilities of course. We can use textures on Color Offset or Color Gain for eg. Color Gain multiplies the texture file's color and Color Offset adds to or substracts from the input color. Maya has nodes for the basic mathematical operations: multiplyDivied and the plusMinusAverage.

But when we adjust the color of the original texture especially when we use Color Gain (multiplication) or Color Offset (addition) we modify the color range of the picture. As far as I know maya has no tool to measure color range of a picture. It would be useful when we use the modified map to emit fluid for example. Especially true when we use animated maps. We have setRange node but there should be an option to copy the current range values to the oldMin and oldMax fields.
There might be a python (pyMel or API) trick to achieve that. Let me know if there is a method to query the color range of a (texture) node output.

I found an easy solution to visualize the color range of a picture using the SOuP plug-in. It provides the textureToArray node. With that we can sample a texture and plug into the SOuP's peak deformer. So basically we can create a displace modifier. Unfortunately the (new) textureDeformer has limited options to acieve that effect. HightField has the same weakness.

To display our colorRange with SOuP nodes we can use a polyPlane as a base geometry. We have to setup a following network:


It is important to check the accurateSampling option on textureToArray node. That will ensure to sample trough to whole network.


In this case I use a luminace sampling using the luminance node and the output in this case the alpha channel. So the peak deformer is using the luminance values to offset the vertices.
If we apply the color as a surfaceShader to the plane the result will look like this.




Of course the measure of the peak is adjustable on the peak derormer.



This setup helps to visualize the RGBA or this case the luminance values. Since we use color management in maya. We can sample in maya viewport (or everywhere else) the real (not only 0-1) color value. So this setup give us a hint which areas have the highest or lowest values.


We can get further if we query the list of y value of the vertices and get the color of the lowest and highest value by the colorAtPoint MEL (or python command).

Wednesday, April 26, 2017

Blender 2.8 Project - Eevee Roadmap

I can't wait to try Blender's new viewport render engines. Especially the so called Eevee. It stands for Extra Easy Virtual Environment Engine and basically means we can render good quality in realtime. Of course it depends of our graphics card.
I really don't understand why we should use cycle to achieve cartoon style look as it covered in this video. I guess the reason is because current viewport OpenGL render quite limited. But raytracing is not an alternative for me if we talking about cartoon style. So the new Eevee engine could be.

Check this out here

This video also covers the topic


Thursday, March 16, 2017

The benefits of using Blender - Bug reporting and solving

On a sunny morning at the beginning of March I found that if I hit OpenGL render active viewport in the Video Sequence Editor Blender crashed. I tried it different ways and I realized that the reason is  the scene had Grace Pencil drawing. It was a bug. I was kind of newbe using Blender but I had internet so I search for bug reporting. I found that article in Blender Nation:
Reporting Bugs in Blender 

I just followed the instruction. I had to register on developer.blender.org forum. So I report the bug on 1st of March (2017) and within a day I got the confirmation and on the same day there was a new daily release which did not produce the crash. Pretty amazing, isn't it! Which commercial software company can compete with this?
Btw the (latest) daily build which can be found here: https://builder.blender.org/download

Friday, March 10, 2017

Fabric Engine - GPU - MPC

It is a presentation from 2015 but still shocking how revolutionary what MPC developed "hand-in-hand" with Fabric Engine.


The background story is here.

Wednesday, February 15, 2017

Naming Convention 3. - Files and Folders 3.

To continue the previous chapter we have to further analyze the structure of a VFX production.

201401_OBP/
    010_in/
    020_preProd/
    030_assetProd
    040_shotProd
    050_out/

Just to mention there could be a folder hierarchy based on kind of a workflows like, edit, paint, composition, or hybrid ones like 2D, 3D. I not recommend that because of redundancy and the lot of time we have to spend to jump for eg. from 2D to 3D or paint and composition since we usually work on one shot at time. There is an exception:

Edit
Editing as a workflow is an exception because this is the workflow rule them all. So it should be at the top level of the structure. A VFX studio usually not responsible for the editing, but it is very important to have the ability to watch to VFX shots in context. So there have to be editorial information, pre-cuts or rough-cuts, sequences in together, etc. Huge topic so might discuss it later. The conclusion is the following:

201401_OBP/
    010_in/
    020_preProd/
    030_assetProd
    040_shotProd
    050_edit
    060_out/

In
Before we get to the fan part, it is time to consider that I tend to call secondary stuff like documents, storyboards, audio, references, etc. Where to put them. For the in folder the definition was: things which was not created in-house during the projects. Being a VFX studio it is most likely that all these type of things should be placed under the in folder. For a small production I do not recommend to use sub-folders (flat vs deep structure - flat is the winner I think), but for this case I use sub-folders to demonstrate it.

  201401_OBP/
    010_in/
        010_script-storyboard
        020_documents
        030_references
    020_preProd/
    030_assetProd
    040_shotProd
    050_edit
    060_out/

Another very important rule for the in folder: never (never never) reference files from here. If you see somebody referencing an image for eg. in a nuke script you should fire he or she or ask your boss to do that. And of course you should be promoted being recognized that.




Thursday, January 12, 2017

Switch to Blender - Forget the Maya (or 3DS Max) input preset

So much stuff is going on. I have a lot of topic I want to write about: naming convention, pipeline stuff, etc. But right now the most important thing for me is that I made the first big step toward Blender. Really :)

The first and very important thing that I had to do to get on the right track is to use the Blender default interaction preset instead of Maya.


By doing that I got the feeling what really Blender is instead of miss certain shortcuts used in Maya and got that feeling Maya works better for me. Of course using the Blender default interaction I have to learn tons of new things, but the more I learn about Blender the less I have the feeling that Maya is better. So it is a beginning of a long journey :)

Later I saw a Blender Conference presentation by Tangent Animation and Jeff Bell said exactly what I experienced about the Blender vs Maya interaction...so I know that I'm really on the right track.