[Maya] NURBS and Polygons

Demonstrating different techniques of creating objects Maya.

Intersections and booleans are easier with NURBS as they use splines to generate shapes.

The spheres below have been via NURBS intersections and the “trim”tool.

You can create complicated shapes with NURBS with intersections and then convert them to polygons if needed.

Below shows spheres created in various different ways, and then converted into polygons with different tessellation methods.

To demonstrate the different wireframes I merged a wireframe node with my shader.

Cheese

Using boolean operations on a cube to create a block of cheese

I added some rigid bodies and effects to make it a bit dynamic. Not particularly realistic but interesting.

[Nuke] Colour Correct

Colour correcting a merge

Before

Grade

Colour Correct – changing the values of each hue based on the data from the background image.

Using the yellows of the horizon for the highlights, and the blues as the shadows.

Colour correct with a ramp mask

Toe node to lift the black levels

Before unpremultifying the alpha

Final comp

Final plate

I also added in the shadow of the plane in the water through a merge and a duplicate of the plane read. I tried to match the direction of the shadows but I’m not sure its position is correct.

[Nuke] Rotoscoping

Roto practice with channels
Demonstrating Rotoscoping

Through use of the “roto” node on nuke, I made various curves tracing out the moving man.

Separate components of the man are roto’d individually. Every few frames adjusting the curves to ensure that the roto curve matches the mans movement in the frame.

[Nuke] Colour Correcting

After learning about concepts including the rule of thirds when shooting, I recording a juggling sequence.

I used 3 light sources in an attempt to recreate studio style lighting. One was a desk lamp used as a key light for the subject.

Another desklamp as a backlight.

And a flood light, shone against a mattress as a fill light – in the hopes of creating a nice soft fill.

Before

After

Using nuke I proceeded to colour correct the footage.

Firstly I applied a grade. Followed by a colour correct with a ramp mask (to mimic the light source)
I adjusted the exposure and hue corrected the footage.

It is then important to “clamp” the footage after all of the adjustments to ensure that its blackest blacks and whitest whites are visible.

Ramped mask on colour correct.

[Maya] The Room Continued – Chair

Process refined using techniques used for the table.

Cracked plank

Using multicut tool to create geometry for cracked plank
UV mapped.

It was difficult to understand what the inside of the crack would look like, and how the UV’s should behave. Particularly in terms of the concentric rings inside the wood. I researched pictures of wood cracks to ensure realism.

Sculped geo AFTER ensuring UVs correct!!!

Using the bend tool and the soft select brush to make crack a bit more realistic.

Using the bend tool to distort the back support of the chair.

Unique bolt needed for crack as the inside of the crack reveals the bolt which has “cracked” the plank. Created with reflected geometry and extrusion.

Two variations of the chair, one with crack, one without. I reused a lot of geo, but duplicated “special” with a -1 scale in the x direction to create semi unique components.

Rendered chair

Table and chairs composed and arranged.

[Maya] The Room™ Continued…

Table continued…

Learnt that reusing some geometry is useful for speed. When geometry is reused but then re-scaled, the UV’s become distorted, so it is often necessary to recreate geometry and map UV’s onto the new geometry.

Bolt

Created sphere and removed unnecessary faces.
Applying materials and textures to bolt to create a metal like effect.
Hypershade with “Metalness” and “Specular Roughness” nodes.

Metalness” tells the shader where the material metal-like and where the material is not. I.e where it is rusted.

Placing bolts onto table
Placed bolt

Sculpted geometry on legs to make to make it look like the bolt has been “hammered” into the leg.
Table with bolts rendered.

[Maya] The Room™

Room development.

Plank

Creating plank and UV mapping. Separated UV’s for the edges and adjusted texture in photoshop.
Using sculpt tool to sculpt geometry. Making it look more organic.
Creating room and placing objects within – using materials.
Creating window and light portal

Creating sink

Improving table

UV mapping
Sculpting geo
Texture mapped with different texture for ends – including displacement and bump mapping.

Using techniques and tools such as the edge loop tool, sculpting tool, extrusion. Point snapping, changing the pivot point. Snapping pivot point to edge, rotate tool with snapping.

[Maya]

During this session we went over the basics of Maya. Learning the environment, workspace, tools and how the program worked.

We learnt that Maya is essentially a programming language with a GUI. How it provides a tool set that allows for the realisation of visuals. The fact that it’s set up in this way can mean that it is very buggy and can often crash!

We learnt the basics of manipulating the environment, hotkeys, move tool, objesct manipulation. Extruding and how to create edgeloops.

We went over the channel box and its inputs. Crucial when creating topology as it defines how the object is first realised in the scene and can save a lot of work later on.

We went over how to apply shaders to objects and how Arnolds basic materials’ work, and how to apply them to objects.

The following image show the various ways in which something as simple as the globe can be mapped onto a 2D space. UV mapping is the reverse of this skill taking a 2D image and as best as possible getting onto a 3D object.

We went over some basic UV mapping. Which the process of laying a 2D image on 3D objects. An essential tool for modelling as it is the skill of laying 2D texture on 3D space. A fundamental skill in modelling.

Applied shaders and bump mapped a Buddha, with an HDRi. Which captures many samples of light to create realistic ray tracing light refraction which when interacting with a shader creates an incredibly realistic image, especially with the Arnold renderer.

Christmas Assignment

WEEK 2 Assignment

Runner

The role of Runner is often the first job that someone will have in the VFX industry. A runner will support all the other members of the team by acting as a conduit between colleagues and departments. Their day will be a mixture of errands, passing on messages, picking up and delivering materials, office admin, organising diaries and meetings and generally supporting in whichever way possible. Whilst doing this they have the chance to watch and learn from their experienced colleagues.

Prep artist

The prep artist prepares footage of backgrounds so that it is ready for effects to be added eg figures, crowds, creatures, explosions, etc. They use VFX software to remove any marks or dust from the shots (or ‘plates’) and also remove any live-action support equipment such as harnesses or microphones that may have been used in capturing the live footage. Once this is done the compositors can layer on the desired foreground effects. Below is an image of artist Anton Egorov at work.

https://www.premiumbeat.com/blog/how-an-average-vfx-pipeline-works/

Roto artist

The roto artist will take parts of live-action shots by drawing round and removing them to use with different images, either live-action or CGI or a combination. The required cut-outs are called ‘mattes’ and can be integrated with different backgrounds and effects in multiple ways. The image below shows stages of the process from a scene in A Scanner Darkly (2006)

WEEK 3 Assignment

Modelling Artist

The modelling artist creates 3D computerised imagery, usually from ideas given to them by a concept artist or from other reference materials such as photos. They will use the concept artist’s brief and drawings, paintings or 3D models to create a digital 3D version of an image such as a person, animal, plant or vehicle. The initial image is a ‘wireframe’ or ‘mesh’ of lines which is then worked on with digital sculpting brushes and graphics pens. The digital model can then be further manipulated with other effects such as animation, texturization and light/shading. Here is an image of polygonal modelling which uses polygon meshes. 

Animator

The animator brings digital figures ‘to life’, giving them characteristics and human or animal emotions and movements. VFX animators use computer-generated rigs (a digital ‘skeleton’ which enables realistic movement) to help with lifelike expressions such as fear or laughter, or to show how the body works, for example how an elderly person might stoop.

https://en.wikipedia.org/wiki/Visual_effects

Texture artist

The texture artist works with the 3D digital images which start off as a plain (normally grey) shape, to further add to the realism by creating shadows, reflections, scratches etc. They need to research how surfaces look and behave under various conditions such as different light, weather conditions, ageing, wear and tear. They also create realistic-looking materials such as fur, cloth, animal markings, wood and metals.

https://en.wikipedia.org/wiki/Visual_effects

https://www.foundry.com/insights/film-tv/texture-artist-mari

Environment artist

The environment artist makes the digital surroundings in which the actors, either real or created, move and interact. These surroundings could be anything that exists or can be imagined, such as solar systems, oceans, forests, cities or supernatural and surreal environments, such as the ecosystem of an imagined planet. They create backdrops for action in places where it would be impossible or dangerous to film in reality, such as inside a volcano, at the top mountain or at the bottom of the sea. Some examples of the main softwares used by environment artists to model and sculpt are Maya, Zbrush, 3D Studio Max, Mudbox and Blender.

https://80.lv/articles/the-stages-of-environment-art-in-gamedev/

WEEK 4 Assignment

Layout artist

The Layout Artist recreates the sets, objects and camera motion of a shot in a 3D software environment to match the live-action background plate images. They set up the virtual camera for each scene in a production, address the composition of each shot to establish camera angles and direction, zooming in and out etc and overall lighting requirements. They need to ensure that the correct scale is applied to all objects.

https://www.artstation.com/artwork/8ekb5m

Matchmove artist

The matchmove artist ensures that digitally created scenes match with actual footage from live shots in a convincing way, addressing position, scale, orientation, and motion. The live backgrounds are recreated digitally to completely mirror the camera on the set in every way, including lens distortion; this requires taking detailed measurements of size, shape, positioning, and properties of all objects and also the background elements such as walls and stairs.

Lighting artist

The lighting artist’s role is to further enhance the realism of digitally produced scenes, giving them depth and believability, to achieve the same effects as a Director of Photography would in a live shot. The computer-generated lighting is manipulated to create different moods by adjusting the colour, saturation and positioning, and the lighting artist will try to match the appearance of CG objects to that of the real elements.

https://o_baillargeon.artstation.com/projects/198O2

CG Supervisor (computer graphics supervisor)

The CG supervisor manages the artists who create the imagery and decides the work schedule and the order of all the elements of the process. They have a managerial role and are responsible for the standards of the work and keeping to deadlines. They will also establish the elements of the work that will need to be researched by software developers.

https://www.screenskills.com/job-profiles/browse/visual-effects-vfx/computer-generated/computer-graphics-cg-supervisor/

Look development artist

The look development artist ensures that the overall world and characters of a story look consistent and a part of a cohesive whole.  They work closely with concept artists to decide how a particular CG element such as a creature would look in different settings such as in the rain or at night. They also liaise with lighting and texturing artists and technical directors to achieve the different looks, taking into account the various stages of texturing, lighting and rendering.

http://www.xuanprada.com/blog/2014/8/18/colorway-for-look-development-in-vfx

WEEK 5 Assignment

VFX supervisor

The VFX supervisor has ultimate responsibility for the all the VFX produced by their organisation and will oversee a whole VFX project. They manage the VFX pipeline and the relationship with the director or producer of a film or TV programme. They retain responsibility during the post-production process.

https://www.studiodaily.com/2012/02/vfx-supervisor-erik-nash-on-real-steel/

Data capture technician

The data capture technician will visit a set to gather all the information about the live filming for a project, to pass on to the VFX teams. They will photograph the set and the camera positions, and gather data about the lenses and filters. This information will be uploaded and shared with the VFX team constantly throughout filming, so that the artists and technicians can match the scenes and action and build on the effects. 

https://leica-geosystems.com/case-studies/reality-capture/blockbuster-scanning-using-lidar-for-joker-and-john-wick

Compositor 

The compositor brings together all the digital elements of a project and combines them with the live footage and paintings to produce the final combined image, ensuring that the components blend together seamlessly so that the viewer sees a cohesive overall scene.

http://www.maacanimationkasba.com/maac-courses/advance-compositing/

Compositing supervisor

The compositing supervisor manages the department that puts together all the different elements of the VFX shots. They check the compositors’ work for quality and continuity.

https://www.screenskills.com/job-profiles/browse/visual-effects-vfx/compositing/compositor-visual-effects-vfx/

Week 6 Assignment

FX TD (Effects technical director)

The FX Technical Director creates realistic particle and fluid effects for live-action film and television, such as fire, smoke, water, air debris, snow, clouds, storms, steam, etc. They will often work on the live set to ensure the footage is filmed in the best way for special effects to be added in the post-production process and create code for customised tools required for the production.

https://beverlyboy.com/film-crew-positions/what-is-a-visual-effects-director/

TD (Assistant technical director)

The TD assists the FX Technical Director to ensure the smooth operation of the tools, software and workflows used by VFX team. They support the pipeline, troubleshooting issues with the workflow tools. They can create code for small-scale issues encountered by the VFX artists.

https://www.screenskills.com/job-profiles/browse/visual-effects-vfx/technical/assistant-technical-director-td/

Software developer 

The software developer designs and implements the components and applications 

needed for a VFX project, and creates the systems for technical directors to utilise and adapt to the needs of their VFX artists. They design new digital tools to operate within existing software systems, so that they can be re-used in later projects.

https://blog.frame.io/2020/02/17/vfx-workflow-best-practices/

Pipeline TD (Pipeline technical director)

The pipeline TD is a coordinating role, ensuring that a VFX project progresses efficiently and identifying and fixing problems as they arise. They ensure each VFX department has the software tools they need to complete project on time and to the required standards.

Week 7 Assignment

Rigging TD (rigging technical director)

The rigging TD creates the bone structure or ‘skeleton’ of 3D digital models, and programs them to move as a real person or creature would. The rig is used by the VFX animators as the starting point for creating their characters.

https://www.hatrigs.com/

Creature TD (creature technical director)

A Creature TD is responsible for building, rigging and simulating the internal anatomy of a creature. They build software that can be used to create life-like looking fur, feathers, scales and skin. They also liaise with pipeline TDs to incorporate these tools into the VFX pipeline.

https://www.williamgabriele.com/single-post/creatures-in-vfx-skin-binding-and-skin-simulation

VFX Producer

The VFX producer manages all aspects of a project, ie writing bids, planning and scheduling of resources, client management, keeping track of scope/budget changes and ensuring the project comes in on time and in budget, whilst maintaining the highest quality of work.

https://www.screenskills.com/job-profiles/browse/visual-effects-vfx/production-management/vfx-producer/

Production Manager

The Production manager supports the VFX producer and carries out the plans and decisions made, with detailed project scheduling and budget management.

They line-manage the production coordinator, and often draw up contracts and hire artists. They liaise with all contributors to the pipeline and also with the producer of the live filming.

https://www.screenskills.com/job-profiles/browse/visual-effects-vfx/production-management/production-manager-visual-effects-vfx/

Weeks 8-10 Assignment

VFX editor

The VFX editor is the conduit between the live-action team and the VFX team. They can be employed by the film/TV studio or the VFX company. They liaise constantly with both to ensure that the live-action and the VFX can be integrated as efficiently as possible. VFX Editors are not usually responsible for the actual making or editing of visual effects, but they are responsible for the tracking, organisation and communication of shots. If employed by the film/TV studio they will usually be on set during filming, to check that the footage will be suitable for VFX to added, and if employed by the VFX company they will liaise with their client counterparts and ensure that the VFX studio artists have all they need to create their work.

https://gprsstudio.com/index.php/2021/10/25/vfx-editor/

Data input/output (I/O) technician

The Data I/O Technician manages all the content created during the VFX project, delivering it to clients, backing up files and troubleshooting technical issues related to drives or data transfer or the SAN (Storage Area Network). They must ensure that all security protocols are followed (such as file encryption) and the data deliverables are accurate.

https://www.screen​​skills.com/job-profiles/browse/visual-effects-vfx/production-management/data-input-output-i-o-technician/

Concept artist

The concept artist works at the beginning stage of the visual effects pipeline to create compelling artwork for creatures, characters and environments that will inspire all the other artists and technicians involved in the project. They start with a brief, which could be a script or an idea in the mind of the filmmaker, and will be the first artists to express the ideas onto the page and/or screen, in 2D, 3D or both. The concept artists’ work will be constantly referred to by the other artists to ensure that they maintain consistency of style and all have the same shared vision.

https://medium.com/@thefocus/5-insights-into-concept-art-in-vfx-85e56ef5d0fb

Previs Artist (Previsualisation artist)

The previs artist works with storyboards from a concept artist to plan the overall look of a film, ie visualising the scenes before fully creating them. They use 3D animatics to make lower-quality rough draft versions of the moving image sequences, planning shots, gauging the scale, timing and positioning of characters, which then enables the production team to organise the scenes in a consistent manner. This ultimately saves time and money for both the film-makers and the VFX team, as changes further along the pipeline would be more costly.

Once a film is in production, previs artists help the other VFX artists to maintain a consistent style in their work. They are usually also skilled at compositing and editing.

https://postperspective.com/tag/previs/

Week 11 Assignment

Having researched all the vfx roles, I feel I would like to be a compositor because I like to have a holistic view of the whole development of the look of a film, particularly grading, colour correcting and adding cg elements to a scene.

 References

https://axisstudiosgroup.com/careers/jobs/vfx-editor/

https://www.entertainmentcareers.net/deluxe-entertainment/data-i-o-technician/job/337462/

https://www.thefocus.com/updates/

https://www.premiumbeat.com/blog/how-an-average-vfx-pipeline-works/

https://axisstudiosgroup.com/careers/jobs/vfx-producer/

https://www.creativeheads.net/job/16777/creature-td-in-wellington

https://sciencebehindpixar.org/pipeline/rigging

http://vfxvancouver.com/job/waterproof-studios-vancouver-2-rigging-td/

https://jobs.lever.co/scanlinevfx/0d0eb938-609f-40a2-bd44-c2890ec89466

https://www.solihull.ac.uk/course/assistant-technical-director-visual-effects-apprenticeship/

https://www.cgspectrum.com/career-pathways/fx-technical-director

https://disneyanimation.com/process/look-development/

https://www.worldofleveldesign.com/categories/game_environments_design/software-for-game-environment-artist.php

https://www.cgspectrum.com/blog/what-is-3d-texturing

https://www.masterclass.com/articles/how-visual-effects-work-in-film#3-types-of-visual-effects

 The Screenskills ‘map’ of roles in VFX:

https://www.screenskills.com/media/4381/vfx-cmyk-2021-master-inclusive-web.pdf

 The Screenskills explanation of VFX departments:

https://www.screenskills.com/job-profiles/browse/visual-effects-vfx/

Emergent Technology – Ray Tracing

Developments in real-time ray tracing techniques are opening up new possibilities in the VFX industry and revolutionising content creation.

Hardware manufacturers are creating devices that enable new capabilities in graphics, which can be utilised in interactive gaming, TV and VR, and animation.

Ray tracing is exactly what it sounds like. The process of calculating how light interacts with objects and materials in computer generated imagery. Particularly important in situations where there is movement as every single frame a scene moves, light interacts differently with the viewers eye.

(garagefarm.net, n.d.)

Previously renders could only be completed on huge render farms, but now there is much less waiting time, and this means that design progression is faster, and higher quality lighting, reflections and shadows can be achieved, and even more realistic ambient occlusion, scattering are possible.

One example of this technology is the Turing Architecture GPU created by leading manufacturer Nvidia. This “speeds up bounding volume hierarchy traversal and ray-triangle intersection testing”
(Meister et al., 2021)

This means that multiprocessors can improve shading speed and execution without needing instruction slots for each ray cast.
In other words, the heavy lifting of ray-tracing light can be picked up by graphics hardware usually needed for real time modelling. Light is traditionally handled by the CPU due to the complexities of the calculations involved.

Turing’s tensor cores enable users to improve AI denoising, so they can create better images, and more quickly. The viewer experience is much better, in that they don’t notice the effects and ‘believability’ increases. This means that ray-tracing light can be said to be happening in “real time” as it does in the real world.

Here is a demo by Nvidia displaying the ray tracing capabilities of their GPU’s.

The rendering process now just needs a single GPU with enough memory, and even huge, intricate scenes can be handled without long waiting times.

Uses:

·       Enhances gaming experiences (eg reflective surfaces are more realistic)

·       Animation pipelines – creators can work on light geometry in real time

·       Creating believable digital humans in film

High fidelity ray tracing creates fantastic realism, but traditionally has been very time intensive. When rendering things that do not have an interactive dynamic component, such as a film or cinematic, this is not a problem. However in VR and video games there is always a dynamic component so advances in technologies that make the process more realistic in real time are very exciting.
(garagefarm.net, n.d.)

A further development of this is in combination with another new technology in the world of VFX . (LED Wall) Virtual Production is the process of using LED screens as backdrops in film sets instead of green screens. This pertains to having Realtime graphics rendered interactively with actors on a set. In combination with technologies such as Unreal Engine, realistic scenes can be used as backdrops to actors that change as needed, made possible by the developments in ray tracing capabilities.
(fxguide, 2020)

References

garagefarm.net. (n.d.). How ray tracing impressively elevated the effects of 3D rendering. [online] Available at: https://garagefarm.net/blog/how-ray-tracing-has-elevated-the-already-impressive-effects-of-3d-rendering [Accessed 27 Jan. 2022].

Meister, D., Ogaki, S., Benthin, C., Doyle, M., Guthe, M. and Bittner, J. (2021). Number 2 STAR -State of The Art Report. [online] 40. Available at: https://meistdan.github.io/publications/bvh_star/paper.pdf.

fxguide. (2020). Art of (LED Wall) Virtual Production Sets, Part Two: “How you make one.” [online] Available at: https://www.fxguide.com/fxfeatured/art-of-led-wall-virtual-production-sets-part-two-how-you-make-one/.