May 21, 2021, by Ernesto Pacheco
A 3D Model Mimicking a Physical Model
CannonDesign’s Ernest Pacheco tests a Substance in Omniverse prototype.
The LAC + USC Restorative Care Village project is an innovative approach to providing supportive care of underserved and vulnerable communities in the Los Angeles County. Its goal is to manage interrelated challenges faced by people with serious medical, mental health, and addiction issues who may be homeless as a result.
This is one of our most recent modular constructions: it leverages prefabricated materials to expedite construction while reducing cost. We partnered with ModularDesign+ for this endeavor.
A 3D render that looks like a physical model
Artwork by Ernesto Pacheco
Prior to COVID-19, we used to build at least one physical model every month either for project pursuits, client presentations, or internal design meetings.
I work from our St. Louis office in Missouri, where we are very lucky to have a fully equipped model shop. There, we have built models the size of dining tables with topography details a few feet tall. We document these models with the use of photography and videography for later use or to archive the work, as we tend to trash the physical models after they have served their purpose.
In recent years we have introduced 3D printing into the mix which has helped to expedite building times while minimizing some of the costs that involve this type of deliverables; not to mention using less harmful non-recyclable plastics.
Lately, we have been looking for a way to minimize the building of design options with the aid of digital tools. Our physical scale models have started to include hybrid workflows where AR enhances the physical with the digital. We are now working on implementing MR into our process.
This project served as the perfect opportunity to test the idea of a photo-realistic real-time virtual scale model as a complementary deliverable to its analog counterpart.
The NVIDIA Omniverse context: multi-tool in real-time
Creating a render that would look like our models was a stress test case for Substance in Omniverse’s open beta. Would the USD workflow and collaborative environment work?
My goal for this exploration, with the Omniverse Create open beta and Substance materials, was to get as close as possible to the look of the photographs we usually take of physical scale models for documentation purposes.
Do not get me wrong, we love the craft that goes into using your hands to put together these amazing miniature pieces of art. We also enjoy the break it gives us to stand up from our desks and stop looking at the screen. However, during busy months this could quickly become overwhelming. Having an option to push production digitally to a high level of quality is just amazing!
Having access to photographic tools within Omniverse Create makes it easy for a designer to transition their analog knowledge to a virtual camera. This means we can expect less requests to pause design work and to plan for and produce 2D drawings just for physical model purposes.
It all began in SketchUp
The original 3D models for this project were put together using SketchUp. The designers working on the deliverables for this job are very comfortable with this DCC app, as it allows them to quickly iterate design options, thus fast-tracking the ideation process.
SketchUp is also a relatively low poly approach to 3D modeling, which affords more resources dedicated to terrain 3D modeling and high-end material creation in other software. This makes it very easy to strike a balance when aiming for a photo-realistic output.
I started by examining the SketchUp model provided by the design team. Most of the work needed was related to deleting unnecessary geometry, like the site information included in the file as it did not have enough polygon definition.
Then, I made sure materials were assigned correctly while deleting unused information.
Finally, I used groups to separate geometry to allow for better placement in Omniverse Create if needed.
Using the Omniverse Connector in SketchUp is straightforward. I made sure to create a new folder in the Omniverse Drive to hold all the USD I would create. Then simply run the Omniverse Connector and save the file to the correct location. At the time of this project, the Live Sync feature was not available yet for SketchUp.
Terrain: Autodesk Infraworks & 3ds Max
For the terrain, I decided to use Autodesk Infraworks.
The reasoning was to capture as much detail and information as possible with the least effort, while making sure I stressed Omniverse Create real-time rendering as much as possible with all the high polygon data.
We use Infraworks at CannonDesign for all our project that require fast and accurate site information. Infraworks does a great job with the terrain but most importantly, it lays down 3D roads and site work that can be edited if needed. Regarding existing buildings, it generates volumes with average heights from the overall building footprint.
The 3D data was exported in two separate .FBX files, one for the building’s massing and one for the site, which we then imported into 3ds Max for processing.
Often, we must optimize and clean up the detailed context information in Infraworks to work with our project. 3ds Max offers several tools to help with this type of tasks. I usually like to convert the mesh into an editable poly and use soft selections to massage the terrain. I also love the ProOptimizer as well as the new Retopology tools in 3ds Max 2021. These features really help expedite what used to be a very tedious process.
Exporting the terrain scene to Omniverse
My plan was to then export the terrain model using the 3ds Max Connector for Omniverse Create set to Live Link and then further modify the terrain around the building site model in real-time using the edit Poly modifier Soft Selection tools.
This was necessary as the Infraworks model did not accommodate for the new construction in the project buildings’ location. This also helped me to test this type of live collaboration workflows with Omniverse.
Making wire trees
We often add some type of indication of landscaping elements to our physical models. I really like to use metal wire for this, so I decided to quickly put together a couple of metal wire trees in 3ds Max.
This was done with the use of splines then converted to Editable Mesh.
This particular asset was exported as a single item using the 3dsmax Connector and then duplicated and transformed as needed in Omniverse Create. I used the Live Link feature in the 3ds Max Connector for Omniverse to tweak the thickness of the wire using the Shell modifier in 3ds Max in real-time before committing to a final look.
Assembling and first look in Omniverse
Best practice for Omniverse Create suggest loading the USD assets as Sublayers into a new Omniverse Create Stage. This will ensure a 100% non-destructive workflow with all assets while maintaining one “main” file or Stage dedicated for work in Omniverse Create that includes assets from different sources.
Once assets are loaded as sublayers, the Root Layer or Authoring Layer might need its World Axis set to Z under Properties>Layer Metadata to ensure correct 3D world orientation.
The new Stage includes a Default Light — a DistantLight that is great for simulating direct sun light. However, for this project I added a DomeLight to take advantage of the Omniverse Create HDRI library. This light helped with creating more realistic reflections, colors, and shadows.
The next step was to create a camera to help dial in the first pass of lighting and tone mapping. The process was simple, simply navigating the viewport to frame the view I wanted for the camera and then going to Perspective>Camera>Create Camera from view.
For the camera setup, I wanted to target the look one can achieve with a SLR/DSLR camera, for this I made sure to tweak the focal length to match that of a prime lens. Then, I switched the render engine to RTX Pathtracing which ensure better results.
After that, I worked on the first pass of tone mapping settings which can be found under RTX Settings>Post Processing>Tone Mapping.
Finally, I enabled the Depth of Field Camera Overrides settings to dial in the bokeh effect. Having a good understanding on how to expose for photography was very helpful and made the whole experience fun and second nature.
Adding the materials
The next pass for lighting was then approached from the material creation perspective. I decided to use Substance by Adobe for all the material creation for this project early on.
Four years ago, I introduced the PBR workflow for material creation at CannonDesign for all our visualization work. Back then we were looking to a solution to standardize this type of work across all DCC apps. I have been following the Substance team for a few years so I was very confident that this could be the answer to our needs. Since then, we have successfully adopted the Substance suite into our workflows for visualization.
Back to our project: I teamed up with Substance’s David Larsson to stress test a Substance Designer Connector prototype for Omniverse Create. Working with David was a great experience, he was able to incorporate new features based on our conversations.
Substance 3D Assets' vast library of PBR materials made for an almost limitless number of options to pick from and use within Omniverse! It truly made me feel like I had superpowers. Iterating between different materials quickly allowed me to bounce ideas with co-workers more efficiently all while having a ton of fun.
I spent a good amount of time trying different materials, tweaking their settings and UV mapping until I was happy with the results. I have never smiled more while working on a visualization project. Even mistakes with the correct placement of materials offered opportunities and aha! moments.
Exporting the final images is very simple. Under the Rendering menu there is a Movie Capture option. There you can set the number of Path Trace samples per pixel and the final resolution among other settings. I disabled Denoising under RTX Settings >Path Tracing (make sure the renderer settings are set to Path-Traced).
Denoising is great for real-time presentations but I found out that it would soften the details of the materials. I used Photoshop to finalize tone mapping, color correction and minimal post-processing.
Artwork by Ernest Pacheco
Artwork by Ernesto Pacheco
See the full workflow in this video: