3D software engineer

MICHAEL
VITAZKO

Havenly Inc.

3d Software Engineer,

March 2023 - December 2023 - Seattle, WA (Remote)

Havenly is an interior design service and furniture store. They match customers up with designers to help them redesign rooms in their house. Customers will either measure or scan their room layout and upload the information to create a room shell. Designers will then work customers to understand their interests and budget. They then furnish a digital twin of their room within Havenly’s Creator and fill it with versions of items sold through Havenly. Once the designer is done they’ll submit a few camera angles of the finished room for photorealistic rendering.

At Havenly I was a 3D Software Engineer and responsible for maintaining and improving Creator, their internal interior design tool. Creator was a mix of a Unity WebGL app rendering to a React Canvas. Most of the 3d interaction happened in Unity, where as most of the connections to the rest of Havenly’s systems was through the react page. I worked back and forth on both and how they communicate.

Shortly after I joined in March 2023, Havenly’s main bank, Silicon Valley Bank, collapsed. While this didn’t have an immediately impact, it meant the likely hood of future investments was low, so reducing costs was my main focus during my time there.

One of the biggest things I did to reduce costs was to help implement modular assets. Basically anytime a designer wanted to use a piece of furniture we didn’t already have a model of, we would get an internal 3d modeler to make it, or contract an outside company to do this.

To reduce this we switch to modular assets when possible. Many of the custom item sold by Havenly’s brands have dozens of fabric options. So in working with an internal artist, we developed a system of tagging different parts for different types of materials. Each model and material would get a separate asset bundle. Then I worked with our cloud engineer to creature a mapping of products to models and materials that could be fetched and assembled at runtime.

With the fast pace of production we were seeing more regressions. As an early warning I created an automated test render to run whenever a new merge request went in an built correctly. The render would drop a thumbnail in slack of a specific room each time so we could tell if something changed. Additionally we could have had the scene do more actions before the render to test more things, or subtract the latest render from the last know good one to automatically determine if a visually regression appeared, but these improvements were postponed indefinitely.

 

Insight Global at Meta Reality Labs

Product Design Prototyper,

July 2021 - January 2022 - Seattle, WA (Remote)

At Meta Reality Labs I was a product design prototyper working on the communications team for Project Nazere, Meta’s upcoming AR Glasses. My job was to build experiences using existing hardware to support designers and user experience researchers to figure out best practices for new methods of communication.

 

PlutoVR

Software Engineer,

September 2019 - April 2021 - Seattle, WA

PlutoVR was a Seattle based start up trying to envision the future of communication by pushing the limits of today tech. Their main product was a SteamVR overlay that allowed you have spatial conversations and share your point of view with friends, all while running on top any other XR experiences you were running.

During my time at Pluto I worked on most parts of the code base, but there are a few highlights that stand out.

Pluto hosted daily standup in the App, to make connecting more convenient, especially for those out of the office, we made an iOS app that could call in. While the mostly worked as we were using WebRTC, there were some considerations for how we rendered this to the VR users. To them, they saw just a flat floating plane in space. Viewing the plane from oblique angles made them hard to see and if their phone jostled a bit it was like their whole head was dodging and weaving like Rocky Balboa. To solve this we implemented Gravity Aligned Video and Depth Rendering.

For gravity alignment we took our video plane and rolled it in the opposite direction of rotation the phone was experiencing. This allowed the user to remain still and their frame instead to rotate around them.

For depth rendering we combined the color and depth data for each frame into one larger image. The depth camera runs at a lower frame rate, so we would just send the most recent each time with each color image. Then on the receiving end we split them back up, created a procedural mesh with a quad for each pixel and extruded them based on the value from their corresponding depth pixel. If the depth value was near zero (very far away), we would clip the pixel as to remove the background and only show the head. There was still some artifacting, some of it probably caused by the depth data being altered by the video compression. Other issues like glasses and eyes sometimes not showing happened because they reflect IR light differently than visible light. Overall however, the results was an improvement in VR.

Pluto also contributed to OpenXR, a part of Khronos Group trying to make standards to XR ecosystem. Pluto advocated for a Multiapps ecosystem where separate developers could individually create applications that could coexists in the same space.

One of the ways I prototyped this was with Web XR and Metachromium. Metachromium is a version of chrome where each tab is a WebXR overlay. The tabs themselves were invisible but otherwise worked like you’d expect. I mostly stuck to 3js to render objects and scenes, but most normal webdev works flows should work. And just like chrome tabs you could open as many as you want at run time, but rendering could get messy since they were each a separate layer.

Unfortunately As of March 2024, PlutoVR has shutdown.

 

Valence Group Inc.

Ar/VR Developer

November 2018 - August 2019, Bellevue, WA
At Valence, I was the lead developer on was ULA Anywhere AR, an education mobile app designed to show off the scale and capabilities of Unite Launch Alliances upcoming Vulcan Centaur Rocket at the 35th Space Symposium.

The project started in late November and needed to be completed before the symposium on April 8th. With the short timeline of the project, we initially decided to use Unity with Vuforia to be ability to build once and deploy to both android and iOS. This worked well at small scales, but struggled to maintain stable tracking when looking up at the full 67.4 meter tall rocket, especially on android. To solve this we switched to using the separate ARKit and ARCore plugins and relied heavily on preprocessor directives to split up platform specific code.

Another interesting challenge was accurately describing orbital launch capabilities. Low earth orbit requires much less power than geosynchronous orbit. Luckily the space industry uses a common format called the Two-line Element Set (TLE) to predict orbits over time. This was invented in the late 60s and early 70s for use with punch card, but it’s still effective to this day. I was able to integrate an orbital tracking library and get some historical TLEs previous ULA launches to accurately plot at scale the different orbits.

We successfully made our deadline and ULA Anywhere AR was available at 35th Space Symposium on Android and iOS.

After the success of ULA Anywhere AR I took on another project for Valence, this time on a virtual factory tour for Purina, the pet food company.

The idea was to capture 360 video of the factory and add training voice over and informational cards to help onboard new employees. To do they would load the tour on an Oculus Quest and ship it to the employee who would use this as part of their orientation before coming on site.

The problem was that the project started in April of 2019, the Quest released on May 21, and the initial app review was to take place in June. To achieve this we initially built a playable demo on the Oculus Rift with some demo 3d videos while the real video and audio was recorded. Then it would just be a matter of porting to the quest and swapping assets.

The approach mostly worked, but we ran into two issues once we finally got a Quest. The first was our scene transitions used a white sphere that was fade it’s alpha channel in and out between videos. On the Quest this flat out didn’t work and we were stuck with a white screen. While this initially panicked me, simply rewriting this effect directly in the 3d video shader solved this.

The other issue was file size. Our test video files were small, but the full tour took up way too much space and android apks have a size limit of 4gb. Streaming was out of the question because we couldn’t guaranty the employees internet. Instead we trimmed a lot of the videos down to the bare minimum to see a full loop. For mostly static scenes we held just one frame, but preserved the full background audio as a separate audio track to keep the space feeling somewhat alive.

We successfully delivered the first build in June with some additional support into July and early August.

Valence group is now know as Kopius, and you can find out more about these projects at their website.

https://kopiustech.com/case-studies/augmented-reality-application-ula/ 

https://kopiustech.com/case-studies/virtual-reality-nestle-purina/

 

Level11

Unity Developer

April 2017 – April 2018 / August 2018 - November 2018, Seattle, WA

During my time at Level11 I was consulting for Carnival, the cruise ship company, on their Ocean Medallion project. Being at sea inside a large metal structure, phones GPS and magnetometers both have issues orienting themselves. The medallion is a wearable IoT device that in combination with numerous BLE beacons at fixed points around the ship allows for more accurate positioning.

Along with the companions app, the Medallions allows for guests to easily navigate around the ship to find each other and meet up or allow for waitstaff to bring refreshments to them anywhere on the ship. Additionally these wearables were connected with a profile set up by the guest ahead of time to make onboarding and offboarding much quicker.

Initially I worked as a front end unity developer on their iOS app for the communications team where we developed ShipMates, the onboard friends list and chat. Cell service is unreliable out at sea so ShipMates utilized the ships internal network for stable communication.

Once ShipMates was relatively feature complete I moved onto the navigation team. OceanCompass acted much like google maps showing your near by ShipMates and Points of Interest with the ability to provide 3D routes to them.

I developed a lot of the UI/UX for both 2D and 3D deck interaction. Most of the unity code base was shared between the iOS mobile app and the linux 4k touchscreen portals around the ship. One main feature I was in charge of was routing. I wrote a custom spline mesh generator that would show an animated path to your destination across decks.

We successfully shipped OceanCompass in 2018 to the Regal Princess, and 10 more ships in the Princess line over the following months.

Level11 has since merged and become part of Launch Consulting. You can find out more about this at their website. https://www.launchconsulting.com/case-studies/carnival

 
 

Side Projects

Coming Soon!

Outside my professional work, I’ve dabbled in game development, 3d modeling, and rendering,

Contact

Email: Vitazkomm@gmail.com