Process

Capturing of the exhibition space was done with a Matterport 3D scanner. The processed data outputs: 360 footage, lidar data (.e57), and photogrammetry. Additionaly, mobile scanning was used for some of the smaller elements in the space. Processing the data the team found out incostistencies in the model representations. Using Blender to close holes and clean up the data yielded better results. Later optimizing the scans and decimating the meshes, the team was able to produce web-ready models for display.
Spatial Web Experience

Building a spatial web experience required experimenting with three.js camera navigation. There were different options for User Input: scroll-based camera dolly-in/dolly-out, third-person movement, point&click navigation. Testing different navigation approaches the team decided to pursue third-person character navigation style since it was more accessible to the user. Below in the carousel there are recorded research outcomes with the camera movement the team tested.



Another research for this project was optimization of the experience. Starting with mesh decimation, user input redesign, and raycasting optimization the team dived deep into making the project more accessible to other users. Most of the optimization issues were in 3D models. The scans had a lot of polygons and stuttered the experience on load. Loading times were tragic without optimization. To tackle this, I researched best optimzation techniques for Web and published an article on medium.

Project Showcase
I've had a lot of fun working on this project and learned much about Web design and 3D programming. Structural Color Gallery officially published this project on their website. Everyone can access the project through the link below.
Visit Gallery