The first prototype was a physical prototype which consisted of shoes, a paintbrush and palette, a shopping cart and an eraser. The testing revolved around three different user interactions: the user grabbing the shoes and moving them around in the environment to examine them from all angles, the user customising the shoes, and finally the user either adding the shoe to a shopping basket or discarding it.
The initial prototype used physical objects in a Wizard of Oz testing style to simulate basic interactions:
Testing Methods:
A range of testing methods were employed throughout the project. A/B testing compared the paintbrush and paint roller interactions (in later iterations of the prototype), task efficiency was measured, cognitive walkthroughs were conducted, and think-aloud protocols combined with post-testing interviews provided qualitative insights.
Testing Results:
Users found physical manipulation of shoes intuitive, and paintbrush interaction was more natural than expected. Shopping cart placement needed refinement, and the eraser tool caused confusion.
Impact on Next Iteration:
I focused on digitising the prototype, simplified the customisation workflow, and removed the eraser tool while emphasising natural manipulation gestures.
The second prototype was digital and of higher fidelity and focused on testing the shoe customisation process. In this prototype users grabbed a sphere and moved it to come in contact with coloured cubes, causing the shoe to change colour. Different parts of the shoe could be changed in this way based on which colour cube was selected.
Building on Prototype 1 findings, this iteration introduced digital customisation with:
Testing Results: Colour cube interaction had mixed reception, and users noted the sphere-based selection was less intuitive than the physical paintbrush (from Prototype 1). Users desired more immersive customisation options, though digital manipulation retained the intuitiveness of the physical prototype.
Impact on Next Iteration: I returned to the paintbrush concept but this time digital form, added an alternative customisation tool, enhanced immersion through realistic assets, and improved the gesture recognition system.
Prototype 3 was the final prototype of the project, and had a higher fidelity than Prototype 2. Building on previous versions, I introduced high-fidelity dual customisation tools for A/B Testing, a paintbrush and a paint roller, along with an improved shopping basket system and more refined gesture-based interactions. Visual assets were also enhanced to create a more immersive environment.
Testing Results: Using time-on-task and A/B testing, I concluded that the paintbrush was 10.91% more efficient than the roller option. Users rated immersion highly, describing the experience as "fun" and "exciting." The shopping basket integration received positive feedback, and users expressed interest in a try-on functionality.
Impact on Next Iteration: Future development would focus on a virtual try-on feature, refine paintbrush mechanics, remove the paint roller, enhance environmental integration, and implement markerless tracking for foot detection.
The video below is a Meta Quest recording of Prototype 3 in action. (Feel free to hit 2x speed!)
The prototype was developed using Unity and C#, integrating mixed reality tools to support gesture-based interactions and real-time customisation. Spatial awareness features enabled accurate product placement and manipulation within the environment. Meta Building Blocks were utilised for hand tracking, gesture recognition, and passthrough API implementation, enhancing immersion by anchoring objects in the user’s physical space. I also wrote C# scripts which handled user interactions and ensured seamless technology integration.
This project highlights the transformative potential of mixed reality in retail, offering users an intuitive and engaging way to customise and purchase products in a virtual environment. The final prototype demonstrated the potential of mixed reality in retail, leading to: