Virtual Reality Immersive Shopping

VR development & iterative prototyping project
User Testing
Iterative Prototyping
Unity
C#
Physical Prototyping
Meta Quest
The Challenge
How can VR be leveraged to create novel and sustainable online shopping experiences? 

This project explores the development of a virtual reality retail platform for a shoe retailer, designed to create an immersive, gesture-based shoe customisation experience using VR. Through the design process went I undertook three iterative prototype phases, refining the user experience after each testing round based on feedback and testing results.
My Role
  • Responsible for the entire design and development process (solo project).
  • Conducted user testing and incorporated feedback to refine the experience.
  • Implemented mixed reality interactions and customisation tools utilising C#, Unity and Meta Quest through multiple prototype iterations.
Skip to:
Prototype 1 - Physical TestingPrototype 2 - Digital InteractionPrototype 3 - Refined Digital ExperienceTechnical ImplementationImpact & OutcomesReflection

Prototype 1: Physical Testing

The first prototype was a physical prototype which consisted of shoes, a paintbrush and palette, a shopping cart and an eraser. The testing revolved around three different user interactions: the user grabbing the shoes and moving them around in the environment to examine them from all angles, the user customising the shoes, and finally the user either adding the shoe to a shopping basket or discarding it.

The initial prototype used physical objects in a Wizard of Oz testing style to simulate basic interactions:

Shoe customisation storyboard
Physical prototype user test set up
Physical prototype artefacts

Testing Methods:
A range of testing methods were employed throughout the project. A/B testing compared the paintbrush and paint roller interactions (in later iterations of the prototype), task efficiency was measured, cognitive walkthroughs were conducted, and think-aloud protocols combined with post-testing interviews provided qualitative insights.

Testing Results:
Users found physical manipulation of shoes intuitive, and paintbrush interaction was more natural than expected. Shopping cart placement needed refinement, and the eraser tool caused confusion.

Impact on Next Iteration:
I focused on digitising the prototype, simplified the customisation workflow, and removed the eraser tool while emphasising natural manipulation gestures.

Prototype 2: Digital Interaction

The second prototype was digital and of higher fidelity and focused on testing the shoe customisation process. In this prototype users grabbed a sphere and moved it to come in contact with coloured cubes, causing the shoe to change colour. Different parts of the shoe could be changed in this way based on which colour cube was selected.

Building on Prototype 1 findings, this iteration introduced digital customisation with:

VR environment initial state: 
concept sketch
VR environment initial state: execution
Customisation in action:
concept sketch
Customisation in action: execution
VR user testing session


Testing Results:
Colour cube interaction had mixed reception, and users noted the sphere-based selection was less intuitive than the physical paintbrush (from Prototype 1). Users desired more immersive customisation options, though digital manipulation retained the intuitiveness of the physical prototype.

Impact on Next Iteration:
I returned to the paintbrush concept but this time digital form, added an alternative customisation tool, enhanced immersion through realistic assets, and improved the gesture recognition system.

Prototype 3: Refined Digital Experience

Prototype 3 was the final prototype of the project, and had a higher fidelity than Prototype 2. Building on previous versions, I introduced high-fidelity dual customisation tools for A/B Testing, a paintbrush and a paint roller, along with an improved shopping basket system and more refined gesture-based interactions. Visual assets were also enhanced to create a more immersive environment.

Prototype 3 environment: concept sketch
Prototype 3 environment: execution
Prototype 3 checkout: concept sketch
Prototype 3 checkout: execution


Testing Results:
Using time-on-task and A/B testing, I concluded that the paintbrush was 10.91% more efficient than the roller option. Users rated immersion highly, describing the experience as "fun" and "exciting." The shopping basket integration received positive feedback, and users expressed interest in a try-on functionality.

Impact on Next Iteration: Future development would focus on a virtual try-on feature, refine paintbrush mechanics, remove the paint roller, enhance environmental integration, and implement markerless tracking for foot detection.


The video below is a Meta Quest recording of Prototype 3 in action. (Feel free to hit 2x speed!)

Prototype 3 video demonstration

Technical Implementation

The prototype was developed using Unity and C#, integrating mixed reality tools to support gesture-based interactions and real-time customisation. Spatial awareness features enabled accurate product placement and manipulation within the environment. Meta Building Blocks were utilised for hand tracking, gesture recognition, and passthrough API implementation, enhancing immersion by anchoring objects in the user’s physical space. I also wrote C# scripts which handled user interactions and ensured seamless technology integration.

Impact & Outcomes

This project highlights the transformative potential of mixed reality in retail, offering users an intuitive and engaging way to customise and purchase products in a virtual environment. The final prototype demonstrated the potential of mixed reality in retail, leading to:

Reflection
Throughout this project, I developed and refined key skills in designing immersive experiences and leveraging virtual reality for retail innovation:
One challenge I encountered was optimising interaction accuracy and intuitiveness. While the final prototype successfully improved usability, reflecting on the process, I recognise the need for deeper exploration of gesture calibration and alternative input methods. This has been a valuable learning experience that will inform my future work in immersive design, and has sparked an interest in the capabilities and potential of virtual reality design.
< BuddyHomeIceworld >