
Create Your First Project
Start adding your projects to your portfolio. Click on "Manage Projects" to get started
Vodafone Sensing API
Project type
UX / UI, Exhibition stand, Video editing
Date
03/03/25
Location
MWC 2025 Barcelona
For Vodafone’s debut of its innovative Network Sensing API at MWC 2025, I led the UX/UI design for an immersive demo that simulated the API’s real-time object detection capabilities. Because the technology wasn’t yet ready for public release, my role was to design a complete, believable end-to-end experience that introduced attendees to the future potential of the API — supported by motion and development teams who helped bring the simulation to life.
The brief posed a unique challenge:
Show the world a technology that isn’t live yet — and make it feel real.
Vodafone’s Network Sensing API has the potential to detect objects like cars, people, boats, and bikes simply through the mobile network. But because the API was still under development, we had to create a believable simulation that demonstrated what the system would feel like once fully functional.
The installation featured a large, physically built map with a transparent LED screen suspended in front of it, creating an augmented-reality-style layer. As objects moved across the map, the screen appeared to “detect” them, displaying their paths and contextual data.
I designed the complete UX/UI system from scratch, mapping every state change, loading sequence, detection moment, and transition. Because nothing was actually sensed in real time, the experience needed to be carefully choreographed to feel smooth, intelligent, and believable.
A key part of the project was creating brand-consistent assets for a technology that had no precedent. Vodafone had no UI components or iconography for network sensing — so I developed a new set of visual elements and icons that aligned with their brand guidelines while introducing a design language suitable for machine detection. These icons have since been adopted into Vodafone’s global asset pack.
Once the system’s logic, tone, and visual language were established, I worked closely with our motion designer to choreograph the behaviour of each object — how a car slowed at a junction, how a boat glided across water, how a person followed a path — and paired each with responsive UI animations. I then collaborated with developers to sync the timing, logic triggers, and transitions, ensuring the entire experience felt cohesive and “live,” even though it was a controlled simulation.













