Hegias - Building Imagination

UI/UX Design
Project Overview
HEGIAS turns static building models into immersive, browser based VR tours. Construction and real estate teams can walk through a space, swap feedback in real time, and make decisions together.
My Role
At HEGIAS, I worked on different parts of the product, from the VR headset UI to the web-based CMS and a pipeline that connects BIM files to VR.
Here are some key features I worked on:
  • Issue Management in VR – users could mark problems or leave comments directly in the virtual space.
  • VR Showroom Navigation – guiding users through spaces in a smooth, intuitive way so they never feel lost.
  • BIM Info Viewer – made it easy to check technical details of objects while inside the model.

User-Centred Design Thinking: How I Solve Problems

Issue management in VR

Problem
Teams could explore the BIM model in VR, but as soon as they noticed an issue, they had to take off the headset, open a laptop, and and write it down somewhere else. That break in flow meant:

• Context was lost—screenshots and coordinates weren’t tied to the comment.
• Notes lived in emails, PDFs, or sticky notes, so issues slipped through the cracks.
• Decisions stalled because designers, engineers, and site crews never saw the same information at the same time.

In short, the lack of an in-headset issue tracker turned a powerful VR review into a slow, error-prone feedback loop.
Goals
Allow anyone inside the VR model to flag, describe, and share an issue in under 60 seconds, using just one controller. The flow includes all key BIMcollab fields (title, status, priority, assignee, labels) and a familiar tablet-style menu, so feedback stays in context and flows straight into the coordination workflow.
Design Process
Impact
  • 40-second capture — Logging an issue now takes seconds instead of a two-minute laptop detour.
  • Competitive scan – Reviewed VR/AR and BIM tools (Revizto, Navisworks, Resolve, etc.) to see how they handle issue tracking—and where they fall short in VR.
  • Rapid prototyping – Sketched flows in FigJam and built hi-fi mockups in Figma to test controller reach, font size, and photo timing.
  • Usability testing – Ran two rounds with 10 mixed-role users. Iterated on button sizing, field defaults, and added a voice-note shortcut for gloved users.
  • Dev handoff & QA – Delivered design specs and prefab guidelines, and worked closely with Unity developers to ensure proper data sync with BIMcollab in staging.
Other Projects
As part of the HEGIAS CMS redesign, I worked on key flows that connect VR projects to BIMcollab—including device pairing, user management, and account settings. I also designed an in-VR tutorial to help users get started smoothly. My focus was on clarity, consistency, and turning complex actions into intuitive experiences for both admins and guests.

Client project: Kirchner Museum

Problem
Visitors enjoyed the life-size VR gallery, but many got disoriented or missed artworks—and some felt “stuck” inside the model.
From doodle to headset ✏️ → 🥽
Early storyboard sketches on a 360° VR grid helped me map sightlines and UI placement before I even opened Figma. The final navigation cues you see in the headset still follow those original pencil lines.
Key features I designed
Check out the live version →

Client project: Richner

Problem
Richner’s showrooms have limited floor space, so customers struggle to picture all the possible layouts, finishes, and room sizes for a new bathroom. Even in VR, early prototypes hid the controls in sub-menus, so shoppers kept asking staff how do I change the vanity?
Goal
Give customers a quick, self-serve way to tweak room size, style, lighting, and objects— whether they’re wearing a headset in the store or clicking through the web demo—so they can make decisions on the spot. 💫
Key features I designed
Validation · Usability Testing
We conducted a think-aloud session at the Hegias office, where the VR headset view was mirrored to a laptop and wall screens—allowing the team to observe user interactions and gather live feedback.
what we focused on:
  • Can first-time users find the toolbar and change room size without instructions
  • Do web viewers understand what the person in VR is doing
  • Are the hover and gaze states clear enough to show what’s interactive?
Impact
Check out the live version →
My first day at Hegias 🌿💻✨