Oracle Aconex · 2019

Fixing punchlists by going to the job site

To design a better experience for site supervisors, I didn't start with a whiteboard. I went to an actual Multiplex construction site and spent a day watching how work really gets done.

Ethnographic Research UX Design Mobile + Web Prototyping Usability Testing
My Role
Senior Product Experience Designer
Platform
iOS, Android, Web
Company
Oracle Aconex
Duration
[X months] · 2019
Status
Shipped ✓
[ Hero image — construction site or app screenshot ]

Aconex Field — Punchlists on mobile and web. On-site, Multiplex construction site, Melbourne.

Aconex is the platform that runs construction projects

Used by construction and engineering teams across 70+ countries, Aconex manages documents, workflows, and field activity on some of the world's biggest infrastructure projects. Punchlists — the industry term for defect tracking and sign-off — were a core feature, but one that had grown unwieldy over time.

Site supervisors were doing their rounds with printed lists, clipboards, and phone photos. The app existed but wasn't being used on-site. That was the signal.

"Nobody uses it on site. They do the walk, take photos on their phone, then come back to the office and enter everything manually."

— Site supervisor, Multiplex Melbourne

We were designing for a version of the job that didn't exist

[Placeholder — describe the gap between what the product assumed and what actually happens on a construction site. Bright sun, gloves, hard hats, no time to type, spotty signal.]

[Describe what "doing a punchlist" actually means physically — walking floors, photographing defects, assigning trades, signing off items — and why the existing UI failed each of those moments.]

[ Before: existing UI screenshot ]

The existing punchlist UI — designed for desktop, used on mobile.

[ Field observation photo ]

On-site observation, Multiplex Melbourne. Supervisors with clipboards.

I went to the site

Rather than relying on stakeholder interviews alone, I arranged an ethnographic field visit to a Multiplex construction site in Melbourne. I shadowed a site supervisor through their full inspection round — from the morning briefing to the final sign-off walk.

01

Ethnographic field visit

[What you observed. How long. Who you followed. What surprised you.]

02

Stakeholder interviews

[Who you spoke to — project managers, site supervisors, subcontractors. Key themes that emerged.]

03

Competitive analysis

[What other tools were people using. What they liked about them. Where Aconex had ground to make up.]

04

Synthesis & insight mapping

[How you translated field notes into design principles. Key insights that changed the direction.]

[ Research synthesis — affinity map, insight board, or field notes ]

Research synthesis from field visit and stakeholder interviews.

What the site taught us

[X]
[Key metric from research — e.g. avg time to log a defect manually]
[X]%
[Another stat — e.g. supervisors who reverted to clipboard after onboarding]
[X]
[Third insight stat — e.g. pain points identified across all sessions]

[Expand on the 2-3 most important design insights that came from the research. What changed your assumptions? What did you see that you couldn't have guessed from the office?]

Designing for gloves, sun, and no signal

[Walk through the design decisions. How did the insights translate into specific UI choices? Larger tap targets, offline-first, camera-first flow, etc.]

Exploration & iteration

[Describe the ideation phase — sketches, wireframes, early concepts that were explored and discarded.]

[ Wireframes or early concept sketches ]

Early wireframes exploring the camera-first inspection flow.

The solution

[Describe what you landed on. What is the core interaction model? What makes it different from before?]

[ Screen 1 ]

Quick-add from camera

[ Screen 2 ]

Offline-ready item list

[ Screen 3 ]

One-tap sign-off

[Your key design principle or the "aha" moment that unlocked the right solution.]

Validated with real users, back on site

[How you tested. Prototype fidelity. Who participated. Where sessions were run — office or back on site?]

[Key findings from testing. What passed. What needed iteration. What changed as a result.]

[ Usability testing session or prototype screenshot ]

[Testing setup description.]

Shipped — and actually used on site

[Describe what shipped. The final product. Any phased rollout or pilot before full release.]

[X]%
[Outcome metric — e.g. increase in mobile adoption]
[X]x
[Second outcome metric]
[X]
[Third outcome or qualitative win]

[Qualitative outcomes — what changed for users. Any quotes from supervisors after the redesign? Stakeholder reactions?]

What I'd do differently

[Honest reflection. What you'd change knowing what you know now. Shows maturity and self-awareness.]

Next case study
NAB Cash Predictions