Omnicell ADC

Problem 01 — Nobody Owned the Seam · Omnicell · Jacqueline Smith
Back to Omnicell Omnicell · 2020–2024 · Problem 01 of 03
Problem 01 Org Design · Hardware-Software Integration · Field Testing

Four teams.
One device.
Nobody owned
the seam.

Hardware decisions were being made that would permanently constrain what the interface could do — and nobody had asked what any of it meant for the software experience. This is the story of how I made myself the person who owned that gap.

Omnicell Automated Dispensing Cabinet — medication storage on every hospital floor
Who
4 teams, 1 device, no shared kickoff
Industrial design firm, electrical engineering, and two scrum teams — none looking at the full experience together.
Risk
Hardware decisions with permanent UI consequences
Screen size, orientation, input method — every hardware choice constrained what the software could do.
Stakes
10-year hardware lifecycle
Every software decision I made would live inside this device for a long time. Getting the seam wrong wasn't recoverable.

The brief wasn't wrong. Nobody who wrote it had been in the room.

What I walked into

When I joined Omnicell I was brought in to lead SaaS UX. A few months later I was reorged and ended up owning the entire Point of Care product line — including the next-generation Automated Dispensing Cabinet redesign already in motion.

Four teams were working on one device with no shared view of the full experience. Industrial design was an outside firm. Electrical engineering was the bridge between hardware and software. Two scrum teams split pharmacy and nursing workflows. By the time I was in the room, hardware decisions had already been made with UI implications nobody had mapped.

The tracks were being laid while the train was moving. Multiple form factor concepts in play simultaneously — different screen sizes, orientations, surface materials, input methods. None of it decided. All of it consequential for the software.

The core problem
Screen size determines information density. Orientation determines reading distance. Input method determines whether you design for a mouse, a finger, a gloved finger, or a scanner. Every one of those choices is a design decision — it just wasn't being treated as one.
What was missing
No single team owned what happened between hardware and software — the moment the drawer opened, the moment the light came on, the moment the system registered the action. That seam is where the experience breaks down.
My move
I became the person asking the questions nobody had asked. Not because it was in my job description — because if the screen fails in the environment, the interface never had a chance.
Hardware form factor concept A
Hardware form factor concept B
Hardware form factor concept C

Multiple hardware form factor concepts in play simultaneously — each one a permanent constraint on what the software could do.

If the screen fails in the environment, the interface never had a chance.
— The principle that drove every hardware-software decision

I don't design from a desk.

Site visits · Omnicell · 2020–2024

Redesigning hardware with a decade-long lifecycle isn't a desk job. I went on-site, gowned up, and spent time watching how people actually used this hardware in live hospital pharmacies and nursing stations.

What you see here tells the story before I say a word: a tethered scanner, a keyboard next to the screen, someone reaching up into a cabinet with both hands occupied. Everything observed in those visits shaped every interface decision that followed.

Environmental conditions weren't edge cases — they were the spec. Overhead fluorescent lighting. Noise. Physical display positioning. Gloved hands. Users who are never fully stopped when they interact with the screen.

Site visit — legacy cabinet in use at hospital pharmacy

On-site contextual inquiry in live hospital pharmacies. Environmental conditions — lighting, noise, display position — shaped every interface decision.

Not a research deliverable. An alignment tool.

Service blueprint · Built without being asked

I built a service blueprint that nobody asked for. Three swim lanes — user actions, hardware, and software — mapped across every workflow the device was part of.

The device had one job: meds in, meds out. A drawer opens, someone puts something in or takes something out, and the system tracks it. Simple mental model. But when you laid the blueprint out, no single team owned that full loop. The hardware team owned the drawer. The software team owned the screen. Nobody owned what happened in between.

That's the seam. The blueprint made it visible for the first time. Every team was suddenly looking at the same device doing the same job — not their piece of it, all of it. Once you can see the seam, you can own it. That's what the blueprint was for. Not documentation. Alignment.

Three swim lanes
User actions, hardware, and software — mapped together for the first time. The template was applied to every workflow across every user type.
The insight it revealed
No single team owned the moment between hardware and software. Once the blueprint made that visible, every team could see — and fix — the seam.
Service blueprint — three swim lanes across user actions, hardware, and software

Three swim lanes. One template. Applied to every workflow across every user type.

The environment had opinions. We had to listen to them.

Physical environment testing · Screen sizes, glare, viewing angles, touch targets

The blueprint told us what questions to ask. The environment told us what the answers had to be. We tested screen sizes and orientations mounted on actual cabinets — not in a lab, on the hardware itself.

We looked at glare from overhead fluorescent lighting. We tested viewing angles because the person reading this screen is rarely standing directly in front of it. We did smudge tests. We looked at touch target sizes for people who weren't always able to stop moving to interact with the screen.

The environment had opinions and we had to listen to them. Our UI didn't always perform. That's what drove everything that came next.

Physical testing setup — screen mounted on actual cabinet

A misread screen in this environment isn't a UX failure — it's a safety incident.

01
Screen size + orientation
Tested multiple sizes and orientations mounted on actual cabinets. Screen size determines information density; orientation determines reading distance. Neither was a software decision alone.
02
Glare + lighting conditions
Direct overhead fluorescent lighting washes out dark UI. The dark theme vision made sense on a monitor in a design review. It didn't survive the environment — and that finding changed the design system strategy entirely.
03
Viewing angles + smudge behavior
Users are rarely standing directly in front of the screen. Off-angle legibility and smudge resistance under repeated gloved-hand use became real constraints, not edge cases.
04
Touch target size for moving users
Users weren't always able to stop moving to interact with the screen. Standard web touch targets weren't sufficient. This drove the kiosk-specific component layer built on top of the existing design system.

Unified doesn't mean identical.

Extending Greenlight — the existing design system — for a new physical context

The constraint was an existing design system — Greenlight — built for web and optimized for someone sitting at a desk with a mouse. The response wasn't to abandon it. It was to extend it.

Same tokens. Same components. But a kiosk-specific layer built on top. The environment is a design input, not a constraint to solve after the fact — and the design system had to encode that understanding by default, so any screen built on top of it inherited those constraints automatically.

Responsive web design wasn't the answer here. Responsive web design assumes a browser. This was a kiosk bolted to a wall in a medication room, operated by someone in gloves who has somewhere else to be in thirty seconds.

Design system — kiosk type styles Design system — classic vs kiosk component comparison

One design system. Extended — not replaced — for every surface.

01
Readable at 6 feet
1.5x type scale. Shoulder-height reading distance. No squinting at the cabinet.
02
PII minimized by default
Patient information surfaced only when clinically necessary. A hospital floor isn't a private environment.
03
Fewer screen interactions
Scanner advances the workflow. The screen confirms. Less touching, more restocking.
04
Built for gloved hands
Larger touch targets. Increased spacing. Designed for the environment, not the ideal.

Same screen. Three years of learning.

The Remove screen — a controlled substance interaction — evolved across three fundamentally different paradigms

Remove screen V1 — 2022 wireframe
Version 01 · 2022
Wireframe based on prior solution
Square screen. Physical keyboard. No hierarchy — everything weighted equally. Required reading or muscle memory.
Remove screen V2 — 2023
Version 02 · 2023
Built from clinical requirements
Horizontal screen. Every clinically relevant piece included — but no hierarchy. Under pressure, nurses couldn't find what mattered.
Remove screen V3 — 2024 final
Version 03 · 2024
Designed for 6 feet away, bending and reaching
Hierarchy above everything. QTY dominates. Details on demand. Legible at a glance without reading.

The remove screen is a controlled substance interaction. The stakes are not abstract. The legacy screen had no hierarchy — everything the same size, the same visual weight, built around what the database needed to record rather than what the nurse needed to do.

V3 is organized around a single principle: glanceability under pressure. QTY dominates because a nurse pulling meds at 2am reads the number before anything else. Waste requirement surfaces inline, never behind a modal. Bin location is mapped spatially. Alert state surfaces without interrupting the workflow.

Whiteboard concepting session for V3 Remove screen

V3 started on a whiteboard. A question nobody had really answered cleanly: what does a nurse actually need to see when they're pulling a medication at 2am under pressure?

What I'd do differently.

Every reflection here is honest — these aren't lessons learned in retrospect, they're things I was advocating for in real time that ran into organizational resistance.

On cross-functional integration
I should have been in the room with hardware and industrial design from day one. By the time I was working with electrical engineering, permanent decisions had already been made. That wasn't a personal failure — it was a symptom of how the org was structured. But knowing what I know now, I'd have pushed harder to be at that kickoff regardless of whether it was in my job description.
On the dark theme discovery
The failure of dark UI in direct overhead fluorescent lighting was a known hypothesis we kept advocating for internally. We should have run physical environment testing earlier and put documented results in front of stakeholders sooner. Showing the data would have made the argument easier and saved time spent designing to a constraint the environment was always going to override.