A different kind of user There's a type of user design writing almost never talks about seriously. Not the frustrated user, not the power user, not the one we're trying to convert. The user who genuinely doesn't care about the product and structurally never will. I've spent several years designing for compliance and fintech companies, and this user is everywhere in those industries. AML analysts, underwriters, compliance officers – and further out, warehouse supervisors, factory floor operators. Smart, experienced people who are very good at their jobs, and whose jobs have nothing to do with software. The software is just something they're stuck with. I'll focus mostly on compliance because that's where I've worked, but the pattern repeats. Working with these people changed how I think about design more than anything else has. They didn't choose this In B2B software there's always a gap between who buys the product and who actually uses it. That gap matters more than most design teams acknowledge. Companies buying RegTech solutions aren't doing it because they're enthusiastic about compliance tooling. They're doing it because they operate in a regulated jurisdiction and the alternative is fines, sanctions, or worse. It's closer to buying insurance than buying software – a necessary cost, not an investment in something they're excited about. Once the box is checked, interest largely stops there. The people actually using the product – the analysts, the risk officers, the operations teams – inherit a tool they had no say in choosing. When you update it, redesign a flow, improve something that wasn't working, their reaction is different from what you'd get with a consumer product. There's no baseline enthusiasm to cushion the friction of change. Every UI update is just more cognitive load on top of a job that already demands a lot. The reaction may seem like ingratitude but it's a rational response to something asking for attention that's already committed elsewhere. This also explains why these users can be extremely specific about their needs in ways that are hard to anticipate without actually watching them work. The conditions of use and the conditions of design are often nothing alike. A clean, minimal notification – perfectly reasonable by any standard design metric – is useless for a warehouse worker scanning items with a handheld terminal who is never looking directly at the monitor. What works there is saturated color, heavy type, signals legible in peripheral vision. Things that would look wrong in a design review but are correct in context. This isn't an edge case. It's the normal condition of design in operational environments. A gap worth naming The problem isn't just context or physical environment. It's about how expertise changes the relationship between a person and a screen. A compliance analyst reviewing a suspicious transaction alert has done this hundreds of times. She already knows what she's looking for. She's scanning for the one or two signals that either confirm or break the pattern. Most of the time she's done before she's finished reading the screen. In contrast, general design practice assumes users are figuring something out – reading, comparing, deciding. The interface supports that process. There's actually quite a bit of research on how experienced people make decisions under pressure, and it's worth looking at before getting into what this means for design. How expert decisions actually happen Gary Klein spent years studying decision-making in high-pressure environments – firefighters, military commanders, intensive care teams – and what he found was pretty counterintuitive. studying decision-making We tend to assume decisions happen through careful evaluation – weigh the options, pick the best one. That's also how most software is designed to support decisions. Klein found that experienced operators almost never worked this way. Instead they recognized situations. They matched what they were seeing to a pattern built from years of experience, generated one plausible course of action, ran a quick mental simulation, and acted. The whole process happened in seconds, often without conscious deliberation. He called this Recognition-Primed Decision making. The practical implication for interface design is that the expert has often already made the decision before they've finished reading your interface. What they need from the screen isn't a structured process to work through – it's fast confirmation that their read is correct, or a clear signal that something doesn't fit. Klein also found something that matters for how we do research in these contexts. If you ask an expert to describe how they made a decision, they give you a rational, sequential account that didn't actually happen. The real process is largely non-verbal and hard to introspect on, so people reconstruct a cleaner version when asked to explain themselves. Standard user interviews in expert domains produce a plausible fiction rather than useful data. To understand what these users actually need, you have to observe them working, not ask them to narrate their thinking afterward. Mica Endsley's work on situation awareness adds another layer. She identified three levels at which operators build their understanding of a situation: perceiving that something exists, understanding what it means in context, and projecting what's likely to happen next. Most enterprise interfaces address only the first – they surface data and stop there. The second and third levels are where expertise actually operates. work on situation awareness What this means for the interface A few things follow from all of this. Surface the thing that doesn't fit Surface the thing that doesn't fit Standard dashboard logic is about completeness – show everything, let the user find what matters. For expert users this is usually the wrong starting point. They're not constructing a picture from scratch. The anomaly is what matters: the single data point that disrupts the pattern they've already formed, the transaction that doesn't match a customer's established behavior, the number that's slightly off. In compliance case review, visual hierarchy should serve pattern confirmation rather than comprehensive display. Normal should recede. Anomalous should be immediately visible – not buried in a consistent visual treatment that makes everything look equally important. Match the interface to the user's mental model, not the system's logic Match the interface to the user's mental model, not the system's logic One of the most persistent problems in complex B2B products is that internal system architecture bleeds into the interface. Things get named and organized according to how the system works rather than how the user thinks about their domain. Users end up having to learn two things – their actual job, and the logic of the software – when they should only need one. Compliance teams come in with their own risk frameworks already figured out – spreadsheets, internal documents, years of thinking about what matters for their specific business. Then they open a compliance tool and have to translate all of that into the product's internal logic. Sometimes it works fine. Sometimes they have to abandon their mental model entirely and start from scratch with the tool's. That second scenario is what good interface design should be working against. The documentation problem is structural, not a UX failure The documentation problem is structural, not a UX failure Almost every case management system in compliance serves two audiences simultaneously: the analyst doing the work, and the regulator or auditor who later needs to verify it was done correctly. These two audiences need fundamentally different things from the same interface, and that tension doesn't go away no matter how well you design. The analyst needs to move fast, act on pattern recognition, close cases without unnecessary friction. The regulator needs a structured record demonstrating that a proper review process was followed. In practice these requirements pull in opposite directions, and it's not always obvious which one is being served at any given design decision. You can reduce the cost of documentation – smarter defaults, contextual pre-population, fewer required fields for routine cases. It is worth being honest about the fact that it’s impossible to resolve this tension completely. Sometimes the environment you're designing for is not the environment you're designing in Sometimes the environment you're designing for is not the environment you're designing in There's a case study from NLMK, a large Russian steelmaker, that I keep coming back to. Their design team had built clean, modern interfaces – light theme, thin fonts, subtle colors. Reasonable choices following all design standards. However, they kept getting extremely negative feedback from factory workers. When they finally visited the plant, they found that factory halls are dark, monitors are older models, and workers are reading numbers at a glance while doing something else entirely. The interface that looked great in the office was nearly unreadable on the floor. case study from NLMK They rebuilt the whole system around the actual conditions – dark themes, heavy monospaced type, large unambiguous indicators. Not things that would look good in a portfolio, but things that actually worked. The compliance context is less dramatic. But an analyst switching between tabs trying to find fraud patterns under time pressure is also in a different headspace from a design review. Unlearning delight Design culture has a rich vocabulary around engagement – onboarding flows, micro-interactions, moments of delight. These things matter in consumer contexts, where the product has to earn continued attention. In operational contexts, most of it is beside the point. The compliance analyst doesn't want to be delighted. She wants to close her cases and move on. What actually helps here is genuine domain knowledge and real understanding of what the work demands and what expertise looks like in practice. Sitting next to the analyst, understanding the job before touching the interface. The goal isn't a product people enjoy using, it's a product that gets out of the way fast enough that people can focus on what they're actually there to do. Less thinking about the user's relationship with the product, more thinking about the user's relationship with their actual job.