“The day they took away the paper towels”
(Or: How my need to create hand washing puns led to me writing a blog post about using established schemas in UX design)
Here at Dubit Towers, like in many other offices across the UK, the principal method for drying hands was the humble paper towel. It remained this way for a long time, very little changed, besides moving the bin to under the paper towel dispenser (genius). Drying your hands on paper towels became ingrained into the muscle memory of every hand washing Dubiteer.
...Until last month. Last month we replaced the paper towels. Installed closer to the sinks, gleaming respendantly, we now have an extremely powerful, cyclone style, hand dryer. The kind that makes the skin on your hand ripple in an effect reminiscent of the briefly popular 80’s TV show “Manimal”. (Yes, I am showing my age).
No more paper towels would be stocked. The dispenser now sits on the wall bereft. High velocity warm air is the preferred method for water removal. Although a few Dubiteers can be seen wandering around with wet hand marks proudly emblazoned on the pocket areas of jeans and trousers in an act of cheeky defiance.
The dryer itself is a technological wonder - a tornado in miniature. It should’ve been an overnight success. Having everything in its favour; The process is simpler, it generates no waste, it’s proximity to the sink requires only two steps to reach, instead of the four steps previously demanded. Taylor would be proud. It should be a better user experience. We should be celebrating?
[Narrator: They’re not celebrating]
“It’s hard to teach a dog new tricks”, and it would seem it’s even harder to forget how to dry your hands.
It’s now an awkward process. We fight an old habit which has been reinforced daily, month after month. The task of drying leads to a briefly embarrassing moment of confusion as your body pulls toward the empty paper towel dispenser. You realise the error, stepping away foolishly, veering back in the direction of the hand dryer, perhaps adding a little flourish in order to “style it out”. You totally meant to do that. Woe betide anyone else is also at the sink, as what then ensues is a shared knowing look, a shrug, raised eyebrows and maybe a mumbled exchange about how “you’ll never get used to this…”. Your companion, complicit in your discomfort, laughs nervously, glad they didn’t commit this social faux pas in front of an audience.
And so it goes on - rinse and repeat. Unlearning something is hard. It turns out you can’t just wash your hands of it.
What on earth has drying hands got to do with designing apps and games? The office hand drying routine became embedded into our behaviour patterns. Even though a more optimal user experience was introduced (the hand dryer.) We experience behavioural friction in switching from an established schema to a new one. We had to unlearn a long established behaviour. The awkwardness of realising you’re doing the wrong thing creates a bad user experience which negates the joy that the simplicity of use the hand dryer gives us.
We’ll get over it, we’ll learn the new system - we don’t have an alternative (apart from wet pockets) but the friction is there. Once this new behaviour becomes the norm, we’ll forget the awkwardness and wonder what the issue was. But this experience gives me a good opportunity to discuss established schemas and learned behaviors.
What is an established schema?
An established schema is a learned behaviour, or system of behaviours that have become ingrained to the point that changing the schema (even for the better) creates friction for the person experiencing the unexpected. The brains equivalent of muscle memory - a combination of habit, experience and expectation. To replace the schema with a new one requires substantial benefits that outweigh the initial friction that a user encounters from the change.
Once you start looking for them, established schemas are everywhere in design:
The QWERTY keyboard. Technology has moved on, but despite many attempts to change it, the QWERTY keyboard reigns supreme as the method for inputting text on a variety of devices. Although sources differ on the original reason for the layout, it’s safe to assume it’s not optimal for devices where pecking with a finger or using two thumbs become the new methods of interacting. Enhancements such as predictive text and Swype style keyboards serve to help bend the QWERTY schema to fit the users requirements and improve the UX but we’re still tied to a system developed in the 1860’s.
The Save icon. Floppy disks are long gone, but the icon still serves as the defacto symbol for “save”.
Media playback icons for audio and video Youtube, Netflix, Spotify and other streaming media providers still use icons developed for physical devices, VHS players, walkmans etc.
The top right close button. Windows has conditioned its users to look to the top right to close anything. This schema is slowly changing as more people use mobile devices instead of desktop or laptops . These new devices have introduced different control systems and schemas which may ultimately create new expectations from users.
Why do established schemas matter?
- They are hard to break
- They are useful to build upon
- They can be unique and conflicting to different user groups
- And sometimes the reason that bad UX may actually be good UX
Let’s elaborate on that last one, how can bad UX be good UX? Take the QWERTY keyboard as an example, it’s the standard, and despite many naysayers it’s really hard to move people away from a standard. In this particular case we could allow people to choose other keyboard designs so we give the choice back to the user to optimise the UX according to their own preferred behaviours. But we don’t always have a choice, most of us have grown up with QWERTY keyboards, and this illustrates the power of schemas. It’s not the best design, but it’s one users are familiar and comfortable with. You’d need a very good reason to change it.
Using established schemas to help improve the user experience
Users come to you preprogrammed with a wealth of learned behaviours and expectations. As a designer you must recognise them, and use them to your advantage.
At the most basic we deduce the icons and symbols our users will recognise and use these to help inform our UI design. Using symbols they know will help the user, even if the original meaning is no longer relevant (eg. the floppy disk). Of course, this is just digital design best practice. We encourage designers to go wider, to consider a users real world behaviours and schemas. inspiration from the users . For example, looking at how kids play with toys can inform how we design games and allow us to encourage those kids to mimic real world behaviours digitally, opening up new UX opportunities that we may not normally explore.
As an example, we have been building construction games for preschool children. Using recognisable connectable building bricks in a game can encourage a child to experiment with multitouch at a younger age than we’d normally expect. In the real world they pick up two bricks and combine them. On the device, allowing them to drag bricks around encourages them to mimic real world play and we observe them quickly using multi touch to try and combine them. The real world affordances applied to the digital space naturally leads to a deeper multitouch experience.
Once the user realises they can do this they will try and combine other elements in the game. At this point we’ve used an established real world behaviour to enable a desired user experience within the app. Conversely if we don’t allow the child to use multitouch to combine bricks we will immediately disappoint as we fail to deliver on the expected outcome. A good UX will utilise schemas to reinforce behaviours but it also must accommodate the expected outcomes.
Traps and pitfalls
We need to be careful that we don’t fall into the trap of generalising a schema, or translating a schema from one medium to another without considering why that schema exists in the first place. The danger is to fixate on similarities of outcome and intent and then assume the procedural systems map across mediums. Skeuomorphic design is often an area where it’s easy to fall into these traps. Designs where the user experience is impacted by a UI that intends improve the UX by being recognisable but doesn’t leverage the advantages of the different medium.
Eg. Turning a physical page in a real book, becomes turning a digital page in an ebook, complete with animation, which although delightful when first experienced quickly becomes cumbersome in the long term. Often these evolve into a compromise, such as quickly swiping through pages.
Let’s take an example project:
Need: Sell groceries in an app.
Existing Schema: People walk around supermarkets and browse visually for for food in the real world
Proposed solution based on established schema: First person shopping experience, the user controls an avatar that can walk around a 3D supermarket looking for items on the shelves and selecting what they want to buy.
Let’s unpack this. We’re suggesting we build a first person shopping experience that is displayed on a device and controlled by user input. This schema doesn’t translate for a number of reasons.
Environment - Do the two environments or mediums translate, or are they different enough for the Schema to become invalid? We are trying to replicate a real world fully immersive 3D environment on a 2D plane, the device screen.
Controls - It’s relatively easy to look and move about in the real world, but replacing these with controls for a simulated experience immediately changes the schema beyond the familiar.
Friction points - The existing real world schema isn’t perfect, we should distill out the good from the bad. Given the opportunity how would people optimise the schema, we shouldn’t replicate the friction points. For example people don’t like pushing a trolley, queing to buy things, or using time on shopping that they could be using for something else.
Competing existing schemas - Has this problem already got a schema that is more apt for the medium/environment we’re basing our solution in? Can we find comparable schema’s that solve a similar problem and would be better suited?
The two situations are so different in execution that the real world schema isn’t useful to us. We can actually improve the shopping experience by leveraging the advantages of the platform, allowing users to quickly search, find and browse many more items than they would have time to in a real world situation without moving from the comfort of the arm chair. We can make shopping easier, and still maintain the fun aspect of shopping (foraging). This is why Amazon isn’t a 1st person 3D shopping experience.
Why change an established schema?
Changing an established schema is hard, it will lead to friction points in the user experience and therefore needs to be approached carefully but there are situations where it may be beneficial to do so.
- Cross platform support - you may be expanding the platforms your application runs on but not have the luxury to create different interfaces for each. In this situation you may need to find a “least worst” compromise that breaks an established schema for one or more of your user bases. You’ll have to decide on whether it’s better to upset everyone a little bit or one particular user group a lot, which brings us to...
- Widening a user base - Your user base may be changing and growing. Using analytics and research will give you the opportunity to improve your UX but this may have a negative impact on your existing core users. There’s always some uproar when Facebook or Twitter updates it’s UI/UX, this is because any change to an established schema causes some friction, even if in the long term those changes are for the best.
- Taking advantage of new technology - New technology allows us to approach UX differently and can lead to substantial improvements (touch screens for example) however your users will also be approaching this new technology with existing expectations. The balance to achieve is to use expectations to teach new behaviours that improve the overall UX.
Users all have expectations when approaching anything new. They look for clues and parallels with things they already know and use these as a means to solve problems and explore situations. We can utilise these schemas to help minimise friction and diagnose issues with our UX/UI but in order to do so we need to understand our users and the expectations they have. Changing these behaviours is hard, so designs must be evaluated with these expectations as a factor.
- Look for and use existing schemas if they are still useful and relevant and we know why we are using them
- When designing we can take inspiration from real world schemas and apply them appropriately
- We must be careful not to take schemas too literally - eg supermarket example
- We shouldn't be afraid to challenge or change existing schemas if we need to
Can we help?
Dubit have 18 years of kids and family insight. We create user journeys that delight and inspire, developed through experience, iterative testing in our PlayLab, and informed aesthetic decisions. You can use this expertise across current or future products. Learn more