Understanding our audiences
It’s one thing to say your organisation is user-centric. It’s another to build the working practices that ingrain user centricity and data-driven decisions, and make it part of the culture. When you do, the payoff is in good digital performance and satisfied users.
When you read Shelter’s working principles, you see this belief reflected in several ways: being user-centric by default, following a test-and-learn approach, making evidence-based recommendations and decisions, building through user participation, even co-producing with end users.
‘User research is all about people. If you can understand people’s environments, the emotions they are feeling, the reasons they are here and what they are doing you will be much more equipped to help them reach their goals.’
Louise Roddan, Senior User Researcher, Shelter
The search for knowledge and insight about our users is seen through all phases of our digital lifecycle:
Creating products and services
The modern web user has little time, many distractions, and growing expectations for websites and content. They’re also complex and unique, with endless distinct needs.
So the role of user researcher has become one of the most pivotal to any digital organisation. They work to identify user behaviours, needs, and motivations through observation techniques, task analysis, and other feedback methods and tools. They have the ability to keep the lived experience at the forefront of their team colleague’s minds.
What user researchers discover becomes the core of how we design content, services and products. Research findings are crystalized into clear user stories which are the reference point for content designers, UX designers, developers, testers and others.
'When researching, focus on users who have problems using existing services or getting the right outcome for them. This will help you create a simpler, clearer, faster service that more people can use.'
The GOV.UK service manual
Being data driven means test and learn
We encourage low-risk, low-cost testing based on an incomplete data set or a hypothesis, with the potential to scale using new evidence as we go. We do this not because we’re a charity with smaller budgets (full-blown research can be expensive), but because it’s the best way to establish a continued, focused understanding of users.
A key stance within in the user research process is to challenge all assumptions and guesses. Evidence is the only currency when designing for people.
Tools of the trade
We use a range of methods and tools for learning:
User behaviour | Users attitudes | |
---|---|---|
Why? (Qualitative) |
|
|
How often? (Quantitative) |
|
|
We also draw from reliable external sources: Data and analysis reflecting external trends in new technology and consumer behaviour. Read more about how we measure success.
Operating a live product or service
The keen focus on user needs doesn’t stop with a launch. A live product is a perfect way to build our understanding - testing and learning, optimising, iterating and improving based on observed use.
Qualitative tools like feedback pop-ups, surveys and usability tests can tell us what people think of the live product, and we can run live-product observational research to augment it.
Quantitatively, our digital analytics team manages the Digital Report Centre. For any Shelter product or service, from web information pages to chat to Shop pages and more, the report centre gives all Shelter staff data on demand: filterable reports covering a vast range of metrics.
All of this lets us establish a clear picture of a product’s or service’s performance against goals, and its user satisfaction. With that knowledge we can improve the product to further meet user needs, and use it to help our Digital Leadership Group (DLG) in guiding Shelter’s digital direction.
Future vision and business planning
The DLG uses product and service performance data to inform what we want to do next - whether to continue a product or service, develop it further, or decommission it and possibly replace it with something new.
We’ve been developing robust and transparent data-driven reporting to measure our progress against our the organisation’s strategic aims. The reporting includes data reflecting business efficiency and customer satisfaction.
Related
See how we measure the success of our products and services
A powerful network of practitioners: Read about our Communities of Practice
Learn the language with our digital glossary
Contact us about the digital framework
Have a question or comment? Found a bug? Or maybe you’d like to contribute to the framework? Use our contact form to get in touch.