How many research techniques to use for a project?


How is it determined the number of research techniques that needs to be done for a project? For example will a competitive analysis be enough or you include more such as personas?


Great question!

In general, just doing a competitive analysis won’t be enough. And personas will be helpful for understanding your research (the ‘analysis’ phase) but won’t actually dig out any useful findings of their own.

A couple of aspects will help you determine which research is required, and how much of it is needed.

  • [B]What applicable research data is available already?[/B] Does the team or business have other data it has collected in the past? Has someone else done that same research already?
  • [B]Is the data still accurate and useful? [/B]How do you know? You want to bust any incorrect assumptions early.
  • After looking at all the available inputs and considering the requirements of the project, [B]where are the gaps in your knowledge?[/B] What important things do you know that you don’t know?
  • [B]What are the risky areas for the project?[/B] What are the things you must really do your homework on? Is the organisation banking on getting something particular right?
  • Knowing the range of research techniques available, [B]which of these is most easily going to give you useful information?[/B] Which will address gaps in knowledge and gives a solid understanding for any risks? (More on this below, if you don’t know the range of techniques/methods).
  • Does the research have a high reliance on discovering unfamiliar behaviours? Or is it more important that the research checks previous or existing assumptions?
  • [B]How much experience do the researchers have?[/B] In this particular field of research, on this topic? There will be more and larger gaps for less experienced research teams.
  • [B]Do at least some face-to-face research with users anyway.[/B] I’ve never seen a project where research failed to help either for personal context or project discovery.

You’ll need to get familiar with the range of techniques:

A good rule of thumb is to do at least three research activities. This allows you to triangulate your data and start seeing important patterns and reinforcements coming in from the other methods. I one that is ‘quantitative’ (looks explicitly at statistical and numerical inputs that show HOW MUCH things happen) and one that is ‘qualitative’ (explores underlying reasons/opinions/motivations for WHY things happen). I would also conduct the qualitative one face-to-face with the users.

If you don’t have a lot of experience yet, and no-one else is dictating the way you do things (and perhaps even if they are), I would recommend that these three research activities are:

  • Contextual enquiry or site visit - go to the place where the users of the product/service are and watch them do their work. Ask them questions and follow your nose.
  • Site analytics (for existing sites) or industry research (new projects) - dig out the numbers that are important to the project (I mean looking at things like ‘search term frequency’, ‘exit pages’, etc, not the usual pageviews and other vanity metrics).
  • Competitive/opportunity analysis - set a benchmark so you know what standards to exceed, and where the best opportunities are.

I would start with about 6 tasks for each: 6 contextual enquiry participants, 6 things you’re hunting for in the analytics, 6 competitor reviews.

As you do the research, you’ll come to a point where any additional tasks with the same research focus aren’t really turning up new information (kind of a Pareto Principle 80:20 rule thing—going further and harder, you may only get an extra 20% findings but have spent an extra 80% effort). It’s good to stay light and nimble, and keep your energy and focus on the things that are most useful for you at that point of the project. It is a good point to re-assess where your gaps are, or to progress your thinking/design work before heading into the additional rounds of testing or research.

Is that helpful? Let me know what other questions you have. =)


Further to what Luke is saying you may also be constrained by budget, in which case its important to look at the key aims of the project to prioritise activities.


Ah yes, budget. How could I forget that! Thanks Andrew, it’s a good point.


I have a question regarding competitor analysis. I should be using the website/app and looking what they do right and what can be improved correct? And then write it down in a research presentation of some sort talking about what I saw in each website/app?

My other question is about user interviews. Should I look for specific questions to ask or does it vary depending on the project? Also how do I conduct it? Do I ask them the questions and just write what they say down? How would a client know if it was a real interview? Just curious.


A competitor analysis is basically a comparison between your existing (or proposed) app/design and a couple of its biggest competitors, often tallied up in a spreadsheet. It’s a bit like a feature-based SWOT analysis of other people’s sites, and it gives you a good benchmark of the strengths/weaknesses of competing products/services, and a list of things you’ll need to address if you want to be competitive with them.

I would use it to get a ballpark idea of desirable features to consider, which may indicate the expectations of the market. I can then test these assumptions, or look for ways to improve on them.

I run it using a spreadsheet, but when I need to highlight findings with the rest of the team I might shine a projector onto a whiteboard, so I can diagram some things onto screenshots and show the comparisons visually. A research presentation would be similar, but I would try and keep it conversational. Too much of a brain dump and people struggle to stay engaged.

The Usability BOK has a good howto and list of resources to help:

There is no magical list of questions that are applicable to every project—they will need to be relevant to the research topic. Yes, you should probably prepare questions in advance so the sessions isn’t a waste of time. You may also want to ask unplanned questions during the session if you come across something interesting. Typical activities/questions will be related to who/what/when/where/how/why the person uses the product/service and anything else relevant to the research, such as:

  • Ask some warm up questions to develop rapport and put them at ease - ask about their job, what they like, etc. Get to know them as a person.
  • Ask them to show you how they do things
  • Try and understand what they do before/after using the product/service, not just during it.
  • Ask them why they did things that way, to try and understand their awareness of certain items, and any workarounds they’ve managed to find themselves
  • Simply watch how they perform a task (without influencing or correcting them). Use something like Silverback to record the screen.
  • Include questions that will give insights to behaviour and motivations affecting design problems, but don’t directly ask them how they would design something (that’s more suited to a collaborative sketching session, which has some important differences).

Gerry Gaffney wrote a great rundown for us on how to run a site visit: and @Cam_Rogers has some brilliant tips on running the interview itself:


Great information, thanks!


You’re welcome! Let us know how you go.


It takes lots of time for researching you can follow this path for your research

  • decide on your research methods
  • measure
  • explain
  • predict
  • generalise
  • test an hypothesis
  • explore meaning
  • interpret
  • understand behaviour