Interested in a simple guide to using the Unmatched Count Technique? Or the Randomised Response Technique?
These and many others tools and techniques will be explained here with examples.
Guiding principles for evaluating the impacts of conservation interventions on human well-being
Emily Woodhouse, Emiel de Lange, E.J. Milner-Gulland With: Emilie Beauchamp, Heather Gurd, Katherine Homewood, Charlotte Mathiassen, Ben Palmer Fry, Dilys Roe, Helen Schneider, Rebecca Singleton.
Conservationists are increasingly seeing the importance of carrying out social impact evaluation to ensure accountability and to learn what works for both biodiversity and human wellbeing.
A single toolkit or blueprint method cannot fit the diversity of intervention types and evaluation questions, and conservationists are faced with an array of decisions about the most appropriate methods and research designs to use.
This guidance aims to demystify the process of social impact evaluation and support practitioners in navigating through these methodological decisions, taking into account: the questions the evaluation aims to answer; the characteristics of the intervention; and the organisational capacity and resources available.
It takes practitioners through the key steps in an evaluation:
1) thinking through the aims of the evaluation;
2) defining relevant wellbeing outcomes and indicators;
3) designing the evaluation to link outcomes to the intervention;
4) collecting data, including applying methods to account for bias, social dynamics and ethical considerations.
The guidance provides a range of real life case studies and ideas for appropriate methods and tools.
Amy Hinsley and Ana Nuno
Tuesday, 18th December 2018
A new conservation toolbox
It is widely accepted that many conservation challenges are directly related to human behaviour. Whether it is the over-collection of a rare orchid by harvesters in Southeast Asia, or the decisions by collectors in Europe to buy and smuggle these orchids home, understanding the extent and nature of these behaviours is essential to addressing the threats they might cause. This has led conservation researchers and practitioners to start looking outside of their discipline, to find methods and approaches from across the social sciences to improve our understanding of these complex issues.
Whilst this interdisciplinarity is a positive move for conservation, it is important that we treat these ‘new’ methods carefully and understand their limitations. If we don’t, there is a risk that our new toolbox full of exciting methods that sound great on a funding application, may in fact not be making what we do any better, or in extreme cases they may even be making it worse.
With this in mind, a group of conservation social scientists, led by researchers at the Universities of Oxford and Exeter, decided to look in depth into one of these ‘new’ methods, to provide recommendations on when and how it should be used, and when it shouldn’t. The paper, is freely available in the journal Methods in Ecology and Evolution, and looks at the Unmatched Count Technique (UCT - also called the list experiment), which is increasingly being used in conservation to ask questions about ‘sensitive’ topics.
A research assistant carrying out a UCT survey about the use of Traditional Medicine products containing bear bile in China. Photo: Chen Haochun.
What is the Unmatched Count Technique?
The team reviewed all peer-reviewed studies that had used UCT and, along with insights from their own experiences using it, developed a set of guidelines. We found that, since UCT was first developed in 1979, it has been used in more than 50 countries and several disciplines.
The method asks questions in an indirect way that allows the respondent to remain protected and anonymous, meaning that it should produce more truthful answers. So far it has mainly been used to investigate topics that people might be tempted to hide their association with, including illegal behaviours (e.g. stealing), but also those that somebody might be embarrassed to admit openly to a researcher, such as socially undesirable (e.g. racist views), or very personal topics (e.g. being HIV positive). It can also be used to find out how many people really support or participate in socially desirable behaviours that might be exaggerated to impress people, such as recycling or turning out to vote.
Impressed by the potential of the method, conservationists started using the UCT in 2013, and it has been growing in popularity ever since, with five peer-reviewed conservation studies using it in 2017 alone.
How does it work?
One of the biggest draws of the UCT is that is looks so easy – UCT questions consist of a short list of items, and respondents are asked to report how many are true for them. These lists can also include drawings to make it more appealing and easier to understand, especially where literacy levels are low.
A random 50% of respondents are shown a list of only non-sensitive items (shaded blue in this example about international illegal orchid trade), whilst the other half see a list with an additional sensitive item (shaded yellow in this example).
The idea is that once everybody has given their answer, the difference in the mean answer between the control and the treatment group will reveal the proportion of people who said yes to the sensitive statement, in this case, what percentage of the people surveyed are orchid smugglers.
How can you do it well?
While it may seem like an easy exercise to draw up some lists and ask a few people to answer the question, it is really not that simple, and there are several steps that are important if you want to design a UCT.
- Choose your ‘sensitive’ item carefully
- Spend time carefully designing your ‘non-sensitive’ control lists
- Pilot and test your control lists, your sensitive item, and your whole UCT, ideally on people who represent the sample you will be working with
- Decide what else to ask in your survey (this will depend on your research question)
- Carefully plan how you will implement your UCT – will it be face-to-face or online? Will you use pictures or written lists?
Not considering these steps may mean that a UCT might not be as effective as it could be. This is especially important to consider as UCTs need large sample sizes, and this can be a waste of scarce conservation funds and resources if it is used when it is not necessary, or if it is not done well.
Should UCT be used for every project?
A UCT takes work to get right, and it is not suitable for all problems. Several factors must be considered before deciding to use the method. These include the type of question you are asking, how you plan to ask it, and how many people you want to ask (which indirectly relies on how much time and money you have to do the project):
Social science methods from outside of conservation are useful and we should not stop trying to increase the range of techniques available that can improve how we do conservation. However, we also have to accept that as well as benefits from the use of a new method there will also be a responsibility for us to investigate its potential limitations, and put in the work required to do it well.