Skip to main content

Monitoring & Evaluation

Showing What You've Achieved

Monitoring and evaluation are not exercises that set out to praise a project, but honest appraisals of learning that has occurred, and improvements that could be included in future iterations of the work. Effective monitoring allows you to talk in concrete details about what you’ve done and achieved. Effective evaluation both allows you to talk more about what you’ve learned. It thereby broadens your project’s impact and benefit.

For some inspiration see the AHRC funded FailSpace project exploring ways the cultural sector can better acknowledge and learn from failure and with some great insights into project design. 

Being able to effectively analyse and share information about your activity and its impact is important for all creative projects. And it’s absolutely essential for arts-based mental health activity, where an evidence-based approach to clinical outcomes will underpin any understanding of the value of the work, informing funding decisions and buy in from health and other sector partners.

Rigorous approaches to monitoring and evaluation are the norm for the health sector and when working in art-in-health we need to meet these standards, using our creative skills to integrate the process and animate the results. The process of case studies, for example, is essentially the art of telling the clear and moving true story about one participant’s experiences. A best practice approach for meeting both the needs of your partners and ensuring that monitoring and evaluation works meaningfully and unintrusively to collect the best data on your project, is to plan in these processes at an early stage in preparation and looks for ways to integrate them throughout the project plan.

What are we evaluating against?

Every project should have a clearly designed set of:

Outcomes - the change you are trying to make happen for each participant, and Outputs - the deliverable ingredients of your activity

Your Outcomes and Outputs will define the project-specific evaluation you undertake.

Outputs are obviously straightforward to measure, giving a quantitative account of the activities you have undertaken. They allow you to evaluate the process of your project and if it has gone to plan, or where plans have changed.

Measuring Outcomes requires more creative thinking – you need to consider how you can measure specifically against the changes you are hoping to see. This measures against the impact of your project, which should relate directly to your overall aim with the work.

You may frame these measures as questions asked of the participants (in an age appropriate way – direct questions about understanding of what has been learned can work with older children, younger children may respond more effectively to Smiley Face scales or discussions facilitated using toys or puppets).

You may also be able to record them through case studies that illustrate the changes on a personal (anonymised) account of one or more of your participants experiences.

The Culture Learning Evidence Champions Handbook produced by the RSA is a great place to learn much more about best practice in monitoring and evaluation and can be downloaded for free here: https://www.thersa.org/reports/evidence-handbook


Evaluating against Mental Health Clinical Outcomes

In order to demonstrate clinical outcomes to creative projects, you will ideally use clinically recognised, or licensed, measures.

All measures should be used at the beginning and end of projects, ideally with the same participants, though you may still be able to extrapolate trends in larger groups even if some participants have changed. A clinically significant time period is around 12 weeks and this is the recommended minimum between uses of the Outcome Stars below, though other measures can be used on shorter intervals if necessary.

For evaluating Mental Health outcomes there are 3 licensed and clinically recognised measures that are recommended:

The Warwick-Edinburgh Mental Well-being Score (WEMWBS)

This was developed by researchers to ‘enable the measuring of mental wellbeing in the general population and the evaluation of projects, programmes and policies which aim to improve mental wellbeing. The 14-item scale WEMWBS has 5 response categories, summed to provide a single score.’ (Professor Sarah Stewart-Brown, Warwick University).

WEMWBS has been shown to be ‘responsive to change’ (meaning: it can detect improvement or deterioration in many different situations) at both group and individual level.

A shorter, 7 question version is also fully validated and may be more suitable and flexible for work with younger participants.

You need to register (which is free for non-profit and statutory groups) in order to officially use the scale.

Personal Wellbeing ScoreIf you are measuring around wellbeing rather than mental health specifically, this is a strong alternative often referenced in social prescribing.

This questionnaire adapts the 4 Measures collected in the Office for National Statistics Measuring National Wellbeing Programme, often described as the ONS4. These measures ask people to evaluate how satisfied they are with their life overall, whether they feel they have meaning and purpose and asks about their emotions during a particular period. The PWS measures against participants assessment of each of these aspects of their lives.

Please note: Both questionnaire-based measures are relatively light touch processes that can be incorporated into a session. They run the risk of being rushed through though by children and young people, so it is worth allocating time and space to talk them through and give weight to the CYP’s thinking and answers in order to get the best quality data.

Outcomes Stars

These are an excellent framework for deeper conversations with participants to measure outcomes in a range of contexts. There are over 30 different versions of the Star, each tailored to specific sectors and several licensed to measure mental health and wellbeing outcomes.

Each version consists of a set of scales presented in a friendly and accessible Star shape, covering the key outcome areas that are relevant in that sector. Underpinning these scales is a five stage Journey of Change – an explicit model of the steps people go through when making sustainable change in their lives.

The Journey of Change means the Star does not purely measure the severity of a problem. The five stages and numbered scales (either 1 to 10 or 1 to 5) measure the relationship a person has with the different areas of their life – how motivated and supported they are in moving forward and in sustaining a better situation.

Outcomes Star are licensed and this required some up-front investment to build your project planning.  Details of T & Cs and more info on all of the stars can be found at https://www.outcomesstar.org.uk/

The My Mind Star, used in many of the commissioned projects with young people supported by Arun Inspires, can be explored here: https://www.outcomesstar.org.uk/using-the-star/see-the-stars/my-mind-star/

NB the deeper conversations necessary to complete the Outcomes Stars makes it an excellent tool to facilitate good key working.  As such, it is best used in the context of longer term relationships with children and young people, such as in youth work or where your project is being offered for an ongoing group. This is so that issues arising can be supported and signposted within the system and the Journey of Change conversations at its heart can support longer term interventions with the participants.  The data collected by Outcome Stars for (12 week +) projects is, however, very high quality because of its underpinning by these conversations. It is a resource-heavy evaluation method that requires dedicated time and space probably bookending projects session rather than being incorporated into them. 

Museum Sector Approaches

The UCL Museum Wellbeing Measures Toolkit is a set of scales of measurement used to assess levels of wellbeing arising from participation in museum and gallery activities that has been trialled across the UK.
https://www.ucl.ac.uk/culture/projects/ucl-museum-wellbeing-measures 


Case Studies

All of the measures outlined above will generate data, but the data alone will only paint part of the picture. It’s important to join up the information you find to practical experiences and personal stories. This is where Case Studies can complete the picture and support a more in-the-round understanding of what you have done and the change it has created.

Case Studies provide personal testimonials that can bring to life your data-driven account of a project’s impact. They should be used carefully, with a focus on selecting case studies that illustrate the specific Outcomes you are seeking to evaluate against. Case studies of Children and young people must also be anonymised, so that specific individuals cannot be identified from the material you share, eg by not using their full name, not being specific about health issues or life experiences that could be identifying.

An excellent summary of approaching Case Studies in arts evaluation produced by Creative and Credible a knowledge exchange project funded by the Economic and Social Research Council (ESRC) in 2015 can be downloaded here: http://creativeandcredible.co.uk/case-studies/

Remember, leave time and space in the young project for thorough evaluation work, which should happen throughout. The best evaluations not only evidence and illustrate the impact of your project, supporting your partnerships and making the work more fundable for the future, they also support improvements to the model that allow you to run this and other projects even better next time. It’s valuable material, and worth doing right.

The Making It Better toolkit has been produced by Artswork through the Arun Inspires programme, a 3-year cultural development programme for children, young people and the organisations supporting them in District of Arun. As a result, place-based data is focused to Arun. However many of the resources collected here are national or can be focused to other areas.

We hope you find the Toolkit useful. This is a living resource so if you have suggestions about material that should be added or content that could be improved, we would like to hear your ideas. Please email us on macince@artswork.org.uk 

Data used across this toolkit was collected in March 2021, and all data is correct at the time of this site being created. Artswork have worked to ensure all resources gathered here are relevant and appropriate, but we cannot be held responsible for the content of external sites. If you have questions about any of this content, or encounter any broken links, please contact info@artswork.org.uk

Sign up to our newsletter

Receive the latest News & Events straight to your inbox

Recieve the lastest News & Events straight to your inbox

Opt into another list