Communicating effectively about your organisation’s work and achievements: resources and advice

IMG_0206As a funder, we are at the receiving end of organisations’ efforts to communicate their work and achievements. We review many grant applications, sit through hours of presentations and carefully study reporting and learning documents.

From this experience, we thought it might be helpful to share some of the common mistakes that we see, and some resources that can help organisations effectively to share the amazing work they are doing.

We first look at issues around reporting on numbers, and then at narratives/stories as a data source and important communication tool. This is a long post, so feel free to scan through and skip to sections that are meaningful to you right now, and to come back to others when they become relevant.

Reporting on the numbers

When we are considering the effectiveness of social programmes, the numbers generally answer two main questions: (1) Did x happen, and (2) To what extent? Everything else is a variation on these questions. For example: Did it happen more for girls than for boys? Did it happen more here than there? How often did it happen? Is it still happening?

Numbers alone are not enough to understand the complicated social dynamics of improving people’s lives. They can be limited in terms of explaining why something happened and they do not describe how it happened or what it means to the people involved.

However, they are a crucial piece of the puzzle. Quantitative measures – even simple ones – are vital to understanding change over time, compare different approaches to solving similar problems and gauge projects’ effectiveness.

Here are a few common mistakes we have seen organisations make when reporting on numbers:

Divorcing percentages from underlying data

Report: 90% of learners accessed tertiary after completing a project.

What is the problem? Does that mean 9 out of 10 participants or 270 out of 300?

Percentages are most appropriate when looking at numbers significantly larger than 100. They can sometimes help us make sense of complex numbers – for example, is 73 out of 89 a better “success rate” than 56 out of 70? (It is – just barely!). But they should never be provided without the underlying numbers.

This isn’t just about pedantic insistence on counting things. Most projects work with individuals, human beings with stories and goals. While percentages can help us make sense of complex numbers, those individual lives matter and should be treated that way in reporting.

Calculating the percentage increase/decline incorrectly

Report: The pass rate for a standardised test has increased by 26% over the year (from 26% to 52%).

What is the problem? An increase of 26 percentage points, from a baseline of 26%, is not a 26% increase. It’s a 100% increase – it’s twice as many points.

Percent change is calculated as:

(new number – old number)   x 100

old number

In this case, that means:

52-26     x 100     =     26   x 100     =    1 x 100 = 100% increase.

26                              26

Averaging percentages

Report: A (fictional) project that established three school libraries wants to examine what percent of learners are using the libraries. The raw data is below.

# of learners # of learners who checked out ≤1 book

% of learners who check out books

School A

100

50

50%

School B

600

210

35%

School C

60

39

65%

Their conclusion: 50% of learners are actively using the libraries because 50% of learners check out books (calculated as follows):

50% + 35% + 65% = 150%.

150% divided by 3 schools = 50%

What is the problem? Calculating the average of the three percentages to come up with the percentage for all three schools skews the results, because the schools are different sizes.

If we add a TOTAL row to our table, and use the actual numbers from all three schools, we’ll get the real answer:

# of learners # of learners who checked out ≤1 book % of learners who check out books
School A

100

50 50%
School B

600

210

35%

School C

60 39

65%

TOTAL

760 299

39%

760 ÷ 299 = .39 = 39%.

Numbers acquire meaning through comparison

Report: In the past quarter our resource centre was visited by 300 young people.

What is the problem? What does it mean that 300 young people visited the centre? Did it increase or decline? Are 300 visitors high or low when compared to the number of youth who are visiting other similar centres?

It is good to have a baseline (and to report on that), but since the purpose of social programming is to bring about change, numbers acquire meaning through comparison either over time or against a baseline or a control group (or even against other participant groups in the programme).

We recently created a web resource to assist organisations to plan for monitoring and evaluation in a step-by-step way. Step 5 deals with issues around indicators and measurement, and you might find our video presentation describing the basic concepts around indicators helpful. (You can also access the text for this presentation here – click on the text icon for Indicators: Basic Concepts).

Burying the numbers

We often receive reports where key numbers are scattered throughout a long narrative. It would, however, be helpful if these numbers are summarised in a way that gives the reader the big picture quickly.

For example, a report on a youth programme that operates in three areas may include several paragraphs about each location. For each site, it shares how many young people participated, how many accessed tertiary education and how many secured jobs – but it does not synthesize or analyse these numbers in a way that can reveal broader trends.

We regularly need to report to our board on the number of people our partners are reaching, and we also use the reports of our partners in our advocacy efforts. When information is scattered, it is harder to draw useful lessons from the work and introduces room for error. It also suggests that the organisation may not be engaging meaningfully with its data or using it to strengthen programme strategy and implementation.

Including a table or spreadsheet with a high-level summary of quantitative indicators helps the reader understand your work and allows them to move on to more interesting and complex questions. For example, the table below quickly shows us what a hypothetical youth programme has done in 2013 and 2014. Pulling the data together helps us ask important questions about implementation, such as:

  • Why did the North West site shrink to less than half its previous size in 2014?
  • Why are Limpopo site participants consistently less successful at accessing tertiary?
  • Why did far more Mpumalanga site participants access tertiary in 2014 than in 2013 – what changed?

2013

2014

Site

Total participants

Accessed tertiary Accessed work Total participants Accessed tertiary

Accessed work

North West

45 30 6 20 12 8
Limpopo

32

8 18 40 12

25

Mpumalanga

70 40 15 75 63

10

It’s also useful to ask yourself: What is your elevator pitch? If you had to summarise your work in a one-page document, what would you share? If you had to convince a busy executive to fund your work, how quickly and clearly could you paint a picture in her mind?

Correlation does not equal causation

Correlation shows us that there is a relationship between things that happen or change together. However, when things change together, it does not necessarily mean that one is causing the other. For example, when sales of sunglasses increase, so do sales of ice cream – but warm weather causes both.

This doesn’t mean correlation isn’t worth reporting – and in many instances, this is all that projects will be able to show. However, organisations should be cautious: proclaiming that any positive change was unequivocally due to one intervention ignores the myriad factors that impact project performance, learning outcomes and individual people’s lives.

Our monitoring and evaluation web resources deal with issues around indicators and measurement – our video presentation provides more information about determining causation/attribution. (You can also access text on this matter here – click on the text icon for Important Concepts to Understand About Measurement).

The power of a story

Visual and narrative storytelling are enormously powerful to give beneficiaries a voice. It is also helpful to illustrate an organisation’s work in its context. Not all stories are created equal, however, and they also have their limitations.

Two things stand out to us:

One or two stories does not automatically prove that your organisation is having an impact

We are fairly often presented with one or two ‘success’ stories, which are meant to convince us that an organisation is effective (which would mean it is achieving the impact it is aiming for). However, impact is proven through evaluation, which requires narrative (qualitative) and quantitative data to be systematically collected to form a body of evidence.

Typically, you would collect qualitative data until there is a saturation point in terms of the themes emerging (you might be looking for specific themes). The way in which you collect qualitative data also needs to comply with criteria for sound research methodology.

On our monitoring and evaluation website, we provide an example of how qualitative data is used in evaluations and methods used to collect this data. (Have a look here – click on the text icons for Using Qualitative Information in Evaluation and Information and Data Collection Methodologies.)

We also published a learning brief describing the Impact Story Tool, a helpful instrument and methodology that can be used to work rigorously with stories.

The quality of the story

If one or two success stories are not enough to “prove” impact, it is clear that the main purpose for NGOs to tell these singular stories is to illustrate context, capture the imagination and give a voice to the people they work with. This means we can take NGO story-telling up a notch from the formulaic ‘sad story – enter organisation x – life is so much better now’ narrative that we hear so often, and is typically meant to convince funders of programmatic effectiveness.

Good stories are true to the complexity of the situations they are trying to describe. They don’t unnecessarily centre on the extremes of the emotional spectrum (very sad or unnaturally happy). Instead, they reflect the range of emotions that colour the human experience. Combined with good storytelling techniques, they create compelling and honest communication.

Have a look at this short movie made by international NGO Room to Read. It tells the story of Suma, an Indian girl participating in Room to Read’s Girls’ Education programme. After spending six years of her childhood as an indentured servant, Suma was rescued and enrolled in school for the very first time.

There is obviously a big budget behind this movie, but what makes it powerful is that it never veers from giving Suma a voice. This movie is not about Room to Read and their work. It is about Suma and her life experience – in which Room to Read has offered her an opportunity to gain new and important experiences. The lasting impression is not one of Room to Read as a wonderful and effective programme, but a deep sense of the potential of people and their worthiness of being offered the right opportunities.

(Some more inspiring storytelling: Dr Hawa Abdi tells the story of how she came to provide safety and healthcare to 90,000+ internally displaced persons at her camp in Somalia. Click here to watch.)

International NGO Witness trains and supports people around the world to use video – mostly using cell phone technology – to expose human rights abuses and fight to secure these rights. Their resource page offers online training and a variety of tools to help people effectively use video to tell stories. Of course, a well-written story can also be very effective although good visuals (pictures or video) help people visualise circumstances that they are unfamiliar with.

American NGO, StoryCorps, is creating a digital archive of people’s stories and has developed a great (!) app that helps ordinary people interview others, record and upload their stories to the online archive.  This tool might be useful to many NGOs looking to collect stories while simultaneously contributing to the global archive of stories – a novel idea on its own.

Lastly, in terms of platforms to share your stories, we think the following is quite helpful:

Presentations: We have written about presentations before – read our article here. It summarises fatal errors people make, as well as things that make presentations excellent. It also gives access to a resource by Andy Goodman: Why bad presentations happen to good causes.

Social media: Social media helps organisations reach and engage with a much wider audience. The Nonprofit Network website is a resource centre that assists South African NGOs with social media, websites and e-newsletters. It features free tutorials on how to optimally use websites, Facebook, Twitter, YouTube, LinkedIn and other platforms to enhance good work, and also features inspiring case studies of local organisations using social media and the web effectively.

If you know of other resources that would be helpful for NGOs to communicate their work and achievements, or have inspiring examples to share, please let us (and them) know in the comment space below.

Leave a Comment

Please note: Comment moderation is enabled and may delay your comment from showing. There is no need to resubmit your comment.
Comments will be made public, should they be approved.