Holding up the mirror – evaluating ourselves as funders

Share Button

Funders, both government and philanthropic, often have a strong focus on measuring the outcomes, impact and effectiveness of the initiatives they fund.   Which is great – but is that focus equally directed at measuring the outcomes, impact and effectiveness of ourselves as funders?

Often the answer is no – and there’s some good reasons for this. You can argue that the impact a funder makes is the sum total of the impact of every initiative that funder supports.   However it’s not easy to definitively understand the impact of even a single initiative, let alone the combined impact of, say, 30 funded initiatives.   And even if measuring this is possible, how does it compare to the potential impact had different funding choices been made?  It’s tricky stuff.

At Todd Foundation we have had several attempts at trying to understand our impact as a funder. For a while we were excited by Results Based Accountability (RBA), however it fell by the wayside, partly for the reasons above and partly because of the difficulty in getting definitive baseline data about any of the issues we hoped to contribute to.

We are now taking a more pragmatic approach and trying instead to measure our effectiveness as a funder, a subtly different thing. After much discussion within our team and with other funders both in New Zealand and the US, we’ve boiled this down to five key questions:

  1. How well do we understand the communities we serve?
  2. How well do we meet the needs of our stakeholders (eg applicants, grantees, donors, partner organisations)?
  3. How well are we run?
  4. How well do we serve our communities beyond giving money?
  5. How well do we understand the impact of our grants?

This fits quite well into the RBA framework – essentially this is their second key question of “how well did we do it”. Which is far from perfect, but it is a useful start and is completely achievable from evidence we can easily gather in the normal course of our work.  More importantly, it provides clear indicators of where we need to improve.

Still, we’d be keen to do this better.  Any ideas?

Share Button

3 Comments

  1. It would be easy to answer these questions with just a “very well” kind of answer and depending on who you are asking that might be all you get. I’d be more inclined to ask
    1.What do we do that shows we understand the communities we serve?
    2.What do we do that shows we meet the needs of our stakeholders (eg applicants, grantees, donors, partner organisations)?
    3.How do we demonstrate that we are well run?
    4.What do we do that shows we serve our communities beyond giving money?
    5.What do we do that shows we understand the impact of our grants?
    6. What can we do better/more effectively/more of….and what do need to achieve that.
    That said I think the Todd Foundation do “do things better” and understand the communities you serve very well!

  2. Thanks Niamh – good point!

    Actually we do sort of do this – the questions above are the top level of a framework that includes actions, indicators and results. For example, for “How well do we understand the communities we serve” we track how much research we read, how many conferences, community seminars and grantee events we attend, how culturally competent we are and various other indicators of learning – and, more importantly, how these influence how we work.

    But I like your wording much better – good input for the next version and much appreciated.

  3. Hi Kate. I think Niamh’s reframe of the questions is helpful in a way s/he may or may not have intended. “What?” focuses on outcomes rather than processes.

    As a bonus, outcomes lend themselves to being described in stories as well as statistics. Personally, I would urge you always to have as many stories as you possible can to explain your impact on the world.

    All said, “Well done” to the Todd Foundation for exemplary leadership.

Leave a Reply

Your email address will not be published. Required fields are marked *

*