Consultant Matt Healey provides a recap on the first delivery of the 1-day Introduction to Social Network Analysis. If you want to be notified of future offerings sign up to our mailing list.
Yesterday, on the 21st of February 2018, Dan Healy and I delivered a 1-day workshop on Social Network Analysis (SNA). Despite it's name, SNA has little to do with Facebook or Twitter (though it can), rather SNA is about the collecton, visualisation and analysis of network data
Now that the 2017 AES conference has wrapped up I thought I’d capture some of my reflections in this short post. In part, I’ve been thinking about what some of the themes are from the conference mean for us (evaluators) going forward. In particular, some of these themes highlight to me (in my view) a shift in the mindset of evaluators attending the conference. I’ve tried to synthesise these and I welcome comments or reflections from you (the reader).
Earlier today I read through Jo Farmer’s reflections from the conference, and a key point that stuck with me was that of empowerment through evaluation. This resonated strongly with me in how I’ve begun to see the use of language and, importantly, what that language promises to those around us when we’re undertaking evaluations.
This promise was something that I tried to pick up on through my own talk on approaches to design and how evaluation can draw on and reflect back on these. I think that as ‘design’ becomes more prevalent through government programs and policies evaluators will be required to understand these approaches and, importantly, be responsible for calling out when these terms are applied inappropriately – a conversation is not the same as co-design!
On the point of design, there is clearly an appetite among us (evaluators) for learning more about design. But more than just design, we’re hungry to learn about more – more about technological opportunities, more about techniques, more about how we can do better. Maybe the best summary is that there’s an appetite to spend more time learning from and with each other, rather than just ‘hearing’ from each other (i.e. the standard presentation approach).
This provides a powerful motivator for those of us involved in organising the 2018 AES conference – we need to respond to this appetite and provide spaces for people to learn from each other. A space for new techniques, mindsets and opportunities to discuss and collaborate on things that is of mutual interest.
This was particularly prevalent (but not exclusive) to conversations that I had with ‘newer’ evaluators or attendees (newer in terms of time in the industry, rather than age). This group is an important segment of the AES and one in which I feel is sometimes forgotten in the grand scheme of things. As a colleague of mine said, sometimes there’s this feeling of being an ‘imposter’ when you're the 'new kid on the block', and I think the AES has an opportunity to better support and empower this group of its members.
This dovetails with my final reflection rather well – what I would call an increasing interest in vulnerability.
On the last day of the conference (actually, as I was literally heading out the door to the airport,) I had a conversation with someone who commented on the effectiveness of talks that possessed a degree of vulnerability. I had to agree, in particular, for those that I saw it made the presenter more engaging, but also the content tended to be more relateable.
In comparison to the 2016 (Perth) and 2015 (Melbourne) conferences, I feel that many of the talks this year possessed this trait. To my mind, this partially reflects a shift in the desire of evaluators' for this type of content – examples where things have been hard, where we’ve struggled, made mistakes or just haven’t been the ‘perfect’ evaluator. Perhaps this is an expansion of the ‘appetite’ theme for a different way of going about things.
I know in my early days I would have loved the chance to talk with others about the learnings they had from tough situations (or just mistakes that were made). I feel like this is an area of opportunity for the AES for the 2018 conference – to create this safe space for evaluators to share these vulnerabilities (and lessons). While the evaluation consulting world is one of competition (many of the presenters work at companies or firms who we bid against on a regular basis), I think we have to acknowledge that there’s a lot we can learn from each other.
It is important to recognise that we’re all on the same side and I think that as a practice and Society we can only improve more than ever if we explore our failures, as well as our successes.
Hope to see you at the 2018 AES Conference in Launceston, Tasmania.
Following her attendance at the Private Land Conservation Conference in Melbourne last month, Bec reflects on the experience and what it means for monitoring and evaluation.
Last month I attended the Private Land Conservation Conference here in Melbourne. Coordinated by the Australian Land Conservation Alliance, the conference brought together a couple of hundred environmental, land management and conservation professionals from Australia and overseas.
Matt introduces a new concept he's been reflecting on - the problem onion.
It's been a couple of months since I attended the 2016 Australasian Evaluation Society (AES) Conference with my colleague, Dan Healy (no relation!). We presented three sessions between us and, overall, I think they went pretty well.
I find that it can always be valuable to reflect on these types of experiences to see what you can learn and, hopefully, improve on. This post is not about the sessions we presented, but rather a reflection across the whole conference on something I have affectionately titled the 'problem onion'.
Rebecca Denniss, a Researcher at FPC on her time in research, evaluation and design since joining the team.
It’s been one year since I stepped into the world of evaluation and joined First Person Consulting. I’ve been mentored by the team and had the opportunity to work on a bunch of interesting projects on a range of topics from natural resource management, climate change adaptation and energy efficiency to financial literacy, innovation networking and public health.
Consultant and FPC co-founder Matt Healey reflects on the value and power of design.
“Everything we evaluate is designed. Every evaluation we conduct is designed. Every report, graph, or figure we present is designed. In our profession, design and evaluation are woven together to support the same purpose—making the world a better place.." John Gargani, American Evaluation Association President.
To me, this is a wonderful insight and perfectly captures my feeling on the future of evaluation.