Loading…
Welcome cocktail reception, Tuesday 18 September 6:00–8:00pm, open to all delegates! Venue: Penny Royal
 
Monday, September 17
 

9:00am AEST

Codesign and evaluation for social innovation (Penny Hagen)
This workshop explores how co-design and evaluative practice are being brought together to support new and developmental ways of working across national and local government, NGOs, business, frontline workers and community members.

Speakers
avatar for Penny Hagen

Penny Hagen

Co-design Lead, Auckland Co-design Lab
Penny assists organisations and teams to apply co-design and design-led approaches to the design and implementation of strategy, programs, policies and services. She specialises in projects with social outcomes and supports capability building with teams and organisations wanting... Read More →


Monday September 17, 2018 9:00am - 5:00pm AEST
Chancellor 6

9:00am AEST

Developing Monitoring and Evaluation Frameworks (Anne Markiewicz)
The workshop provides participants with useful, step by step practical guidance for developing a Monitoring and Evaluation Framework, supported by relevant background and theory. It presents a clear and staged conceptual model, discusses design and implementation issues and considers any barriers or impediments, with strategies for addressing these. Participants will learn the format and approach for developing a Monitoring and Evaluation Framework, the range oftechniques and skills involved in its design and implementation and develop an appreciation of the parameters of the tasks involved and how to approach them.

Speakers
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →


Monday September 17, 2018 9:00am - 5:00pm AEST
Tamar Room, Clarion Hotel City Park Grand

9:00am AEST

Making it stick - Creating an evaluation report for impact and use (Samantha Abbato)
This workshop is designed for professionals who commission, write or use evaluation reports. Beginners and those new to evaluation will also benefit as well as those at Intermediate level.
Participants will learn how to maximise report reach, engagement and use through applying strategies based on current psychological and communication principles and theory. Innovative reporting techniques using photo, film, story and online tools will be introduced.

Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
Dr. Samantha Abbato is an evaluation consultant and Director of Visual Insights. Sam has a passion for maximising evaluation use through effective communication and evaluation skill building using a pictures and stories approach and increasing the academic rigour of evidence.Sam’s... Read More →


Monday September 17, 2018 9:00am - 5:00pm AEST
Chancellor 4

9:00am AEST

The basics of systems thinking and its application to systems evaluation (Ralph Renger)
The purpose of the workshop is to help evaluators understand the difference between using systems thinking and systems theory to evaluate programs versus modern day systems.



Speakers
avatar for Ralph Renger

Ralph Renger

University of North Dakota; Center for Rural Health
Dr. Renger spent the first twenty years of his evaluation practice advancing theory driven methods for improving program evaluation. Several publications arose from his work including the ATM approach to logic modeling, using source documentation to reconstruct program theory, and... Read More →


Monday September 17, 2018 9:00am - 5:00pm AEST
Chancellor 3

9:00am AEST

Valuing social outcomes to demonstrate impact (Taimur Siddiqi)
This interactive workshop will focus specifically on how to undertake an SROI analysis to value outcomes as part of ongoing monitoring and evaluation (M&E) activities and using M&E data. It will also encourage participants to consider the benefits and challenges of valuing outcomes. It will be based on peer learning, with a series of cooperative learning exercises and opportunities for group discussion. Participants will be asked to bring their own examples and provided with take home templates and resources to assist them with their analyses.

Speakers
avatar for Taimur Siddiqi

Taimur Siddiqi

The Incus Group
Taimur is an experienced evaluation and impact measurement professional who is Managing Director of The Incus Group and current member of the AES Pathways Committee. In his consulting role, he has completed numerous evaluation and impact measurement projects, working with a range... Read More →


Monday September 17, 2018 9:00am - 5:00pm AEST
Chancellor 5
 
Tuesday, September 18
 

9:00am AEST

Interview Skills: listening to understand (Jade Maloney and Kerry Hart)
This applied workshop is a practical forum to learn the fundamentals of good interviewing through practice. It aligns to AES professional learning competency 4 ‘research methods and systematic inquiry’.  It is designed for people who need to collect qualitative data from clients or stakeholders for evaluation, but have limited prior experience doing so.The workshop covers what is needed to make a good interview – from asking the right questions through preparation and interviewer competencies to debriefing. At the end of the session, participants should feel equipped to conduct interviews with a range of stakeholders for evaluation

Speakers
avatar for Kerry Hart

Kerry Hart

Senior Consultation, ARTD Consultants
Senior Consultant, Kerry Hart, has been interviewing with ARTD for over 15 years, using her specialist skills in face-to-face and telephone interviewing and small group processes to collect meaningful data to answer key evaluation questions. Kerry has collected meaningful data from... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →


Tuesday September 18, 2018 9:00am - 12:30pm AEST
Chancellor 4
  Preconference workshop
  • Modality Paid half-day workshop
  • about Senior Consultant, Kerry Hart, has been interviewing with ARTD for over 15 years, using her specialist skills in face-to-face and telephone interviewing and small group processes to collect meaningful data to answer key evaluation questions. Kerry has collected meaningful data from interviews and focus groups with senior executives in government agencies, frontline staff in non-government organisations, people with disability, children with cancer and their siblings, and parents of children with disability, seniors, people from culturally and linguistically diverse and people from Aboriginal communities. She brings a strong understanding of adapting approaches to different contexts.

9:00am AEST

Understanding evaluation contexts (Sarah Mason)
This half-day workshop is designed to introduce participants to the Framework for Situation Awareness in Program Evaluation: a research-based tool designed to support evaluators in defining, and understanding, the contexts in which they work. Through a combination of mini-lectures and evaluation scenarios, participants will also gain practical experience interpreting real-world evaluation situations, along with feedback based on the real-world outcomes of these scenarios.

Speakers
SM

Sarah Mason

Sarah Mason is a Research Fellow based at the Centre for Program Evaluation, The University of Melbourne where she has taught classes in quantitative methods and mixed methods for evaluation. Over the past 15 years, Sarah has conducted research and evaluation projects across a wide... Read More →


Tuesday September 18, 2018 9:00am - 12:30pm AEST
Chancellor 3

9:00am AEST

Behaviour architects: a game that applies behavioural insights to improve policy solutions (Karol Olejniczak)
The workshop is designed in the form of a game with case studies that provide participants with engaging yet research-based learning experiences. All levels from beginners to advanced are welcomed, and the workshop is aimed at those interested in how to evaluate and improve public policy, as well as anyone wanting to experience a game-based workshop!

Speakers
avatar for Karol Olejniczak

Karol Olejniczak

Assistant Professor, University of Warsaw, Centre for European Regional and Local Studies
Karol Olejniczak is an Assistant Professor of public policy at EUROREG - University of Warsaw, Poland, and a visiting scholar at The George Washington University, Washington D.C. He is also a co-founder of policy research company Evaluation for Government Organization (EGO s.c.).His... Read More →


Tuesday September 18, 2018 9:00am - 5:00pm AEST
Chancellor 5

9:00am AEST

Conflict resolution skills: A toolbox for evaluators (Ruth Pitt)
The workshop is suitable for evaluators of any level of experience who have little or no training in conflict resolution skills. Participants will be asked to reflect on their experiences from evaluation projects, but those with limited evaluation experience will be able to draw on experience in other contexts. The workshop covers skills relevant to managing evaluations, and will support evaluators to develop competency in a number of professional competency standards from Domain 5 (Project management) and Domain 6 (Interpersonal skills).

Speakers
avatar for Ruth Pitt

Ruth Pitt

Assistant Director, Evaluation Uni, Australian Government Department of Social Services
Ruth Pitt is Assistant Director of the Evaluation Unit at the Australian Government Department of Social Services. Her evaluation experience includes diverse roles in consulting, government and not-for-profit organisations, in Australia and overseas. Her qualifications include a Master... Read More →


Tuesday September 18, 2018 9:00am - 5:00pm AEST
Tamar Room, Clarion Hotel City Park Grand

9:00am AEST

Principles-Focused Evaluation for Transformation (Michael Quinn Patton & Kate McKegg)
Presented by Michael Quinn Patton and Kate McKegg, the workshop is based on Patton’s latest book, Principles-Focused Evaluation: The GUIDE(2018).

Principles-focused evaluation for transformations is a special application – and an especially relevant one. Transformations are not simply projects and programs. Transformations can involve major, complex, and rapid systems changes. Given the complexities, uncertainties, multidimensional, and multi-level (macro, meso, micro) nature of transformative change, principle-focused global systems change initiatives are appropriately evaluated with principles-focused Blue Marble (whole Earth) evaluation. The implications of this will be presented, discussed, and applied.

Speakers
avatar for Kate McKegg

Kate McKegg

Director, The Knowledge Institute
Kate has worked in the fields of evaluation, evaluation capacity building, research, policy and public sector management since the late 1980s. She has developed specialist skills in developmental evaluation, programme evaluation, evaluation capacity building, strategy, policy, research... Read More →
avatar for Michael Quinn Patton

Michael Quinn Patton

Independent Evaluation Consultant
Michael is delivering his keynote by video link, supported by expert facilitator, Kate McKeggMichael is an independent evaluation consultant based in Minnesota, USA. He is former President of the American Evaluation Association (AEA) and author of eight major evaluation books including... Read More →


Tuesday September 18, 2018 9:00am - 5:00pm AEST
Chancellor 6

1:30pm AEST

From Data to Learning - How to run an effective Reflection Workshop (Byron Pakula)
The purpose of this workshop is for evaluation practitioners and project managers to be trained in the facilitation approach of Reflection Workshops. Building on the facilitated processes of summit workshops used in evaluations, the reflection workshops are designed to build consensus among project staff (and often the donors) related to the context, activities, outcomes, impacts, cross-cutting issues, and management responses.

This workshop builds a foundational evaluation skill designed for beginner and intermediate practitioners, including M&E advisors and project managers. No pre-requisites are required, though knowledge of summit workshops, results charts and or evidence matrices is an advantage.

Speakers
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →


Tuesday September 18, 2018 1:30pm - 5:00pm AEST
Chancellor 3

1:30pm AEST

Questionnaire design: asking the right questions (Andrew Hawkins and Jasper Odgers)
This applied workshop is a practical forum to learn the fundamentals of good survey design through practice. It aligns to AES professional learning competency 4 ‘research methods and systematic inquiry’.

It is designed for people who need to collect standardised satisfaction and outcomes data from clients or stakeholders as part of their professional practice, but have not had previous experience designing questionnaires. It is also suitable for funders of evaluation and research to understand what constitutes good practice when asked to review survey instruments as part of managing the contract for an evaluation.

Speakers
AH

Andrew Hawkins

Partner, ARTD Consultants
Andrew works as a trusted advisor and strategic evaluator for public policy professionals, generating insight and evidence for decision-making. Andrew has worked for a wide range of Australian and NSW public sector agencies and not-for-profits on the design, monitoring and evaluation... Read More →
avatar for Jasper Odgers

Jasper Odgers

Manager, ARTD Consultants
Jasper has been studying and working in quantitative research and data analysis for the past eight years. He manages online surveys, quantitative data analysis and data visualisation for all of ARTD’s reporting. He has recently managed several stakeholder surveys for NSW Government... Read More →


Tuesday September 18, 2018 1:30pm - 5:00pm AEST
Chancellor 4
 
Wednesday, September 19
 

9:00am AEST

Opening plenary: Welcome to Country; Michael Quinn Patton "Getting Real about Transformational Change: The Blue Marble Evaluation Perspective"
Welcome to Country: Aunty Nola Hooper
Opening address: Dr Lyn Alderman, AES President

Michael Quinn Patton (Independent Evaluation Consultant, Minnesota, USA)


Michael will be appearing by video link, with Kate McKegg facilitating.

Last year Andy Rowe presented climate change and sustainability as deep global challenges of the ‘Anthropocene’ – the geological age characterised by humans’ influence on the planet.  He argued that “Every aspect of human activity needs to change if we and other life forms are to have a sustainable future.” That is a vision of transformation.But designing and evaluating genuinely transformational initiatives is different from designing and evaluating projects and programs. At international conferences on transformation, I’ve witnessed the challenges framed as complex, multidimensional, multi-layered, cross-silos, and dynamic -- followed by traditional project and evaluation presentations that were anything but transformational. My premise: autonomous and isolated projects and programs do not lead to global systems transformation.

This presentation will present a theory of global transformational change and the Blue Marble (whole Earth) evaluation implications of that theory. 

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →
AN

Aunty Nola Hooper

Aunty Nola is a proud Tasmanian Aboriginal woman, she is well-respected within the community for her leadership, strength and her passion for supporting and strengthening Tasmanian Aboriginal culture through shell necklace making, water carriers and mutton birding. Aunty Nola also... Read More →
avatar for Kate McKegg

Kate McKegg

Director, The Knowledge Institute
Kate has worked in the fields of evaluation, evaluation capacity building, research, policy and public sector management since the late 1980s. She has developed specialist skills in developmental evaluation, programme evaluation, evaluation capacity building, strategy, policy, research... Read More →
avatar for Michael Quinn Patton

Michael Quinn Patton

Independent Evaluation Consultant
Michael is delivering his keynote by video link, supported by expert facilitator, Kate McKeggMichael is an independent evaluation consultant based in Minnesota, USA. He is former President of the American Evaluation Association (AEA) and author of eight major evaluation books including... Read More →


Wednesday September 19, 2018 9:00am - 10:30am AEST
Conference centre

11:00am AEST

Integrated Care Maturity Model
Simone Cheung (Deloitte Access Economcis), James Linden (NSW Ministry of Health), Amy Hogan (Deloitte Access Economics)

In 2014, NSW Government invested $180 million over six years under the Integrated Care Strategy. This funding was invested in innovative and locally led models of integrated care, as well as a range of system enablers, to explore how health care might be delivered to communities differently and in a more integrated way.

In May 2017, the the funding body for the Integrated Care Strategy commissioned a state-wide formative evaluation of the strategy. The evaluation was at a program level but was informed by a high-level analysis of the 20 projects that were delivered by the Local Health Districts (LHDs) under the strategy. The complexity of the evaluation was that each of the LHDs had commenced implementation at varying points in time and the models of care and target cohorts were also different for each LHD.  

The evaluation approach relied heavily on qualitative data because the strategy was early in its implementation. The data sources included consultations with all Integrated Care sites, stakeholder surveys, past evaluation findings, linked hospital data sets and patient reported measures for a selection of LHDs. 

A maturity model, presented as a radar diagram, was developed to enable a comparison of LHDs against five dimensions of integrated care - program and service innovation, patient-centred care and empowerment, digital health and analytics, models of care and partnerships. The maturity model was designed to be aspirational and encourage growth, such that the highest ratings of maturity represented leading practice stages of integrated care. 

The maturity model provides a visual representation of maturity across the LHDs and highlights where there are strengths and/or gaps across the dimensions of integrated care. This enabled a state-wide evaluation of progress and facilitated the sharing of learnings on what works in NSW. 

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Speakers
avatar for Amy Hogan

Amy Hogan

Manager, Deloitte Access Economics
Amy Hogan is a Manager in the Health Economics and Social Policy team at Deloitte Access Economics and has wide ranging experience involving economic analysis, evaluation, strategy and policy reform. Amy has six years’ experience in public sector consulting and is particulary passionate... Read More →
avatar for James Linden

James Linden

Director, Funding and Policy Reform, NSW Health
James is a Public Health Specialist with experience across the UK NHS, NGO and Australian health settings. Previous roles have provided valuable exposure to the issues and enablers that support integration of care.James is passionate about policy reform that supports 'whole of system... Read More →


Wednesday September 19, 2018 11:00am - 11:30am AEST
Chancellor 3

11:00am AEST

In the deep end? Evaluation 101 for new evaluators
Charlie Tulloch (Policy Performance Pty Ltd)

Ask any evaluator how they ended up in this field, and most will say that they fell into it. Right in the deep end. This can be overwhelming, with theoretical, methodological, logistical and ethical challenges to consider. This presentation will provide an introductory overview of the evaluation field, adapted from evaluation capability building materials prepared and delivered within a large professional services firm. It will explore various definitions of evaluation , outline the rationale for undertaking evaluations, outline the role of evaluation across the government policy cycle, detail the most suitable types of evaluation, and step through practical considerations relating to planning, conducting and reporting on evaluation findings. It will draw on the AES Evaluators Professional Learning Competency Framework to identify the skills that new evaluators should seek to build as they develop. By the end of this session, those attending the conference to learn the basics will have a better understanding about their development path, and the contribution they can make to extending their own practice = building personal capital.

Chairs
avatar for Dan Borg

Dan Borg

Independent consultant
Dan is an evaluator with a background in environmental science and a PhD from the University of Melbourne. Dan's has experience conducting evaluations in natural resource management, emergency management and in health in the public service and not for profit sectors. Dan is passionate... Read More →

Speakers
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.



Wednesday September 19, 2018 11:00am - 12:00pm AEST
Conference centre

11:00am AEST

Thinking local and global: Tasmanian lessons in pursuit of Transformational Systems Change
Catherine Manley (Miles Morgan Australia), Ebeny Wood (Beacon Foundation), Anna Powell (Beacon Foundation)

During 2017, a team in Perth was seeking out a case study subject for a forthcoming publication on Australian skills development at the local level and came across a live example of transformational systems change in action, right here in Tasmania.
The Beacon Foundations’ Collective Ed is a work-in-progress example of systems change design and practice, and demonstration of the willingness and commitment of Tasmanian community, education, industry, and government. Currently working with six Tasmanian secondary schools, Collective Ed is designed to help schools try and test new ideas and new ways of helping young people complete Year 12.
This special panel brings together practice observers, designers, and evaluators, as well as school leadership associated with the Collective Ed project. The session is designed to stimulate discussion of, and engagement with the panel's perspectives and explore answers to valuable conference questions from both a local and global standpoint:
What are we learning about collaborating with unlikely partners and operating at the systems level?
How is evaluation practice adapting to work at the system level?


Chairs
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →

Speakers
avatar for Catherine Manley

Catherine Manley

Principal, Miles Morgan Australia
Catherine found her home in evaluation while completing her master's degree and learning from Sandra Mathison at the University of British Columbia in Vancouver. She now works in areas of evaluation and strategic research across Australia within areas of social policy relating to... Read More →
AP

Anna Powell

Collective ed. State Backbone Lead, Beacon Foundation, Collective ed.
Anna is driven by a purpose to address the causes of inequality in Australia. With over 15 years of experience in building networks and coalitions for social change, Anna is currently the Collective ed. State Backbone Lead, working with a network of leaders and organisations across... Read More →
avatar for Ebeny Wood

Ebeny Wood

Collective ed. Director, Beacon Foundation, Collective ed.
Ebeny came to Beacon Foundation from the University of Tasmania, where she was undertaking her doctorate on secondary student school disengagement. Her honours work was also in the field of education, with a focus on social change and schooling. Ebeny has been with Beacon since October... Read More →


Wednesday September 19, 2018 11:00am - 12:00pm AEST
Chancellor 6

11:00am AEST

Big data, big possibilities, big challenges: Lessons from using experimental designs in evaluation of system-level educational reforms
Duncan Rintoul (NSW Department of Education), Ben Barnes (NSW Department of Education), Ian Watkins (NSW Department of Education)

For many evaluators, quasi-experimental designs fall at the first set of hurdles, due to the absence of readily available data sets and the difficulties associated with identifying appropriate comparison/control groups. At the NSW Centre for Educational Statistics and Evaluation (CESE), we have been fortunate to clear these first hurdles on occasion, only to then hit the second set: the technical challenges of working with big data.
 
This paper is a chance for participants to get their hands dirty... or at the very least to hear the stories of people with dirty hands. The presenters are senior practitioners - the Director of CESE's evaluation unit and the Principal Data Analyst responsible for statistical modelling. The paper will lift the lid on this important (but uncommon) aspect of evaluation practice: the models they build; the data management challenges they face; the internal political challenges they face; the statistical methods that bear more - or less - fruit; and how they translate 'heavy quant' back into actionable insights for policy and program management. 

Through a set of case studies, the presenters will draw out practical lessons and tips for making these designs work - including what the team has needed in terms of skillsets, models, software, datasets, mindsets and other complementary elements of evaluation design that sit alongside the quant.

Chairs
avatar for Keryn Hassall

Keryn Hassall

Aptavit
I'm interested in how people think - about evaluation, about policies and programs, about their work, and how people think about other people.I have two primary areas of professional focus:(1) Organisational capability and practice change - using organisation theory and practical... Read More →

Speakers
BB

Ben Barnes

Director Evaluation, NSW Department of Education
I began in consultancy, and made the move to the public sector in 2012. I am now Director of Evaluation at the Centre for Education Statistics and Evaluation in the NSW Department of Education. We have a team of over 30 internal evaluators, data analysts and evaluation capacity builders... Read More →
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →
IW

Ian Watkins

Principal Data Analyst, NSW Department of Education
Ian has a background in psychology and statistics. He studied and taught at the University of New South Wales before moving to the public sector. He is now the Principal Data Analyst at the Centre for Education Statistics and Evaluation in the NSW Department of Education where he... Read More →


Wednesday September 19, 2018 11:00am - 12:00pm AEST
Chancellor 5

11:00am AEST

The STrengthening Evaluation Practices and Strategies (STEPS) in Indigenous settings in Australia and New Zealand Project: Next "steps" in the journey.
Amohia Boulton (Whakauae Research for Maori Health and Development), Gill Potaka Osborne (Whakauae Research for Maori Health and Development, NZ), Lynley Cvitanovic (Whakauae Research for Maori Health and Development, NZ), Sharon Clarke (Women's and Children's Health Network Government of South Australia, AU), Lisa Warner (YWCA Adelaide), Jenni Judd (Central Queensland University), Margaret Cargo (University of Canberra)

The STEPS project has coalesced as a discrete piece of work over several years. Its genesis lies in the desire of a group of Indigenous and non-Indigenous evaluators in NZ and Australia to improve evaluation undertaken in Indigenous settings. Mixed-method concept mapping methodology was used to brainstorm practices and strategies to support culturally safe evaluation; 106 strategies were consolidated and sorted into conceptually meaningful groups; each strategy was rated on relative importance and achievability. Approximately 400 participants were involved in this work. Concept maps for each country were developed using multi-dimensional scaling and hierarchical cluster analyses. The 12 cluster Australia map reflects three thematic regions: (1) An Evaluation Approach that Honours Community; (2) Core Heart of the Evaluation; (3) Cultural Integrity of the Evaluation. The 11 cluster New Zealand map reflects four regions: (1) Authentic Evaluation Practice; (2) Building Māori Evaluation Expertise; (3) Integrity in Māori Evaluation; (4) Putting Community First.   Both maps highlight the importance of cultural integrity in evaluation. Differences include the distinctiveness of the Respecting Language Protocols concept in the Australia map with language being embedded within the concept of Knowing Yourself as an Evaluator in a Māori Evaluation Context in the NZ map. The ratings on importance and achievability highlight that all concepts are important though differences exist between countries in perceived achievability. In both countries the concepts of Evaluator Qualities and Evaluator Integrity were rated as very important and as most achievable. We will present an overview of the concept maps and highlight importance and achievability ratings. Participants will be invited to discuss how resources can best be harnessed to 'grow' evaluation that works for Indigenous communities. 

Chairs
avatar for Liz Smith

Liz Smith

Partner, Litmus
I am a co-founder of Litmus a specialist private sector evaluation company based in Wellington New Zealand and active across Australasia. My evaluation journey started more than 20 years ago as a nurse in the John Radcliffe Hospital Oxford when assessing the effects of a new nursing... Read More →

Speakers
avatar for Margaret Cargo

Margaret Cargo

Associate Professor, University of Canberra
Originally from the northwest coast of Canada, Margaret migrated with her family to Australia in 2007. She is currently based at the Centre for Research and Action in Public Health, Health Research Institute, University of Canberra. Her expertise is in the implementation evaluation... Read More →
SC

Sharon Clarke

Senior Project Officer, Aboriginal Well Women’s Screening Program, SA Government
On my mother's side my language group is Wergaia and on my father's side I am Gunditjmara from Victoria.Sharon Clarke is a Senior Project Officer and works in the area of Aboriginal Women's Health within South Australia. She has 45 years' experience working in the health domain, public... Read More →
avatar for Lynley Cvitanovic

Lynley Cvitanovic

Researcher, Whakauae Research Services Ltd
Born and brought up in Whanganui (Aotearoa New Zealand), I am fifth generation Pākehā of Croatian, English and Irish descent. I joined Whakauae in 2008, as a researcher and evaluator, after spending 25 years in service delivery and middle management roles in the public health (health... Read More →
avatar for Jenni Judd

Jenni Judd

Professor of Health Promotion, Central Queensland University
Indigenous Health and Education, Health Promotion, Capacity Building, Emerging Infectious Diseases, Rural and remote Health, research capacity building, Evaluation
avatar for Gill Potaka-Osborne

Gill Potaka-Osborne

Researcher, Whakauae Research Services
Ko Aotea te wakaKo Ruapehu te maungaKo Whanganui te awaKo Ātihaunui-ā-Pāpārangi te iwiKo Ngāti Tuera, Ngāti Pamoana, Ngāti Pareraukawa ngā hapū.Ko Pungarehu, ko Parikino, ko Koriniti, Ko Ngātokowaru Marae ngā marae.Ko Gill Potaka-Osborne au. In 2005, I began employment... Read More →
avatar for Lisa Warner

Lisa Warner

Coordinator Aboriginal Women's Leadership, YWCA Australia
Lisa Warner is a descendant of the Anangu Yankunytjatjara/Pitjantjatjara people. Lisa is employed at the YWCA Australia as a program coordinator of the Aboriginal women’s Leadership Program, working alongside Aboriginal women in communities providing inspiring leadership development... Read More →


Wednesday September 19, 2018 11:00am - 12:00pm AEST
Chancellor 4

11:30am AEST

Beyond 'reach': Rethinking the evaluation of digital government
Tanja Porter (ACIL Allen Consulting)

It's been almost a decade since the Australian Government announced that social media would revolutionise how citizens engage with government and would lead to a raft of improvements in policy making and service delivery. Today, social media features in most government interactions with citizens - from managing expectations about hospital waiting times, to taking reports on pot holes that need fixing, or consulting on tax policy reform. 

How do we evaluate the impact of government activity that involves social media? Commonly we use the data generated by social media (hits, likes, shares, etc.) and draw conclusions about outcome and impact based on these measures of popularity and 'reach'. Social media measurement tools and dashboards make it increasingly easy to do so.

Drawing from case studies of social media in the development of the National Disability Insurance Scheme (NDIS) and the 'one punch' laws in NSW, my research shows that this data disguises the complexity of citizen-government interactions on social media. Evaluations relying on social media data alone are blind to context and power relations and can result in inaccurate appraisals of an activity's outcome or impact. 

Introducing the concept of 'deliberative systems' and the emerging techniques of 'digital ethnography', and by applying them to the same two case studies, this presentation will show how evaluators can achieve far richer, and more nuanced insights of citizen-government interactions through social media. 

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Speakers
avatar for Tanja Porter

Tanja Porter

Senior Consultant, ACIL Allen Consulting
I love to get my teeth into policy reviews, program evaluations and doing research, consultation and analysis. For many years I managed public investigations and integrity processes in the Commonwealth Government. I've recently examined (via a PhD at ANU) what political theory can... Read More →


Wednesday September 19, 2018 11:30am - 12:00pm AEST
Chancellor 3

12:00pm AEST

Embracing the "Fish out of Water" – a novice evaluators' experience introducing reflective practice to influence systems transformation
Sophie McEniry (Bendigo Health)

"Systems thinking" is a loaded, hyped, and often misused term. Join a novice evaluator in her journey of navigating and creating a culture of reflection, experimentation and action in her small team. Listen to a story about how a couple of words, and a kind act can have in supporting aspiring evaluators and systems change. Learn how introducing a "reflective practice" process has changed outcomes for practitioners, influences systems engagement, transition and transformation. 

Chairs
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →

Speakers
avatar for Sophie McEniry

Sophie McEniry

Health Promotion Officer, Bendigo Health
Sophie is an emerging evaluator (Registered Nurse / Midwife) with recent experience evaluating complexity / collective impact / systems thinking initiative in reference to obesity in regional Victoria. Novice experience with developmental evaluation and shared measurement. Competent... Read More →


Wednesday September 19, 2018 12:00pm - 12:05pm AEST
Chancellor 6

12:00pm AEST

Transforming research organisations via monitoring, evaluation and learning: how can we evaluate our own work?
Larelle McMillan (CSIRO), Samantha Stone-Jovicich (CSIRO), Toni White (AgResearch New Zealand), Helen Percy (AgResearch New Zealand), Lan Chen (AgResearch New Zealand)

The potential for monitoring, evaluation and learning (MEL) to enhance innovation and impact is receiving increasing attention in practice and research. AgResearch NZ and CSIRO Agriculture & Food are working with our biophysical researchers to transform our organisations to achieve increased innovation and impact through embedded monitoring, evaluation and learning at the project and programme level. Lessons from CSIRO and AgResearch NZ has shown that it is difficult to systematically show the value of MEL in practice, with many researchers and managers asking "Is it worth the effort and resources?".  While there are an increasing number of case studies and anecdotes pointing towards the role of MEL in helping deliver social, economic and environmental impacts, there is limited evidence, collated through systematic and rigorous methods, to substantiate this. In this paper we present an evaluation framework we developed drawing on insights from complexity science (the Cynefin framework) and reflective practices (the 'what, so what, now what' evaluation inquiries). The aim of the framework is to enable our organisations to gather empirical evidence, to track our MEL processes and outcomes in ways that enables organisational learning and informs research strategies and actions; and to enable comparative analyses. We share insights from piloting this framework and provide reflections on how it can support researchers and science organisations to transform the way impact and innovation is framed and delivered.

Chairs
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →

Speakers
avatar for Larelle McMillan

Larelle McMillan

Research Impact Broker, CSIRO
Larelle’s role in CSIRO Agriculture & Food is a Research Impact Broker – she has particular agency in building relationships and partnerships for research impact. Her mix of project management, partnership brokering, writing and communication skills, complemented by her growing... Read More →
avatar for Helen Percy

Helen Percy

Science Impact Leader - Adoption and Practice Change, AgResearch Ltd
Helen Percy is a Science Impact Leader – Adoption and Practice Change at AgResearch. She is currently leading a cross-organisation programme implementing the recommendations of AgResearch’s Adoption and Practice Change Roadmap, which includes developing organisational capability... Read More →
avatar for Toni White

Toni White

Evaluator & Social Researcher, Plant and Food Research; ImpactAhead
•Strong background in biological sciences•Focus on bringing evaluation into the sciences in NZ•Loves working in the ECB space•Keen on learning and using new methodologies•Passionate about the social sciences•Co-facilitates the Waikato/BOP branch of ANZEA•Loves cats and... Read More →


Wednesday September 19, 2018 12:00pm - 12:30pm AEST
Chancellor 3

12:00pm AEST

Size Matters: Quantifying the Size of the Challenge Through Big Data, Analytics and Evaluative Thinking
Rico Namay (Ministry of Education, NZ)

Although at a young stage in terms of unleashing the full extent of what they can offer to policy-setting, big data and analytics plus evaluative thinking are a potent mix that could have real and huge impact in the way policy is set or research and evaluations are conducted in the future.

This presentation shows how the transformative power of linked government data and analytics, combined with the ability to ask the right questions, help:
  • evaluate policy options,
  • make value for money assessments,
  • target participants for intervention programs and
  • set specific and measurable goals for organisations; school clusters in particular.
 
Reflections on some lessons gleaned from the appplication of big data and analytics follow the examples.
 
 
 
 
 
 
 

Chairs
avatar for Keryn Hassall

Keryn Hassall

Aptavit
I'm interested in how people think - about evaluation, about policies and programs, about their work, and how people think about other people.I have two primary areas of professional focus:(1) Organisational capability and practice change - using organisation theory and practical... Read More →

Speakers
avatar for Rico Namay

Rico Namay

Principal Analyst, Ministry of Education, New Zealand
Analytics, research and evaluation enthusiast; music and film fan; food lover; and not necessarily in that order.Although Rico has no degrees in music, film nor food he holds degrees in Mathematics and Applied Mathematics and has done studies in Statistics, Econometrics and Optimsation.Rico... Read More →


Wednesday September 19, 2018 12:00pm - 12:30pm AEST
Chancellor 5

12:00pm AEST

Using co-design to give voice to Aboriginal people in the design of a culturally appropriate infant maternal health service
Sue Leahy (ARTD Consultants)

Traditional consultation approaches typically start with a service or program model in mind and ask for people's views, often ending up with a solution largely reproducing the status quo. Co-design presents a valuable method for disrupting traditional power dynamics. Using creative techniques co-design processes help to build a safe space in which participants can explore difference and find commonalities that cross normal boundaries and relationships. 

This paper describes the steps in a successful co-design process to develop a new maternal and child health (MCH) service model to ensure Aboriginal families have access to culturally responsive and high quality MCH services. Twenty key stakeholders with expertise in working with Aboriginal families or delivering MCH services were drawn from across the state—half Aboriginal and half non-Aboriginal. They participated in a three-phase co-design process that explored in depth the needs and experiences of Aboriginal families, generated new service ideas to respond to these needs and then refined service features for implementation. Through a series of workshops stakeholders produced a flexible and tailored service model firmly centred on the needs of Aboriginal families.

Chairs
avatar for Liz Smith

Liz Smith

Partner, Litmus
I am a co-founder of Litmus a specialist private sector evaluation company based in Wellington New Zealand and active across Australasia. My evaluation journey started more than 20 years ago as a nurse in the John Radcliffe Hospital Oxford when assessing the effects of a new nursing... Read More →

Speakers
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Amanda Reeves

Amanda Reeves

A/Manager, Performance and Evaluation Division, Department of Education and Training
Amanda is an experienced evaluation practitioner and policy analyst at the Department of Education Victoria. Amanda has led evaluation projects in a variety of roles in government, the not-for-profit sector and as an external consultant in education, youth mental health and industry... Read More →


Wednesday September 19, 2018 12:00pm - 12:30pm AEST
Chancellor 4

12:05pm AEST

Economic evaluation of justice support: Transforming life pathways for people with intellectual disability
Ruth McCausland (UNSW), Rebecca Reeve (UNSW)

Young people with intellectual disability from backgrounds of disadvantage often become 'managed' by the criminal justice system in the absence of holistic support in the community. This is extraordinarily costly in human and economic terms. This presentation reports on an economic evaluation of a program run by the Intellectual Disability Rights Service in NSW that demonstrated how the provision of appropriate support and services at a critical intervention point can transform the lives of individuals with intellectual disability in the criminal justice system, work towards more equitable legal outcomes and also result in cost savings to government.

Chairs
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →

Speakers
avatar for Ruth McCausland

Ruth McCausland

Senior Research Fellow, School of Social Sciences, UNSW
Dr Ruth McCausland is Director of Research and Evaluation for the Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW, and Senior Research Fellow in the School of Social Sciences, UNSW. Her research focuses on women, young people, people with disabilities and... Read More →
avatar for Rebecca Reeve

Rebecca Reeve

Senior Research Fellow, Intellectual Disability Behaviour Support (IDBS), UNSW Australia
Dr Rebecca Reeve is a Senior Research Fellow in the Intellectual Disability Behaviour Support (IDBS) team at UNSW. She is also a Senior Research and Advocacy Officer at The Smith Family. Rebecca’s academic research as an econometrician focuses on vulnerable Australian populations... Read More →


Wednesday September 19, 2018 12:05pm - 12:10pm AEST
Chancellor 6

12:10pm AEST

Joining the dots: evaluation and strategy
Joanna Farmer (beyondblue)

Evaluation is often seen as something that occurs at the micro, program level, while strategy happens up at the macro, organisational level. Increasingly though, organisations are thinking about how they can measure the performance of their strategy, and the programs that contribute towards it, to drive long-term strategic goals. The presenter reflects on her experience developing organisational strategy using an evaluation background, highlighting the key stages of developing organisational strategy, and how evaluative thinking can be used to improve goal-setting, implementation and monitoring at all levels of an organisation. 

Chairs
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →

Speakers
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.


Wednesday September 19, 2018 12:10pm - 12:15pm AEST
Chancellor 6

12:15pm AEST

Using systems theory to explore the impacts and outcomes of a research and evaluation capacity building partnership
Rochelle Tobin (Curtin University), Jonathan Hallett (Curtin University), Roanna Lobo Lobo (Curtin University), Bruce Maycock (Curtin University)

The Sexual Health and Blood-borne Virus Applied Research and Evaluation Network (SiREN) takes a partnership approach to building the research and evaluation capacity of organisations working to address sexual health and blood-borne virus issues in Western Australia. Despite the potential of partnership approaches, like SiREN, to improve public health practice, there is limited understanding of how they work and the kinds of outcomes they can achieve. This presentation will describe the application of systems theory to understand how, and in what ways, the SiREN model has influenced research and evaluation practices.

Chairs
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →

Speakers
avatar for Rochelle Tobin

Rochelle Tobin

PhD candidate
I am a PhD candidate investigating SiREN's (Sexual Health and Blood-borne Virus Research and Evaluation Network) influence on research and evaluation practices in the Western Australian sexual health and blood-borne virus sector. I also support SiREN's knowledge translation activities... Read More →


Wednesday September 19, 2018 12:15pm - 12:20pm AEST
Chancellor 6

12:20pm AEST

Designing a transformative evaluation framework
Sarah Stamp (Queensland Family and Child Commission), Nerida Leal (Queensland Family and Child Commission), Rhian Stack (Queensland Family and Child Commission), Bianca Reveruzzi (Queensland Family and Child Commission), Katrina Middlin (Queensland Family and Child Commission), Jessica Eggleton (Queensland Family and Child Commission)

The Queensland child protection system is undergoing a 10-year reform program to transform the system. This evaluand requires an equally transformative evaluation framework. Evaluations scheduled at three time points require different approaches to define and measure success given the varied purposes of the evaluations, maturity of the reform program and data available for each evaluation, which were unknown at program commencement. AES delegates will have the opportunity to hear about how we designed a flexible, transformative evaluation framework with subsequent evaluation plans defining success measures to allow evaluation planning to occur early while ensuring the evaluations are appropriate. 

Chairs
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →

Speakers
avatar for Sarah Stamp

Sarah Stamp

Principal Advisor, Evaluation, Queensland Family and Child Commission
Sarah is an evaluation practitioner with public and private sector experience across a range of health and human services areas. Sarah developed this Ignite presentation in collaboration with her colleagues at the QFCC.


Wednesday September 19, 2018 12:20pm - 12:25pm AEST
Chancellor 6

1:30pm AEST

Theories on and of: A systematic analysis of evaluation's domains of knowledge
Ghislain Arbour (Centre for Program Evaluation)

The evaluation discipline deals with a diverse body of knowledge. Some theories, concepts and other models are about investigating the value of things, and some are about how evaluation partners engage in the evaluation process. Others are about how people should conduct evaluations, and we also have ideas about how we communicate the results from such evaluations. We even developed theories about events that happen after the evaluation is done, concerning decision-making and other types of use.

But what is evaluation knowledge, really? What defines it? What delineates it from other disciplines? What is the role of other disciplines in developing evaluation knowledge? Can all contributions relevant for evaluation qualify as evaluative knowledge?

This paper is an attempt at answering the aforementioned questions. In so doing, it proposes a systematic framework to organise the various evaluation's domains of knowledge. The framework is driven by a fundamental distinction between theories on evaluation and theories of evaluation. The former are the theories and concepts from various disciplines that are applied to the social object of evaluation to explain, among others, the administrative, political and sociological nature of evaluation. The latter are the theories and concepts that explain the determination of value.

Chairs
Speakers
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Doctor Ghislain Arbour is a Senior Lecturer at the University of Melbourne where he coordinates the Master of Evaluation.*Research and consultancy*A primary research interest of Dr Arbour is the clarification of necessary concepts in the analysis and judgement of the performance of... Read More →


Wednesday September 19, 2018 1:30pm - 2:00pm AEST
Chancellor 4

1:30pm AEST

The Promise Design-thinking and Implementation Science holds for Social Impact Evaluation: Views from Practitioners and Evaluators
Ruth Aston (University of Melbourne), Rachel Aston (ARTD Consultants), Timoci O'Connor (University of Melbourne), Robbie Francis (The Lucy Foundation / University of Otago, NZ)

In the last decade the prevalence of complex evaluands (multi-site, multi-input, multi-output and multi-outcome) aiming to achieve social change has exponentially grown. However, the expansion and development of approaches to measuring the impact of these evaluands has not kept pace. A multi-year research project conducted by the authors investigated measures for evaluating the impact of complex social change initiatives, and found that intervention design and implementation are proxy indicators for intervention impact. This short paper presentation will draw on the key findings of the research project 'creating measures for social change' and will present challenges and promising approaches in social impact evaluation including the role of technology, co-design and implementation science.
Evaluators and social change practitioners from the University of Otago National Centre for Peace and Conflict Studies, The Lucy Foundation and ARTD Consultants will present critical practical considerations for applying the findings of the research drawing from evaluations in public health, social enterprise and family violence.The ways in which inclusive and accessible information about design and implementation could support the adaptive monitoring and evaluation needs in challenging social change contexts will also be reflected on.


Chairs
avatar for Stefan Kmit

Stefan Kmit

A/Manager, Research and Evaluation, Department for Child Protection

Speakers
avatar for Ruth Aston

Ruth Aston

Lecturer, University of Melbourne
Dr Ruth Aston is a Lecturer at the Centre for Program Evaluation and Honorary Research Fellow at the Centre for Adolescent Health at Murdoch Children's Research Institute. Ruth has a background in public health, including working on a three-year national project investigating the... Read More →
avatar for Rachel Aston

Rachel Aston

Manager, ARTD Consultants
Rachel is an experienced social researcher and evaluator at ARTD Consultants. She brings eight years’ experience conducting research and evaluation for government, NGOs and in the higher education sector. Rachel has a high level of expertise in qualitative and mixed-methods research... Read More →
RF

Robbie Francis

Director, The Lucy Foundation
Robbie Francis is a young woman who has packed a lot into 29 years. Having lived with a physical disability since birth, she has worked in the disability sector for over a decade as a support worker, documentary maker, human rights intern, researcher, consultant and as an advisor... Read More →
avatar for Timoci O’Connor

Timoci O’Connor

Lecturer, Centre for Program Evaluation
Timoci has over ten years of experience in conducting research and evaluation projects in the public health, education, international development and community sectors. He holds a Masters of Public Health and is currently doing his Phd exploring the nature of feedback in community-based... Read More →


Wednesday September 19, 2018 1:30pm - 2:00pm AEST
Chancellor 6

1:30pm AEST

Outcomes, dashboards and cupcakes
Jenny Riley (Navigating Outcomes Pty Ltd)

Outcomes based performance management is heading our way. We know it and at Windana we are getting ready, but we wanted an outcomes measurement framework that works for us and our clients, one that is meaningful, robust and proportionate. We wanted it not to be a tick-box, top-down, administration burden but something that could add value to our work and perhaps in drive our work.

With this in mind Windana embarked on an outcomes measurement journey in April 2017. Our ambition was to introduce real-time outcome measurement into our 35 bed therapeutic community in Maryknoll, Victoria.We took the time to build the skills and knowledge of our team about what outcomes are versus outputs, our residents participated in a 'theory of change' workshop, allowing us to identify our short, medium and long term outcomes.
The consultants worked with us to recommend validated tools to collect data that aligned with our intended outcomes. We visioned a dashboard of how this data could be feed back to our clients and staff in real-time. We launched our dashboards on 4th December (this is where the cupcakes come in) and have been collecting and using the data to support our work in Maryknoll.

We will present feedback from staff and clients 6 months into using our live dashboards i.e. was it worth it? is it adding value to our work? what are we learning? 

In concluding this paper we will share the process including what worked well and what we could have done better. We will share our recommendations for setting up outcome measurement in other therapeutic communities and programs in the AOD sector and our 'what next' thinking about shared measurement across the sector and opportunities for data linkage.

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Speakers
avatar for Clare Davies

Clare Davies

Executive Director Rehabilitation Services, Windana Drug and Alcohol Recovery Inc
avatar for Jenny Riley

Jenny Riley

Founder and Director, Navigating Outcomes
I help organisations and collaborations access and learn from their data so they can improve social outcomes for individuals, families and communities.My passion is to bring together digital solutions (cloud, mobile, big data analytics, social media) and the social sector. I develop... Read More →


Wednesday September 19, 2018 1:30pm - 2:00pm AEST
Chancellor 5

1:30pm AEST

What we wish we'd known: The experiences of new and emerging evaluators
Rebecca Denniss (First Person Consulting), Matthew Healey (First Person Consulting), Liz Smith (Litmus / AES), Amy Gullickson (University of Melbourne, Centre for Program Evaluation), Nerida Buckley (Sustainability Victoria), Sally Hartmanis (Deloitte Access Economics)

The beauty of evaluation as a discipline and a professional practice is that it involves diverse skills, capabilities, mindsets and approaches that can be applied across diverse contexts, cultures, landscapes and sectors. While this presents opportunities, it can also be overwhelming.

New and emerging evaluators often get told what they need and what they should be doing - so, instead, we're asking them for their perspective. For this panel session, we have brought together a collection of movers and shakers with a range of experiences, including: - a young up-start who started up his own evaluation firm - new and emerging evaluators from the government and non-government sectors - an evaluation educator who challenges and inspires evaluators across all stages of their careers - an experienced evaluator and senior AES member who describes herself as a 'disrupter'.

If you are a new or emerging evaluator, this is your chance to ask questions, seek mentoring and advice, share experiences and, most importantly, tell us what you need to transform your career.
If you're an experienced evaluator, it's your chance to meet some of the region's brightest new evaluators—and talk to them about all the things you wish you'd known in the early stages of your career!

Facilitated by new and emerging evaluators for new and emerging evaluators, this panel session will involve discussion about: - capabilities, mindsets, approaches and skills - learning from failures and f**k ups - mentoring and support - professional pathways

After hearing a bit about the stories of each of the panellists, the majority of this session will be dedicated to questions and answers, and facilitated audience discussion.

Chairs
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.

Speakers
avatar for Nerida Buckley

Nerida Buckley

Strategy & Planning Business Partner, Sustainability Victoria
Nerida is an emerging evaluator with experience in the sustainability and natural resource management sectors. Nerida enjoys working in complex environments and across disciplines to deliver new insights and inform stakeholder-driven program and policy design. Having previously worked... Read More →
avatar for Rebecca Denniss

Rebecca Denniss

Consultant, First Person Consulting
As an emerging evaluator and Consultant at First Person Consulting, Rebecca evaluates a range of programs and policies in the natural resource management, climate change, sustainability and social sectors.She brings experience in social and environmental policy and research to her... Read More →
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Sally Hartmanis

Sally Hartmanis

Senior Analyst, Health Economics & Social Policy, Deloitte Access Economics
Sally is a Senior Analyst in the Health Economics & Social Policy team at Deloitte Access Economics. She is concurrently completing a Master of Public Health, specialising in Health Economics, at Monash University part-time and also holds a Bachelor of Biomedicine, Genetics major... Read More →
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →
avatar for Liz Smith

Liz Smith

Partner, Litmus
I am a co-founder of Litmus a specialist private sector evaluation company based in Wellington New Zealand and active across Australasia. My evaluation journey started more than 20 years ago as a nurse in the John Radcliffe Hospital Oxford when assessing the effects of a new nursing... Read More →


Wednesday September 19, 2018 1:30pm - 2:30pm AEST
Chancellor 3

1:30pm AEST

New words, old approaches: Weaving together foundational principles for contributing to transformation through evaluation
Robyn Bailey (Allen + Clarke), Emma Walke (University of Sydney), Roxanne Bainbridge (Central Queensland University)

Do new terms such as 'co-design' signal substantively new or different approaches to evaluation? Or are they repackaging old concepts, concepts fundamentally important for ensuring the self-determination of Indigenous peoples? Do 'co' approaches - co-design, co-operative inquiry, co-production, co-creation, inherently address issues such as power and control over decision-making and resources, or can they further entrench current inequities?

We contend that it is not evaluation approaches in and of themselves that can contribute to better outcomes for Indigenous peoples. Rather, the application of principles and practices which consciously address issues of inequity in power, diversity of voices, values and knowledge, and benefits arising from such evaluation projects. 

We have started to build a principles-based framework for guiding our practice, both during the co-design and evaluation of a substantive program aimed at improving outcomes for Aboriginal and Torres Strait Islander peoples. This framework attempts to weave together principles emphasised by Aboriginal and Western forms of inquiry - differing ways of knowing, doing and being.  

We invite you to a yarning circle to talk about foundational principles and practices which respect 'all learn, all teach' processes and practices. We would like to explore whether there are evaluation approaches that are inherently more culturally safe and transformative, whether it is the way in which we apply our craft that is key to realising better outcomes for Indigenous and ultimately all peoples, or whether it is something else. The knowledge generated in the session will be shared back with participants, using both visual and written mediums.

Chairs
avatar for Keryn Clark

Keryn Clark

DMEL & Research Consultant

Speakers
avatar for Robyn Bailey

Robyn Bailey

Senior Associate, Evaluation + Research, Allen and Clarke
Hello! I am a Pakeha (European) New Zealander, currently working in both Australia and New Zealand. Evaluation, and contributing to better outcomes for people and communities through evaluation, is my passion. Along with colleagues, I work with both government agencies and NGO providers... Read More →
avatar for Roxanne Bainbridge

Roxanne Bainbridge

Director Centre for Indigenous Health Equity Research, Central Queensland University
I am a Gungarri/Kunja Aboriginal researcher from South Western Queensland in Australia and Professorial Research Fellow at Central Queensland University where I am Director for the Centre for Indigenous Health Equity Research. My current priority evaluation is embedded in a partnership... Read More →
avatar for Emma Walke

Emma Walke

Academic Lead Aboriginal Health - Co-Lead Community Engagement, University Centre for Rural Health
I'm a Bundjalung woman, my family from Cabbage Tree Island/Ballina area Northern NSW. I have a role that works with Medical and Allied health Students when away from their home base universities to understand and work better with Aboriginal and or Torres Strait Islander peoples. I... Read More →


Wednesday September 19, 2018 1:30pm - 2:30pm AEST
Conference centre

2:00pm AEST

The offerings and challenges of transdisciplinarity for evaluation
Keren Winterford (Institute for Sustainable Futures, University of Technology Sydney)

This paper explores the offerings but also the challenges of employing theory and practice of transdisciplinary research, which is being increasingly employed in academic research, to realms of evaluation. This way of working is in response to a recognition of 'wicked' problems, complexity and that solutions for the future will not be solved by single disciplines alone. As Einstein said "we cannot solve our problems with the same thinking that created them."

Transdisciplinarity offers an approach through which to ask different types of questions, to different types of actors, in order to create new types of transformative knowledge for improved program design and implementation. 

The paper describes aspects of transdisciplinary research, including purposive, holistic, participatory, experimental and action focused and dynamic, and situates these within practice examples of evaluation. The paper highlights the importance of situating evaluator expertise with other sets of knowledge and exploring underlying world views that inform policy and program interventions. This type of practice is increasingly in line with how projects and programs operate. 

Transdisciplinarity offers a set of thinking and practice which situates the evaluator together with other sets of knowledge. This includes equally valuing and integrating different knowledge and perspectives, and by working outside traditional definitions and crossing disciplinary boundaries, adapting and transforming to find connections and meaning.

The paper tests the practice of transdisciplinary research against the expectations of evaluation practice and highlights challenges of working through such an approach which include uncertainty of bringing multiple actors together in a process of co-design and co-production, use of different languages, and dominance of singular frameworks. Despite its challenges, the paper concludes that transdisciplinarity provides a useful means through which to guide evaluation theory and practice and for evaluators to contribute to addressing societal problems, discourse and strengthened policy and programming objectives. 

Chairs
Speakers
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →


Wednesday September 19, 2018 2:00pm - 2:30pm AEST
Chancellor 4

2:00pm AEST

Whose outcome is it anyway? Using matrices to serve many masters.
Linda Leonard, Nolan Stephenson (WA Department of Primary Industries and Regional Development)

Was the project outcome met?  How many times have we heard that phrase as we try to justify the outcome of a project for further funding?  In general terms, the assumption is, the outcome of a project will meet a particular stakeholder need and addresses the issue.  This is not always the case.  We find ourselves in situations where the outcome meets the needs of one type of stakeholder but many others have vested interests.  Through one lens the project is deemed successful through another it has failed.  How then do we meet the needs of various audiences while staying true to a project outcome?  

This presentation looks at the transformation from the single outcome based approach of project delivery, to an approach that meets expectations of a range of vested interests.  A quote from Homer states "if you serve too many masters you will suffer".  We approach with caution, aware of the complex pathways which may be formed transitioning to an end point. Audiences facing complex environments, driven by political and budgetary constraints will be interested in gaining insights into how multi-level logical thinking can meet the needs of a range of parties.

Experiences of this particular program has shown that the use of matrices offers insight, awareness and decision support thinking to a wider audience.  It explores how one approach, Rubrics, can be used to provide a decision support framework to enable stakeholders to understand levels of success from varying points of view.  Using the theory behind Rubrics, allows for development of measurement standards, decision making and validation of priorities for a variety of needs.  The methodology allows for transformation away from linear thinking, to one that reflects multi-criteria consideration of stakeholders who want buy-in on the result.

Chairs
avatar for Stefan Kmit

Stefan Kmit

A/Manager, Research and Evaluation, Department for Child Protection

Speakers
avatar for Nolan Stephenson

Nolan Stephenson

Principal Policy Officer Evaluation, Department of Primary Industries and Regional Development
I use evaluative thinking, methodologies and promote a culture of evaluation to create an environment where evaluation is accepted as a holistic approach to inform and guide policy, investment and project implementation. In my current role, I evaluate regional development initiatives... Read More →


Wednesday September 19, 2018 2:00pm - 2:30pm AEST
Chancellor 6

2:00pm AEST

New evaluation techniques for the transformation of Melbourne. Time-and-place targeting technology and the decline of the 300-page evaluation report
David Spicer (Colmar Brunton), Kirstin Couper (Colmar Brunton)

The impact of disruptions initiated by the transformation of transport infrastructure is a hot topic in Melbourne. Improvements to crucial arterial roads and public transport corridors mean there is a lot for Melburnians to consider when planning a journey. We will share the results from an evaluation of the impact of twelve infrastructure disruptions. Each of the twelve disruptions covered different locations, time periods and transport modes. Historically, there has been concern that traditional lagging indicators from online and phone surveying could not capture accurate or timely recall of travel experience. We overcame this limitation using 'geo-targeted sampling' as part of a suite of methodologies. We used targeted surveys on mobile devices using GPS data to identify individuals who had been present at a specific location at a specific time. There was no traditional 'Evaluation Report' for this study, nor did we use static 'scorecards' or similar devices across the 12 disruptions. Instead, we shaped the way that policy-makers and planners could interrogate the data relevant to their area by providing a series of interactive online dashboards. The dashboard enabled the dissemination of findings that created a space where a broad range of stakeholders could test their hypotheses. These stakeholders may not have been able to answer their own research questions using traditional and static report/scorecard materials.This did not de-value the role of the evaluator who was always on hand to aid with interpretation and translation of data into insights. Rather, it empowered clients and their stakeholders to take control of their own data. We will demonstrate the dashboard outputs in our presentation. 

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Speakers
avatar for David Spicer

David Spicer

Research Director, Colmar Brunton


Wednesday September 19, 2018 2:00pm - 2:30pm AEST
Chancellor 5

2:30pm AEST

Maximising the Value Add of a Strategic Evaluation Function in an International Non-Government Organisation (NGO)
Sarah Leslie (The Fred Hollows Foundation),Peta Leemen (The Fred Hollows Foundation)
In this presentation, we will present our experiences as internal evaluators in an international NGO trying to develop a strategic evaluation function. This will include:
  • developing the evaluation policy, defining strategic evaluations and developing guidance on how to do these;
  • how many strategic evaluations we’ve done to date and a more in-depth profile of a couple;
  • our learnings on how to effectively structure, commission and manage these evaluations and support learning from the evaluation;
  • how we’ve tried to apply our learnings in more recent evaluations and the implications for the NGO’s overall monitoring and evaluation system.

Chairs
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.

Speakers
PL

Peta Leeman

Senior Monitoring and Evaluation Adviser, The Fred Hollows Foundation
avatar for Sarah Leslie

Sarah Leslie

Monitoring and Evaluation Advisor, The Fred Hollows Foundation
I support staff across the Foundation to design and implement monitoring and evaluation plans and use the information they collect to inform their work.


Wednesday September 19, 2018 2:30pm - 2:35pm AEST
Chancellor 3

2:30pm AEST

Synthesising Kirkpatrick's four-levels
Francesca Demetriou (Lirata Consulting)

Donald Kirkpatrick published the four-level model for evaluating training programs in his 1994 book Evaluating Training Programs: The Four Levels. His objective was to "provide a simple, practical four-level approach for evaluating training programs" (Kirkpatrick, 2006). Since then, the framework has been applied extensively in evaluating training and development programs.

What is clear in choosing to utilise this framework, is that the evaluation takes on a specific values frame: that these four levels (Reaction, Learning, Behaviour, Results) are the appropriate criteria in which to judge the program on. What is not clear is the relative importance of each of the four levels, and how evaluators should therefore approach synthesising findings about the four-levels to reach an overall judgement about the performance of a program.

Synthesis is an important step in evaluation, but there is a tendency for this step to occur in ways which are not rigorous, nor explicitly justified, leading to hidden assumptions behind conclusions provided in evaluations.

In the context of the Kirkpatrick model, for a training program to be considered "good", how well does each of the four levels need to perform to determine that a program is adequate, good, or excellent? Can a training program where participants learn a lot, but the results for the organisation are limited be considered "good"?

Using Jane Davidson's (2005) guidance on evaluation synthesis, and a literature review on the use and critiques of the Kirkpatrick model, this paper considers the assumptions behind the model to provide guidance for determining the relative importance of the four levels.
An example in practice is presented, where a synthesis methodology is developed for a Tasmanian community leadership program evaluation that uses the Kirkpatrick model.

Chairs
Speakers
avatar for Francesca Demetriou

Francesca Demetriou

Evaluator, Lirata
I’m an early career evaluator with a background in social research. I have worked with NFPs, NGOs and government across a range of different sectors, including health, housing and homelessness, education, employment, and refugee settlement services. I’m especially interested in... Read More →


Wednesday September 19, 2018 2:30pm - 3:00pm AEST
Chancellor 4

2:30pm AEST

Co-creating an evaluation of an innovative collective impact project: the Katherine Individual Support Program
Jenne Roberts (Menzies School of Health Research), Eslyn Fletcher (Katherine regional Aboriginal Health and related Service (KRAHRS)), Graham Castine (Kalano Aboriginal Corportation), Darrell Brock (Wurli Wurlinjang Aboriginal Health Serivce), Simon Quilty (Katherine District Hospital)

A Consortium of Aboriginal service providers have united with a small 60 bed- hospital to ensure that homeless, frequent attenders of the emergency department are not turned out onto the streets after receiving treatment. They had a compelling idea - that their combined efforts could transform the service system and improve wellbeing- and they didn't want to wait until they had exhausted their pilot funding to find out if it worked. So, these social innovators chose to work with a developmental evaluator. Together, they use evaluation to improve design and implementation, strengthen their collective impact and transform into a cohesive, person-centred network of services. 

This presentation will outline the magic that results from combining Indigenous concepts of wellbeing, developmental evaluation and the lived experience of participants to co-create knowledge, solve complex service system gaps as they are identified, and increase access to social and health services. They co-create culturally appropriate methods to ensure participants receive culturally appropriate collaborative case management, primary health care and timely access to services.

This presentation will illustrate the complexities of bringing stakeholders together to: 1) generate a shared workplan and common set of indictors of positive impact; 2) identify the principles and values that underpin the co-creation and collective impact approach adopted by the Consortium; 3) reflect on the value of their combined efforts to support the 500-plus people who present frequently to the Emergency Department.  The presentation will explore some of the problems encountered in evaluating collective impact and how they are being tackled and overcome. The Consortium members and frontline service providers (from several agencies) will speak candidly (in person and in a video presentation) about how they have been able to open an innovation process to ongoing, collective scrutiny.

Chairs
avatar for Stefan Kmit

Stefan Kmit

A/Manager, Research and Evaluation, Department for Child Protection

Speakers
avatar for Jenne Roberts

Jenne Roberts

Evaluation Manager, Menzies School of Health Research
Jenne is an international health evaluator, working in Indigenous health in Australia and international public health, mostly in South East Asia. Jenne is interested in identifying the efforts and interventions that spark positive social and health impact, and engaging intended beneficiaries... Read More →


Wednesday September 19, 2018 2:30pm - 3:00pm AEST
Chancellor 6

2:30pm AEST

Leveraging publicly available longitudinal and transactional data sources to create comparison groups in quasi-experimental and natural experimental evaluation scenarios.
Gerard Atkinson (ARTD Consultants)

One of the challenges faced by evaluators is how to effectively determine the impacts of a program when a control group is not readily available. Sometimes the design of the program makes such groups impossible or unethical to create (e.g. mandatory or selective participation), or constraints on resources and scope make such investigations infeasible.

These challenges have led to the development of quasi-experimental and natural experimental approaches to evaluation.In parallel to the adoption of these techniques, the shift to policies of "open government" has enabled greater public access to data. Much of these data capture transformations in society over time, or provide records of how people have interacted with government and public services. In the right situations, these data can be used to augment impact evaluations through creating comparison groups for analysis.

This presentation looks at a variety of publicly available data sources, ranging from large scale longitudinal studies such as HILDA, geographic data such as the Geographic National Address File, or transactional data such as public transport journeys. These data sets can be used to enhance the robustness of quasi-experimental and natural evaluations. Through exploring example data sets and case studies, we consider the challenges of identifying and preparing such data, the privacy and ethical implications, and the value that such data can add to the evaluation process.

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation- market and social research - financial and operational modelling- non-profit, government and business strategy... Read More →


Wednesday September 19, 2018 2:30pm - 3:00pm AEST
Chancellor 5

2:35pm AEST

Evaluative study to assist a transformation of the Indigenous affairs system
Kevin Dolman (Kevin J Dolman Consulting)

The research aim for my PhD thesis is to identify the systemic problems that have been hindering the efficiency and effectiveness of the Indigenous affairs system. It involved a detailed case study of the Council of Australian Governments' Indigenous Whole-of-Government Trials Project, which was undertaken from 2002-2007. Under this project, the Commonwealth, State and Territory Governments had agreed to experimentally plan and deliver services to eight trial regions around Australia, which have substantial Indigenous populations. The application of two public administration practices were required: 1) whole-of-government coordination; and 2) a partnership with the Indigenous communities. The theory was that better socioeconomic outcomes from public expenditure would prevail with this approach.

I investigated how well the trials succeeded in implementing this approach. I found there was very little success in applying the two practices across all eight trial sites and consequently, there were very little positive socioeconomic outcomes from the overall project. In seeking to understand the reasons for this disappointing result, I appraised the quality of the project's public administration by analysing the policy development, the implementation and evaluation stages against recognised best practice standards. The research revealed a pattern of relatively poor quality public administration across all three stages of the project.  

Chairs
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.

Speakers
avatar for Kevin Dolman

Kevin Dolman

Principal Analyst, Indigenous Evaluation Consultant
Dr Kevin J. Dolman is an Eastern Arrernte man who has worked for thirty years in the government, private and community sectors across a spectrum of Indigenous social, economic and cultural policy. He is a consultant in evaluation and policy fidelity with a special interest in how... Read More →


Wednesday September 19, 2018 2:35pm - 2:40pm AEST
Chancellor 3

2:40pm AEST

Strengthening program impact on systems and building evaluation into systems
Jade Maloney (ARTD), Katherine Rich (ARTD) To address ‘wicked’ social problems, there’s a need for programs to recognise their potential to impact on systems and for ongoing learning to be built in. So what can we evaluators do to amplify the impact of an evaluation project on systems? We’ll share three takes from our work with Fair Trading. 1) Evaluators can assist program managers to build systems thinking into design – incorporating ways of addressing systems issues into their logic rather than treating them as external factors or barriers. 2) Evaluators can build capacity for evaluative thinking among program staff in every evaluation project. 3) And, when the context is right, they can also create a transferable monitoring and evaluation framework that organisations can continue to use when the project ends.

Chairs
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.

Speakers
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Katherine Rich

Katherine Rich

Manager, ARTD
Katherine is a Manager at ARTD Consultants. She manages evaluation and research projects across a range of policy sectors, including industry, disability, transport and energy and environment. She previously held a senior position in the research division of one of Australia's top... Read More →


Wednesday September 19, 2018 2:40pm - 2:45pm AEST
Chancellor 3

2:45pm AEST

Evaluating influence
Joanna Farmer (beyondblue)

Evaluation theory has primarily emerged from the desire to measure the impact of discrete programs or interventions. However, for many organisations, especially not for profits, their primary goal is advocacy, and attributing behaviour change to any one action is challenging. These organisations often still have to demonstrate impact to funders, Boards and government - so how do you evaluate influence? The presenter draws on theory, and her experience evaluating advocacy and influence models, providing simple and practical steps to understand and attribute change. 

Chairs
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.

Speakers
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.


Wednesday September 19, 2018 2:45pm - 2:50pm AEST
Chancellor 3

2:50pm AEST

Systemic transformation in action: Turbo-charging evaluation and impact in the New Zealand science system
Helen Percy (AgResearch Limited), Toni White (AgResearch Limited)
How do we ‘turbo-charge’ evaluation and impact in the New Zealand science and innovation system?
What does it take for research, government and industry organisations to explore a collective approach to tackling the challenge of evaluating science impact?
This presentation tells our story of collaboration for system transformation: collaborating across organisations through the Impact Planning and Evaluation Network, and – through a facilitated forum - gaining a shared understanding, language and benchmarking of current evaluative capacity; identifying what’s needed to turbo-charge the current state; and initial steps to achieving systemic transformational change.

Chairs
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.

Speakers
avatar for Helen Percy

Helen Percy

Science Impact Leader - Adoption and Practice Change, AgResearch Ltd
Helen Percy is a Science Impact Leader – Adoption and Practice Change at AgResearch. She is currently leading a cross-organisation programme implementing the recommendations of AgResearch’s Adoption and Practice Change Roadmap, which includes developing organisational capability... Read More →
avatar for Toni White

Toni White

Evaluator & Social Researcher, Plant and Food Research; ImpactAhead
•Strong background in biological sciences•Focus on bringing evaluation into the sciences in NZ•Loves working in the ECB space•Keen on learning and using new methodologies•Passionate about the social sciences•Co-facilitates the Waikato/BOP branch of ANZEA•Loves cats and... Read More →


Wednesday September 19, 2018 2:50pm - 2:55pm AEST
Chancellor 3

3:30pm AEST

Integrating evaluation and design roles: Innovations in recent NGO projects
One of the exciting developments in evaluation is for evaluators to also be involved in the program design process. This session will explore how the dual evaluation / co-design role can work in practice, using program logic as a tool.
 This will be an interactive session, where we will build a program logic on the floor to show the process in action.  By building and challenging the program logic in a quick, collaborative manner, participants will experience how a project design can evolve rapidly to give it a better chance of achieving long-term outcomes. This is evaluation planning and program design, rolled together.
This session uses financial literacy programs as a case study, based on work we have done recently with a cluster of NGO projects where evaluative / design thinking has made a big leap forward. The effects have been fascinating to observe. Teams have embraced major changes to the project design. The funder has supported increased funding where the program logic showed it was necessary to get the desired outcomes. The process also helped plan a better evaluation, as the program logic revealed the points where evaluation focus was most critical.

We will also explore the environment that fosters the transformation into evaluation / co-design.  Key factors include early engagement, agile evaluators, funders demanding a program logic and encouraging innovation, and capacity building amongst program managers to foster evaluative thinking. 

The session will be of interest to evaluators who are keen to broaden their practice into co-design, and to active co-designers who wish to share insights.

Chairs
avatar for Ian Patrick

Ian Patrick

Director, Ian Patrick & Associates
Dr. Ian Patrick has wide experience in the evaluation, design and management of social sector programs. This includes a strong background in design and implementation of M&E systems, conduct of program evaluations, strategic planning, analysis and research, and training and academic... Read More →

Speakers
avatar for Robert Drake

Robert Drake

Director, SmartSteps
Director of SmartSteps, a consultancy specialising in the design and evaluation of financial literacy programs. Internationally, Robert works with the World Bank on the design and evaluation of financial literacy and inclusion programs in Indonesia. He was previously General Manager... Read More →
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →



Wednesday September 19, 2018 3:30pm - 4:00pm AEST
Chancellor 3

3:30pm AEST

Buka Hatene - an innovative model promoting adaptive management for improved aid effectiveness in Timor-Leste
Louise Maher (M&E House, Timor-Leste)

In Timor-Leste, an innovative approach to transforming monitoring, evaluation and learning capacity and quality for improved aid effectiveness has been developed. The Australian government has established M&E House, a monitoring and evaluation focussed facility designed to contribute towards a high performing and continually improving development program by ensuring that (1) the Australian Embassy is equipped with evidence and capacity to continually improve decision-making and tell a clear performance story, and (2) that implementing partners generate and use evidence to learn, adapt and produce user-focused reports.

M&E House will transform current practice into a whole-of-program adaptive performance management system. M&E specialists implement the program, supported by an Australian organisational partnership.  M&E House has facilitated the development of a whole-of-program performance assessment framework identifying shared outcomes and indicators for improved integration, collaboration and reporting across program boundaries, and will develop an underpinning information management system. Strategic reviews on cross-sectoral issues provide evidence for improved systems-level programming. Implementing partners are facilitated to develop and implement M&E plans, apply adaptive practice, and improve reporting. Evaluation capacity building is focussed on improving foundational capabilities, changing mind-sets, and building motivation.    

The M&E House model allows for application of a single M&E approach, which is utilisation focussed, realist, and consolidates evidence from mixed-methods. It enables M&E methods to be trialled, improved and scaled out. It ensures M&E expertise is accessible to stakeholders, and keeps M&E front-of-mind for implementers. A lean and influential approach ensures targeted information is available for decision-makers. It allows for trusting relationships to develop, to ensure stakeholder participation and engagement in improving program performance. 

Baseline data on M&E systems justifies the need for an innovative solution, and early evidence after one year indicates that the M&E House model may be an effective and relevant solution to transforming MEL systems for improved aid effectiveness in Timor-Leste.

Chairs
Speakers
avatar for Louise Maher

Louise Maher

Team Leader, M&E House (Buka Hatene)
Louise Maher is the Team Leader of M&E House, which supports the Australian development program in Timor-Leste to tell a clear results story. Louise has a background in physiotherapy, public health, and international development. She is interested in building organisational motivation... Read More →


Wednesday September 19, 2018 3:30pm - 4:00pm AEST
Chancellor 5

3:30pm AEST

Learning from Failure: A Safe Space Session
Matt Healey (First Person Consulting)

The increasing appetite from government, philanthropy and other funders for innovative approaches to complex social and environment challenges has driven many towards such trends as design thinking, human centred design and co-design. These design approaches emphasise (among other things) a willingness to try and fail, and, most importantly, to learn from that failure. 

For evaluators, failure (or the potential for failure) is a risk to be mitigated. Should failure occur or mistakes be made, they tend to be kept in-house or otherwise not shared more broadly. To fail means disappointing clients, stakeholders (internal and external) and the communities we seek to benefit. 

Given that, and the increasing emphasis on integrating design into our practice, how can evaluators come together to learn from our collective failures and mistakes? How can we pass this learning onto the next generation of evaluators in a way that acknowledges their own experiences and perspectives? What are the opportunities unearthed for the evaluation sector and field by this failure?

This interactive session addresses these questions through facilitated discussion and shared reflection. Through a mix of lightning talks, small group discussions and whole room consensus-making, the session will elicit sharing about times that mistakes were made and what lessons can be learned from those mistakes—for conference attendees and the field of evaluation.

This session will result in a set of agreed upon principles that (hopefully) lay the groundwork for the future sharing of instances where mistakes were made and the lessons learned. This session will be guided by a set of house rules to ensure that attendees feel comfortable in sharing. Upon entry, participants will provide their name, contact details and consent to these principles, which will also enable follow-up after the session.

Chairs
avatar for Catherine Hastings

Catherine Hastings

PhD Candidate, Macquarie University
I am in my final year of a PhD developing explanations for Australian family homelessness. Prior to this, I worked as an applied social research and evaluation consultant. My interests are realist and complex evaluations in areas related to social equality and social justice.

Speakers
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →


Wednesday September 19, 2018 3:30pm - 4:30pm AEST
Conference centre

3:30pm AEST

Is this strategy working?: The systems thinking approach to investing for impact
Lewis Atkinson (Haines Centre for Strategic Management LLC)

The systems thinking approach is an important tool for evaluators
because it is a way to:
  • clarify the system level that you are trying to change
  • be people-centric by having a focus on clarity of measures of their ‘better-off-ness’ Rapidly build evaluation capacity
  • establish a common language for measuring impact focus on evidence-based practice and continuous
  • improvement Turn strategic reflection into practical action
  • ensure a participative process with stakeholders to codesign to create systems change Have a low tech/low cost introduction to measuring outcomes of programs
  • use iterative hypothesis testing to validate theory of change for programs
  • ensure accountability for theory of change over time, by whom and at which systems level
  • Create a narrative that is evidence-based and reported as a contribution to social impact

The authors use group reflection & discussion based on a Results Based Accountability (RBA) adoption case study within a medium-sized NFP company delivering community services in Queensland. Practitioners will be exposed to how RBA and other systems thinking tools and participative methods for stakeholder engagement are used build evaluation capacity, create an evaluative culture, encourage timely utilisation of feedback loops and a commitment to strategic learning.

By the end of this session, participants will be able to:
  • understand how systems thinking that can be applied facilitate evaluative thinking
  • understand how Results Based Accountability (RBA) is used to Build Evaluation Capacity
  • understand how to use RBA to report change for people at different system levels
  • understand that the systems thinking approach can accommodate any validated method of measurement of change over time.

Chairs
avatar for David Roberts

David Roberts

Principal Consultant, RobertsBrown
David Roberts is a self-employed consultant with wide experience over 35 years in evaluations using both qualitative and quantitative methods. David has training and experience in Anthropology, Evaluation and Community Development. In 2016, David was awarded the Ros Hurworth Prize... Read More →

Speakers
avatar for Lewis Atkinson

Lewis Atkinson

Global Partner, Haines Centre for Strategic Management LLC
JOIN US AT THE FREE SYSTEMS THINKING CONVERSATION SERIES - see schedule and program cut & past this link = https://bit.ly/2LBOQhwSystems thinker (http://hainescentre.com/research-based/key-concepts-of-general-systems-theory/) and the architect of strategic change. I am also in the... Read More →


Wednesday September 19, 2018 3:30pm - 4:30pm AEST
Chancellor 6

3:30pm AEST

Personal and professional transformation through cultural safety training: Learnings and implications for evaluators from two decades of professional development
Kathleen Stacey (beyond...(Kathleen Stacey & Associates)), Sharon Gollan (Sharon Gollan & Associates)

This presentation will: 1) provide an orientation to the focus of and our approach to cultural safety training, 2) share learnings from 15 years of evaluation feedback from workshop participants, and 3) propose how understanding cultural safety can assist in the development, implementation and evaluation of programs designed for, or inclusive of, Aboriginal and Torres Strait Islander Australians.The concept of cultural safety has emerged in NZ and Australia over the past 20 years - it addresses how power operates and equity is/is not achieved based on cultural identity in the context of colonisation. In Australian training contexts, it shifts the focus from learning about Aboriginal and Torres Strait Islander peoples to non-Indigenous people learning about themselves, and exploring their relationship with racism, whiteness and the dominant culture. This can be confronting, but for many it results in personal and professional transformation, particularly if undertaken as part of an organisational cultural change process.

Qualitative evaluation data has been gathered since 2004 by an Aboriginal/ non-Aboriginal partnership that has facilitated over 400 interactive two-day workshops across all Australian states and territories, involving sectors such as: health, family and community support, child protection, education, law and justice, Aboriginal affairs, and planning and transport/infrastructure. The data demonstrates different ways in which many participants experience personal and professional transformation, including how they will apply this to their work contexts.

In our experience as evaluators, a clear understanding of and commitment to contribute to cultural safety for Aboriginal and Torres Strait Islander Australians can result in critical changes to how programs are developed and implemented, and whether meaningful outcomes are achieved. It is also a vital lens through which any evaluator should approach their role in evaluating programs designed for, or inclusive of, Aboriginal and/or Torres Strait Islander Australians.


Chairs
avatar for Lee-Anne Molony

Lee-Anne Molony

Director, Clear Horizon
Principled-focused evaluation Evaluating place-based approachesOrg level evaluation frameworks

Speakers
avatar for Sharon Gollan

Sharon Gollan

Sharon Gollan is a descendent of the Ngarrindjeri nation of South Australia, with family and cultural connections to many communities within and beyond South Australia. Sharon has worked professionally and academically in a range of human services fields in Australia. She has over... Read More →
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond... (Kathleen Stacey and Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →


Wednesday September 19, 2018 3:30pm - 4:30pm AEST
Chancellor 4

4:00pm AEST

Lessons on designing, monitoring, evaluating and reporting for policy influence programs
Ikarini Mowson (Clear Horizon), Byron Pakula (Clear Horizon)

Development aid is transforming from direct service provision to influencing policies to promote systemic change and achieve development outcomes. More and more aid programs are seeking to become catalytic drivers through influencing policies. These influencing programs have some distinct elements that mean traditional approaches to design, monitoring, evaluation and reporting are not as relevant. 

Drawing on experience in facilitating and developing monitoring and evaluation frameworks for policy influence programs, this paper presents some practical lessons that can be applied by designers, managers and evaluators. 

First, we must understand the five main characteristics of policy influence programs including complexity; unpredictable links between cause and effect; the scope and scale may move away; policy goals and commitments may change; and outcomes / impacts may be delayed. Second, there needs to be clear definition of policy changes expected in the programs. Policy changes may be defined in a very broad manner that allow the program to capture policy decisions and processes, including implementation. Policy changes could also be defined to capture every step in the policy cycle. Third, use people-centred approaches to theory of change including stakeholder analysis, in order to step out causal pathways and make sure intermediate outcomes are clearly articulated. Fourth, monitoring systems can be strengthened by using light approaches such as influence log to sufficiently capture the intricate details that are often not known if they will be the triggers of change. Fifth, apply multiple evaluation methods to measure influence, particularly methods to assess the contribution of an intervention to policy change rather than outputs or outcomes. Evaluating contribution is more realistic, cost-effective and practical than seeking to establish attribution or using experimental approach. Some outcome harvesting tools such as outcome mapping, episode studies or significant instances of policy and systems improvement (SIPSI) could be used in the evaluation.

Chairs
Speakers
avatar for Rini Mowson

Rini Mowson

Consultant, Clear Horizon
Rini has been working in the international development and Australian community development sectors for more than thirteen years. She has worked with a broad range of social change initiatives and businesses globally. Rini has extensive experience in designing and implementing monitoring... Read More →
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →


Wednesday September 19, 2018 4:00pm - 4:30pm AEST
Chancellor 5

4:00pm AEST

Between the known and the unknown: exploring innovation in evidence-based programs
Alexandra Ellinson (UTS Institute for Public Policy and Governance)

As evaluators we are increasingly tasked with assessing innovation in programs—including in programs that are also intended to be evidence-based. While there isn't an inherent contradiction here, there can be some tensions. The imperative for evaluators to account for both innovation and an evidence-base creates challenges in evaluation design, delivery and reporting: particularly because innovation is often associated with high expectations around outcomes (or at least more efficient outcomes), even if the response is less thoroughly tried and tested.  

To navigate these challenges, it is helpful to clarify (and to do so in collaboration with program commissioners and deliverers) what stages and around what aspects of a program innovation is expected to operate. Drawing on lessons from recent projects, I outline a typology that locates innovation in (1) the commissioning approach e.g. outcomes-based contracting; (2) the funding strategy e.g. social investment models; (3) the design process e.g. co-design, and/or (4) the structure of information sharing within program delivery e.g. developmental learning. As each of these is an attempt to encourage more responsive, targeted and often localised solutions, each demands different ways of prioritising the role of evidence in informing what might work best. 

I conclude by reflecting on the need for evaluators to recognise how our activity can create or contribute to risk-averse program environments that are less conducive to innovation. We need to reflect critically on our 'observer effect'. Accordingly, I set out some practical considerations for evaluators—from how we use theory, resource evaluation components, and report on outcomes—so that we work in a way that minimises these impacts.

Chairs
avatar for Ian Patrick

Ian Patrick

Director, Ian Patrick & Associates
Dr. Ian Patrick has wide experience in the evaluation, design and management of social sector programs. This includes a strong background in design and implementation of M&E systems, conduct of program evaluations, strategic planning, analysis and research, and training and academic... Read More →

Speakers
avatar for Alexandra Ellinson

Alexandra Ellinson

Manager, Evaluation, Research & Policy, Institute for Public Policy & Governance, UTS
Alexandra delivers strategic evaluations, reviews and consultancies, working in partnership with clients. She is deeply knowledgeable about the social services sector and has particular expertise in interagency responses to complex issues, such as housing and homelessness, domestic... Read More →


Wednesday September 19, 2018 4:00pm - 5:00pm AEST
Chancellor 3

4:30pm AEST

The Office of the Inspector-General's Cyclone Debbie Review: Lessons for delivering value and confidence through trust and empowerment
Iain MacKenzie (Inspector-General Emergency Management), Rowena Richardson (Inspector-General Emergency Management)

The purpose of this presentation is to demonstrate how system-level evaluation can lead to tangible improvements and benefits to communities.

In March 2017, Tropical Cyclone Debbie and subsequent severe weather events resulted in the activation of all levels of Queensland's emergency management system (the System).  Strong winds, torrential rain and flooding resulted in significant damage to homes, infrastructure and agriculture, impacting many communities. The effect of Debbie across a large area of Queensland is now well documented and the vast recovery effort continues. 

The Office of the Inspector-General Emergency Management (the Office) in Queensland is mandated to provide assurance to state government and the community on the effectiveness of the System. The Office reviewed how the System responded to Debbie. The review ensured that lessons were captured; common themes for improvement identified; and good practice shared system-wide.

The presentation will explore the rigorous evaluation methodology, including extensive consultation.
Sources of evidence included:
  • attendance at 23 formal debrief sessions undertaken by local, district and state disaster management groups, NGOs, state and commonwealth agencies 
  • engagement with 65 entities, reviewing policy, plans and other associated data that supports disaster management activities
  • analysis of specific data related to Debbie, e.g. Emergency Alert campaigns
  • research into good practice evidence and case studies to inform identified themes analysing previous reviews undertaken by the Office and other entities
  • a community survey was undertaken of 1200 residents in affected areas in Queensland to capture public opinion and validate findings.
The review found that the disaster management system in Queensland is well constructed, experienced and practiced and identified a range of opportunities for improvement and good practice examples. Five major themes emerge from the evaluation: planning, public information and engagement, information management, evacuation, and capability. Recommendations have been accepted by government, including the implementation of a system-wide lessons management program.

Chairs
avatar for David Roberts

David Roberts

Principal Consultant, RobertsBrown
David Roberts is a self-employed consultant with wide experience over 35 years in evaluations using both qualitative and quantitative methods. David has training and experience in Anthropology, Evaluation and Community Development. In 2016, David was awarded the Ros Hurworth Prize... Read More →

Speakers
avatar for Michael Shapland

Michael Shapland

Director for Interoperability and Innovation, Office of the Inspector-General Emergency Management
Mike Shapland is Director for Interoperability and Innovation in the Office of the Inspector-General Emergency Management in Queensland.  Before joining the Office, he worked with the Department of Emergency Services and Department of Community Safety from 2004 to 2013 in roles covering... Read More →


Wednesday September 19, 2018 4:30pm - 5:00pm AEST
Chancellor 6

4:30pm AEST

The potential for system level change: Addressing political and funding level factors to facilitate health promotion and disease prevention evaluation
Joanna Schwarzman (Monash University), Ben Smith (The University of Sydney; Monash University), Adrian Bauman (The University of Sydney), Belinda Gabbe (Monash University), Chris Rissel (NSW Ministry of Health), Trevor Shilton (National Heart Foundation, Western Australia)

Despite the known importance of evaluating prevention initiatives, there are challenges to conducting any evaluation, and efforts can fall short in terms of quality and comprehensiveness. Evaluation capacity building research and strategies have to date focused on individual and organisational levels. However, the factors acting to influence evaluation practice at the level of the prevention system have not been explored. 

We conducted a national mixed-methods study with 116 government and non-government organisations that sought to identify the factors that influence evaluation practice in the prevention field. Participating organisations took part in three phases of data collection. These were qualitative interviews (n=40), a validated evaluation practice analysis survey (n=216, 93% response rate) and audit and appraisal of two years of evaluation reports (n=394 reports). 

In this presentation we focus determinants of evaluation practice at the prevention system level. We found the system played a key role in the demand for evaluation, however it also presented significant challenges, particularly through time-limited funding agreements and mismatched expectations of policy makers and funded agencies. The political and funding contexts impacted on the resources available for prevention programs and the purpose, scope and reporting requirements for evaluation. We also found some prevention organisations were proactive in negotiating and modifying elements of the political, contextual and administrative requirements to improve the conditions for evaluation. Other organisations with less evaluation capacity, resources and experience were not in a position to engage in advocacy to the same degree.

Evaluation capacity building is an increasingly important component of many evaluator's roles, and there are still important gains to be made within prevention organisations and government agencies. This research builds on insights concerning organisational level influences, and can guide evaluators, practitioners and polic

Chairs
Speakers
avatar for Joanna Schwarzman

Joanna Schwarzman

PhD Candidate, Monash University
I'm in the final stages of completing my PhD research at the School of Public Health and Preventive Medicine, Monash University. Over the last three years I've been working to identify and understand the factors that influence evaluation practice in health promotion and disease prevention... Read More →


Wednesday September 19, 2018 4:30pm - 5:00pm AEST
Chancellor 5

4:30pm AEST

Inclusive and culturally safe evaluation capacity building
Sharon Babyack (Indigenous Community Volunteers), Alison Rogers (PhD Candidate, University of Melbourne), Doyen Radcliffe (Indigenous Community Volunteers)
There is an urgent need to move towards culturally safe, appropriate and relevant ways of evaluating that contribute to better outcomes for Indigenous peoples. An Indigenous non-profit community development organisation has transformed towards this goal by intentionally building evaluation capacity over a period of four years. The organisation now incorporates participatory monitoring and evaluation approaches into community development practices to improve measurement and capture the outcomes with the communities.
The transformation adopted essential principles including inclusion, flexibility, empowerment, ownership and effective communication. These principles were incorporated to ensure that everyone involved were brought along on the journey to strengthen the monitoring, evaluation and learning systems.
An independent researcher was engaged to assess the degree to which the organisation was able to build evaluation capacity. This organisation’s journey of change and the methodology used to make the assessment may be useful for other organisations who could undertake a self-assessment or for other researchers who could adapt the process. Acknowledging that there are no common measures for assessing the sustainability of evaluation capacity building, this presentation will contribute to knowledge on this topic by sharing an example that has been implemented in practice.  

Chairs
avatar for Lee-Anne Molony

Lee-Anne Molony

Director, Clear Horizon
Principled-focused evaluation Evaluating place-based approachesOrg level evaluation frameworks

Speakers
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, ICV - Indigenous Community Volunteers
While at ICV, I've delivered the Monitoring, Evaluation and Learning Review project, co-designed the M&E database and framework and developed and run the consultation and M&E training processes with our regional teams.I'm currently co-leading our team as we undertake participatory... Read More →
avatar for Doyen Radcliffe

Doyen Radcliffe

Regional Manager, Indigenous Community Volunteers
Doyen Radcliffe is a Yamatji Naaguja Wajarri man from the Midwest Region of Western Australia. Doyen  is a community minded individual with a passion for empowering  Indigenous communities to reach their real potential to improve  quality of life, health, social and economic wellbeing... Read More →
avatar for Alison Rogers

Alison Rogers

PhD Candidate, The University of Melbourne
Alison Rogers is a PhD candidate with the Centre for Program Evaluation at The University of Melbourne. She is also the Strategic and Innovation Adviser with the Indigenous Australian Program of The Fred Hollows Foundation based in Darwin, Northern Territory.


Wednesday September 19, 2018 4:30pm - 5:00pm AEST
Chancellor 4

4:30pm AEST

Values and Synthesis: Evaluation's Power Core
Amy Gullickson (University of Melbourne), Kelly Hannum (Aligned Impact, LLC)

Values, criteria, standards, and synthesis together form the lens that defines the worth of the object being evaluated and the quality of its performance. To answer the question about how good a particular something is, we must combine values with information about how the evaluand is performing. Values determine what good looks like, but to be useful, they must be translated into criteria, indicators  and performance standards to make them explicit. Those choices of criteria and standards then influence what information is needed to make an evaluative judgment. Once the data is collected, the operationalized values are combined with that information using a synthesis method to arrive at evaluative judgements about performance.
The values that drive the evaluation, and the integrity of the synthesis method are key to promoting fairness, equity, accessibility and sustainability - they are the core power in the task of evaluation. Yet, despite their importance they have been largely missing in evaluation research, training, and practice.
In this session we review these primary elements (values, criteria, standards, and synthesis) and present steps for applying them in practice to enhance the equity and integrity of evaluations. The session will conclude with a facilitated discussion on research needs, further ideas for application, and potential ways to stay connected on this topic.

Chairs
avatar for Catherine Hastings

Catherine Hastings

PhD Candidate, Macquarie University
I am in my final year of a PhD developing explanations for Australian family homelessness. Prior to this, I worked as an applied social research and evaluation consultant. My interests are realist and complex evaluations in areas related to social equality and social justice.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Kelly Hannum

Kelly Hannum

Aligned Impact LLC
I am the President of Aligned Impact LLC and a consultant with the Luminare Group. I am particularly interested in mixed methods and equitable evaluation.


Wednesday September 19, 2018 4:30pm - 5:00pm AEST
Conference centre
 
Thursday, September 20
 

8:00am AEST

Plenary two: Penny Hagen "Scaling up, out and deep: What we are learning about social innovation for transformation"
Penny Hagen (Design Strategist and Participatory Design Coach, Auckland)

This talk shares challenges and questions emerging from ongoing social innovation efforts in Aotearoa New Zealand. Outcomes of such initiatives include new relationships, attitudes, capacities, practices, structures and connections across parts of the ‘system’. There is a focus on co-design, prototyping, growing capability and providing ‘biodegradable support’. Efforts are place-based and grounded in culture, recognising different forms of power, resource and knowledge. As we explore the potential for systemic and structural change we are finding that terms such as impact, scale and success need to be closely examined. An integrated evaluative practice helps us to focus more keenly on what is working and why and hold us to account, but we are still learning what will be most meaningful in service of the transformative intent. This exploratory talk reflects on what we are trying and learning thus far and why.

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →

Speakers
avatar for Penny Hagen

Penny Hagen

Co-design Lead, Auckland Co-design Lab
Penny assists organisations and teams to apply co-design and design-led approaches to the design and implementation of strategy, programs, policies and services. She specialises in projects with social outcomes and supports capability building with teams and organisations wanting... Read More →


Thursday September 20, 2018 8:00am - 9:30am AEST
Conference centre

9:30am AEST

Ethics in evaluation: navigating ethical requirements and processes to improve the quality of evaluation
Ellie McDonald (Department of Health and Human Services), Lisa Thomson (DHHS), Meredith Jones (DHHS), Eleanor Williams (DHHS), Jan Browne (DHHS)

Navigating how and when to apply for ethics approval is often a challenge for evaluators. Determining when an evaluation is aligned with quality assurance and when the proposed evaluation plan should be assessed through a formal ethics process is not always clear cut. Now, with the emergence of new ways to access data and evolving practices such as 'human-centred design', it is essential that we have the knowledge and processes in place to tackle ethical considerations effectively.The Centre for Evaluation and Research (the Centre) in the Victorian Department of Health and Human Services recently consulted with departmental staff and human research ethics secretariats from 14 government departments and NGOs across Australia to better understand today's challenges facing ethics approval. The purpose of this review was to investigate the ethical barriers program and policy areas are experiencing when conducting an evaluation, research or co-design project.In a world where emergent technologies, design methodologies and data accessibility are constantly changing, how can we support evaluators and researchers to navigate ethical boundaries in a timely and reasonable way? The findings provided insight into a range of strategies that could be used to encourage more accessible ethical processes. The Centre found that:
  • staff are seeking more tailored guidance and support regarding ethics and ethics process 
  • ethical approval processes would be more effective if secretariats reviewed and provided advice prior to submission
  • an alternative low-risk process would encourage more staff to comply with ethics requirements rather than seeking ways to go around it
  • more diverse membership of Human Research Ethics Committees both in terms of cultural background and subject matter expertise would improve the ability of committees to process applications appropriately 

The Centre will discuss the findings of this review. More broadly, this presentation will discuss the ways that organisations and people can support ethical research and evaluation, from largescale data linkage exercises through to the elements of smaller scale qualitative participatory or human-centred methodologies.  

Chairs
avatar for Clara Walker

Clara Walker

Evaluation Coordinator, Cancer Council Victoria
I am an evaluator with experience working in government, health service delivery and preventative health. In my current role, I am working to integrate monitoring and evaluation into a statewide, complex health promotion program. I am interested in the design and conduct of evaluation... Read More →

Speakers
avatar for Ellie McDonald

Ellie McDonald

Evaluation and Research Policy Officer, DHHS
Ellie is an emerging evaluator, with a background in public policy and international relations. Currently working in the Centre for Evaluation and Research at the Victorian Department of Health and Human Services, Ellie undertakes internal program evaluations and provides advice and... Read More →
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →


Thursday September 20, 2018 9:30am - 10:00am AEST
Chancellor 6

9:30am AEST

Realist axiology: A realist perspective on 'valuing' in evaluation
Gill Westhorp (Charles Darwin University)

Evaluation intends to contribute to learning and to inform decision-making by providing information about the value, worth or merit of interventions, initiatives or innovations. Over the past 20 years, realist evaluation has transformed parts of the evaluation sector by introducing new ways to think about what programs are and how they work. The approach is grounded in realist ontology (the philosophy of 'what exists') and realist epistemology (the philosophy of knowledge.) However, there has been little significant work on realist axiology - the philosophy of value and valuing - in evaluation. This presentation will open the axiological black box, enquiring into the ways that a realist understanding of value and valuing may inform evaluation. It will present and briefly discuss seven questions, each with implications for evaluation practice:
  • What is the relationship between the ideas of "values" and "value"?
  • Can there be a realist axiology that derives from, or is at least consistent with, realist ontology? Are there implications of realist ontology for 'values' with particular importance for evaluation, such as 'responsibility'?
  • In realist analysis - are values (in both senses) contexts, mechanisms, or outcomes? • What are the relationships between programmes' inherent value and values and those of intended beneficiaries? Can 'the realist question' be adapted to evaluate value positions and differences?
  • How might we take a realist approach to value and values themselves, recognising that what we value, and to what extent, is different in different contexts?
  • How might we take a realist approach to the ethical frameworks which guide our work - research ethics and evaluators' codes of ethics?
This presentation is intended to stimulate discussion about an under-developed area of realist evaluation practice. By doing so, it has the potential to transform evaluation practice in ways which may in turn contribute to the transformation of policies and programs. 

Chairs
avatar for Catherine Manley

Catherine Manley

Principal, Miles Morgan Australia
Catherine found her home in evaluation while completing her master's degree and learning from Sandra Mathison at the University of British Columbia in Vancouver. She now works in areas of evaluation and strategic research across Australia within areas of social policy relating to... Read More →

Speakers
avatar for Gill Westhorp

Gill Westhorp

Professorial Research Fellow, Charles Darwin University
Gill leads the Realist Research Evaluation and Learning Initiative (RREALI) at Charles Darwin University. RREALI develops new methods and tools within a realist framework, supports development of competency in realist approaches and provides realist evaluation and research services... Read More →


Thursday September 20, 2018 9:30am - 10:30am AEST
Chancellor 4

9:30am AEST

Strengthening the professionalisation of evaluation in Australia, workshop 1
AES Learning and Professional Practice Committee

In 2017 the AES commissioned Better Evaluation and ANZOG to explore options for strengthening the capacity and professionalisation of the evaluation sector. The report explores options to:

  • increase motivation
  • increase capacity
  • increase opportunities.

The Learning and Professional Practice Committee (LPP) of the AES Board is interested in your views about priorities for skill development, learning pathways, embedding professional competencies and opportunities to increase demand for and strengthen the operating environment for evaluation.

We are holding two workshop style sessions and participants are invited to attend either one or both.
Workshop 1 will identify and discuss issues of most interest and concern to members. Workshop 2 will build on the first, and help shape the direction for the AES in strengthening the professionalisation of evaluation in Australia.

The outcomes of the workshop sessions will be shared at the conference closing plenary.


Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.


Thursday September 20, 2018 9:30am - 10:30am AEST
Chancellor 3

9:30am AEST

Evaluation literacy: Exploring the skills needed to motivate and enable others to access, understand and use evaluation information in non-government organisations
Alison Rogers (The Fred Hollows Foundation), Leanne Kelly (Windermere), Alicia McCoy (BeyondBlue)

The motivations and abilities of individuals to gain access to, understand and use evaluative information is highly varied. Evaluation literacy can make evaluation more appropriate, understandable and accessible. This world cafe session intends to reveal and share ways that we engage with colleagues to enhance evaluation literacy. This session is aimed at internal evaluators in non-government organisations, employees who practise and promote evaluation, and external evaluators working with organisations. We invite participants to share their experiences and learn from others. The session will examine a key issue: how do individuals promote evaluation among their colleagues in non-government organisations? 

Understanding social connections between colleagues and elucidating interpersonal dynamics is useful for considering how to transform team work dynamics. Drawing upon a social psychological theory called social interdependence theory, the presenters will facilitate the world café discussion around setting cooperative goals. Focused on ways of promoting evaluation, the questions will be structured around:
  • How do you set common goals that link all individuals? 
  • How are individuals held accountable for their contribution?
  • How do you ensure there are opportunities to connect? How do you provide encouragement? 
  • What is your preferred communication style? 
  • How do you incorporate opportunities for reflection?  
The world café session will use examples from the literature as a starting point. Participants rotating through the questions will be provided with an opportunity to share their real world experiences and hear from others. Participants will leave with an increased understanding of this topic with evidence from the literature, theory and practical examples. This useful networking opportunity will enable practitioners attempting to promote evaluation among their colleagues with practical strategies to enhance their practice.

Chairs
Speakers
avatar for Alicia McCoy

Alicia McCoy

Head of Research and Evaluation, Beyond Blue
Alicia is Head of Research and Evaluation at Beyond Blue. She has 15 years’ experience working in the health and community sectors in a range of evaluation, research and clinical roles. Alicia is a social worker by profession and her PhD at The University of Melbourne on the practice... Read More →
avatar for Alison Rogers

Alison Rogers

PhD Candidate, The University of Melbourne
Alison Rogers is a PhD candidate with the Centre for Program Evaluation at The University of Melbourne. She is also the Strategic and Innovation Adviser with the Indigenous Australian Program of The Fred Hollows Foundation based in Darwin, Northern Territory.


Thursday September 20, 2018 9:30am - 10:30am AEST
Conference centre

9:30am AEST

Transforming evaluation culture and systems within the Australian aid program: Embracing the power of evaluation to promote learning, transparency, and accountability.
David Slattery (Department of Foreign Affairs and Trade), Tracey McMartin (Department of Foreign Affairs and Trade)

Evaluation is a core means of assessing the effectiveness of Australian aid. Over the past five years, DFAT has progressively transformed its evaluation culture and systems from one non-compliance with policy requirements, and non-publication of results to what is now a structured and systematic approach to assessing and providing feedback on performance. Once a focus for strong external criticism of Australia's aid administration, evaluation is now regarded as one of its biggest strengths. This paper will identify and examine the key drivers for this transformation, including the importance of strong institutional leadership; clarity over accountabilities for delivering evaluations; flexibility to determine priorities and to design evaluations that will address these priorities; realism about the capacity of programs to commission and use evaluations; mechanisms for protecting the independence of evaluations; and a culture that values independent viewpoints and contestation and is willing to be transparent about the challenges it faces.

Chairs
avatar for Michael Shapland

Michael Shapland

Director for Interoperability and Innovation, Office of the Inspector-General Emergency Management
Mike Shapland is Director for Interoperability and Innovation in the Office of the Inspector-General Emergency Management in Queensland.  Before joining the Office, he worked with the Department of Emergency Services and Department of Community Safety from 2004 to 2013 in roles covering... Read More →

Speakers
avatar for Tracey McMartin

Tracey McMartin

Department of Foreign Affairs and Trade
program evaluation, utilisation focused evaluation, M&E systems


Thursday September 20, 2018 9:30am - 10:30am AEST
Chancellor 5

10:00am AEST

How algorithms shape our lives: evaluating the unseen
Increasingly decisions about our lives are being made by algorithms. This is the case across government, large corporates and social networking platforms. Algorithms at Centrelink decide whether you are targeted for debt recovery, while those at the bank decide whether you get a home loan and Facebook decides what 'fake news' will appear in your feed. In some cases, such as Centrelink robodebt and Facebook fake news scandals, the poor outcomes have been widely and publicly criticised. In many other cases, you are probably not even aware if, and how, an algorithm is making decisions about your life.  

This session explores the ways in which algorithms shape our everyday lives and the role evaluation has to play in safeguarding us from these unseen decision makers.

Chairs
avatar for Clara Walker

Clara Walker

Evaluation Coordinator, Cancer Council Victoria
I am an evaluator with experience working in government, health service delivery and preventative health. In my current role, I am working to integrate monitoring and evaluation into a statewide, complex health promotion program. I am interested in the design and conduct of evaluation... Read More →

Speakers
avatar for Kristy Hornby

Kristy Hornby

Senior Manager - Victorian Government Account Lead, Grosvenor Performance Group
I love it when work feels like joy. It flows when I know I’m doing good work, when I’m discussing ideas with great people, when I’m doing work with social purpose, and I get lost in time.In my work at Grosvenor, I lead our Melbourne team to meet client expectations, build their... Read More →


Thursday September 20, 2018 10:00am - 10:30am AEST
Chancellor 6

11:00am AEST

When an evaluator benefits: the challenges of managing values and power in evaluating with a lived experience
Joanna Farmer (beyondblue)

Traditionally, we tend to think of evaluators as external 'agents of evaluation' working with a number of program stakeholder groups, including program clients, to provide findings and recommendations to program staff. Increasingly, this reality is changing as evaluation capacity increases within service delivery organisations - evaluators often come from 'within'.
Interest in participatory evaluation has grown, with a range of evaluation approaches that could be considered under the umbrella of participation, such as empowerment and democratic evaluation. However, these approaches continue to present a false binary between evaluator and program beneficiary. This fails to recognise that sometimes evaluators come from within the community that the program is designed to assist. These approaches posit that the evaluator neutrally applies the standards and criteria established by others. For example, in democratic evaluation "the evaluator acts as a broker in exchanges of information between groups who want knowledge of each other." (MacDonald, 1996)

In this paper, the presenter will draw on her experience as both an evaluator (within program delivery and as an external evaluator) and a mental health lived experience advocate. She proposes that when evaluating programs of which the evaluator is a potential beneficiary there are challenges not currently accounted for in participatory evaluation approaches, and traditional conflict of interest processes. 

However, these challenges can be managed. Here, she presents a range of considerations for the 'evaluator as beneficiary' and practical solutions to manage potential conflict of values and power. 

The power of evaluation as a democratic enabler of lived experience capacity cannot be underestimated, and addressing the challenges head on will produce more meaningful outcomes for both evaluators and communities.

Chairs
avatar for Kelly Tapley

Kelly Tapley

Evaluation & Impact Manager, SuperFriend - Industry Funds’ Mental Health Initiative
Since joining SuperFriend in 2010, Kelly has gained a wealth of subject matter expertise in workplace mental health promotion. With qualifications in psychology and psychophysiology over 15 years experience in health research project management across corporate, public and not for... Read More →

Speakers
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.


Thursday September 20, 2018 11:00am - 11:30am AEST
Chancellor 6

11:00am AEST

Evaluative thinking and strategic learning - nice words, do they make any difference?
Zazie Tolmer (Clear Horizon), Mila Waise (Department of Health and human Services)

The presenters are involved in delivering the Children and Youth Area Partnerships (CYAP) a Victorian government-led Collective Impact initiative delivered through place-based area partnerships in eight sites across Victoria. The Area Partnership members are intentionally diverse and together a) identify systemic and local factors that contribute to the vulnerability of children, young people and their families, b) design and test new ways of thinking and prototypes to overcome these and c) seek to influence uptake of successful prototypes by government, business, philanthropy, community and others. Ultimately, the initiative aims to work out how government can lead collaborative place-based approaches that result in real and sustainable positive change for those experiencing vulnerability.

A key component of the approach is to embed evaluative thinking and strategic learning.  We are finding that in order for the learning and local innovative practices to drive system change, a strong authorising environment and collaborative governance is needed. There needs to be a strong collective forum where learning can be further tested and innovative practice can be implemented. There needs to be a culture where partners feel 'safe to fail' and learn while continuously refining their work. There needs to be an environment where accountability is well balanced with learning and power is shared. Only this has the potential to lead to true transformation at the local and system levels and within each component/actor in the system.

The following questions will be explored in the presentation:1. What does evaluative thinking and strategic learning mean and look like in a government-led Collective impact initiative? What are the tensions and 'easy fits'?2. What difference has it made to our work? What are the implications on our resources, the intensity of the work, the impacts and ripples?3. Yeah but, so what? Has any of this actually sparked the transformation we are after? Procedure: The presentation will be delivered by three presenters, which will include perspectives and expertise from:• one Principal Advisor that is a place-based practitioner that leads the initiative within an area level, who is also the local backbone and  drives the change process locally• a representative from the central government unit that provides whole-of-initiative backbone support  and leads  transformation  within government; and • an evaluator who has been engaged to provide practice advise and embed a learning culture across the initiative. 

The panellists will each deliver a presentation of about 10 minutes each where their different perspectives will be explored.  The presentations will be followed by a 15 minutes question time from the audience where answers will be provided to generate a short discussion on themes that the audience will be most interested in.  It is anticipated that time will permit for about three to five questions that will be followed by answers from the most relevant panelists and brief commentary from the audience.

Chairs
avatar for Claire Grealy

Claire Grealy

Partner, Urbis
Motivated to influence - better experiences, better systems and better places, for everyone. Evaluation + is how I think about it - always with an eye to translating the learning into knowledge and change for the better. Passionate about working with our partners to influence change... Read More →

Speakers
avatar for Meg Beilken

Meg Beilken

Principal Advisor, Department of Education and Training
Meg Beilken is a Principal Adviser within State Government with 10 years’ experience in policy design and implementation across early childhood, schools, youth services and the out-of-home-care system. Currently, Meg is leading a cross-sectoral, collective impact initiative aiming... Read More →
avatar for Hayley Rose

Hayley Rose

Principal Advisor, Department of Education and Training
Underlying theme in career has been working with children, youth and families to increase their safety and protection. A passionate advocate for vulnerable and marginalised members of the community being able to access services and supports that will assist in meeting their optimum... Read More →
ZT

Zazie Tolmer

Principal Consultant, Clear Horizon
Anything! I'm curious and friendly! I'm currently working as an embedded evaluator at DHHS working on a Government-led Collective impact initiative to improve outcomes for vulnerable children, young people and their families.
avatar for Mila Waise

Mila Waise

Senior Policy Adviser, Department of Health and Human Services
I have a background in community development and public policy. Prior to joining the public sector, I have worked in various roles in the not-for-profit sector to support young people and families in areas of settlement, education, training and employment, mental health and family... Read More →


Thursday September 20, 2018 11:00am - 12:00pm AEST
Chancellor 4

11:00am AEST

Freaking Super Sweet Webinars: learning new tricks from young guns (aka Webinars 101: AES webinar working group reports back)
Kara Scally-Irvine (Evalstars Limited), Liz Smith (Litmus), Kahiwa Sebire (Flinders University)

The AES is transforming and wants to increase member value. We know many AES members are not located in easy reach of the regional seminars and workshops. In 2018, the Member Services Engagement (MSE) committee decided to trial the use of webinars, with a particular emphasis on enabling greater learning and connection opportunities for members unable to attend AES events. We established a webinar working group to identify potential applications of webinar technology and best practice guidelines for webinar technology and online facilitation. In keeping with design thinking approaches, we tested our assumptions with a pilot: "A webinar on how to run webinars".  

In this interactive session, the AES Webinar Working Group will share our learnings and activities so far. We will provide an overview of what a webinar is (and isn't), different delivery options within an evaluative setting (the techie bit) and our top tips and tricks for facilitating online. We will end with our reflections on the value of the tool for AES members as a vehicle for professional development, and a tool for use in evaluations. Throughout the session, we will incorporate the use of other interactive tools, e.g. PollEverywhere, that can be used to garner engagement and gather data, so attendees leave with first-hand experience of the technology options available to them.  

We hope to deliver this session as a webinar (and later as a webcast) so members not attending the conference can benefit.  

We will also seek feedback on what the membership might like to see next from the AES to support professional development. 

Chairs
Speakers
avatar for Kara Scally-Irvine

Kara Scally-Irvine

Principal Consultant & Director/Co-convenor, KSI Consulting Limited/ANZEA
Kara has over 15 years’ experience, planning, monitoring, and evaluation experience. She has expertise in both quantitative and qualitative data collection and analysis, and systems thinking. She now applies these skills to support organisations large and small, in a range of settings... Read More →
avatar for Kahiwa Sebire

Kahiwa Sebire

Manager, Learning Design/MEval Student, University of Adelaide/University of Melbourne
Enthusiastic solution-finder and life-long learner. Exploring the possibilities of authentic learning experiences and technology with sticky notes and whiteboards in tow.Studying MEval at UniMelb, interested in ECB, education, learning analytics, technology-enhanced practice, facilitation... Read More →
avatar for Liz Smith

Liz Smith

Partner, Litmus
I am a co-founder of Litmus a specialist private sector evaluation company based in Wellington New Zealand and active across Australasia. My evaluation journey started more than 20 years ago as a nurse in the John Radcliffe Hospital Oxford when assessing the effects of a new nursing... Read More →


Thursday September 20, 2018 11:00am - 12:00pm AEST
Conference centre

11:00am AEST

The Enhanced Commonwealth Performance Framework - the opportunity for the Australian evaluation community
David Morton (Department of Finance), Brad Cook (Department of Finance)

The Australia Parliament - through the Joint Committee on Public Accounts (JCPAA) -  has encouraged the Department of Finance (Finance) and others to support capacity-building to further implement the enhanced Commonwealth performance framework. Evaluators have a key role. They will need to be clear about what they have offer, and how they can help deliver better performance information to government, the Parliament and public more broadly. They will need to be willing to adapt what evaluators do and know today, and participate in developing the flexible approaches needed in the future. The performance frameworks calls for approaches that deliver performance information that simultaneously supports  accountability to the taxpaying public and everyday operational decisions.  We encourage the Australian evaluation community to reflect on  what it has to offer and how it can work with others to shape the evolution the of the performance framework. 

The performance framework commenced on 1 July 2015. It succeeds if it enables the Australian Parliament and public to understand the benefits of Commonwealth activity. The framework encourages entities and companies to move past over-reliance on input and output-based performance measures. There is a clear role for evaluators to contribute to this important adjustment. Opportunities lie in helping a larger cross-section of the Commonwealth public sector understand and use the evaluators' toolbox - for example, program theory and qualitative analysis - to improve the quality of published performance information available to stakeholders. The evaluation community has the opportunity to be at the centre of key expertise, and to make a critical contribution to building the capability of 'performance professionals' across the public sector.

Chairs
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →
avatar for Brad Cook

Brad Cook

Assistant Secretary - Public Governance, Performance and Accountability, Department of Finance
avatar for David Morton

David Morton

Assistant Director, Department of Finance
David is a an Assistant Director in the Department of Finance. From September 2014 to May 2018 he worked in various teams responsible for establishing the enhanced Commonwealth performance framework under the PGPA Act. David’s contribution included drafting guidance on developing... Read More →
avatar for David Roberts

David Roberts

Principal Consultant, RobertsBrown
David Roberts is a self-employed consultant with wide experience over 35 years in evaluations using both qualitative and quantitative methods. David has training and experience in Anthropology, Evaluation and Community Development. In 2016, David was awarded the Ros Hurworth Prize... Read More →
avatar for David Turvey

David Turvey

General Manager, Department of Industry, Innovation and Science
David Turvey is the General Manager of the Insights and Evaluation Branch within the Economic and Analytical Services Division of the Department of Industry, Innovation and Science. David oversees the Divisions research and analytical work on the drivers of firm performance and the... Read More →


Thursday September 20, 2018 11:00am - 12:00pm AEST
Chancellor 5

11:00am AEST

Developmental evaluation in Indigenous contexts: transforming power relations at the interface of different knowledge systems
Samantha Togni (RMIT University), Nan Wehipeihana (Kinnect Group), Kate McKegg (Kinnect Group), Sonya Egert (Inala Indigenous Health Service)

Innovation is required in Indigenous settings to strengthen communities and address challenging and complex social issues. Evaluation in these contexts is important to understand innovation effectiveness and takes place at the interface of different knowledge systems. Therefore, the challenge for evaluation in these contexts is to transform the power and privilege inherent in evaluation and to be centred on Indigenous voices, values and aspirations.

Developmental evaluation is designed to support innovation development in complex and dynamic contexts. Informed by complexity theory and systems thinking, developmental evaluation is relationship-based and pays attention to different perspectives, inter-relationships, context, boundaries and emergence. As the practice of developmental evaluation continues to evolve, recognition of its ability to respond to different cultures, diverse communities and Indigenous peoples' worldviews is increasing. Understanding how this is achieved is important. 

Indigenous and non-Indigenous evaluator panellists will critically reflect on our developmental evaluation practice experience in New Zealand and Australian Indigenous contexts in relation to transforming power relations and at the interface of different knowledge systems. The panellists will reflect on what genuine co-creation that recognises different worldviews looks like in practice, the dynamic role and orientation of the evaluator and how developmental evaluation grounded in culture can address power and privilege, facilitate collaboration in innovation and support Indigenous peoples' aspirations. We will also discuss the challenges and limitations of using developmental evaluation in culturally diverse contexts. To promote rich discussion, we will invite audience participation through questions and sharing of experiences of developmental evaluation in Indigenous or culturally diverse contexts.

The history of evaluation too often has been detrimental to, and marginalised, Indigenous people and communities. Our western frames of thinking and reasoning are simply not adequate for meeting the aspirations of Indigenous communities. Developmental evaluation offers an approach to include diverse knowledges in these pursuits

Chairs
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →

Speakers
avatar for Kate McKegg

Kate McKegg

Director, The Knowledge Institute
Kate has worked in the fields of evaluation, evaluation capacity building, research, policy and public sector management since the late 1980s. She has developed specialist skills in developmental evaluation, programme evaluation, evaluation capacity building, strategy, policy, research... Read More →
avatar for Samantha Togni

Samantha Togni

Evaluation & Social Research Consultant, S2 Consulting
Samantha Togni is an evaluation and social research consultant based in Alice Springs. She has more than 20 years’ experience in Indigenous health and wellbeing research and evaluation, working with rural and remote Aboriginal organisations in northern and central Australia. Her... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Ms, Kinnect Group
Nan Wehipeihana has more than 20 years' experience designing, leading and managing evaluations. Nan's work is characterised by a commitment to protecting, evidencing and growing the space to be Maori in Aotearoa New Zealand and offering insights into Maori worldviews and values. Internationally... Read More →


Thursday September 20, 2018 11:00am - 12:00pm AEST
Chancellor 3

11:30am AEST

Sharing research results to shape future services
Kiri Parata (Whakauae Research for Māori Health and Development), Gill Potaka-Osborne (Whakauae Research for Māori Health and Development, NZ), Rachel Brown (Whakauae Research for Māori Health and Development, NZ)

Transforming Māori lives through excellent research
Rangatiratanga 
Hauora tangata 
Manaaki tangata 
Mātauranga 
Ngākau tapatahi me te aurere 
Transforming Māori lives!

This waiata (song) was composed by staff of Whakauae Research for Māori Health and Development (Whakauae Research Services), an iwi (tribal) owned and mandated research centre in Aotearoa New Zealand. The research centre focuses primarily on Māori public health research, evaluation and health services and health policy research. The waiata describes ngā mātāpono (values) of the organisation to achieve Pae Ora (healthy futures) and transformation for our Māori people and aligns with New Zealand Health Strategy documentation. This presentation describes how Whakauae has supported the development of three Māori evaluators using a pragmatic approach within a Kaupapa Māori paradigm. The presentation will include information regarding three case studies and the methods employed to engage, research and evaluate alongside whānau (families) and their communities. Whakauae Research Services are committed to dissemination and translation using a range of methods however significant challenges remain in this space including research design that doesn't adequately allow for time and resources to meaningfully engage with end users. Despite these challenges, three distinct dissemination methods were undertaken using infographics, posters and booklets that echo whānau and provider voices.  As part of the learnings from the project, it is recommended that researchers and health providers consider appropriate and useful dissemination methods at early stages of any research. Early considerations better benefit  interest groups ensuring methods that may be usefully applied enabling challenges in translation of research results to be effective and therefore appropriately managed. The findings from this study show that Māori being diverse populations often live simultaneously in a range of cultural worlds. Therefore, research that attempts to impact on future wellbeing needs to recognise, reflect and cater for diversity both within providers and whānau. 

Chairs
avatar for Kelly Tapley

Kelly Tapley

Evaluation & Impact Manager, SuperFriend - Industry Funds’ Mental Health Initiative
Since joining SuperFriend in 2010, Kelly has gained a wealth of subject matter expertise in workplace mental health promotion. With qualifications in psychology and psychophysiology over 15 years experience in health research project management across corporate, public and not for... Read More →

Speakers
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. He uri ahau nō Kāpiti, kō Te Ātiawa, kō Ngāti Toarangatira, kō Ngāti Raukawa, kō Ngāti Ruanui, kō Ngāi Tahu ōku iwi. I work as a Māori health researcher and evaluator, this mahi brings... Read More →
avatar for Gill Potaka-Osborne

Gill Potaka-Osborne

Researcher, Whakauae Research Services
Ko Aotea te wakaKo Ruapehu te maungaKo Whanganui te awaKo Ātihaunui-ā-Pāpārangi te iwiKo Ngāti Tuera, Ngāti Pamoana, Ngāti Pareraukawa ngā hapū.Ko Pungarehu, ko Parikino, ko Koriniti, Ko Ngātokowaru Marae ngā marae.Ko Gill Potaka-Osborne au. In 2005, I began employment... Read More →


Thursday September 20, 2018 11:30am - 12:00pm AEST
Chancellor 6

12:00pm AEST

Ethical Dilemmas in Evaluation Practice
Anne Markiewicz (Anne Markiewicz and Associates)

This session will consider a range of ethical dilemmas faced by evaluators in their evaluation practice. The context for ethical evaluation practice will be set through a short introductory presentation that outlines the four foundation ethical principles of respect, relevance, responsibility and reciprocity. This presentation will be followed by consideration of a number of scenarios where ethical dilemmas exist in each of the four 'R' areas. The presentation of four scenarios will then be followed by opportunities for members of the audience to pose their own ethical dilemmas from their practice experiences. 

This session will be highly interactive as common evaluation challenges and dilemmas are identified and responses to ethical dilemmas are discussed and considered.

Chairs
Speakers
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →


Thursday September 20, 2018 12:00pm - 1:00pm AEST
Conference centre

12:00pm AEST

Evaluation capability building: Transforming evaluation culture or spinning wheels?
Delyth Lloyd (Victorian Department of Health and Human Services), Vanessa Hood (Rooftop Social), Megan Kerr (Victorian Department of Education and Training), Kate Nichols (Victorian Department of Economic Development, Jobs, Transport and Resources), Amanda Reeves (Victorian Department of Education and Training), Roberta Thorburn (Australian Government Department of Industry, Innovation and Science), Eleanor Williams (Victorian Department of Health and Human Services), Martin Hall (New South Wales Department of Education)

Building an organisation's evaluation culture and capability is not an exact science. Different approaches are suited to different contexts and must be responsive to the organisation's individual characteristics. Factors such as leadership, systems, processes, staff attitudes and skills will inform what strategies will be most effective in transforming an organisation's evaluation culture.  Who leads the evaluation capability effort and the resources available will also determine the approach. Sometimes evaluation capability building is led by a central team, other times it is dispersed throughout the organisation, or contracted-in via external consultants. Sometimes the funding and resources are flowing, while other times there is only a trickle.

So what is the current situation in the public sector at State and Commonwealth level, a sector undergoing marked transformation and reform with increased demand for accountability, outcomes-thinking, evaluation and evidence-driven ways of working? In this context, what different approaches are being used to help strengthen organisational evaluation culture and capability building? Are evaluation capability building endeavours equipping government organisations to thrive in this time of change?

This interactive session will explore the current evaluation capability and culture building approaches being used in five large State and Commonwealth government departments, including the:
  1. Victorian Department of Education and Training
  2. Victorian Department of Health and Human Services
  3. Victorian Department of Economic Development, Jobs, Transport and Resources
  4. Australian Department of the Environment and Energy
  5. New South Wales Department of Education
  6. Australian Government Department of Industry, Innovation and Science
The session will be invaluable for those who work in, or with, any government agency as well as those interested in evaluation capability building more broadly. Each organisation will showcase different evaluation culture and capability building approaches tailored to their context. A facilitated mini-workshop will then invite participants to reflect on the implications for their own organisations and co-create practical strategies for enhancing evaluation capability and culture building practice in different contexts.

Chairs
avatar for Claire Grealy

Claire Grealy

Partner, Urbis
Motivated to influence - better experiences, better systems and better places, for everyone. Evaluation + is how I think about it - always with an eye to translating the learning into knowledge and change for the better. Passionate about working with our partners to influence change... Read More →

Speakers
MH

Martin Hall

Principal Project Officer, Centre for Education Statistics and Evaluation
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
MK

Megan Kerr

Manager, Evaluation, Victorian Department of Education and Training
Megan is a public policy professional with over 15 years experience in policy and program design, implementation, evaluation and research. Megan has worked across education, health, and community development settings in the government and non-government sectors and is currently the... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.
avatar for Kate Nichols

Kate Nichols

Senior Evaluator, Department of Economic Development, Jobs, Transport & Resources
I've been a practising evaluator since Missy Elliot released 'Work it' which a) reveals a bit too much about my age, but b) gives you a sense of how much I'm into this stuff. I've recently returned to an evaluation role in the Victorian public sector after working in a private sector... Read More →
avatar for Amanda Reeves

Amanda Reeves

A/Manager, Performance and Evaluation Division, Department of Education and Training
Amanda is an experienced evaluation practitioner and policy analyst at the Department of Education Victoria. Amanda has led evaluation projects in a variety of roles in government, the not-for-profit sector and as an external consultant in education, youth mental health and industry... Read More →
avatar for Roberta Thorburn

Roberta Thorburn

Senior Evaluation Officer, Department of Industry, Innovation and Science
I am an evaluation officer in the central evaluation unit of the Australian Government Department of Industry, Innovation and Science. Prior to this, I worked in the central evaluation unit of the Department of the Environment and Energy. While I moved from one central evaluation... Read More →
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →


Thursday September 20, 2018 12:00pm - 1:00pm AEST
Chancellor 4

12:00pm AEST

Developing an AES Advocacy and Influence Strategy: A consultation and co-design session for AES members

Influence is one of the key components of the AES 2015-2019 Strategic Plan. The AES Advocacy and Alliances Committee is developing an Advocacy and Influence Strategy in order for the AES to project its 'voice' and to enable it to better serve its members and the profession.

The Strategy is underpinned by the key principles of:
  • Collaboration: (within the AES membership and between the members and clients) Inclusiveness: sharing information and ideas with clients and members
  • Continual professional growth: (within membership and clients)
  • Professional service: on behalf of and to our members
  • Innovation: new ways to respond to new times

In keeping with these principles, the Advocacy and Alliances Committee is offering an opportunity for AES members to be involved during the Conference in a consultation and needs analysis session that will contribute to the design of the Strategy. The session will explore what needs or issues members have regarding advocacy and influence, and their thinking about the most relevant and useful approaches.


A background paper will be made available for participants to read prior to the session. 

Chairs
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →

Speakers
avatar for Alexandra Ellinson

Alexandra Ellinson

Manager, Evaluation, Research & Policy, Institute for Public Policy & Governance, UTS
Alexandra delivers strategic evaluations, reviews and consultancies, working in partnership with clients. She is deeply knowledgeable about the social services sector and has particular expertise in interagency responses to complex issues, such as housing and homelessness, domestic... Read More →
avatar for Margaret MacDonald

Margaret MacDonald

Director, MacDonald Wells Consulting
Margaret is a leadership, public policy and evaluation consultant and works mostly in the social and health policy issues. She has a passion collaborative practice, systems thinking and linking ideas to practice.. Margaret is particularly interested in the interplay between policy... Read More →



Thursday September 20, 2018 12:00pm - 1:00pm AEST
Chancellor 5

12:00pm AEST

Developmental evaluation, biostatisics, primary health care researcher and Indigenous voices: Culture clash or symbiotic relationship?
Deborah Askew (The University of Queensland), Samantha Togni (S2 Consulting), Philip Schluter (University of Canterbury), Sonya Egert (Inala Indigenous Health Service)

We implemented a transformative model of primary health care for Aboriginal and Torres Strait Islander peoples with complex chronic disease. This research project used developmental evaluation to develop, adapt and understand why and how our intervention had the impact it did.  Therefore, this project brought together different paradigms, different priorities and different languages. Our challenge was to unite these different perspectives to improve health outcomes for Indigenous people.

Quantitative research is characterised as being value-free, structured, logical and reductionist, with the researcher being distant and independent to the research. In contrast, developmental evaluation requires flexibility, innovation, tolerance for ambiguity, and the evaluator is inseparable to the process of refinement and adaptation of the intervention. Improvements in the health of Australia's Indigenous people requires honouring the Aboriginal definition of health. Bringing these worldviews together required identification of shared values and beliefs.

Indigenous and non-Indigenous researcher and evaluator panellists will critically reflect on the challenges, opportunities and successes we experienced implementing, refining, adapting and evaluating our model of care and bringing together these different knowledge systems. The panellists will reflect on how their personal ideologies and values created a space where the importance of each different worldview was recognised and given its rightful place in the project; how tensions at the interface were recognised and celebrated as opportunities to learn; and how developmental evaluation facilitated the successful conduct of the research project and improved Indigenous peoples' health. To promote audience participations, we will facilitate paired discussions and feedback where participants can share their own stories of successes, failures, and learnings in similar situations.

The history of research and evaluation has too often privileged outcomes that are frequently of little benefit to Indigenous people and communities. Developmental evaluation offers an approach to facilitate symbiotic relationships rather than tragic culture clashes.

Chairs
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →

Speakers
avatar for Deborah Askew

Deborah Askew

Associate Professor in General Practice Research, The University of Queensland
I am a primary health care researcher, focusing on research and evaluation in Aboriginal and Torres Strait Islander health and wellbeing. My work is focused on addressing the social determinants of health to improve health outcomes.
avatar for Samantha Togni

Samantha Togni

Evaluation & Social Research Consultant, S2 Consulting
Samantha Togni is an evaluation and social research consultant based in Alice Springs. She has more than 20 years’ experience in Indigenous health and wellbeing research and evaluation, working with rural and remote Aboriginal organisations in northern and central Australia. Her... Read More →


Thursday September 20, 2018 12:00pm - 1:00pm AEST
Chancellor 3

12:00pm AEST

Working with values in evaluation
Keryn Hassall (The Australia and New Zealand School of Government)

Values underpin evaluative decision-making, and evaluation experts have advocated for clarity of values in evaluation. But there are conceptual and practical challenges to working with values. This session draws on the findings of values research in social psychology and combines this with techniques for values inquiry developed by evaluation thought leaders.
The learning objectives of this skill-building session are to (a) give participants some understanding of the research into values, to enable them to comfortably talk about values, and (b) provide ways to include values explicitly through all stages of evaluation, and particularly for evaluative synthesis.

Values are trans-situational goals that motivate people's action and serve as guiding principles in their lives. In a society or organisation, values are the broadly shared abstract ideas about what is good and desirable. These social values serve to justify actions that are taken in pursuit of these goals, and are implicitly and explicitly embedded in policies and programs. Values underpin the programs we evaluate, and how we evaluate them. Working with values allows evaluators to make clearer decisions about evaluative criteria, evaluation methods, to interpret the distribution of outcomes, and make evaluative judgements.

Participants will learn about research into values, with a framework for understanding values that can be used to facilitate discussions about values in evaluation. The session will guide participants through using this framework to interpret and map values as they appear in a social context - in policies, programs, documents and organisations. It will show the importance of understanding and being explicit about values in all stages of program development - through the process of needs analysis and developing a program theory.

Participants will learn about techniques for eliciting and clarifying values, and discuss ways to incorporate values in each stage of an evaluation, and how this clarity about values can enable more effective evaluative synthesis.

Chairs
avatar for Kelly Tapley

Kelly Tapley

Evaluation & Impact Manager, SuperFriend - Industry Funds’ Mental Health Initiative
Since joining SuperFriend in 2010, Kelly has gained a wealth of subject matter expertise in workplace mental health promotion. With qualifications in psychology and psychophysiology over 15 years experience in health research project management across corporate, public and not for... Read More →

Speakers
avatar for Keryn Hassall

Keryn Hassall

Aptavit
I'm interested in how people think - about evaluation, about policies and programs, about their work, and how people think about other people.I have two primary areas of professional focus:(1) Organisational capability and practice change - using organisation theory and practical... Read More →


Thursday September 20, 2018 12:00pm - 1:00pm AEST
Chancellor 6

2:00pm AEST

Principles before rules: Child-centred, family-focused and practitioner-led evaluation in child protection
Stefan Kmit (Department for Child Protection)

Supporting a learning environment through evaluation requires more than just the monitoring of service indicators. The South Australian Department for Child Protection (DCP) is committed to principles-based evaluation processes with children, families and practitioners that acknowledge the improvement journey is just as significant as the final outcome. Known as 'the rudder for navigating complex dynamic systems ' (Patton, 2018), principles-focused evaluations enable us to think beyond structures and processes to re-direct focus on service user experience and outcomes. Our ability to form and reform service approaches based on what key stakeholders tell us underpins a continuous improvement and questioning culture.

This is all driven by a passion to best identify how we can (a) better use the voice of the child; (b) shift our view of children and families from service users to service shapers; (c) orient findings towards practitioner learning; and, (d) create more opportunities for closer collaboration across the board.

Recent 'evidence-informed practice ' (Moore, 2016) evaluations of the DCP Young People's Council and the DCP Volunteer program have featured the voice of children and their families and carers within the system. Using these as case studies, we will examine the evaluation design and engagement strategies incorporated with children and families and share critical learnings about applying a principles-focused approach.

Chairs
avatar for Rhianon Vichta

Rhianon Vichta

Research and Evaluation Coordinator, Brisbane Youth Service
Rhianon Vichta became an evaluator after spending more than 20 years delivering, designing, managing and working to improve social programs both in Australia and overseas. Throughout her career journey from crisis counsellor to CEO, she continued to seek answers to the fundamental... Read More →

Speakers
avatar for Stefan Kmit

Stefan Kmit

A/Manager, Research and Evaluation, Department for Child Protection


Thursday September 20, 2018 2:00pm - 2:30pm AEST
Chancellor 5

2:00pm AEST

The Lived Experience Evaluators Project: Combining design thinking and innovation to build cultural capital in the evaluation sector
Anna Strempel (Asylum Seeker Resource Centre)

This presentation will share lessons from the Lived Experience Evaluators Project (LEEP). The Asylum Seeker Resource Centre (ASRC) worked with human-centred service designers to develop this pilot, which trains people seeking asylum who have professional backgrounds to become evaluators. Participants complete a paid internship in which mentors from the evaluation sector support them to design and conduct evaluations for the ASRC. 
The anticipated outcomes are: 
  1. People seeking asylum gain skills, experience and opportunities that will help them to secure professional employment
  2. The ASRC has access to evaluators with valuable lived experience
  3. The input of evaluators from diverse backgrounds increases cultural capacity within the evaluation sector. 
The results of the pilot evaluation will help us decide whether and how to scale the model up. The project is exciting because of its potential to transform the evaluation sector while creating positive outcomes for a highly marginalised group, whose expertise is often overlooked. Further, it provides a real-world case study of how to integrate evaluation and design. 

We introduced Clear Horizon and TACSI's InDEEP (Integrated Design, Evaluation and Engagement with Purpose) framework during the later stages of the project; we can draw some conclusions about the value of using such a framework by comparing our experience from the early stages, where we feeling our way through the collaboration, to the process that followed the adoption of InDEEP. One of our early findings has been that using the InDEEP framework helped to clarify the respective roles of designers and evaluators. Our experience suggests that diving in without a clear framework can result in design 'crowding out' evaluation, or vice-versa. The InDEEP framework helped us to integrate the two disciplines and ensure they were mutually beneficial. This presentation will explore these and other lessons from the LEEP pilot.

Chairs
avatar for Christina Thornley

Christina Thornley

Lead Advisor Innovation and Collaboration, Education Council of Aotearo New Zealand
Christina Thornley leads the Council’s Strengthening Appraisal professional learning project across English and Māori medium schooling and early childhood education settings. She has a strong interest in promoting teaching as a self-managing profession. The appraisal project focuses... Read More →

Speakers
avatar for Anna Strempel

Anna Strempel

Monitoring & Evaluation Manager, Asylum Seeker Resource Centre
After starting my career in the environmental sustainability sector I moved through several community engagement, international development, advocacy and research roles before finding my current home in the M&E field. I have lived and worked in several parts of Indonesia and have... Read More →


Thursday September 20, 2018 2:00pm - 2:30pm AEST
Chancellor 3

2:00pm AEST

'What about me?': A campfire session to co-design transformational self-care guidelines for evaluators
Emma Williams (Northern Institute, CDU), John Stoney (Northern Institute, CDU)

Evaluators often - and increasingly - work in high risk, high stress situations. These include data collection in fragile states and conflict situations but also working in relatively 'safe' environments with evaluands in traumatic situations if the experience is sufficiently intense that the evaluator experiences vicarious trauma. Data collection when evaluating institutions of power presents its own challenges. Reporting also may provide a high risk, high stress point for evaluators. 'Telling truth to power' is seldom easy, and there are situations where it can have impacts on evaluators' career prospects and, in some settings, personal safety. Even the stress of juggling multiple projects with tight timelines that impose periods of little sleep, let alone adequate space for reflection, can impact on evaluator well-being. This presentation presents guidelines drafted in response to this issue and based on primary and secondary research:
  • Evaluation planning: self-care guidelines based in part on a transformation of ethical practice questions. (These often assume that the researcher/evaluator holds power and is not at risk; reverse-engineering the questions to consider potential risks to evaluator well-being proved a fruitful source of self-care guidelines.)
  • Debriefing guidelines: for use by evaluators after particularly stressful situations, based in part on transformed disaster management tools
  • Self-assessment: This checklist enables evaluators to assess their own capacity - including capacity for evaluative judgement - in high risk, high stress situations.The campfire session will use a co-design variant process involving pre-circulated materials to enable session participants to test and refine these draft guidelines. 

Chairs
DR

Dwi Ratih S. Esti

Flinders University
I've been interested in evaluations since joining the Directorate for Monitoring, Evaluating and Controlling of Regional Development at the Ministry of National Development Planning of the Republic of Indonesia in 2007. At the moment, I am conducting an evaluation research as my doctoral... Read More →

Speakers
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →
avatar for Emma Williams

Emma Williams

Associate Professor, RREALI CDU Maburra
Emma is a Credentialed Evaluator and an Associate Professor at Charles Darwin University in Australia, where she researches, conducts and teaches evaluation, particularly realist evaluation. She has moved between government, academe and private consulting in evaluation over more than... Read More →



Thursday September 20, 2018 2:00pm - 3:00pm AEST
Conference centre

2:00pm AEST

Evaluative Rubrics: A tool for making explicit evaluative judgements
Nan Wehipeihana (Research Evaluation Consultancy Limted - a member of the Kinnect Group), Judy Oakden (Pragmatica Limited - a member of the Kinnect Group), Kate McKegg (The Knowledge Institute - a member of the Kinnect Group), Julian King (Julian King & Associates - a member of the Kinnect Group)

Evaluation rubrics are a powerful and influential approach to evaluation-specific methodology that can be used in collaborative/participatory or independent evaluations to build a clear, shared understanding of how quality, value, and effectiveness are defined. Evaluative rubrics make explicit the basis for evaluative judgments about effectiveness or performance, as well as importance. 

Drawing from their experience of using rubrics in many evaluation settings, the presenters in this panel session will provide an overview of rubrics, as well as more detail about different kind of rubrics and their uses, their strengths and weaknesses, and the ability of rubrics to explore and integrate shared values providing a clear and transparent basis for making decisions.
Objectives:In this panel presentation, participants will gain insights from panel members' practice about rubrics in the following areas: 
An overview of rubrics • What are rubrics?• Where do they come from?• What are the components of a rubric?• Why are they useful / transformative for evaluation practice?
Different kinds of rubrics:• What different types of rubrics are there?• What are their key features?• What are the design considerations for each? • What is the comparative value of each type for makingevaluative judgments?

The strengths and weaknesses of rubrics• What are the strengths of rubrics?• Troubleshooting, faults and mishaps - overcoming the weaknesses of rubrics in practice?• How do they transform evaluation practice?

Using rubrics to integrate shared values• Whose perspectives and values count when using rubrics?• How do you weave different values into the design and use of a rubric?• Why does this matter?


Chairs
avatar for Evylyn Brophy

Evylyn Brophy

Associate Director, Urbis
I am passionate about helping others strengthen their evaluation capabilities. I work with program deliverers across sectors – including crime prevention, education, regional development, and primary health – to help develop capacity to measure their own impact, and enhance the... Read More →

Speakers
avatar for Kate McKegg

Kate McKegg

Director, The Knowledge Institute
Kate has worked in the fields of evaluation, evaluation capacity building, research, policy and public sector management since the late 1980s. She has developed specialist skills in developmental evaluation, programme evaluation, evaluation capacity building, strategy, policy, research... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Ms, Kinnect Group
Nan Wehipeihana has more than 20 years' experience designing, leading and managing evaluations. Nan's work is characterised by a commitment to protecting, evidencing and growing the space to be Maori in Aotearoa New Zealand and offering insights into Maori worldviews and values. Internationally... Read More →


Thursday September 20, 2018 2:00pm - 3:00pm AEST
Chancellor 4

2:00pm AEST

Challenging the status quo: the emerging evaluators panel
This panel will invoke conversations that explore ideas that will challenge the status quo in evaluation. The session will also seek to establish a community of practice for emerging evaluators.  
The panel will introduce emerging evaluators from a range of professional backgrounds. With the focus on ideas that challenge the status quo, each panel member will offer their unique perspective and experience, drawing on ideas around the role of evaluators in alleviating poverty, how evaluation can drive Aboriginal sovereignty, and opportunities for inclusivity and integrating lived experience into evaluation. Facilitated by emerging evaluators, the session will include opportunities for the audience to pose questions to the panel.
Panel members
  • Skye Bullen, PCT Consulting
  • Fran Demetriou, Lirata Consulting
  • Joanna Farmer, beyondblue 
  • Sarah Leslie, Fred Hollows
  • Rini Mowson, Clear Horizon
Facilitators: Eunice Sotelo and Nathan Delbridge from Clear Horizon
Structure of the session – 50 minutes 
  • Part 1 – Panel introduction (15 min)
  • Part 2 – Panel discussion and audience questions (30 min)
  • Part 3 – Close and next steps (5 min) 


Chairs
avatar for Anthea McClintock

Anthea McClintock

Senior Manager Evaluation, NSW DPC
I lead the Program Evaluation Unit of Department of Premier and Cabinet. Our team is currently evaluating regional infrastructure programs funded by the NSW State Government. Prior to working with DPC, I worked for the Department of Primary Industries, ABARE and the Industry Commission... Read More →

Speakers
avatar for Skye Bulleen

Skye Bulleen

Founder & Director, PCT Consulting
Skye is an emerging evaluator and founder of PCT Consulting. She has worked on place based evaluation nationally (namely Maranguka in Bourke and Ready Set Go in Port Stephens) and brings experience in social governance and research to her role. Skye is currently completing her PhD... Read More →
avatar for Nathan Delbridge

Nathan Delbridge

Senior Consultant, Clear Horizon
Nathan brings an interdisciplinary skill-set to his evaluation projects that draw on his experience as an environmental analyst, systems thinker and social researcher. Driven by a curiosity about how people connect with place, Nathan completed The University of Melbourne’s Master... Read More →
avatar for Francesca Demetriou

Francesca Demetriou

Evaluator, Lirata
I’m an early career evaluator with a background in social research. I have worked with NFPs, NGOs and government across a range of different sectors, including health, housing and homelessness, education, employment, and refugee settlement services. I’m especially interested in... Read More →
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.
avatar for Sarah Leslie

Sarah Leslie

Monitoring and Evaluation Advisor, The Fred Hollows Foundation
I support staff across the Foundation to design and implement monitoring and evaluation plans and use the information they collect to inform their work.
avatar for Rini Mowson

Rini Mowson

Consultant, Clear Horizon
Rini has been working in the international development and Australian community development sectors for more than thirteen years. She has worked with a broad range of social change initiatives and businesses globally. Rini has extensive experience in designing and implementing monitoring... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Research Analyst, Clear Horizon
I am an early career evaluator with a passion for education. As a former classroom teacher, I'm particularly interested in agile and collaborative ways of working, learning and assessing outcomes so we can build better initiatives and solutions.


Thursday September 20, 2018 2:00pm - 3:00pm AEST
Chancellor 6

2:30pm AEST

Youth Partnership Project: Applying place-based collective impact and evaluating for systems change
Maria Collazos (Save the Children)
‘Wicked problems’ demand a new way of thinking and working; one which moves beyond independent programs with isolated impact, to a collaborative approach with a common goal. By rethinking the system and how it operates, we can discover new solutions with population level impact.  Being able to measure this impact is key. This practice-focused presentation explores systems change evaluation, using the place-based collective impact initiative, the Youth Partnership Project (YPP), as a case study.
Despite significant investment in the community, there has been persistent issues of youth crime and anti-social behaviour in the south-east corridor of Perth. The YPP was formed as a strategic project to develop a cross-sector early intervention system for the region, and is a demonstration site for Western Australian reform. The project brings together a broad cross-sector of partners to systematically identify the most vulnerable young people in the community and collaboratively address complex needs which are the responsibility of multiple agencies.
This presentation will delve into the challenge of evaluating systems change in initiatives with multiple levels of impact, from individual to systemic. We consider how impact at these different levels affect one another and draw on the YPP’s approach of using developmental evaluation to provide a framework for continuous learning, emergent strategies and monitoring effectiveness and efficiency. Finally, we discuss cost-benefit analysis as an advocacy tool to articulate the need for prevention-focused collaboration and system reform.

Chairs
avatar for Rhianon Vichta

Rhianon Vichta

Research and Evaluation Coordinator, Brisbane Youth Service
Rhianon Vichta became an evaluator after spending more than 20 years delivering, designing, managing and working to improve social programs both in Australia and overseas. Throughout her career journey from crisis counsellor to CEO, she continued to seek answers to the fundamental... Read More →

Speakers
MC

Maria Collazos

Project Design and Development Officer, Save the Children
BA. Political Science and Grd Cert Social Impact.Maria works to strengthening the quality of program delivery through effective design of program logics and impact evaluation frameworks. Maria works at a strategic and an operational level, bringing together theory and practice. She... Read More →


Thursday September 20, 2018 2:30pm - 3:00pm AEST
Chancellor 5

2:30pm AEST

In their own words: How we (the boring adults) worked with young people (the cool kids) in Papua New Guinea to develop a bilingual post-program survey, why we did it, and why it was a good idea
Lauren Siegmann (Clear Horizon), Dr Ceridwen Spark (RMIT University), Junior Muke (Equal Playing Field)

We were doing an evaluation of a program on preventing violence against women in Papua New Guinea that worked with young people. This program had been diligently collecting pre and post survey data. When the evaluation started there was a dataset with approximately 2000 pre and post surveys. It was expected that we would use this survey data in the evaluation. The surveys were validated instruments that had been used in evaluations of similar programs, for this reason the data was seen as being of high quality. On closer examination it was clear to us that the data had limited value. We saw no meaningful trends in the survey responses. We concluded that it was likely that the young people completing these surveys did not understand the questions. The language in the survey was formal, and some students found it easier to talk about concepts like gender in Tok Pisin, a local language, rather than English. It was likely that the constructs the survey was measuring did not align to the changes the students told us they were experiencing. The pre and post questions misunderstood the way in which attitudinal changes happened for young people.

We worked with young people who had been in the program to redesign the survey so that it 1) captured the types of changes that students told us happened for them as a result of the program; 2) used their own words and language to describe these changes; and 3) was bilingual, so that students could choose to complete the survey in their preferred language.
In this presentation, we discuss the participatory methods we used to develop the survey, we discuss the ways in which we validated the survey, and we discuss the politics surrounding the redevelopment the survey. 

Chairs
avatar for Christina Thornley

Christina Thornley

Lead Advisor Innovation and Collaboration, Education Council of Aotearo New Zealand
Christina Thornley leads the Council’s Strengthening Appraisal professional learning project across English and Māori medium schooling and early childhood education settings. She has a strong interest in promoting teaching as a self-managing profession. The appraisal project focuses... Read More →

Speakers
avatar for Junior Muke

Junior Muke

Program Monitoring and Evaluation Officer, Equal Playing Field Inc. Papua New Guinea
I am from Papua New Guinea. I prefer as an emerging indigenous evaluator and was part of a recent evaluation research on respectful relationship education program with primary school students aged 13-15 in the Nations Capital of Papua New Guinea.
avatar for Lauren Siegmann

Lauren Siegmann

Senior Consultant, Clear Horizon
Lauren is a funny, compassionate, and dedicated evaluation specialist. Her work focuses on collaborative evaluations which involve program staff and program participants in the design and conduct of evaluation, and in the sense-making of evaluation data. She has a particular interest... Read More →


Thursday September 20, 2018 2:30pm - 3:00pm AEST
Chancellor 3

3:00pm AEST

Just add water: The ingredients of an evaluation consultant
Matt Healey (First Person Consulting)

The AES' Professional Learning Competency Framework presents a range of areas to focus learning and development. The Framework also acknowledges that "people bring different strengths, knowledge and skills to their work as evaluators". Given that, what are the core ingredients of a 'good' evaluation consultant outside of the competencies?
I will discuss some of these ingredients based on some reflections of my status as an early career evaluator (less than five years) and co-founder of a small evaluation firm. Importantly, I want attendees to walk away thinking about how we can support other early career evaluators in their practice.

Chairs
avatar for Evylyn Brophy

Evylyn Brophy

Associate Director, Urbis
I am passionate about helping others strengthen their evaluation capabilities. I work with program deliverers across sectors – including crime prevention, education, regional development, and primary health – to help develop capacity to measure their own impact, and enhance the... Read More →

Speakers
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →


Thursday September 20, 2018 3:00pm - 3:05pm AEST
Chancellor 4

3:00pm AEST

Transforming evaluation: Necessary but not sufficient to make a meaningful contribution to society
Julie McGeary 

This presentation aims to challenge the notion that transforming the way evaluators practise will be instrumental in solving the main problems facing our profession. 

Drawing on the views of eminent evaluators, and the presenter's own sixteen years of evaluation experience in the public sector, it will be argued that transformative approaches to evaluation are necessary but not sufficient to overcome the constraints increasingly imposed by the authorising environment in which we operate. 

Few would disagree that evaluations should be relevant, meet market expectations and meaningfully contribute to society. These aims are not new; the struggle to achieve them is ongoing with mixed results. Audiences at the 2017 AES Conference in Canberra heard Sandra Mathison provide a gloomy assessment of evaluation's current ability to contribute to the public good. She offered three reasons for this: evaluation is constrained by the dominant socio-political ideology, it lacks independence, and it is a conserving practice, generally maintaining the status quo.

A decade earlier, Eleanor Chelimsky discussed the clashes that occur between evaluative independence and the political culture it challenges. She warned of the danger of focusing too much on the easier to control methodology issues, and being distracted from the much harder to control, but larger problem of evaluation's political context. 

Certainly, those who supply evaluations should keep abreast of emerging evaluation theories, practices and the potential advantages offered by innovative tools and technologies. Harnessing the technological advances and new ways of thinking can lead to profound and radical change in our practice and credibility. But in our urgency to transform evaluation, let's not overlook the context in which we operate. The social, political and cultural forces explored in this presentation ultimately determine whose values are considered, whose expectations dominate, and how meaningfully our evaluations are able to contribute to society.

Chairs
DR

Dwi Ratih S. Esti

Flinders University
I've been interested in evaluations since joining the Directorate for Monitoring, Evaluating and Controlling of Regional Development at the Ministry of National Development Planning of the Republic of Indonesia in 2007. At the moment, I am conducting an evaluation research as my doctoral... Read More →

Speakers
avatar for Julie McGeary

Julie McGeary

DEDJTR
Julie McGeary is an Evaluation Specialist with the Victorian Department of Economic Development, Jobs, Transport and Resources. She has been conducting and contracting evaluations within the government primary industries sector for over 16 years. She holds a Master of Assessment and... Read More →


Thursday September 20, 2018 3:00pm - 3:30pm AEST
Conference centre

3:00pm AEST

From outputs to outcomes: A system transformation approach for the Victorian child and family service sector
Emily Mellon (Centre for Excellence in Child and Family Welfare)

The Victorian child and family service sector is undergoing a profound transformation from a service system to a learning system where experimentation, rapid knowledge sharing and continuous improvement will be the norm. The sector is transforming from a disparate network of services to an integrated learning system that better serves Victoria's children and families. 

The learning system assumes a culture of inquiry, experimentation and learning which requires certain knowledge, skills and motivation akin to an evaluation capacity building (ECB) effort. This paper explores how the Victorian child and family services' Outcomes Practice and Evidence Network (OPEN) is supporting community sector organisations to move from outputs to outcomes to better serve vulnerable children and families. 

We will share the Outcomes Practice and Evidence Network approach which has drawn on Learning Organisation, Knowledge Translation and ECB literature to develop a framework for systemic capacity building. Significantly there is an appreciation that both bottom-up and top-down efforts are required for system transformation, the challenges in cohesively framing and delivering these efforts will be discussed. In addition some of the particular strategies used to bridge the gap between research and practice, demystify and improve evaluation practice and support the sector to create, share and use better quality evidence will be presented. Importantly, we will focus on a specific case example from the child and family service sector to demonstrate the impact of our approach and the experience  of system transformation efforts at the local level.  

Chairs
avatar for Rhianon Vichta

Rhianon Vichta

Research and Evaluation Coordinator, Brisbane Youth Service
Rhianon Vichta became an evaluator after spending more than 20 years delivering, designing, managing and working to improve social programs both in Australia and overseas. Throughout her career journey from crisis counsellor to CEO, she continued to seek answers to the fundamental... Read More →

Speakers
avatar for Emily Mellon

Emily Mellon

OPEN Project Manager, Centre for Excellence in Child and Family Welfare
Emily is passionate about supporting sustainable social change through harnessing collective wisdom, nurturing inquiry mindedness and encouraging the utilisation of evidence. Emily is an advocate for creating space to inspire and hear the perspectives and insights of young people... Read More →


Thursday September 20, 2018 3:00pm - 3:30pm AEST
Chancellor 5

3:00pm AEST

The promise and practice of partner-led evaluation: a policy research programme case study
Stuart Raetz (Australian Red Cross), Jessica Dart (Clear Horizon Consulting), Tiina Pasanen (Overseas Development Institute), Julien Colomer (International Union for the Conservation of Nature)

This presentation will reflect on a partner-led approach that was taken in an evaluation of a global policy research programme. The International Forestry Knowledge programme (KNOWFOR) was a £38 million UK Aid funded partnership between the Center for International Forestry Research (CIFOR), the International Union for Conservation of Nature (IUCN) and the World Bank Programme of Forests (PROFOR) between 2012-2017.

The partner-led approach involves shared ownership, leadership and responsibility for evaluation with multiple actors. In KNOWFOR the evaluation partners took a lead role in design and planning, data collection, analysis, interpretation and reporting. Partners were supported by an external evaluation facilitator (Clear Horizon Consulting) who played a coordination role while an external quality assurer (the Overseas Development Institute [ODI]) provided independence and credibility. The decision to take a partner-led evaluation was made by partners to build on ownership of a shared M&E system, harness organisational knowledge and to enhance partner's ability to learn.

Based on the shared KNOWFOR experience of those involved in the evaluation as well as independent observers this presentation will argue that the partner-led evaluation has the potential to create meaningful dialogue and learning within and between donors and implementing partners. However, the potential advantages of partner led evaluation need to be seen in the light of several challenges in this approach that are highlighted by the KNOWFOR evaluation. These challenges included coordination between partners and timeframes, ensuring independence from bias, balancing partner versus programme learning, supporting differing levels of partner capacities and ensuring shared ownership of the evaluation findings.

Overall the KNOWFOR case highlights the potential of partner led evaluation to provide an opportunity for inter-organisational learning. In the right institutional environment this approach also presents an opportunity to decentre the traditional donor/recipient relationship. The KNOWFOR case provides rich insight into these dynamics and challenges.

Chairs
avatar for Anthea McClintock

Anthea McClintock

Senior Manager Evaluation, NSW DPC
I lead the Program Evaluation Unit of Department of Premier and Cabinet. Our team is currently evaluating regional infrastructure programs funded by the NSW State Government. Prior to working with DPC, I worked for the Department of Primary Industries, ABARE and the Industry Commission... Read More →

Speakers
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
avatar for Stuart Raetz

Stuart Raetz

Stuart has over 10 years experience working as an M&E specialist in the Asia-Pacific region. He has consulted across a range of sectors including agriculture, natural resources, climate change, community development and emergency management and has experience working for and with... Read More →


Thursday September 20, 2018 3:00pm - 3:30pm AEST
Chancellor 6

3:05pm AEST

Measuring a healthy workplace environment in 10 questions: Developing a rapid environmental audit tool for Victorian workplaces
Clara Walker (Cancer Council Victoria), Amy Timoshanko (Cancer Council Victoria)

The Victorian Government’s Achievement Program supports workplaces to create healthy environments which can contribute to improved health and wellbeing of employees and the broader community. This paper outlines the scoping, developing and testing a rapid audit tool to capture change in workplace environments, practices, policies and culture relating to health and wellbeing. The tool was developed based on a synthesis of existing tools and a rapid review of best practice. Testing was conducted with a sample of workplaces, and feedback sought from experts and health promotion professionals. This presentation presents lessons learned from the scoping, development and testing process.

Chairs
avatar for Christina Thornley

Christina Thornley

Lead Advisor Innovation and Collaboration, Education Council of Aotearo New Zealand
Christina Thornley leads the Council’s Strengthening Appraisal professional learning project across English and Māori medium schooling and early childhood education settings. She has a strong interest in promoting teaching as a self-managing profession. The appraisal project focuses... Read More →

Speakers
avatar for Clara Walker

Clara Walker

Evaluation Coordinator, Cancer Council Victoria
I am an evaluator with experience working in government, health service delivery and preventative health. In my current role, I am working to integrate monitoring and evaluation into a statewide, complex health promotion program. I am interested in the design and conduct of evaluation... Read More →


Thursday September 20, 2018 3:05pm - 3:10pm AEST
Chancellor 3

3:05pm AEST

What happens when the public is not a monolithic audience?
Judith Lovell (Charles Darwin University), Kathleen Wallace (Charles Darwin University), Al Strangeways (Charles Darwin University)

This session illuminates a public engagement initiative called Monumental: in a small town way that transformed how some of its audience engaged with a donated public artwork (2010) in Alice Springs. Installing a 'founding father'-type statue (white man, gun in hand), without the 'due process' of the Public Arts Committee could be described as flaunting our region's love of informality. Our subsequent engagement (2017) of a public audience initiated 20 arts-based responses to the man, story, and statue. Their images tell of a public engagement initiative linking realist philosophy and arts processes, and a subsequent transformation of our understanding of evaluating public engagement initiatives.

Chairs
avatar for Evylyn Brophy

Evylyn Brophy

Associate Director, Urbis
I am passionate about helping others strengthen their evaluation capabilities. I work with program deliverers across sectors – including crime prevention, education, regional development, and primary health – to help develop capacity to measure their own impact, and enhance the... Read More →

Speakers
avatar for Judith Lovell

Judith Lovell

Senior Research Fellow, Charles Darwin University
Dr Judith Lovell is a Senior Research Fellow with the Northern Institute at Charles Darwin University. Part of the Northern Institute at CDU in Alice Springs, Judy’s current research interests include healthy and strong inland societies in central Australia. How can public art... Read More →


Thursday September 20, 2018 3:05pm - 3:10pm AEST
Chancellor 4

3:10pm AEST

'Bring a friend to work day': The value of dragging non evaluator colleagues along to the AES Conference
Liam Downing (Charles Sturt University)

Since 2015, I've brought non evaluator colleagues to AES Conferences. In a striking correlation, I've also seen the number of people at my organisation using evaluative thinking in their day to day work grow in the same period. While correlation is not causation, I'd like to use this presentation to present qualitative evidence gathered from said colleagues around how they contribute to evaluative thinking and practice within my own organisation. In not-unrelated arguments, I'll also show you that these non-evaluators think evaluators are pretty fun, and in some cases even start to consider themselves evaluators and act as such. 

Chairs
avatar for Evylyn Brophy

Evylyn Brophy

Associate Director, Urbis
I am passionate about helping others strengthen their evaluation capabilities. I work with program deliverers across sectors – including crime prevention, education, regional development, and primary health – to help develop capacity to measure their own impact, and enhance the... Read More →

Speakers
avatar for Liam Downing

Liam Downing

Equity Programs Evaluation Coordinator, Charles Sturt University
Liam has 10 years of research and evaluation experience, and is currently evaluating equity programs at Charles Sturt University (CSU) with a specific focus on programs run through the Higher Education Participation and Partnerships Program (HEPPP). In addition to ascertaining the... Read More →


Thursday September 20, 2018 3:10pm - 3:15pm AEST
Chancellor 4

3:10pm AEST

Evaluation and Transformation: It's the Politics Stupid
Chris Roche (La Trobe University)
This presentation will argue that evaluation is an inherently political process and this reality cannot be ignored or wished away. Particularly if evaluation seeks to cont4ibute to more trans-formative change.
I will explore why doing so is naive and dangerous, as well as suggesting some practical ways that evaluation can embrace politics more effectively.
A number of synthesis reviews in different sectors underline the importance of politics, and the political and institutional context, in contributing to the likelihood of research and evaluation uptake. This includes for example health policy (Liverani et al, 2013), nutrition policy (Cullerton et al, 2016), transport policy (Sager, 2007), and low carbon technology policy (Auld et al 2014).
There are also some substantive explorations of this issue in relation to evidence (Parkhurst, 2017); results and evidence in international development (Eyben et al, 2015) and evaluation (Taylor and Balloch, 2005).
Amongst other things these studies note:
  • That despite the recognition that politics is important it is often underexplored in evaluation design and outreach;
  • That there are tried and tested approaches to exploring these issues from political science, organisational studies etc which could be better drawn from;
  • That there is a tendency to see politics as a problem to be got round or bypassed, rather than an inevitable and important part of policy processes and decision making;
  • Or there is a tendency to simply blame the lack of ‘political will’  as the reason for lack of follow through on evaluation finding, without any attempt to unpack why that is the case, what the interests are in maintaining the status quo, or what underpinning values, norms or ideas might be at play.
If we accept that this is the case then I argue that much of the work that has been done in the international development sector on 'thinking and working politically' (https://twpcommunity.org/) and on 'knowledge, power and politics' (Jones et 2013) could be embraced in a more politically savvy approach to evaluation, which aims to speak truth to power.

Chairs
avatar for Christina Thornley

Christina Thornley

Lead Advisor Innovation and Collaboration, Education Council of Aotearo New Zealand
Christina Thornley leads the Council’s Strengthening Appraisal professional learning project across English and Māori medium schooling and early childhood education settings. She has a strong interest in promoting teaching as a self-managing profession. The appraisal project focuses... Read More →

Speakers
avatar for Chris Roche

Chris Roche

Associate Professor, La Trobe University
I am the Director of the Institute for Human Security and Social Change at La Trobe University, a Senior Research Partner of the Developmental Leadership Program (www,dlprog.org) and a member of the intellectual leadership team of the Centre for Development Impact and Learning (CEDIL... Read More →


Thursday September 20, 2018 3:10pm - 3:15pm AEST
Chancellor 3

3:15pm AEST

Sizing up social campaigns - Evaluation in a market research world
Gerard Atkinson (ARTD Consultants)

Evaluation by its nature is a hybrid discipline, and often we find our work overlapping with other disciplines. In the consulting world, this translates to competing against other sectors for the same work. This presentation looks at one such example - evaluation of social impact marketing campaigns. When the goals of a campaign are so much more than moving product, what value do we add as evaluators in assessing whether a campaign worked? And what skills do evaluators need in order to go toe-to-toe with the market research field? 

Chairs
avatar for Evylyn Brophy

Evylyn Brophy

Associate Director, Urbis
I am passionate about helping others strengthen their evaluation capabilities. I work with program deliverers across sectors – including crime prevention, education, regional development, and primary health – to help develop capacity to measure their own impact, and enhance the... Read More →

Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation- market and social research - financial and operational modelling- non-profit, government and business strategy... Read More →


Thursday September 20, 2018 3:15pm - 3:20pm AEST
Chancellor 4

3:15pm AEST

Transforming the experience of seriously ill children, young people and their families - A real life example of evaluation in action
Sarah Moeller (Starlight Children's Foundation), Claire Treadgold (Starlight Children's Foundation)

Starlight Express Rooms (SERs) provide a medical-free zone where children can escape from the hospital environment. Every three years, Starlight undertakes an evaluation of all nine SERs in Australia and in this Ignite session, we will share the story - and the learnings - of this review, highlighting how effective evaluation contributes to transforming the hospital experience of seriously ill children.We will share insights on the importance of stakeholder engagement, the challenges of capturing the voices of children and young people in a meaningful way (and yes, there will be a burping frog involved), and the effective dissemination of results.

Chairs
avatar for Christina Thornley

Christina Thornley

Lead Advisor Innovation and Collaboration, Education Council of Aotearo New Zealand
Christina Thornley leads the Council’s Strengthening Appraisal professional learning project across English and Māori medium schooling and early childhood education settings. She has a strong interest in promoting teaching as a self-managing profession. The appraisal project focuses... Read More →

Speakers
avatar for Sarah Moeller

Sarah Moeller

Research & Evaluation Manager, Starlight Children's Foundation
Sarah Moeller is the Manager, Research and Evaluation at the Starlight Children’s Foundation. Qualifications: MA (Social Work) University of Melbourne; BSc University of Melbourne.


Thursday September 20, 2018 3:15pm - 3:20pm AEST
Chancellor 3

3:20pm AEST

If what you are doing scares you, you're probably on the right track: 5 things I've learned about how to co-design an evaluation
Jenne Roberts (Menzies School of Health Research)

Co-design is transforming the way evaluators work, yet there is not a lot of guidance on how to facilitate a co-design process. This presentation will highlight five things I have learned during 2 recent evaluation co-design processes: an internationally funded HIV program in Indonesia and the other a Collective Impact project with 4 Aboriginal community controlled organisations in the Northern Territory.  I will cover: Start with the end; Walk alongside: Work in both worlds; Done is better than perfect; Radical beats routine. Rest assured: if what you are doing scares you, you are probably on the right track. 

Chairs
avatar for Evylyn Brophy

Evylyn Brophy

Associate Director, Urbis
I am passionate about helping others strengthen their evaluation capabilities. I work with program deliverers across sectors – including crime prevention, education, regional development, and primary health – to help develop capacity to measure their own impact, and enhance the... Read More →

Speakers
avatar for Jenne Roberts

Jenne Roberts

Evaluation Manager, Menzies School of Health Research
Jenne is an international health evaluator, working in Indigenous health in Australia and international public health, mostly in South East Asia. Jenne is interested in identifying the efforts and interventions that spark positive social and health impact, and engaging intended beneficiaries... Read More →


Thursday September 20, 2018 3:20pm - 3:25pm AEST
Chancellor 4

4:00pm AEST

Plenary three: Karol Olejniczak "Transforming evaluation practice with serious games"
Karol Olejniczak (Assistant Professor, University of Warsaw, Centre for European Regional and Local Studies (EUROREG UW), Warsaw, Poland)

During the presentation we will explore the innovative and dynamically developing practice of serious games to find inspiration for addressing some of the key challenges of our evaluation practice.

We will start with a two-dimensional typology of games for evaluation, distinguishing between the level of complexity of a policy issue, and the intended primary purpose of the inquiry. Then I will present four types of games for evaluation, illustrating them with exemplars of real-life application. These are: games for testing retention of skills and knowledge, games for teaching knowables, games for crash-testing mechanisms, and games for exploring system dynamics.

In conclusion, everyone will assess, using real-time survey, the potential utility of the presented game types for advancing evaluation practice. And, of course, as with all games, there will be the opportunity to win prizes.


Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →

Speakers
avatar for Karol Olejniczak

Karol Olejniczak

Assistant Professor, University of Warsaw, Centre for European Regional and Local Studies
Karol Olejniczak is an Assistant Professor of public policy at EUROREG - University of Warsaw, Poland, and a visiting scholar at The George Washington University, Washington D.C. He is also a co-founder of policy research company Evaluation for Government Organization (EGO s.c.).His... Read More →


Thursday September 20, 2018 4:00pm - 5:30pm AEST
Conference centre

5:30pm AEST

AES Annual General Meeting & 'What could the Romans actually do for us?' - an interactive Forum on the concept of an Evaluator-General and what it could mean for AES members
Join the AES Board as we celebrate another year’s achievements by members of the AES.

Followed by:
'What could the Romans actually do for us?' - an interactive Forum on the concept of an Evaluator-General and what it could mean for AES members

Initially proposed by Nicholas Gruen, and more recently supported by the AES and stakeholders such as the Department of Industry, Innovation and Science, the concept of an independent Evaluator-General reporting to the Australian Parliament is showing signs of gaining traction. While many support the concept in-principle, what might its actual design and implementation look like, and what could this mean for AES members both internal and external to Government? Lead by a Panel, this interactive session will explore with the AES membership what are the possible implications for doing evaluation, evaluation capacity and capability building, and creating an independent arm, plus what it could mean for the status of evaluation, the sector and profession.

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →


Thursday September 20, 2018 5:30pm - 7:00pm AEST
Conference centre
 
Friday, September 21
 

8:00am AEST

Plenary four: Sharon Gollan & Kathleen Stacey "Cultural accountability in evaluating Aboriginal initiatives and programs"
Sharon Gollan (Leader and facilitator of Cultural Respect and Safety training, South Australia) and Kathleen Stacey (Managing Director and Principal Consultant, beyond..., South Australia)
In our paper on Wednesday, we emphasised that a clear understanding of and commitment to contribute to cultural safety is a vital lens through which any evaluator should approach their role in evaluating programs designed for, or inclusive of, Aboriginal and/or Torres Strait Islander Australians. This requires evaluators to think deeply and critically about power, inclusion and the relationship between them.

We will introduce participants to cultural accountability and invite them to reflect on an evaluation of an Aboriginal initiative or program with which they are familiar, because they conducted, were involved in or read about it. Through applying the lens of cultural accountability to this reflection, we hope this will generate different learnings and identify new ideas about how to undertake similar evaluations in the future.

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →

Speakers
avatar for Sharon Gollan

Sharon Gollan

Sharon Gollan is a descendent of the Ngarrindjeri nation of South Australia, with family and cultural connections to many communities within and beyond South Australia. Sharon has worked professionally and academically in a range of human services fields in Australia. She has over... Read More →
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond... (Kathleen Stacey and Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →


Friday September 21, 2018 8:00am - 9:00am AEST
Conference centre

9:00am AEST

Evaluation reports: Writing, editing and wrangling Word
Ruth Pitt (Department of Social Services)

Despite the increasing popularity of visual presentation methods, writing is still a core skill for evaluators.  Evaluators need to write for diverse audiences and produce attractive, error-free reports while facing tight deadlines and budgets. I have previously worked as an editor, an evaluator and a consultant supporting organisations to improve their evaluation documents. In my current role, I receive and review numerous evaluation reports. These experiences have given me insight into the common problems with evaluation reports, why they occur and how to fix them. In this skill building session, I will share tips and tricks for improving your writing when facing a deadline, whether the final version is due in one hour, one day or one week.One hour: the clock is ticking and you've only just finished writing. I'll demonstrate affordable editing software that can quickly reduce errors and improve consistency. I'll also provide a handout outlining the features and costs of other options. One day: The final report is due tomorrow and your draft is...okay. I'll provide a checklist of common problems that can be addressed in one day, and demonstrate Word features that will help you find and fix them. One week: you planned carefully and left plenty of time for revising and editing. I'll share practical steps for improving the structure, readability and visual appeal of your reports.  The skill building session is suitable for evaluators of any level of experience who would like refresher training in writing and editing, particularly on how technology can support (rather than thwart) efforts to deliver a quality report on time. 

Chairs
avatar for Stuart Raetz

Stuart Raetz

Stuart has over 10 years experience working as an M&E specialist in the Asia-Pacific region. He has consulted across a range of sectors including agriculture, natural resources, climate change, community development and emergency management and has experience working for and with... Read More →

Speakers
avatar for Ruth Pitt

Ruth Pitt

Assistant Director, Evaluation Uni, Australian Government Department of Social Services
Ruth Pitt is Assistant Director of the Evaluation Unit at the Australian Government Department of Social Services. Her evaluation experience includes diverse roles in consulting, government and not-for-profit organisations, in Australia and overseas. Her qualifications include a Master... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Chancellor 6

9:00am AEST

Traps for young players: a panel session by new evaluators for new evaluators
Dan Borg, Jennifer Thompson (VicRoads), Victoria Cook (Department of Economic Development, Jobs, Transport and Resources), Ellie McDonald (Department of Health and Human Services)

What are the common traps for young players newly transitioned to evaluation? New to evaluation and asking this question - then this is the session for you. Come and hear about the lessons learnt from a panel of practitioners who have recently transitioned to evaluation through diverse pathways. Hear also from evaluators with dedicated roles in building evaluation capability (and the common issues encountered). You'll also have the opportunity in this facilitated session to share your experiences and lessons learnt. 

Part panel, part facilitated session/panel forum, we will discuss pathways into evaluation practice; successes and challenges in making the transition; all aspects of the evaluation journey (from first conversations with clients/commissioners to evaluation reporting); lessons in maximising evaluation use; and, where to turn to for help. 

The session will involve a mix of a facilitated panel; audience Q&As and facilitated group activities designed to encourage audience participation and sharing of experiences.  

Chairs
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.

Speakers
avatar for Dan Borg

Dan Borg

Independent consultant
Dan is an evaluator with a background in environmental science and a PhD from the University of Melbourne. Dan's has experience conducting evaluations in natural resource management, emergency management and in health in the public service and not for profit sectors. Dan is passionate... Read More →
avatar for Victoria Cook

Victoria Cook

Senior Evaluator | Outcomes Performance and Evaluation Strategy and Planning Group, Department of Economic Development, Jobs, Transport and Resources
I have over twelve years’ experience in M&E in both government and private consulting. My passion is ECB in organisations and am currently employed at a very large Government Department doing so. I enjoy working on complex program logic models and M&E plans. I have a very strong... Read More →
avatar for Ellie McDonald

Ellie McDonald

Evaluation and Research Policy Officer, DHHS
Ellie is an emerging evaluator, with a background in public policy and international relations. Currently working in the Centre for Evaluation and Research at the Victorian Department of Health and Human Services, Ellie undertakes internal program evaluations and provides advice and... Read More →
avatar for Jen Thompson

Jen Thompson

Senior Project and Policy Officer, VicRoads
I am a big picture thinker who has found a natural home in evaluation. I am passionate about active transport, urban planning, public health and road safety. I have bundled these together in my work in policy and project management throughout my career. I now work at VicRoads, developing... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Chancellor 3

9:00am AEST

Evolving the evaluation deliverable
Gerard Atkinson (ARTD Consultants)

A key principle of utilisation-focused evaluation is that it needs to be useful to stakeholders, whether they are evaluation commissioners, policy developers, or the general public. Much of the theory of utilisation-focused evaluation centres on the process of evaluations, and the early and sustained engagement of stakeholders. Consideration is also given to the way an evaluation is communicated, a.k.a. the "deliverable", focusing on tailoring the communication of findings to match how different stakeholders absorb information. 

In prior decades, the sole deliverable was almost always a written report. As users of evaluations became more time poor, visual techniques for conveying information gained popularity. Slideshow presentations became a key part of communicating findings, to the point of replacing written reports in some cases. More recently, as evaluations have utilised large data sets and responded to a desire to make findings interactive, dashboards have gained in prominence as the core deliverable. However, each of these are imperfect solutions. Slideshows often omit some of the technical details required by those seeking to operationalise the findings, and dashboards are strongly focused on presenting quantitative analyses. So the question arises: what's next?

This interactive session is an opportunity for participants to bring their own ideas and needs, and brainstorm what might be the next step in the evolution of the evaluation deliverable. Starting with an overview of the evolution of the deliverable and the aims of utilisation-focused evaluation, participants will then work together in small groups with creative stimuli to explore ideas for new types of deliverables that overcome current challenges in usability and communication. Groups will consider what the next generation deliverable might look like, how it might be developed in an evaluation process, how it fits with existing deliverables, and what skills will be needed to design and deliver these in collaboration with stakeholders.

Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →

Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation- market and social research - financial and operational modelling- non-profit, government and business strategy... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Conference centre

9:00am AEST

Designing research and evaluation for a complex system: The Stronger Smarter Approach to Aboriginal and Torres Strait Islander education.
Cathy Jackson (Stronger Smarter Institute), John Davis (Stronger Smarter Institute)

The mission of the Stronger Smarter Institute is to create transformative change in outcomes for Aboriginal and Torres Strait Islander students.  Through the Stronger Smarter Leadership Program, we support educators to reject the deficit thinking that comes from a racialized view of education and become agents of change.  In the classroom, the strength-based Stronger Smarter Approach posits that Indigenous students can be both Strong and Smart: students can be both strong in culture and smart in the classroom.

In this presentation we describe how we have developed our evaluation model over several years to work towards an understanding of the question 'When does the Stronger Smarter Approach work best?'  We show how we draw on Indigenous ways of knowing, being and doing, using the Bunya Bunya Cycle to guide our understanding of a complex problem requiring local solutions.  The Bunya Bunya Cycle moves us away from the Western positioning of the 'researchers' and 'the researched' towards privileging the Indigenous voice and sharing and giving back.  We draw on Complexity Theory to understand Indigenous education as a complex system and the Stronger Smarter Approach as an intervention with simultaneous causal strands. Drawing on Realist Evaluation theory, we assert the agency of educators in choosing how to respond and we show how a strength-based approach leads us to a confidence in the power of educators and Indigenous communities to deliver local solutions.   We describe how these understandings have led us to develop a series of emergent logic models that will continue to be refined as our research evolves. 

Chairs
avatar for Carol Vale

Carol Vale

Managing Director, Murawin
I am a Dunghutti woman with an extensive career in public sector management and service delivery in the realm of Aboriginal Affairs. My academic background is primarily in the social sciences and leadership development particularly as they relate to overcoming disadvantage. What should... Read More →

Speakers
avatar for Jana Andrade

Jana Andrade

Research Analyst, Stronger Smarter Institute
Jana is a Brazilian Statistician with experience in business intelligence, quality control, statistical model development, research support in biostatistics, electoral research, lecturing at university and survey development and subsequent result analysis for all research purposes... Read More →
avatar for Cathy Jackson

Cathy Jackson

Head of Research, Stronger Smarter Institute
Cathy joined the Stronger Smarter Institute seven years ago following on from a 20-year career at the Queensland University of Technology. At the Institute, she has responsibilities in designing and managing the overall evaluation program for the Institute’s professional development... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Chancellor 4

9:00am AEST

Taking an intersectional approach to evaluation and monitoring: moving from theory to practice
Sarah Kearney (Our Watch), Anna Trembath (Our Watch), Elise Holland (OurWatch)

In recent decades, there have been growing efforts to apply intersectional theory to the field of gender equality, health promotion, and other areas of social policies. While much of the focus so far has been on understanding how to apply an intersectional lens to policy and programming, of equal importance is the application of an intersectional approach to monitoring and evaluation and its potential to reveal meaningful distinctions and similarities in order to better a) understand the impact of social interventions and b) monitor progress toward social policy outcomes.  

The panel consists of practice specialists with expertise in evaluation and monitoring from Our Watch, the national foundation for the prevention of violence against women. Each panellist applies an intersectional approach to designing either project-level evaluations or monitoring frameworks for tracking population-level change.  This panel will open by exploring the concept of intersectionality and its role in the development of transformative social policy.
Building on this theoretical understanding, the panellists will be interviewed by a facilitator on how they have embedded intersectionality into their monitoring and evaluation projects, drawing primarily from examples of violence prevention interventions and initiatives aimed at promoting gender equality. Examples will include: the evaluation of a national cultural change campaign (delivered across digital platforms) and the development of monitoring mechanism which tracks population-level progress towards the prevention of violence against women.   

The panel will conclude with an interactive facilitated discussion. Audience members will be asked to interrogate evaluation case studies (provided by the panellists), discussing whether the examples are intersectional, and identifying practical steps to advance an intersectional approach of the case study. At the conclusion of the panel, participants will be directed toward relevant resources to support them to move from 'inclusive' evaluations that simply recruit for diversity, towards transformative, intersectional evaluation design.

Chairs
avatar for Jenne Roberts

Jenne Roberts

Evaluation Manager, Menzies School of Health Research
Jenne is an international health evaluator, working in Indigenous health in Australia and international public health, mostly in South East Asia. Jenne is interested in identifying the efforts and interventions that spark positive social and health impact, and engaging intended beneficiaries... Read More →

Speakers
avatar for Loren Days

Loren Days

Loren Days is currently Senior Policy Advisor, Intersectionality at Our Watch where she specialises in developing strategies to embed an intersectional approach across the organisation. She is a qualified lawyer who has experience in policy, human rights, legal and regulatory ref... Read More →
avatar for Sarah Kearney

Sarah Kearney

Manager, Evaluation and Learning, Our Watch
I am an experienced social policy evaluation specialist with a passion for preventing gender based violence. For the past few years, I've lead evaluation at Our Watch, Australia's national foundation to prevent violence against women. Together with my colleagues, we've learned how... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Chancellor 5

10:00am AEST

Realities of monitoring and evaluation in a not-for-profit
Sophia Harryba (UnitingCare Wesley Bowden), Eboni Tiller (UnitingCare Wesley Bowden)

As a not-for-profit that has dedicated significant resources and commitment to embedding monitoring and evaluation within our work, we believe we have a unique contribution to the conference. 

We will be focusing on the 'Realities of monitoring and evaluation in a not-for-profit' where we will highlight the successes, challenges and lessons learned since beginning our m&e journey. 

We have two primary foci for our presentation:
  1. The impact of monitoring and evaluation on our everyday work and how staff have experienced the change; 
  2. The challenges in building capacity within the organisation for ownership of the monitoring and evaluation framework 

Chairs
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.

Speakers
avatar for Eboni Tiller

Eboni Tiller

Research and Evaluation Project Officer, UnitingCare Wesley Bowden
Eboni is a Research and Evaluation Project Officer at UnitingCare Wesley Bowden (UCWB) – a mid-size NGO in South Australia. Within UCWB, Eboni works on implementing evaluation across the organisation. In her other role, Eboni works with Collective Impact initiatives across South... Read More →


Friday September 21, 2018 10:00am - 10:05am AEST
Chancellor 3

10:00am AEST

Transforming evaluation to better address complexity
Julie Elliott (RMIT, PHD candidate)

Over the past 20 years, some insights from complexity science have been adopted into evaluation, including applications in Developmental Evaluation, systems approaches in evaluation and realist evaluation. But mainstream evaluation practice, including evaluation tied to Results Based Management and performance management, is largely built upon assumptions that interventions always operate under conditions of equilibrium, outcomes can be predicted in advance and the relationship between cause and effect in linear and unidirectional. 

For many interventions, such as place based initiatives, social change strategies and those that aim to establish conditions to stimulate social innovation, this is not the case. Instead they exhibit the features of Complexity: collective patterns of culture and group identity fold back onto the individuals who formed them through an interplay of connection, interdependence and human agency that is dependent upon memories from the past, learning and anticipation of the future, including second-guessing or out-guessing what others will do. Complexity shifts how we see the world. It replaces 'reductionism' and understands human social interaction as always complex and emergent. 

This paper begins by summarising the initial uptake of some complexity ideas and methods in evaluation. It then sets out some more radical ideas and methods from Complexity Science with potential utility in evaluation. It ends with some suggestions for how the theory and practice of evaluation might be transformed to better address complexity. 

Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →

Speakers
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.


Friday September 21, 2018 10:00am - 10:30am AEST
Conference centre

10:00am AEST

Visionary, maybe, but how viable? Understanding executive leaders' thinking about evaluation mainstreaming within child and family welfare
Amanda Jones (Berry Street)

The need within the child welfare sector to understand, evidence and improve programmatic impact is intensifying. Evaluation mainstreaming (EM) holds much promise for meeting that need. EM can be understood to represent a major organisational change endeavour, and also has the key characteristics of a complex innovation. 

Leadership has been identified as a critical factor in both organisational change initiatives and innovation implementation. The Organisational Evaluation Capacity Building (OECB) field also recognises leadership as a key building block in initiating, implementing and sustaining OECB. It is a critical factor in gauging organisational readiness. Executive leadership, specifically, provides critical leverage for this purpose of resetting an organisation towards major change. 

The construct of leadership readiness, however, is not well understood in the OECB literature. We do not know what executive leaders think about mainstreaming during the pre-adoption phase when they first encounter and consider the merit EM. How the context of child welfare practice might mediate executive leader views is also given limited attention.

This presentation explores the thinking of executive leaders about EM within the specific context of a large child welfare organisation within Victoria, Australia. The attitudes, value propositions and other thinking of the entire executive leadership were collected at two points in time: prior to deliberating about EM for inclusion in the forthcoming triennial strategic plan, and following formal plan sign off. 

Findings are useful both for understanding how executive leaders think about the desirability and feasibility of EM, and the nature of readiness to buy-in. The implication for evaluation theory and practice is that the OECB field constructs and assesses leadership too narrowly. It would benefit from unpacking the pre-adoption leadership readiness stage to a greater degree, and drawing on change management and implementation science theory and tools to assist with its conceptualisation and measurement. 

Chairs
avatar for Stuart Raetz

Stuart Raetz

Stuart has over 10 years experience working as an M&E specialist in the Asia-Pacific region. He has consulted across a range of sectors including agriculture, natural resources, climate change, community development and emergency management and has experience working for and with... Read More →

Speakers
avatar for Amanda Jones

Amanda Jones

Senior Manager - Evaluation, Policy & Research, Berry Street
I have over 20 years' experience in evaluation, public policy and research for not for profit welfare organisations and government. I have also worked in the private sector in a research capacity consulting to state and local government, and as a manager of a community health counselling... Read More →


Friday September 21, 2018 10:00am - 10:30am AEST
Chancellor 6

10:00am AEST

"It's about involving Aboriginal people in every aspect of decision making": Understanding the enablers and drivers of evaluation in Indigenous higher education in Australia
James Smith (Curtin University), Kellie Pollard (Charles Darwin University), Kim Robertson (Charles Darwin University)

Growing Indigenous participation and success in higher education has frequently been highlighted as a priority for improving the health, social and economic outcomes of Indigenous peoples and Australian society. Recent academic scholarship has reinforced the importance of strengthening evaluation in Indigenous higher education contexts in Australia to achieve this goal. This has paralleled national and global commentary about the importance of data sovereignty within Indigenous affairs policy and program settings. Despite successive calls from high level Indigenous advisory groups for the Australian Government to invest in a performance, monitoring and evaluation framework that is tailored to the unique needs and priorities of the Indigenous higher education sector, this has not yet occurred. In this presentation we draw on in-depth interviews with 24 Indigenous scholars from across all state and territory jurisdictions across Australia to describe evaluation in higher education from an Indigenous standpoint. The research subsequently privileges Indigenous voices and identifies enablers and drivers likely to strengthen evaluation of Indigenous success in higher education contexts. These include growing Indigenous leadership; increasing funding and resources; investing in strategy development; leading innovative policy development, implementation and reform; investing in cultural transformation and quality improvement; addressing white privilege and power; improving Indigenous student outcomes; valuing Indigenous knowledges and prioritising Indigenous epistemologies; incentivising cultural competence; embracing political challenges as opportunities; promoting cultural standards and accreditation; reframing curricula to explicitly incorporate Indigenous knowledges and practices; investing in an Indigenous workforce; and recognising sovereign rights. We discuss these findings in the context of three primary domains of control - Indigenous control, Government control and University control. In doing so, we unpack the social-political complexities of negotiating evaluation work specific to Indigenous success in higher education. We show how significant transformations can be achieved in policy and practice contexts in higher education, if Indigenous standpoints are prioritised.

Chairs
avatar for Carol Vale

Carol Vale

Managing Director, Murawin
I am a Dunghutti woman with an extensive career in public sector management and service delivery in the realm of Aboriginal Affairs. My academic background is primarily in the social sciences and leadership development particularly as they relate to overcoming disadvantage. What should... Read More →

Speakers
avatar for Kim Robertson

Kim Robertson

Senior Analyst, Indigenous Policies and Programs, Charles Darwin University
Kim Robertson, is Senior Analyst, Indigenous Policies and Programs with the Office of the Pro Vice-Chancellor Indigenous Leadership at Charles Darwin University and was a member of the Steering Group for Professor Smith’s 2017 NCSEHE Equity Fellowship investigating ways of strengthening... Read More →
avatar for James Smith

James Smith

Father Frank Flynn Fellow and Professor of Harm Minimisation, Menzies School of Health Research
James is the Father Frank Flynn Fellow and Professor of Harm Minimisation at Menzies School of Health Research - with much of his work sitting at the health/education nexus. Previous to this role he was a 2017 Equity Fellow with the National Centre for Student Equity in Higher Education... Read More →


Friday September 21, 2018 10:00am - 10:30am AEST
Chancellor 4

10:00am AEST

Q: Can realist evaluations be designed to be more suitable for use in Indigenous contexts? (A: It depends)
Emma Williams (Northern Institute, CDU), Kevin Dolman (Northern Institute, CDU)

Realist evaluation has been growing in popularity over the past 20 years, and is now being used in Australian and Canadian Indigenous contexts. This presentation, developed by realist Indigenous and non-Indigenous colleagues, looks at how (and to what degree) realist evaluations can be designed to be culturally safe, and more suitable for use in different Indigenous contexts. We note that a single proposed solution is impossible, given the diversity of Australian Indigenous peoples, and describe issues that arise in different Indigenous contexts. One area of innovation is methods, identifying how techniques developed in a European context - such as realist interviewing - have been and can be further adapted to suit preferred ways of sharing information in different Indigenous contexts. More challenging is understanding how the ontology and epistemology of realist evaluation, and particularly its understanding of causation, align with the ontologies, epistemologies and understandings of causality of different Indigenous peoples. Steps towards a cross-cultural understanding of realist philosophy are presented, together with the challenges this presents. The impact of who 'owns' the evaluation, who leads and shapes it, will also be discussed with reference to realist evaluations. 

Chairs
avatar for Jenne Roberts

Jenne Roberts

Evaluation Manager, Menzies School of Health Research
Jenne is an international health evaluator, working in Indigenous health in Australia and international public health, mostly in South East Asia. Jenne is interested in identifying the efforts and interventions that spark positive social and health impact, and engaging intended beneficiaries... Read More →

Speakers
avatar for Kevin Dolman

Kevin Dolman

Principal Analyst, Indigenous Evaluation Consultant
Dr Kevin J. Dolman is an Eastern Arrernte man who has worked for thirty years in the government, private and community sectors across a spectrum of Indigenous social, economic and cultural policy. He is a consultant in evaluation and policy fidelity with a special interest in how... Read More →
avatar for Emma Williams

Emma Williams

Associate Professor, RREALI CDU Maburra
Emma is a Credentialed Evaluator and an Associate Professor at Charles Darwin University in Australia, where she researches, conducts and teaches evaluation, particularly realist evaluation. She has moved between government, academe and private consulting in evaluation over more than... Read More →


Friday September 21, 2018 10:00am - 10:30am AEST
Chancellor 5

10:05am AEST

When do we have enough evidence!!!
Zazie Tolmer (Clear Horizon)

In a design process, whether it's co-design or another approach, not knowing when we know enough, or have enough evidence on the problem, insights, opportunities etc. can paralyse the process. The 'expert' cannot always be there to provide the assessment and confidence for design groups to move on. To address this challenge, the team I am working with at the Department of Health and Human Services has developed and tested a new 'Ah help me! Do I have enough evidence or good enough quality to continue?' tool. In this ignite presentation I'd like to share and test it with you. 

Chairs
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.

Speakers
ZT

Zazie Tolmer

Principal Consultant, Clear Horizon
Anything! I'm curious and friendly! I'm currently working as an embedded evaluator at DHHS working on a Government-led Collective impact initiative to improve outcomes for vulnerable children, young people and their families.


Friday September 21, 2018 10:05am - 10:10am AEST
Chancellor 3

10:10am AEST

We should be democritising evalution, not sanctifying it
Duncan Rintoul (Rooftop Social)

I'm a card carrying evaluator - literally. But most of the people I work with aren't: they're project managers, designers, policy analysts... the list goes on. For evaluation to 'make a difference', it's got to get closer to the action. Out of the realm of being a specialised technical sub-specialty and get into the skillsets and mindsets of non-specialists. 
 

Chairs
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.

Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →


Friday September 21, 2018 10:10am - 10:15am AEST
Chancellor 3

10:20am AEST

TLDR (too long, didn't read): Let's knife evaluation reports
Elizabeth Smith (Litmus)

Boring, long reports are killing evaluation. Getting evidenced reports to under 30 pages is an art form. We are on a mission to get our reports in the Times top 100 best sellers. Or better yet to be dog-eared and thumb-marked, and used to create change. 

Come and hear how we are executing our mission. We will share our processes to hear a client say 'I love your report'. And how we overcome the traditionalists and blockers. 

Let's make evaluation great again. Embrace the reporting revolution. 

Chairs
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.

Speakers
avatar for Liz Smith

Liz Smith

Partner, Litmus
I am a co-founder of Litmus a specialist private sector evaluation company based in Wellington New Zealand and active across Australasia. My evaluation journey started more than 20 years ago as a nurse in the John Radcliffe Hospital Oxford when assessing the effects of a new nursing... Read More →


Friday September 21, 2018 10:20am - 10:25am AEST
Chancellor 3

11:00am AEST

Doing evaluation: Task analysis as a pathway to progress evaluation education
Amy Gullickson (University of Melbourne Centre for Program Evaluation)

Formal evaluation education has followed a pathway primarily dictated by the discipline in which it is taught (e.g., public administration, community psychology, education, health) and generally the focus has been on good practice in research methods (LaVelle, 2014). Informal training and professional development has been more focused on evaluation-specific skills, dictated by market demands. The AES Evaluators’ Professional Learning Competencies (AES Professional Learning Committee, 2013) cover both research and evaluation-specific ground and have been an important first step to understanding what is required to do the tasks of evaluation. However, competency sets tend to ignore the tasks inherent in doing the work (Brannick, Levine, & Morgeson, 2007).  To create effective educational processes, we need to analyse the tasks inherent in the work of evaluation, identify the skills, knowledge or other abilities required, and the type of learning necessary to develop them (e.g., basic, complex). Once that is complete we can then assess required the levels of performance on those skills and tasks (e.g., novice, expert). These steps are essential if we are to meet the needs of our diverse cadre of practicing evaluators and provide clear pathways for learning and professional recognition. In this session the presenter will report an initial task analysis on the logic of evaluation, the processes and results, and lessons learned to springboard a discussion on implications for educating people to conduct evaluation for both professional development and formal education. 

Chairs
avatar for Helen Watts

Helen Watts

General Manager Strategy and Planning, Corangamite Catchment Management Authority
With a passion for achieving real natural resource management outcomes that are based on sound evidence and based on people's ability to contribute, learn and adapt.Love talking complexity, systems and the nexus between people and place

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →


Friday September 21, 2018 11:00am - 11:30am AEST
Chancellor 4

11:00am AEST

Total value measurement: Are we counting what actually counts?
Les Trudzik (ACIL Allen)

Evaluation noun
1. the making of a judgement about the amount, number, or VALUE of something; assessment.en.oxforddictionaries.com

Have you ever been in the situation of believing something is of inherent value but frustrated at not being able to put an adequate measure on that belief?

Evaluation by its very definition requires careful thought about measuring all the attributes of value of the subject or topic under assessment. Not just the tangible benefits but also the intangible. The latter can often be significant but difficult to quantify, especially so in public and social policy settings, where there are increasing needs to consider cross-sectoral and transformational factors, and many of the derived benefits or value may have an indirect relationship with the specific outputs of the program.

Multi-criteria analysis is typically used as the 'goto' way to assess a range of tangible and intangible value measures. But this approach fails to recognise that intangible value by its nature does not usually combine in the same additive way that conventional financial or economic value does. Intangible value, capacity building as an example, is not lost or reduced when given to or shared with others, but is available to both and as such conforms to a network economics model where there are increasing, not decreasing, marginal returns. It is also important to understand the ways in which the different value attributes can interrelate and influence each other.

This presentation will outline how evaluators can address the challenges of first classifying, and then understanding and combining, all the attributes of value that may be being delivered—that is, assessing the total value not just the value that is easy to count.

Chairs
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →

Speakers
avatar for Les Trudzik

Les Trudzik

Director, ACIL Allen Consulting
Les oversees the evaluation practice within ACIL Allen. He has over thirty yerars experience in advising public and private sector organisations, with a focus on strategy and policyh development, program evaluation and review, and organisational performance improvement. Les has worked... Read More →


Friday September 21, 2018 11:00am - 11:30am AEST
Chancellor 3

11:00am AEST

"Stories for Purpose" – transforming the use of documentary film, participatory media and participatory forums in Monitoring and Evaluation, in order to create evidence based visual reports
Susan Rooney-Harding (The Story Catchers), Margaret Howard (Department of Planning, Transport and Infrastructure)

Resources
https://vimeo.com/199815869
https://vimeo.com/258778259

Looking at how we use documentary film and stakeholder participation in monitoring and evaluation to create visual reports. It's the story behind the numbers that bring a traditionally dry process to life.

We use qualitative data collection methodologies, participatory media (where the audience can play an active role in the process of collecting, reporting, analysing and sharing media content) and documentary videographers to collect stories. These stories are then used in the creation of a series of Documentaries to be employed in participatory forums and in the monitoring, evaluation and reporting process.

Working with monitoring and evaluation specialists we use a variety of evidence-based methodologies. A Monitoring and evaluation specialist conducts participatory forums with stakeholders to unpack the documentary films; the findings are then used to in producing a traditional written report. This report and media previously collected are then used to create a short visual documentary report (approx. 8-10min) to accompany the written report.

We will look at the participatory monitoring and evaluation process that was used in the APY Lands in South Australia with the 'One the Right Track Remote' drivers licensing program. We will show the final documentary and discuss its uses for communications and how it can shape program direction and policy change, help change legislation and for refunding of programs. 

A lot of our work is with government agencies working with indigenous programs that are looking to implement more inclusive and culturally appropriate evaluation and reporting methodologies. 

We will discuss the user experience (the client) of the above methodology employed in the 'On the Right Track remote' program and review how the documentary piece has been used.

Chairs
avatar for Ruth McCausland

Ruth McCausland

Senior Research Fellow, School of Social Sciences, UNSW
Dr Ruth McCausland is Director of Research and Evaluation for the Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW, and Senior Research Fellow in the School of Social Sciences, UNSW. Her research focuses on women, young people, people with disabilities and... Read More →

Speakers
avatar for Margaret Howard

Margaret Howard

Manager Living Neighbourhoods and Travel Behaviour, Department of Planning, Transport and Infrastructure
I lead an area of my department that delivers behaviour change programs in relation to personal transport with a focus on reducing car use, and active, safe, green travel. I'm also the manager of our Aboriginal Road Safety and Driver Licensing programs, which aims to address long... Read More →
avatar for Susan Rooney-Harding

Susan Rooney-Harding

Director, The Story Catchers
Susan Rooney-Harding: Director, Videographer, Editor, Business Development, and Project Management.Susan is a documentary filmmaker and a creative qualitative data specialist. Her inquisitive and intuitive nature is central to her ability to capture meaningful stories for a greater... Read More →


Friday September 21, 2018 11:00am - 12:00pm AEST
Chancellor 6

11:00am AEST

Evaluation Ready: Transforming government processes and ensuring evaluability
Ruth Pitt (Department of Social Services), Lyn Alderman (Department of Social Services), Katherine Barnes (Department of Industry, Innovation and Science), David Turvey (Department of Industry, Innovation and Science)

Abstract 1: Evaluation Ready: transforming government processes

The Australian Government's Digital Transformation Agenda, announced in the 2015/16 federal budget, includes establishing two grants hubs, the Business Grants Hub (located in the Department of Industry, Innovation and Science) and the Community Grants Hub (located in the Department of Social Services). These hubs are intended to streamline how grant programs are designed, established and managed across the Commonwealth. This centralisation of grants administration is a significant systems-level change that presents both challenges and opportunities for evaluation.This presentation will outline the work being done at the two Departments to embed evaluation services within their respective grants hubs, looking at key successes to date and challenges ahead. In particular, it will examine how the hubs have moved evaluation planning into the design phase by ensuring evaluation is included in the costings for new programs and by providing 'evaluation readiness' services. These services align with the utilisation-focused evaluation approach of holding 'launch workshops' to assess and enhance evaluation readiness, with the aim of improving the timing and relevance of future evaluation activities (Patton 2012). The speakers will discuss the implications of these services for evaluative thinking and practice. How each of the hubs capitalise on the opportunities offered through centralised evaluation services will be of interest to evaluators who are interested in transforming evaluation from being on the periphery of programs to being at the heart of their design and delivery.Reference: Patton (2012) Essentials of Utilization-Focused Evaluation, Sage Publications.

Abstract 2: Evaluation Ready: ensuring evaluability

Ensuring good evaluation involves more than just hiring evaluators and setting them to work. It requires preparation, capability and an evaluation mindset. It requires programs to be evaluation-ready. How reassuring would it be if program developers knew from the outset the types of evaluations planned for their program and when they would commence? If they knew the questions that would be asked and the methods and indicators that would be used to answer them? And if evaluators were confident that the data required would be collected, tested and available for use when needed? In short, if evaluability was assured? This presentation explores how the Department of Industry, Innovation and Science's Evaluation Ready tool has improved the evaluability of its programs. At or near the design stage of a new program, Evaluation Unit staff work with policy and program specialists to develop a program logic, evaluation questions, data requirements and an evaluation schedule. These documents comprise an evaluation strategy, which informs program documentation including application forms and reporting templates. The tool has been reviewed and refined to enhance speed and consistency of application. The unit's ambition is to have it considered public sector best practice. In this presentation, the experience of applying the Evaluation Ready tool in the Department of Industry, Innovation and Science is explained and its impact on evaluability is assessed. Examples illustrate how the process has all but eliminated the need for scoping studies and evaluability assessments. We show how the process interacts with program rollout arrangements and performance reporting frameworks for individual programs. And as it hasn't always been easy, we highlight some of the challenges encountered and lessons learned along the way. 

Chairs
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →
avatar for Ruth Pitt

Ruth Pitt

Assistant Director, Evaluation Uni, Australian Government Department of Social Services
Ruth Pitt is Assistant Director of the Evaluation Unit at the Australian Government Department of Social Services. Her evaluation experience includes diverse roles in consulting, government and not-for-profit organisations, in Australia and overseas. Her qualifications include a Master... Read More →
avatar for David Turvey

David Turvey

General Manager, Department of Industry, Innovation and Science
David Turvey is the General Manager of the Insights and Evaluation Branch within the Economic and Analytical Services Division of the Department of Industry, Innovation and Science. David oversees the Divisions research and analytical work on the drivers of firm performance and the... Read More →


Friday September 21, 2018 11:00am - 12:00pm AEST
Chancellor 5

11:00am AEST

We are Women! We are Ready! Amplifying our voice through Participatory Action Research
Tracy McDiarmid (International Women's Development Agency), Amanda Scothern (International Women's Development Agency), Paulina Belo (Alola Foundation)

Our organisation's work is grounded in the principles of gender equality and women's rights, delivered in partnership with inspiring organisations across the Asia Pacific region. We recognise that gender equality requires incremental and transformative change which occurs over generations, and that strengthening women's movements through collective action and learning is a key strategy in achieving change. Capturing those changes in the voices of diverse women is at the heart of our commitment to ethical, feminist, participatory evaluation.

This Interactive Session models the principles and practices of our organisation's approach.  It will explore how evaluations can be designed to strengthen the capacity of diverse women as co-researchers; to build on and generate knowledge as a resource of and for the women who create, own and share it; and to design evaluative spaces that promote authentic, inclusive forms of evidence.  

A campfire approach will  highlight our recent experience, including the design of a mid-term reflection using feminist participatory action research methodologies and the development of our Feminist Research Framework (Nov 2017), and engage session participants to enquire into, and explore other applications of these principles and practices drawing on their own experience.  Our discussion will include evaluation design (experiences, challenges, applicability to different contexts) and methodological practices such as appreciative inquiry, narrative and performative methods.  

Key learnings are envisaged on topics such as participatory design processes (ensuring delivery and community partners are involved in the development of key questions and appropriate methodologies), capacity building (empowering diverse women as co-researchers in data collection and analysis), and accessible and applicable learning (communicating and using findings relevant to diverse partners to support political, economic and social change).  Peer-to-peer exchange will be captured, and will inform the circulation of sector guidance drawing on experience and learning of session participants.

Chairs
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.

Speakers
avatar for Tracy McDiarmid

Tracy McDiarmid

Senior Program Quality Manager, International Women's Development Agency
Dr Tracy McDiarmid completed her PhD at the University of Western Australia before working in the fields of Australian social policy, governance, disaster risk reduction, and gender. Tracy has experience in international development with a variety of International Non-Government Organisations... Read More →
avatar for Alejandra Pineda

Alejandra Pineda

Programs Coordinator, Myanmar, IWDA
I have worked with the International Women's Development Agency for three years, working in collaboration with IWDA's five Myanmar partner organisations to ensure the successful implementation of their programming on women's political participation, and women's safety and security... Read More →


Friday September 21, 2018 11:00am - 12:00pm AEST
Conference centre

11:30am AEST

Reconciliation Action Plans as drivers of social change: The engagement process in the evaluation of the Gold Coast 2018 Commonwealth Games RAP
Kate Frances (Cultural and Indigenous Research Centre Australia), Ross Williams (Cultural and Indigenous Research Centre Australia)

In a nationwide first for Australian events, a Reconciliation Action Plan (RAP) has been developed for the Gold Coast 2018 (GC2018) Commonwealth Games. An independent evaluation of the RAP has been identified as a priority initiative by the Queensland Government and between November 2017 and July 2018, researchers undertook the evaluation of this RAP. The primary data collection methodology involved face-to-face consultations with a range of stakeholders.

For many stakeholders, the GC2018 RAP represents an important framework for improvement in opportunities (particularly employment, training and contracting opportunities), relationships and respect between Aboriginal and Torres Strait Islander peoples and the broader Australian community. Recognising that RAPs represent an emerging trend as drivers of social change, especially in the ways in which organisations are publicly committing to specific actions that produce or contribute towards tangible outcomes for Aboriginal and Torres Strait Islander people and communities, the evaluators assessed how far their evaluation practices reflected the RAP paradigm. 

This paper will explore how the evaluators of the GC2018 Commonwealth Games RAP have managed to occupy the same space as RAPs as drivers of social change through embodying the concepts central to RAPs in their engagement with Aboriginal and Torres Strait Islander colleagues and stakeholders. 


Chairs
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →

Speakers
KF

Kate Frances

Principal Consultant, Cultural and Indigenous Research Centre Australia
Kate joined CIRCA as a senior researcher with over 13 years’ experience in academia and the not-for-profit sector. Kate’s experience includes research and evaluation projects over a diverse portfolio including child health, health promotion, justice, and social services for a... Read More →
avatar for Ross Williams

Ross Williams

Research Consultant, Cultural and Indigenous Research Centre Australia
My name is Ross Williams, traditional name Timmulbar (Lightning) and I am a very proud Bindal/Juru leader of Bowen to Townsville region and Torres Strait Islander from my mothers side. I have worked with traditional owner groups, Aboriginal and Torres Strait Islander organisations... Read More →


Friday September 21, 2018 11:30am - 12:00pm AEST
Chancellor 3

11:30am AEST

Transforming evaluation relationships: Evaluators as responsive and flexible mentors
Timothy Carey (Centre for Remote Health, Flinders Univeresity), George Tremblay (Center for Behavioral Health Innovation, Antioch University New England, US), Jim Fauth (Center for Behavioral Health Innovation, Antioch University New England, US)

In 2017 the Australian-American Fulbright Commission funded a research project at the Center for Behavioral Health Innovation (BHI), Antioch University New England to investigate the important factors for initiating and sustaining ongoing monitoring and evaluation within an organisation.

The research was conducted by an Australian Fulbright Scholar who interviewed 15 people from different organisations with whom BHI had partnered at various times to establish evaluation procedures and protocols for a range of different projects. A surprising finding of the project was the potential to transform the role of external evaluators. Based on the data gathered from the research participants it appears that external consultants can offer important expertise and guidance in an ongoing way.

Rather than working with organisations for discreet periods of time to reach conclusions about a specific program's effectiveness, research participants described the value in having flexible and responsive mentors who were external to the organisation but available in an ongoing capacity. Transforming the way in which the role of evaluators is conceptualised enabled service providers to change their attitudes from fearing evaluation to embracing it as a learning process that is crucial to effective service delivery. While evaluation expertise remained an important aspect of the external evaluator's role, they were able to expand the support they provided and establish a different relationship with organisations. Participants described the value of having summaries of research evidence presented to them by the external evaluators as well as having resources such as powerpoint slides prepared. Importantly, having the external evaluators as an ongoing presence meant service providers were much more likely to maintain fidelity to the relevant model. Transforming relationships with external evaluators required reorganising mindsets concerned with the traditional role of evaluators, however, the benefits of this transformation appeared to be engaged, committed, and motivated service providers.

Chairs
avatar for Helen Watts

Helen Watts

General Manager Strategy and Planning, Corangamite Catchment Management Authority
With a passion for achieving real natural resource management outcomes that are based on sound evidence and based on people's ability to contribute, learn and adapt.Love talking complexity, systems and the nexus between people and place

Speakers
avatar for Tim Carey

Tim Carey

Director, Centre for Remote Health, Flinders University
Tim is a teacher, researcher, and clinician with a background in clinical psychology. He worked for five years in the National Health Service in Scotland and, in this setting, developed innovative approaches to appointment scheduling and cognitive therapy. He evaluated these innovations... Read More →


Friday September 21, 2018 11:30am - 12:00pm AEST
Chancellor 4

12:00pm AEST

For all in tents and porpoises: the use of spell check in evalaution
Evie Cuthbertson (Grosvenor Management Consulting)

One of the most important technological advances has been the spell check tool.  Saviour of many an evaluation report from embarrassing grammatical errors and spelling mistakes.
However spell check unchecked may be your ondoing.........

Example slides (more will be done if submission successful):
  • Who is Hoo Ristic?  And what does he do?
  • Efficiency, funkiness and appropriateness
  • Activities, inputs, outages and outrages
  • Designing your random control trolls
Slides will be presented in comic strip format (using clip art and use of speech bubbles etc). 

Chairs
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →

Speakers
avatar for Evie Cuthbertson

Evie Cuthbertson

Senior Manager, Grosvenor Performance Group
Senior Manager, Grosvenor Performance GroupEvie is a seasoned evaluation consultant. Her work is characterised as being sensible, practical and accessible.Over the last 20 years she has designed and delivered a range of evaluation related projects and worked with a diverse mix of... Read More →


Friday September 21, 2018 12:00pm - 12:05pm AEST
Chancellor 3

12:00pm AEST

Evaluation fatigue and the tragedy of the commons: Are we plundering our participants' finite resources of patience and trust?
Adrian Field (Dovetail Consulting Ltd)

The 'tragedy of the commons' explores the drivers and consequences of systems of shared resources, where individual users - acting independently according to their own self-interest - behave contrary to the common good of all users, by depleting or spoiling that resource through their collective action.

First developed in the 1800s with regard to cattle grazing on common land, it has subsequently been explored more widely to encompass such issues as the Earth's natural resources and the knowledge commons.In this presentation I pose the question of what if people's willingness to take part in evaluation and research activities is itself a finite resource? In a world where almost every day we receive an invitation to take part in consumer research, and individual data is routinely collected and shared, are we irrevocably depleting a resource that as evaluators we fundamentally rely on? What are the transformations that we need to make in our mindsets, and instigate more broadly in our professional and policy networks?

This presentation was inspired by a daily transactional experience of depositing a cheque in a bank, whereupon I was asked a few days later to complete an online survey about the experience. It led me to record over subsequent months the extent to which I was being invited to take part in surveys, and alongside this to reflect on the implications of this near-daily bombardment for our own practice. 

The presentation will explore and discuss with participants the options available to us as evaluators in our work; the wider forces at play that can undermine our individual best efforts; and the transformations we need beyond our individual practice if we are to support the credibility and impact of our work?

Chairs
avatar for Helen Watts

Helen Watts

General Manager Strategy and Planning, Corangamite Catchment Management Authority
With a passion for achieving real natural resource management outcomes that are based on sound evidence and based on people's ability to contribute, learn and adapt.Love talking complexity, systems and the nexus between people and place

Speakers
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →


Friday September 21, 2018 12:00pm - 12:30pm AEST
Chancellor 4

12:00pm AEST

Why do well designed M&E systems seldom inform decision making?
Byron Pakula (Clear Horizon), Damien Sweeney (Clear Horizon)

Monitoring and evaluation is broadly accepted as part of good project design and implementation. However, M&E systems regularly fail to feedback information to improve learning or change actions by managers, donors and decision makers. As the aid program transforms itself, focusing more on problem driven iterative adaptation, the emphasis on reflecting, learning and changing is ever increasing. The authors conducted a stocktake of M&E investment level systems across an entire DFAT aid portfolio - including desktop review, key informant interviews, and a detailed rubric based on the DFAT Reporting and M&E Standard.  

While it focused on one country, lessons were further developed based on a broad range of experience. The stocktake found that the majority of M&E Plans were well designed, though sometimes overly complicated. However, the quality diminished along the M&E pathway, in relation to implementing the M&E plans, communicating information, and using information for learning and adaptive management. Additionally, it was identified that implementing partners were often dependent on M&E advisers, often with varying approaches, and in some cases, varying quality.Partner-led, participatory and engaging approaches leads to improved reporting and learning. Good M&E ideally involves the participation of program design and program implementation staff to support ownership and understanding of M&E systems. Moreover, engaging donors in the reflection and reporting processes supports communication and facilitates decision making. Supporting this, embedding evaluation in the implementation team through Evaluation Capacity Building (ECB) is integral to the quality of M&E systems. Making this an "an intentional process to increase individual motivation, knowledge, and skills and to enhance a group and/or organisation's ability to conduct and use monitoring and evaluation" as per Labin et al (2012) helps build and reinforce a culture of M&E, leading to the use of information to generation knowledge that supports adaptive management and learning.

Chairs
avatar for Ruth McCausland

Ruth McCausland

Senior Research Fellow, School of Social Sciences, UNSW
Dr Ruth McCausland is Director of Research and Evaluation for the Yuwaya Ngarra-li partnership between the Dharriwaa Elders Group and UNSW, and Senior Research Fellow in the School of Social Sciences, UNSW. Her research focuses on women, young people, people with disabilities and... Read More →

Speakers
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →
avatar for Damien Sweeney

Damien Sweeney

Principal Consultant, Clear Horizon
Damien Sweeney is a Senior Consultant at Clear Horizon Consulting. Damien is a sustainability generalist and M&E practitioner, bringing together his experiences across numerous sectors, from local government, to seafood industry, and consulting. Damien has worked with leading behaviour... Read More →


Friday September 21, 2018 12:00pm - 12:30pm AEST
Chancellor 6

12:00pm AEST

Realist Evaluation: Tracing the Evolution of Realist Program Theory over the Years of the Resilient Futures Project in South Australia
Bronny Walsh (Community Matters)

Evaluators have been working with a Research Institute to undertake a collaborative, capacity building evaluation of a pilot program called 'Resilient Futures'.  The program aimed to improve wellbeing for young people from disadvantaged communities by delivering, through schools and youth sector agencies, resiliency training and mentoring support for young people.The evaluation was intended to inform future decision-making about the Resilient Futures program, and to inform program improvement over time. A realist evaluation methodology was selected because it was a learning-oriented methodology which could contribute to program refinement, while also explaining different outcomes for different sub-groups and in different contexts. 

The program was being developed, tested and refined during the evaluation.  It started as a small-scale pilot in a few agencies, underwent a complete transformation of the delivery model and became a large-scale program delivered to hundreds of young people through multiple agencies. The program model moved from delivery of a pre-designed program in which high fidelity was expected, to supporting and resourcing the delivery agencies to adapt and use core materials in ways that were appropriate to their own setting. This required a significant change in the program theory and a change in evaluation methods. Realist evaluation is intended to be iterative, gradually developing and refining program theory through recurrent rounds of evaluation.  

This paper demonstrates how it can respond to transformation within programs. It will trace the evolution of the evaluation, demonstrating the changes in program theory, evaluation questions and methods required as the program evolved. Key findings from the final round of the evaluation will also be presented. 

Chairs
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →

Speakers
avatar for Bronny Walsh

Bronny Walsh

Research and Evaluation Partner, Walsh & Associates
My background is in criminology and I currently manage the SA Drug Use Monitoring in Australia (DUMA) research program which involves interviewing police detainees about drugs and crime.I also do a lot of work with Realist Evaluation trying to unpack why, and in what circumstances... Read More →


Friday September 21, 2018 12:00pm - 12:30pm AEST
Chancellor 5

12:00pm AEST

Inclusive Systemic Evaluation: Gender equality, Environments, Marginalised voices for Social justice (ISE4GEMS) - A New UN Women Approach for the SDG Era
Anne Stephens (James Cook University)

This presentation will introduce participants to a systemic thinking evaluation guidance, produced by UN Women. The ISE4GEMS is a new approach for the Sustainable Development Goals Era, which due to the many interrelated and interconnected SDGs, requires evaluators to think systemically, systematically and intersectionally. We introduce the GEMs framework - a framework for complex and systemic intersectional analysis which calls to attention culturally appropriate and ethical practices in evaluation planning, conduct, analysis and dissemination phases. The ISE4GEMs seeks to promote social transformation by understanding complex phenomena through a systemic approach and importantly, building evaluation capacity and every stage. The GEMs framework invokes an ethical imperative in the systemic methodological approach to the principles and practices to hear from different voices, values and forms of evidence to promote fairness, equity, accessibility and sustainability. This presentation will discuss both the theory and learned practice of its application with the UN and other global participants.This presentation will introduce participants to a systemic thinking evaluation guidance, produced by UN Women. The ISE4GEMS is a new approach for the Sustainable Development Goals Era, which due to the many interrelated and interconnected SDGs, requires evaluators to think systemically, systematically and intersectionally. We introduce the GEMs framework - a framework for complex and systemic intersectional analysis which calls to attention culturally appropriate and ethical practices in evaluation planning, conduct, analysis and dissemination phases. The ISE4GEMs seeks to promote social transformation by understanding complex phenomena through a systemic approach and importantly, building evaluation capacity and every stage. The GEMs framework invokes an ethical imperative in the systemic methodological approach to the principles and practices to hear from different voices, values and forms of evidence to promote fairness, equity, accessibility and sustainability. This presentation will discuss both the theory and learned practice of its application with the UN and other global participants.

Chairs
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.

Speakers
avatar for Kathryn Meldrum

Kathryn Meldrum

Educational Designer, James Cook University
Kathryn is an emerging evaluator drawing on close to twenty years of project conceptualisation, design, conduct and reporting experience. She has worked in the education, health and crime prevention sectors.
avatar for Jill Thomas

Jill Thomas

Manager Student Success, James Cook University
Jill Thomas is a professional evaluator with background in business analysis, program funding, contract management, project reviews and reporting. Jill’s experience spans the higher education sector, Aboriginal health care and finance sectors, including not for profit and commercial... Read More →


Friday September 21, 2018 12:00pm - 12:30pm AEST
Conference centre

12:05pm AEST

Charting a course through unpredictable seas: How Amaze is using evaluative approaches to adapt to large-scale sector reform without losing sight of long term outcomes
Natasha Ludowyk (Ludowyk Evaluation), Braedan Hogan (Amaze)
The roll-out of the NDIS requires the disability sector to be more agile and evidence-based than ever before. Amaze (the peak body for Autism in Victoria), has invested in significant organisational transformation to meet the requirements of the new system and how impact is measured, balanced with the capacity to influence system reform through advocacy.
Braedan Hogan (Amaze) and Natasha Ludowyk (Ludowyk Evaluation) will describe the challenges of transforming to deliver services and advocacy, and meaningfully measure impact, in  a rapidly evolving sector, and how evaluative strategies have been applied to each of these within a holistic MEL framework.

Chairs
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →

Speakers
avatar for Braedan Hogan

Braedan Hogan

Manager, Public Affairs and NDIS Transition, Amaze
Braedan is the Manager of Public Affairs and NDIS Transition at Amaze, the peak body for autistic Victorians, their families and supporters.Braedan oversees a range of areas including public and advocacy, information service and peer support program along with leading Amaze’s approach... Read More →
avatar for Natasha M Ludowyk

Natasha M Ludowyk

Founder, Ludowyk Evaluation
I am a social researcher and program evaluator, experienced in conducting both qualitative and quantitative research and evaluation with a broad range of populations to inform co-design, program improvement, monitoring and evaluation, communications and strategic decision-making... Read More →


Friday September 21, 2018 12:05pm - 12:10pm AEST
Chancellor 3

12:10pm AEST

Alcohol Culture Change: developing an overarching framework and method to evaluate activities under the VicHealth Alcohol Culture Change Initiative
Virginia Lewis (La Trobe University), Michael Livingstone (La Trobe University), Katherine Silburn (La Trobe University), Genevieve Hargrave (Victorian Health Promotion Foundation), Emma Saleeba (Victorian Health Promotion Foundation), Geraldine Marsh (La Trobe University)
The challenge: evaluate nine projects working to shift alcohol culture within different target groups. Funded by VicHealth’s Alcohol Culture Change Initiative (ACCI), the projects are delivered by different kinds of organisations. Our approach: using the Initiative’s underlying Alcohol Cultures Framework, we developed an overarching evaluation framework that highlights the similarities between the projects. This has supported development of a minimum set of common impact indicators to be used across all of the projects, and recommended questions for project communications and events. How is it going? This presentation will discuss the feasibility and usefulness of this approach to managing complexity.

Chairs
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →

Speakers
VL

Virginia Lewis

Associate Professor & Director of AIPCA, Australian Institute for Primary Care & Ageing, La Trobe University
I have been working as an evaluator and health services researcher for more than 25 years, working across the health sector and beyond. I have a special interest in evaluation of policies and programs implemented within complex systems. I am committed to working collaboratively with... Read More →


Friday September 21, 2018 12:10pm - 12:15pm AEST
Chancellor 3

12:15pm AEST

Improving the quality of suicide prevention programs: Strengthening the evidence-base with evaluation and collaborative partnerships
Michelle Kwan (Suicide Prevention Australia)
In a sector with a relatively ‘immature’ evidence base, is it possible to systemise quality improvement? When asked to project manage the design, development and implementation of a nationally coordinated, evidence-based, online resource to support service planning, delivery and continuous quality improvement - the biggest challenge and time commitment?
Stakeholder engagement.
The Hub is an Australia-first resource, created to strengthen best practice in suicide prevention. It exists to support and inform Government and others involved in service planning and commissioning at a local and regional level, and is a useful reference tool for communities seeking to implement suicide prevention activities.

Chairs
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →

Speakers
avatar for Michelle Kwan

Michelle Kwan

Knowledge Exchange Manager, Suicide Prevention Australia
Michelle leads a National suicide prevention quality improvement program with a focus on building a stronger evidence base for suicide prevention in Australia. Michelle is interested in supporting organisations to improve their work through better evaluation, and is currently seeking... Read More →


Friday September 21, 2018 12:15pm - 12:20pm AEST
Chancellor 3

1:30pm AEST

'Drive out fear': creating space for evaluative thinking and speculation for practitioners and organisations
Carolyn Page (The Clear English Company), Susan Garner (Public Sector Policy Solutions (PSPS)), Rob Richards (Public Sector Policy Solutions (PSPS))

'Drive out fear': creating space for evaluative thinking and speculation for practitioners and organisations

At a time when government and non-for-profit organisations are being asked to improve their practices in describing performance and applying the lessons from evaluation, it is important to take a tough look at some of the impediments. In an era of increased political exposure for senior managers, there is considerable risk in airing a policy or implementation failure — even though the 'take away' learning may be the most positive thing to emerge from it. In an Academic Symposium on 'Improving Performance Information — Developing an Entity Performance Story' hosted by the Department of Finance in 2016, Professor Brian Head, of the Institute of Social Science Research at the University of Queensland, noted the strength of institutional and cultural barriers to talking about negative outcomes, going so far as to suggest that there is almost no 'space' for this in government practice: 'We need to have confidential spaces in which we can have these discussions — a 'cone of silence'. We should make it a place we can really have these discussions.' 

At the team and individual level, the common divide between 'policy' and 'program' expertise can also result in forms of organisational silence that rob policy debate of essential insights. For these practitioners, too, there may be no safe 'space' to share insights and to speculate about theory and practice, to spot risks, or debate alternative approaches to our pressing policy challenges. 

From their private practice in policy analysis, evaluation, evidence, and organisational change management—and from their three-year collaboration as public policy and evaluation trainers—the presenters will provide suggestions about 'what works', focussing on the principles of organisational and individual learning. 

Chairs
MN

Marion Norton

Manager Research and Evaluation, Qld DJAG

Speakers
avatar for Susan Garner

Susan Garner

Director, Garner Willisson
I see myself as an 'accidental' evaluator having come from a career in science and public policy. I managed my first evaluation project as a policy analyst in the health portfolio. With post graduate qualifications in public policy and public health I found a natural affinity to evaluative... Read More →
avatar for Carolyn Page

Carolyn Page

Director, The Clear English Company
Carolyn Page built policy and evaluation units in three government agencies and is now an independent public policy analyst and trainer. In 2001 she developed the analytic tool 'Policy Logic' to help agencies to evaluate programs and policies in the context of government priorities... Read More →


Friday September 21, 2018 1:30pm - 2:30pm AEST
Chancellor 4

1:30pm AEST

Designing better surveys: from zero to hero
Dan Borg 

The online or hard copy survey is one of the go-to data collection tools in the evaluator's tool box. Easy to use online software is making these kinds of surveys more accessible than ever before with hundreds of questions for the budding survey designer to choose from. But what makes a quality survey? How do you know that your survey is well constructed and has the right kinds of questions designed to elicit high quality and reliable responses? 

In this skill-building session, aimed at those new to survey questionnaire design, we will explore the art of designing a good survey questionnaire.

We will work through the fundamentals of survey design, including overall structure; common question types and good practice in their use. We will also work through some common ways in which the design of surveys can influence responses (either increasing or decreasing reliability).  By understanding how we can commonly go wrong in the design of surveys, we will highlight strategies for avoiding these problems. 

Participants will leave the session with an increased understanding of:
  • The role of the survey designer in influencing the reliability of survey results
  • Common survey problems (and strategies for how to avoid these problems) 
  • Foundational principles for good survey design

Chairs
avatar for Tracey McMartin

Tracey McMartin

Department of Foreign Affairs and Trade
program evaluation, utilisation focused evaluation, M&E systems

Speakers
avatar for Dan Borg

Dan Borg

Independent consultant
Dan is an evaluator with a background in environmental science and a PhD from the University of Melbourne. Dan's has experience conducting evaluations in natural resource management, emergency management and in health in the public service and not for profit sectors. Dan is passionate... Read More →


Friday September 21, 2018 1:30pm - 2:30pm AEST
Chancellor 6

1:30pm AEST

Strengthening the professionalisation of evaluation in Australia, workshop 2
AES Learning & Professional Practice committee

In 2017 the AES commissioned Better Evaluation and ANZOG to explore options for strengthening the capacity and professionalisation of the evaluation sector. The report explores options to:

  • increase motivation
  • increase capacity
  • increase opportunities.

The Learning and Professional Practice Committee (LPP) of the AES Board is interested in your views about priorities for skill development, learning pathways, embedding professional competencies and opportunities to increase demand for and strengthen the operating environment for evaluation.

We are holding two workshop style sessions and participants are invited to attend either one or both.
Workshop 1 will identify and discuss issues of most interest and concern to members. Workshop 2 will build on the first, and help shape the direction for the AES in strengthening the professionalisation of evaluation in Australia.

The outcomes of the workshop sessions will be shared at the conference closing plenary.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.


Friday September 21, 2018 1:30pm - 2:30pm AEST
Chancellor 5

1:30pm AEST

Umbrellas and rain drops: Evaluating systems change lessons and insights from Tasmania
Jess Dart (Clear Horizon), Anna Powell (Beacon Foundation), Ebeny Wood (Beacon Foundation), Jo Taylor (Paul Ramsay Foundation), Kitty te Riele (Peter Underwood Centre, University of Tasmania)

There is a gradual shift in realisation that intractable or wicked problems are going to require different types of solutions - and different ways of working together.  There has been considerable energy in setting up and establishing collaborative initiatives to disrupt and change systems. These initiatives don't fit the usual confines of a program or service. They often work across sectors, are emergent,  with long establishment phases.  With the challenge of how to work with this new kind of initiative comes the challenge of how we meaningfully evaluate in this space. It's a big topic, and in this panel we focus on evaluation in the establishment phase of a systems change initiative. 

Early ideas include looking to diagnose what is holding the system in a non-optimal state, to looking for key anchors that enable systems change such as adaptive leadership, collaborative health, and trusting relationships.

This panel session brings you four different perspectives on the topic of evaluating initiatives with systems change endeavours. The philanthropist funder, the backbone leader, the project director and the evaluators. Each present their challenges and their ideas for how to evaluate systems change projects on a real example of a 5 year project here in Tasmania in 5 schools.

Chairs
avatar for Annette Madvig

Annette Madvig

Director, Nous Group
Annette has eighteen years of experience in the design, management and evaluation of complex policy and programs. She has worked in government and non-government settings in Australia and internationally, principally in Timor-Leste. She co-leads Nous' evaluation practice.

Speakers
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
avatar for Galina Laurie

Galina Laurie

Paul Ramsay Foundation
My background is in social policy development and implementation. I am a newbie to evaluation. I am interested in how we can learn from projects aiming to bring about complex change - the results, the impacts on participants, and the ways to effect changes in systems to support equity... Read More →
AP

Anna Powell

Collective ed. State Backbone Lead, Beacon Foundation, Collective ed.
Anna is driven by a purpose to address the causes of inequality in Australia. With over 15 years of experience in building networks and coalitions for social change, Anna is currently the Collective ed. State Backbone Lead, working with a network of leaders and organisations across... Read More →
avatar for Kitty te Riele

Kitty te Riele

Professor, University of Tasmania
My research aims to enhance opportunities to access, participate and succeed in education, especially for young people from disadvantaged and under-represented communities. This includes evaluation research of initiatives that share that aim. I lead the research portfolio at the Peter... Read More →
avatar for Jo Taylor

Jo Taylor

General Manager, Paul Ramsay Foundation
Over the past 20+ years Jo has worked in the UK and Australia leading philanthropic foundations for families, corporations and government departments. Jo loves working on knotty problems, alongside communities, and working with people with differing and diverse experience in the... Read More →
avatar for Ebeny Wood

Ebeny Wood

Collective ed. Director, Beacon Foundation, Collective ed.
Ebeny came to Beacon Foundation from the University of Tasmania, where she was undertaking her doctorate on secondary student school disengagement. Her honours work was also in the field of education, with a focus on social change and schooling. Ebeny has been with Beacon since October... Read More →


Friday September 21, 2018 1:30pm - 2:30pm AEST
Chancellor 3

1:30pm AEST

Into the great wide open (data): Understanding and using big and open data in evaluations
Jessie Wilson (Allen and Clarke)

The idea of big data and open data - and the increasingly inevitable incorporation of these approaches into evaluations - is terrifying for some and tantalising for others. For those falling into the former category, a lack of understanding, familiarity, and/or confidence in approaching big/open data has the potential of limiting one's own evaluative practice. In other contexts, limitations with and/or misapplications of big/open data can also impact on the validity and credibility of the evaluation designs and findings we produce.

The purpose of this interactive AES conference session is two-fold. We will: 1) address these fears, concerns, and limitations about use of big/open data in evaluations; and 2) begin to learn how to use these approaches in our own evaluative practices. Although I have a strong quantitative research background, I am just beginning my own big/open data journey within an evaluation context. As such, I promise to be encouraging and honest about how we evaluation professionals can start to become, in the words of Michael Bamberger, more 'sufficiently conversant' with these new approaches and begin building them into our ever-transforming toolkits to enhance how we evaluate policies, programs and interventions.

With the above purposes in mind, the session will use a World Café approach and practical, real-world Australasian examples to discuss and share learnings about:
  • what big data and open data is and is not and differences between these approaches; 
  • evaluative situations in which the use of big/open data is and is not appropriate, framed by various considerations (e.g., evaluand, evaluation methodology, evaluation questions and criteria, stage in the evaluation's project cycle); and
  • limitations of big/open data use in evaluations (e.g., data reliability and quality, ethics, consent) and management of these limitations.
Participants will also be provided with a guide for how to assess big/open data quality within an evaluation context.

Chairs
avatar for Katherine Pontifex

Katherine Pontifex

AES22 Program Co-Chair. Manager, Evaluation Services, Wellbeing SA
Katherine is the Manager, Evaluation Services at Wellbeing SA. She is an experienced evaluation expert with an extensive background working in government on health and social programs, policies and systems. Her evaluation practice is firmly grounded in a pragmatic approach with an... Read More →

Speakers
avatar for Jessie Wilson

Jessie Wilson

Senior Associate (Evaluation + Research), Allen and Clarke
I am a senior evaluator, researcher and project manager with 10+ years’ experience working with public and private sector agencies. In addition to my workshop topic (incorporating big/open data into evaluation design and implementation), you can also talk to me about cultural competency... Read More →


Friday September 21, 2018 1:30pm - 2:30pm AEST
Conference centre

2:30pm AEST

Closing plenary: It’s the AES18 Great Debate and it’s going to be huge!
Bear witness to the battle of the biggest evaluation brains as two teams fight it out to reign supreme. Six experts will use their evaluation knowledge, evaluative logic, wit and charm to win the day and prove that they are the better side. The topic is contentious and opens the way for a synthesis and lively critique of the conference theme. The competition will be heated and the insights will be world-class.
The AES18 Great Debate topic is:

Evaluation as a profession will be replaced by artificial intelligence and we should all be looking for new jobs

The debate will be battled out during the final conference plenary session with the Duncan Rintoul, our MC in the moderator’s chair.
The affirmative team will argue artificial intelligence will replace the role traditionally played by evaluators, and some version of Siri or Alexa drawing on all existing evidence to produce an instant evaluation of a policy or program for minimal cost. 
The negative team will argue that evaluation is even more relevant that ever, and that artificial intelligence will simply assist evaluators do an even better and demanded role.

Followed by:
Conference close, AES President
Handover to aes19 conference

Chairs
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →
avatar for Penny Hagen

Penny Hagen

Co-design Lead, Auckland Co-design Lab
Penny assists organisations and teams to apply co-design and design-led approaches to the design and implementation of strategy, programs, policies and services. She specialises in projects with social outcomes and supports capability building with teams and organisations wanting... Read More →
avatar for Kate McKegg

Kate McKegg

Director, The Knowledge Institute
Kate has worked in the fields of evaluation, evaluation capacity building, research, policy and public sector management since the late 1980s. She has developed specialist skills in developmental evaluation, programme evaluation, evaluation capacity building, strategy, policy, research... Read More →
avatar for Karol Olejniczak

Karol Olejniczak

Assistant Professor, University of Warsaw, Centre for European Regional and Local Studies
Karol Olejniczak is an Assistant Professor of public policy at EUROREG - University of Warsaw, Poland, and a visiting scholar at The George Washington University, Washington D.C. He is also a co-founder of policy research company Evaluation for Government Organization (EGO s.c.).His... Read More →
avatar for Gill Westhorp

Gill Westhorp

Professorial Research Fellow, Charles Darwin University
Gill leads the Realist Research Evaluation and Learning Initiative (RREALI) at Charles Darwin University. RREALI develops new methods and tools within a realist framework, supports development of competency in realist approaches and provides realist evaluation and research services... Read More →


Friday September 21, 2018 2:30pm - 4:00pm AEST
Conference centre
 
Filter sessions
Apply filters to sessions.