Loading…
Welcome cocktail reception, Tuesday 18 September 6:00–8:00pm, open to all delegates! Venue: Penny Royal
 
Chancellor 5 [clear filter]
Monday, September 17
 

9:00am AEST

Valuing social outcomes to demonstrate impact (Taimur Siddiqi)
This interactive workshop will focus specifically on how to undertake an SROI analysis to value outcomes as part of ongoing monitoring and evaluation (M&E) activities and using M&E data. It will also encourage participants to consider the benefits and challenges of valuing outcomes. It will be based on peer learning, with a series of cooperative learning exercises and opportunities for group discussion. Participants will be asked to bring their own examples and provided with take home templates and resources to assist them with their analyses.

Speakers
avatar for Taimur Siddiqi

Taimur Siddiqi

The Incus Group
Taimur is an experienced evaluation and impact measurement professional who is Managing Director of The Incus Group and current member of the AES Pathways Committee. In his consulting role, he has completed numerous evaluation and impact measurement projects, working with a range... Read More →


Monday September 17, 2018 9:00am - 5:00pm AEST
Chancellor 5
 
Tuesday, September 18
 

9:00am AEST

Behaviour architects: a game that applies behavioural insights to improve policy solutions (Karol Olejniczak)
The workshop is designed in the form of a game with case studies that provide participants with engaging yet research-based learning experiences. All levels from beginners to advanced are welcomed, and the workshop is aimed at those interested in how to evaluate and improve public policy, as well as anyone wanting to experience a game-based workshop!

Speakers
avatar for Karol Olejniczak

Karol Olejniczak

Assistant Professor, University of Warsaw, Centre for European Regional and Local Studies
Karol Olejniczak is an Assistant Professor of public policy at EUROREG - University of Warsaw, Poland, and a visiting scholar at The George Washington University, Washington D.C. He is also a co-founder of policy research company Evaluation for Government Organization (EGO s.c.).His... Read More →


Tuesday September 18, 2018 9:00am - 5:00pm AEST
Chancellor 5
 
Wednesday, September 19
 

11:00am AEST

Big data, big possibilities, big challenges: Lessons from using experimental designs in evaluation of system-level educational reforms
Duncan Rintoul (NSW Department of Education), Ben Barnes (NSW Department of Education), Ian Watkins (NSW Department of Education)

For many evaluators, quasi-experimental designs fall at the first set of hurdles, due to the absence of readily available data sets and the difficulties associated with identifying appropriate comparison/control groups. At the NSW Centre for Educational Statistics and Evaluation (CESE), we have been fortunate to clear these first hurdles on occasion, only to then hit the second set: the technical challenges of working with big data.
 
This paper is a chance for participants to get their hands dirty... or at the very least to hear the stories of people with dirty hands. The presenters are senior practitioners - the Director of CESE's evaluation unit and the Principal Data Analyst responsible for statistical modelling. The paper will lift the lid on this important (but uncommon) aspect of evaluation practice: the models they build; the data management challenges they face; the internal political challenges they face; the statistical methods that bear more - or less - fruit; and how they translate 'heavy quant' back into actionable insights for policy and program management. 

Through a set of case studies, the presenters will draw out practical lessons and tips for making these designs work - including what the team has needed in terms of skillsets, models, software, datasets, mindsets and other complementary elements of evaluation design that sit alongside the quant.

Chairs
avatar for Keryn Hassall

Keryn Hassall

Aptavit
I'm interested in how people think - about evaluation, about policies and programs, about their work, and how people think about other people.I have two primary areas of professional focus:(1) Organisational capability and practice change - using organisation theory and practical... Read More →

Speakers
BB

Ben Barnes

Director Evaluation, NSW Department of Education
I began in consultancy, and made the move to the public sector in 2012. I am now Director of Evaluation at the Centre for Education Statistics and Evaluation in the NSW Department of Education. We have a team of over 30 internal evaluators, data analysts and evaluation capacity builders... Read More →
avatar for Duncan Rintoul

Duncan Rintoul

Manager, Evaluation Capacity Building, NSW Department of Education
I have been working in social research and evaluation since 2000. My favourite things to chat about, apart from my kids and how good Wollongong is:* evaluation capacity building* design thinking and innovation* evaluative practice in education, particularly in schools * public sector... Read More →
IW

Ian Watkins

Principal Data Analyst, NSW Department of Education
Ian has a background in psychology and statistics. He studied and taught at the University of New South Wales before moving to the public sector. He is now the Principal Data Analyst at the Centre for Education Statistics and Evaluation in the NSW Department of Education where he... Read More →


Wednesday September 19, 2018 11:00am - 12:00pm AEST
Chancellor 5

12:00pm AEST

Size Matters: Quantifying the Size of the Challenge Through Big Data, Analytics and Evaluative Thinking
Rico Namay (Ministry of Education, NZ)

Although at a young stage in terms of unleashing the full extent of what they can offer to policy-setting, big data and analytics plus evaluative thinking are a potent mix that could have real and huge impact in the way policy is set or research and evaluations are conducted in the future.

This presentation shows how the transformative power of linked government data and analytics, combined with the ability to ask the right questions, help:
  • evaluate policy options,
  • make value for money assessments,
  • target participants for intervention programs and
  • set specific and measurable goals for organisations; school clusters in particular.
 
Reflections on some lessons gleaned from the appplication of big data and analytics follow the examples.
 
 
 
 
 
 
 

Chairs
avatar for Keryn Hassall

Keryn Hassall

Aptavit
I'm interested in how people think - about evaluation, about policies and programs, about their work, and how people think about other people.I have two primary areas of professional focus:(1) Organisational capability and practice change - using organisation theory and practical... Read More →

Speakers
avatar for Rico Namay

Rico Namay

Principal Analyst, Ministry of Education, New Zealand
Analytics, research and evaluation enthusiast; music and film fan; food lover; and not necessarily in that order.Although Rico has no degrees in music, film nor food he holds degrees in Mathematics and Applied Mathematics and has done studies in Statistics, Econometrics and Optimsation.Rico... Read More →


Wednesday September 19, 2018 12:00pm - 12:30pm AEST
Chancellor 5

1:30pm AEST

Outcomes, dashboards and cupcakes
Jenny Riley (Navigating Outcomes Pty Ltd)

Outcomes based performance management is heading our way. We know it and at Windana we are getting ready, but we wanted an outcomes measurement framework that works for us and our clients, one that is meaningful, robust and proportionate. We wanted it not to be a tick-box, top-down, administration burden but something that could add value to our work and perhaps in drive our work.

With this in mind Windana embarked on an outcomes measurement journey in April 2017. Our ambition was to introduce real-time outcome measurement into our 35 bed therapeutic community in Maryknoll, Victoria.We took the time to build the skills and knowledge of our team about what outcomes are versus outputs, our residents participated in a 'theory of change' workshop, allowing us to identify our short, medium and long term outcomes.
The consultants worked with us to recommend validated tools to collect data that aligned with our intended outcomes. We visioned a dashboard of how this data could be feed back to our clients and staff in real-time. We launched our dashboards on 4th December (this is where the cupcakes come in) and have been collecting and using the data to support our work in Maryknoll.

We will present feedback from staff and clients 6 months into using our live dashboards i.e. was it worth it? is it adding value to our work? what are we learning? 

In concluding this paper we will share the process including what worked well and what we could have done better. We will share our recommendations for setting up outcome measurement in other therapeutic communities and programs in the AOD sector and our 'what next' thinking about shared measurement across the sector and opportunities for data linkage.

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Speakers
avatar for Clare Davies

Clare Davies

Executive Director Rehabilitation Services, Windana Drug and Alcohol Recovery Inc
avatar for Jenny Riley

Jenny Riley

Founder and Director, Navigating Outcomes
I help organisations and collaborations access and learn from their data so they can improve social outcomes for individuals, families and communities.My passion is to bring together digital solutions (cloud, mobile, big data analytics, social media) and the social sector. I develop... Read More →


Wednesday September 19, 2018 1:30pm - 2:00pm AEST
Chancellor 5

2:00pm AEST

New evaluation techniques for the transformation of Melbourne. Time-and-place targeting technology and the decline of the 300-page evaluation report
David Spicer (Colmar Brunton), Kirstin Couper (Colmar Brunton)

The impact of disruptions initiated by the transformation of transport infrastructure is a hot topic in Melbourne. Improvements to crucial arterial roads and public transport corridors mean there is a lot for Melburnians to consider when planning a journey. We will share the results from an evaluation of the impact of twelve infrastructure disruptions. Each of the twelve disruptions covered different locations, time periods and transport modes. Historically, there has been concern that traditional lagging indicators from online and phone surveying could not capture accurate or timely recall of travel experience. We overcame this limitation using 'geo-targeted sampling' as part of a suite of methodologies. We used targeted surveys on mobile devices using GPS data to identify individuals who had been present at a specific location at a specific time. There was no traditional 'Evaluation Report' for this study, nor did we use static 'scorecards' or similar devices across the 12 disruptions. Instead, we shaped the way that policy-makers and planners could interrogate the data relevant to their area by providing a series of interactive online dashboards. The dashboard enabled the dissemination of findings that created a space where a broad range of stakeholders could test their hypotheses. These stakeholders may not have been able to answer their own research questions using traditional and static report/scorecard materials.This did not de-value the role of the evaluator who was always on hand to aid with interpretation and translation of data into insights. Rather, it empowered clients and their stakeholders to take control of their own data. We will demonstrate the dashboard outputs in our presentation. 

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Speakers
avatar for David Spicer

David Spicer

Research Director, Colmar Brunton


Wednesday September 19, 2018 2:00pm - 2:30pm AEST
Chancellor 5

2:30pm AEST

Leveraging publicly available longitudinal and transactional data sources to create comparison groups in quasi-experimental and natural experimental evaluation scenarios.
Gerard Atkinson (ARTD Consultants)

One of the challenges faced by evaluators is how to effectively determine the impacts of a program when a control group is not readily available. Sometimes the design of the program makes such groups impossible or unethical to create (e.g. mandatory or selective participation), or constraints on resources and scope make such investigations infeasible.

These challenges have led to the development of quasi-experimental and natural experimental approaches to evaluation.In parallel to the adoption of these techniques, the shift to policies of "open government" has enabled greater public access to data. Much of these data capture transformations in society over time, or provide records of how people have interacted with government and public services. In the right situations, these data can be used to augment impact evaluations through creating comparison groups for analysis.

This presentation looks at a variety of publicly available data sources, ranging from large scale longitudinal studies such as HILDA, geographic data such as the Geographic National Address File, or transactional data such as public transport journeys. These data sets can be used to enhance the robustness of quasi-experimental and natural evaluations. Through exploring example data sets and case studies, we consider the challenges of identifying and preparing such data, the privacy and ethical implications, and the value that such data can add to the evaluation process.

Chairs
avatar for Eleanor Williams

Eleanor Williams

Assistant Director, Centre for Evaluation and Research, Department of Health and Human Services Victoria
In my current role as Assistant Director of the Centre for Evaluation and Research at the Department of Health and Human Services, I am working to build an evaluation culture across a large government department of over 12,000 employees. My team and I aim to improve the use of data... Read More →

Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation - market and social research - financial and operational modelling - non-profit, government and business... Read More →


Wednesday September 19, 2018 2:30pm - 3:00pm AEST
Chancellor 5

3:30pm AEST

Buka Hatene - an innovative model promoting adaptive management for improved aid effectiveness in Timor-Leste
Louise Maher (M&E House, Timor-Leste)

In Timor-Leste, an innovative approach to transforming monitoring, evaluation and learning capacity and quality for improved aid effectiveness has been developed. The Australian government has established M&E House, a monitoring and evaluation focussed facility designed to contribute towards a high performing and continually improving development program by ensuring that (1) the Australian Embassy is equipped with evidence and capacity to continually improve decision-making and tell a clear performance story, and (2) that implementing partners generate and use evidence to learn, adapt and produce user-focused reports.

M&E House will transform current practice into a whole-of-program adaptive performance management system. M&E specialists implement the program, supported by an Australian organisational partnership.  M&E House has facilitated the development of a whole-of-program performance assessment framework identifying shared outcomes and indicators for improved integration, collaboration and reporting across program boundaries, and will develop an underpinning information management system. Strategic reviews on cross-sectoral issues provide evidence for improved systems-level programming. Implementing partners are facilitated to develop and implement M&E plans, apply adaptive practice, and improve reporting. Evaluation capacity building is focussed on improving foundational capabilities, changing mind-sets, and building motivation.    

The M&E House model allows for application of a single M&E approach, which is utilisation focussed, realist, and consolidates evidence from mixed-methods. It enables M&E methods to be trialled, improved and scaled out. It ensures M&E expertise is accessible to stakeholders, and keeps M&E front-of-mind for implementers. A lean and influential approach ensures targeted information is available for decision-makers. It allows for trusting relationships to develop, to ensure stakeholder participation and engagement in improving program performance. 

Baseline data on M&E systems justifies the need for an innovative solution, and early evidence after one year indicates that the M&E House model may be an effective and relevant solution to transforming MEL systems for improved aid effectiveness in Timor-Leste.

Chairs
Speakers
avatar for Louise Maher

Louise Maher

Team Leader, M&E House (Buka Hatene)
Louise Maher is the Team Leader of M&E House, which supports the Australian development program in Timor-Leste to tell a clear results story. Louise has a background in physiotherapy, public health, and international development. She is interested in building organisational motivation... Read More →


Wednesday September 19, 2018 3:30pm - 4:00pm AEST
Chancellor 5

4:00pm AEST

Lessons on designing, monitoring, evaluating and reporting for policy influence programs
Ikarini Mowson (Clear Horizon), Byron Pakula (Clear Horizon)

Development aid is transforming from direct service provision to influencing policies to promote systemic change and achieve development outcomes. More and more aid programs are seeking to become catalytic drivers through influencing policies. These influencing programs have some distinct elements that mean traditional approaches to design, monitoring, evaluation and reporting are not as relevant. 

Drawing on experience in facilitating and developing monitoring and evaluation frameworks for policy influence programs, this paper presents some practical lessons that can be applied by designers, managers and evaluators. 

First, we must understand the five main characteristics of policy influence programs including complexity; unpredictable links between cause and effect; the scope and scale may move away; policy goals and commitments may change; and outcomes / impacts may be delayed. Second, there needs to be clear definition of policy changes expected in the programs. Policy changes may be defined in a very broad manner that allow the program to capture policy decisions and processes, including implementation. Policy changes could also be defined to capture every step in the policy cycle. Third, use people-centred approaches to theory of change including stakeholder analysis, in order to step out causal pathways and make sure intermediate outcomes are clearly articulated. Fourth, monitoring systems can be strengthened by using light approaches such as influence log to sufficiently capture the intricate details that are often not known if they will be the triggers of change. Fifth, apply multiple evaluation methods to measure influence, particularly methods to assess the contribution of an intervention to policy change rather than outputs or outcomes. Evaluating contribution is more realistic, cost-effective and practical than seeking to establish attribution or using experimental approach. Some outcome harvesting tools such as outcome mapping, episode studies or significant instances of policy and systems improvement (SIPSI) could be used in the evaluation.

Chairs
Speakers
avatar for Rini Mowson

Rini Mowson

Consultant, Clear Horizon
Rini has been working in the international development and Australian community development sectors for more than thirteen years. She has worked with a broad range of social change initiatives and businesses globally. Rini has extensive experience in designing and implementing monitoring... Read More →
avatar for Byron Pakula

Byron Pakula

Principal Consultant, Clear Horizon
Byron has gathered a broad array of professional experience working for government, private and not for profit sectors internationally and in Australia for over fifteen years. Byron is a well respected and influential manager, strategist and adviser that has applied himself to some... Read More →


Wednesday September 19, 2018 4:00pm - 4:30pm AEST
Chancellor 5

4:30pm AEST

The potential for system level change: Addressing political and funding level factors to facilitate health promotion and disease prevention evaluation
Joanna Schwarzman (Monash University), Ben Smith (The University of Sydney; Monash University), Adrian Bauman (The University of Sydney), Belinda Gabbe (Monash University), Chris Rissel (NSW Ministry of Health), Trevor Shilton (National Heart Foundation, Western Australia)

Despite the known importance of evaluating prevention initiatives, there are challenges to conducting any evaluation, and efforts can fall short in terms of quality and comprehensiveness. Evaluation capacity building research and strategies have to date focused on individual and organisational levels. However, the factors acting to influence evaluation practice at the level of the prevention system have not been explored. 

We conducted a national mixed-methods study with 116 government and non-government organisations that sought to identify the factors that influence evaluation practice in the prevention field. Participating organisations took part in three phases of data collection. These were qualitative interviews (n=40), a validated evaluation practice analysis survey (n=216, 93% response rate) and audit and appraisal of two years of evaluation reports (n=394 reports). 

In this presentation we focus determinants of evaluation practice at the prevention system level. We found the system played a key role in the demand for evaluation, however it also presented significant challenges, particularly through time-limited funding agreements and mismatched expectations of policy makers and funded agencies. The political and funding contexts impacted on the resources available for prevention programs and the purpose, scope and reporting requirements for evaluation. We also found some prevention organisations were proactive in negotiating and modifying elements of the political, contextual and administrative requirements to improve the conditions for evaluation. Other organisations with less evaluation capacity, resources and experience were not in a position to engage in advocacy to the same degree.

Evaluation capacity building is an increasingly important component of many evaluator's roles, and there are still important gains to be made within prevention organisations and government agencies. This research builds on insights concerning organisational level influences, and can guide evaluators, practitioners and polic

Chairs
Speakers
avatar for Joanna Schwarzman

Joanna Schwarzman

PhD Candidate, Monash University
I'm in the final stages of completing my PhD research at the School of Public Health and Preventive Medicine, Monash University. Over the last three years I've been working to identify and understand the factors that influence evaluation practice in health promotion and disease prevention... Read More →


Wednesday September 19, 2018 4:30pm - 5:00pm AEST
Chancellor 5
 
Thursday, September 20
 

9:30am AEST

Transforming evaluation culture and systems within the Australian aid program: Embracing the power of evaluation to promote learning, transparency, and accountability.
David Slattery (Department of Foreign Affairs and Trade), Tracey McMartin (Department of Foreign Affairs and Trade)

Evaluation is a core means of assessing the effectiveness of Australian aid. Over the past five years, DFAT has progressively transformed its evaluation culture and systems from one non-compliance with policy requirements, and non-publication of results to what is now a structured and systematic approach to assessing and providing feedback on performance. Once a focus for strong external criticism of Australia's aid administration, evaluation is now regarded as one of its biggest strengths. This paper will identify and examine the key drivers for this transformation, including the importance of strong institutional leadership; clarity over accountabilities for delivering evaluations; flexibility to determine priorities and to design evaluations that will address these priorities; realism about the capacity of programs to commission and use evaluations; mechanisms for protecting the independence of evaluations; and a culture that values independent viewpoints and contestation and is willing to be transparent about the challenges it faces.

Chairs
avatar for Michael Shapland

Michael Shapland

Director for Interoperability and Innovation, Office of the Inspector-General Emergency Management
Mike Shapland is Director for Interoperability and Innovation in the Office of the Inspector-General Emergency Management in Queensland.  Before joining the Office, he worked with the Department of Emergency Services and Department of Community Safety from 2004 to 2013 in roles covering... Read More →

Speakers
avatar for Tracey McMartin

Tracey McMartin

Department of Foreign Affairs and Trade
program evaluation, utilisation focused evaluation, M&E systems


Thursday September 20, 2018 9:30am - 10:30am AEST
Chancellor 5

11:00am AEST

The Enhanced Commonwealth Performance Framework - the opportunity for the Australian evaluation community
David Morton (Department of Finance), Brad Cook (Department of Finance)

The Australia Parliament - through the Joint Committee on Public Accounts (JCPAA) -  has encouraged the Department of Finance (Finance) and others to support capacity-building to further implement the enhanced Commonwealth performance framework. Evaluators have a key role. They will need to be clear about what they have offer, and how they can help deliver better performance information to government, the Parliament and public more broadly. They will need to be willing to adapt what evaluators do and know today, and participate in developing the flexible approaches needed in the future. The performance frameworks calls for approaches that deliver performance information that simultaneously supports  accountability to the taxpaying public and everyday operational decisions.  We encourage the Australian evaluation community to reflect on  what it has to offer and how it can work with others to shape the evolution the of the performance framework. 

The performance framework commenced on 1 July 2015. It succeeds if it enables the Australian Parliament and public to understand the benefits of Commonwealth activity. The framework encourages entities and companies to move past over-reliance on input and output-based performance measures. There is a clear role for evaluators to contribute to this important adjustment. Opportunities lie in helping a larger cross-section of the Commonwealth public sector understand and use the evaluators' toolbox - for example, program theory and qualitative analysis - to improve the quality of published performance information available to stakeholders. The evaluation community has the opportunity to be at the centre of key expertise, and to make a critical contribution to building the capability of 'performance professionals' across the public sector.

Chairs
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →
avatar for Brad Cook

Brad Cook

Assistant Secretary - Public Governance, Performance and Accountability, Department of Finance
avatar for David Morton

David Morton

Assistant Director, Department of Finance
David is a an Assistant Director in the Department of Finance. From September 2014 to May 2018 he worked in various teams responsible for establishing the enhanced Commonwealth performance framework under the PGPA Act. David’s contribution included drafting guidance on developing... Read More →
avatar for David Roberts

David Roberts

Principal Consultant, RobertsBrown
David Roberts is a self-employed consultant with wide experience over 35 years in evaluations using both qualitative and quantitative methods. David has training and experience in Anthropology, Evaluation and Community Development. In 2016, David was awarded the Ros Hurworth Prize... Read More →
avatar for David Turvey

David Turvey

General Manager, Department of Industry, Innovation and Science
David Turvey is the General Manager of the Insights and Evaluation Branch within the Economic and Analytical Services Division of the Department of Industry, Innovation and Science. David oversees the Divisions research and analytical work on the drivers of firm performance and the... Read More →


Thursday September 20, 2018 11:00am - 12:00pm AEST
Chancellor 5

12:00pm AEST

Developing an AES Advocacy and Influence Strategy: A consultation and co-design session for AES members

Influence is one of the key components of the AES 2015-2019 Strategic Plan. The AES Advocacy and Alliances Committee is developing an Advocacy and Influence Strategy in order for the AES to project its 'voice' and to enable it to better serve its members and the profession.

The Strategy is underpinned by the key principles of:
  • Collaboration: (within the AES membership and between the members and clients) Inclusiveness: sharing information and ideas with clients and members
  • Continual professional growth: (within membership and clients)
  • Professional service: on behalf of and to our members
  • Innovation: new ways to respond to new times

In keeping with these principles, the Advocacy and Alliances Committee is offering an opportunity for AES members to be involved during the Conference in a consultation and needs analysis session that will contribute to the design of the Strategy. The session will explore what needs or issues members have regarding advocacy and influence, and their thinking about the most relevant and useful approaches.


A background paper will be made available for participants to read prior to the session. 

Chairs
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →

Speakers
avatar for Alexandra Ellinson

Alexandra Ellinson

Manager, Evaluation, Research & Policy, Institute for Public Policy & Governance, UTS
Alexandra delivers strategic evaluations, reviews and consultancies, working in partnership with clients. She is deeply knowledgeable about the social services sector and has particular expertise in interagency responses to complex issues, such as housing and homelessness, domestic... Read More →
avatar for Margaret MacDonald

Margaret MacDonald

Director, MacDonald Wells Consulting
Margaret is a leadership, public policy and evaluation consultant and works mostly in the social and health policy issues. She has a passion collaborative practice, systems thinking and linking ideas to practice.. Margaret is particularly interested in the interplay between policy... Read More →



Thursday September 20, 2018 12:00pm - 1:00pm AEST
Chancellor 5

2:00pm AEST

Principles before rules: Child-centred, family-focused and practitioner-led evaluation in child protection
Stefan Kmit (Department for Child Protection)

Supporting a learning environment through evaluation requires more than just the monitoring of service indicators. The South Australian Department for Child Protection (DCP) is committed to principles-based evaluation processes with children, families and practitioners that acknowledge the improvement journey is just as significant as the final outcome. Known as 'the rudder for navigating complex dynamic systems ' (Patton, 2018), principles-focused evaluations enable us to think beyond structures and processes to re-direct focus on service user experience and outcomes. Our ability to form and reform service approaches based on what key stakeholders tell us underpins a continuous improvement and questioning culture.

This is all driven by a passion to best identify how we can (a) better use the voice of the child; (b) shift our view of children and families from service users to service shapers; (c) orient findings towards practitioner learning; and, (d) create more opportunities for closer collaboration across the board.

Recent 'evidence-informed practice ' (Moore, 2016) evaluations of the DCP Young People's Council and the DCP Volunteer program have featured the voice of children and their families and carers within the system. Using these as case studies, we will examine the evaluation design and engagement strategies incorporated with children and families and share critical learnings about applying a principles-focused approach.

Chairs
avatar for Rhianon Vichta

Rhianon Vichta

Research and Evaluation Coordinator, Brisbane Youth Service
Rhianon Vichta became an evaluator after spending more than 20 years delivering, designing, managing and working to improve social programs both in Australia and overseas. Throughout her career journey from crisis counsellor to CEO, she continued to seek answers to the fundamental... Read More →

Speakers
avatar for Stefan Kmit

Stefan Kmit

A/Manager, Research and Evaluation, Department for Child Protection


Thursday September 20, 2018 2:00pm - 2:30pm AEST
Chancellor 5

2:30pm AEST

Youth Partnership Project: Applying place-based collective impact and evaluating for systems change
Maria Collazos (Save the Children)
‘Wicked problems’ demand a new way of thinking and working; one which moves beyond independent programs with isolated impact, to a collaborative approach with a common goal. By rethinking the system and how it operates, we can discover new solutions with population level impact.  Being able to measure this impact is key. This practice-focused presentation explores systems change evaluation, using the place-based collective impact initiative, the Youth Partnership Project (YPP), as a case study.
Despite significant investment in the community, there has been persistent issues of youth crime and anti-social behaviour in the south-east corridor of Perth. The YPP was formed as a strategic project to develop a cross-sector early intervention system for the region, and is a demonstration site for Western Australian reform. The project brings together a broad cross-sector of partners to systematically identify the most vulnerable young people in the community and collaboratively address complex needs which are the responsibility of multiple agencies.
This presentation will delve into the challenge of evaluating systems change in initiatives with multiple levels of impact, from individual to systemic. We consider how impact at these different levels affect one another and draw on the YPP’s approach of using developmental evaluation to provide a framework for continuous learning, emergent strategies and monitoring effectiveness and efficiency. Finally, we discuss cost-benefit analysis as an advocacy tool to articulate the need for prevention-focused collaboration and system reform.

Chairs
avatar for Rhianon Vichta

Rhianon Vichta

Research and Evaluation Coordinator, Brisbane Youth Service
Rhianon Vichta became an evaluator after spending more than 20 years delivering, designing, managing and working to improve social programs both in Australia and overseas. Throughout her career journey from crisis counsellor to CEO, she continued to seek answers to the fundamental... Read More →

Speakers
MC

Maria Collazos

Project Design and Development Officer, Save the Children
BA. Political Science and Grd Cert Social Impact.Maria works to strengthening the quality of program delivery through effective design of program logics and impact evaluation frameworks. Maria works at a strategic and an operational level, bringing together theory and practice. She... Read More →


Thursday September 20, 2018 2:30pm - 3:00pm AEST
Chancellor 5

3:00pm AEST

From outputs to outcomes: A system transformation approach for the Victorian child and family service sector
Emily Mellon (Centre for Excellence in Child and Family Welfare)

The Victorian child and family service sector is undergoing a profound transformation from a service system to a learning system where experimentation, rapid knowledge sharing and continuous improvement will be the norm. The sector is transforming from a disparate network of services to an integrated learning system that better serves Victoria's children and families. 

The learning system assumes a culture of inquiry, experimentation and learning which requires certain knowledge, skills and motivation akin to an evaluation capacity building (ECB) effort. This paper explores how the Victorian child and family services' Outcomes Practice and Evidence Network (OPEN) is supporting community sector organisations to move from outputs to outcomes to better serve vulnerable children and families. 

We will share the Outcomes Practice and Evidence Network approach which has drawn on Learning Organisation, Knowledge Translation and ECB literature to develop a framework for systemic capacity building. Significantly there is an appreciation that both bottom-up and top-down efforts are required for system transformation, the challenges in cohesively framing and delivering these efforts will be discussed. In addition some of the particular strategies used to bridge the gap between research and practice, demystify and improve evaluation practice and support the sector to create, share and use better quality evidence will be presented. Importantly, we will focus on a specific case example from the child and family service sector to demonstrate the impact of our approach and the experience  of system transformation efforts at the local level.  

Chairs
avatar for Rhianon Vichta

Rhianon Vichta

Research and Evaluation Coordinator, Brisbane Youth Service
Rhianon Vichta became an evaluator after spending more than 20 years delivering, designing, managing and working to improve social programs both in Australia and overseas. Throughout her career journey from crisis counsellor to CEO, she continued to seek answers to the fundamental... Read More →

Speakers
avatar for Emily Mellon

Emily Mellon

OPEN Project Manager, Centre for Excellence in Child and Family Welfare
Emily is passionate about supporting sustainable social change through harnessing collective wisdom, nurturing inquiry mindedness and encouraging the utilisation of evidence. Emily is an advocate for creating space to inspire and hear the perspectives and insights of young people... Read More →


Thursday September 20, 2018 3:00pm - 3:30pm AEST
Chancellor 5
 
Friday, September 21
 

9:00am AEST

Taking an intersectional approach to evaluation and monitoring: moving from theory to practice
Sarah Kearney (Our Watch), Anna Trembath (Our Watch), Elise Holland (OurWatch)

In recent decades, there have been growing efforts to apply intersectional theory to the field of gender equality, health promotion, and other areas of social policies. While much of the focus so far has been on understanding how to apply an intersectional lens to policy and programming, of equal importance is the application of an intersectional approach to monitoring and evaluation and its potential to reveal meaningful distinctions and similarities in order to better a) understand the impact of social interventions and b) monitor progress toward social policy outcomes.  

The panel consists of practice specialists with expertise in evaluation and monitoring from Our Watch, the national foundation for the prevention of violence against women. Each panellist applies an intersectional approach to designing either project-level evaluations or monitoring frameworks for tracking population-level change.  This panel will open by exploring the concept of intersectionality and its role in the development of transformative social policy.
Building on this theoretical understanding, the panellists will be interviewed by a facilitator on how they have embedded intersectionality into their monitoring and evaluation projects, drawing primarily from examples of violence prevention interventions and initiatives aimed at promoting gender equality. Examples will include: the evaluation of a national cultural change campaign (delivered across digital platforms) and the development of monitoring mechanism which tracks population-level progress towards the prevention of violence against women.   

The panel will conclude with an interactive facilitated discussion. Audience members will be asked to interrogate evaluation case studies (provided by the panellists), discussing whether the examples are intersectional, and identifying practical steps to advance an intersectional approach of the case study. At the conclusion of the panel, participants will be directed toward relevant resources to support them to move from 'inclusive' evaluations that simply recruit for diversity, towards transformative, intersectional evaluation design.

Chairs
avatar for Jenne Roberts

Jenne Roberts

Evaluation Manager, Menzies School of Health Research
Jenne is an international health evaluator, working in Indigenous health in Australia and international public health, mostly in South East Asia. Jenne is interested in identifying the efforts and interventions that spark positive social and health impact, and engaging intended beneficiaries... Read More →

Speakers
avatar for Loren Days

Loren Days

Loren Days is currently Senior Policy Advisor, Intersectionality at Our Watch where she specialises in developing strategies to embed an intersectional approach across the organisation. She is a qualified lawyer who has experience in policy, human rights, legal and regulatory ref... Read More →
avatar for Sarah Kearney

Sarah Kearney

Manager, Evaluation and Learning, Our Watch
I am an experienced social policy evaluation specialist with a passion for preventing gender based violence. For the past few years, I've lead evaluation at Our Watch, Australia's national foundation to prevent violence against women. Together with my colleagues, we've learned how... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Chancellor 5

10:00am AEST

Q: Can realist evaluations be designed to be more suitable for use in Indigenous contexts? (A: It depends)
Emma Williams (Northern Institute, CDU), Kevin Dolman (Northern Institute, CDU)

Realist evaluation has been growing in popularity over the past 20 years, and is now being used in Australian and Canadian Indigenous contexts. This presentation, developed by realist Indigenous and non-Indigenous colleagues, looks at how (and to what degree) realist evaluations can be designed to be culturally safe, and more suitable for use in different Indigenous contexts. We note that a single proposed solution is impossible, given the diversity of Australian Indigenous peoples, and describe issues that arise in different Indigenous contexts. One area of innovation is methods, identifying how techniques developed in a European context - such as realist interviewing - have been and can be further adapted to suit preferred ways of sharing information in different Indigenous contexts. More challenging is understanding how the ontology and epistemology of realist evaluation, and particularly its understanding of causation, align with the ontologies, epistemologies and understandings of causality of different Indigenous peoples. Steps towards a cross-cultural understanding of realist philosophy are presented, together with the challenges this presents. The impact of who 'owns' the evaluation, who leads and shapes it, will also be discussed with reference to realist evaluations. 

Chairs
avatar for Jenne Roberts

Jenne Roberts

Evaluation Manager, Menzies School of Health Research
Jenne is an international health evaluator, working in Indigenous health in Australia and international public health, mostly in South East Asia. Jenne is interested in identifying the efforts and interventions that spark positive social and health impact, and engaging intended beneficiaries... Read More →

Speakers
avatar for Kevin Dolman

Kevin Dolman

Principal Analyst, Indigenous Evaluation Consultant
Dr Kevin J. Dolman is an Eastern Arrernte man who has worked for thirty years in the government, private and community sectors across a spectrum of Indigenous social, economic and cultural policy. He is a consultant in evaluation and policy fidelity with a special interest in how... Read More →
avatar for Emma Williams

Emma Williams

Associate Professor, RREALI CDU Maburra
Emma is a Credentialed Evaluator and an Associate Professor at Charles Darwin University in Australia, where she researches, conducts and teaches evaluation, particularly realist evaluation. She has moved between government, academe and private consulting in evaluation over more than... Read More →


Friday September 21, 2018 10:00am - 10:30am AEST
Chancellor 5

11:00am AEST

Evaluation Ready: Transforming government processes and ensuring evaluability
Ruth Pitt (Department of Social Services), Lyn Alderman (Department of Social Services), Katherine Barnes (Department of Industry, Innovation and Science), David Turvey (Department of Industry, Innovation and Science)

Abstract 1: Evaluation Ready: transforming government processes

The Australian Government's Digital Transformation Agenda, announced in the 2015/16 federal budget, includes establishing two grants hubs, the Business Grants Hub (located in the Department of Industry, Innovation and Science) and the Community Grants Hub (located in the Department of Social Services). These hubs are intended to streamline how grant programs are designed, established and managed across the Commonwealth. This centralisation of grants administration is a significant systems-level change that presents both challenges and opportunities for evaluation.This presentation will outline the work being done at the two Departments to embed evaluation services within their respective grants hubs, looking at key successes to date and challenges ahead. In particular, it will examine how the hubs have moved evaluation planning into the design phase by ensuring evaluation is included in the costings for new programs and by providing 'evaluation readiness' services. These services align with the utilisation-focused evaluation approach of holding 'launch workshops' to assess and enhance evaluation readiness, with the aim of improving the timing and relevance of future evaluation activities (Patton 2012). The speakers will discuss the implications of these services for evaluative thinking and practice. How each of the hubs capitalise on the opportunities offered through centralised evaluation services will be of interest to evaluators who are interested in transforming evaluation from being on the periphery of programs to being at the heart of their design and delivery.Reference: Patton (2012) Essentials of Utilization-Focused Evaluation, Sage Publications.

Abstract 2: Evaluation Ready: ensuring evaluability

Ensuring good evaluation involves more than just hiring evaluators and setting them to work. It requires preparation, capability and an evaluation mindset. It requires programs to be evaluation-ready. How reassuring would it be if program developers knew from the outset the types of evaluations planned for their program and when they would commence? If they knew the questions that would be asked and the methods and indicators that would be used to answer them? And if evaluators were confident that the data required would be collected, tested and available for use when needed? In short, if evaluability was assured? This presentation explores how the Department of Industry, Innovation and Science's Evaluation Ready tool has improved the evaluability of its programs. At or near the design stage of a new program, Evaluation Unit staff work with policy and program specialists to develop a program logic, evaluation questions, data requirements and an evaluation schedule. These documents comprise an evaluation strategy, which informs program documentation including application forms and reporting templates. The tool has been reviewed and refined to enhance speed and consistency of application. The unit's ambition is to have it considered public sector best practice. In this presentation, the experience of applying the Evaluation Ready tool in the Department of Industry, Innovation and Science is explained and its impact on evaluability is assessed. Examples illustrate how the process has all but eliminated the need for scoping studies and evaluability assessments. We show how the process interacts with program rollout arrangements and performance reporting frameworks for individual programs. And as it hasn't always been easy, we highlight some of the challenges encountered and lessons learned along the way. 

Chairs
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →

Speakers
avatar for Lyn Alderman

Lyn Alderman

Chief Evaluator, Department of Social Services
Dr Lyn Alderman brings a wealth of expertise to her role as Chief Evaluator. Lyn’s experience spans several sectors including nonprofit, corporate, higher education and vocational education. She has a deep disciplinary knowledge and experience in program evaluation, evaluation frameworks... Read More →
avatar for Ruth Pitt

Ruth Pitt

Assistant Director, Evaluation Uni, Australian Government Department of Social Services
Ruth Pitt is Assistant Director of the Evaluation Unit at the Australian Government Department of Social Services. Her evaluation experience includes diverse roles in consulting, government and not-for-profit organisations, in Australia and overseas. Her qualifications include a Master... Read More →
avatar for David Turvey

David Turvey

General Manager, Department of Industry, Innovation and Science
David Turvey is the General Manager of the Insights and Evaluation Branch within the Economic and Analytical Services Division of the Department of Industry, Innovation and Science. David oversees the Divisions research and analytical work on the drivers of firm performance and the... Read More →


Friday September 21, 2018 11:00am - 12:00pm AEST
Chancellor 5

12:00pm AEST

Realist Evaluation: Tracing the Evolution of Realist Program Theory over the Years of the Resilient Futures Project in South Australia
Bronny Walsh (Community Matters)

Evaluators have been working with a Research Institute to undertake a collaborative, capacity building evaluation of a pilot program called 'Resilient Futures'.  The program aimed to improve wellbeing for young people from disadvantaged communities by delivering, through schools and youth sector agencies, resiliency training and mentoring support for young people.The evaluation was intended to inform future decision-making about the Resilient Futures program, and to inform program improvement over time. A realist evaluation methodology was selected because it was a learning-oriented methodology which could contribute to program refinement, while also explaining different outcomes for different sub-groups and in different contexts. 

The program was being developed, tested and refined during the evaluation.  It started as a small-scale pilot in a few agencies, underwent a complete transformation of the delivery model and became a large-scale program delivered to hundreds of young people through multiple agencies. The program model moved from delivery of a pre-designed program in which high fidelity was expected, to supporting and resourcing the delivery agencies to adapt and use core materials in ways that were appropriate to their own setting. This required a significant change in the program theory and a change in evaluation methods. Realist evaluation is intended to be iterative, gradually developing and refining program theory through recurrent rounds of evaluation.  

This paper demonstrates how it can respond to transformation within programs. It will trace the evolution of the evaluation, demonstrating the changes in program theory, evaluation questions and methods required as the program evolved. Key findings from the final round of the evaluation will also be presented. 

Chairs
avatar for Keren Winterford

Keren Winterford

Research Director, Institute for Sustainable Futures, University of Technology Sydney
Dr Winterford has 20 years of work experience working in the international development sector, in multiple capacities with Managing Contractors, NGOs, as a private consultant, and more recently in development research. She currently provides research and consultancy services for numerous... Read More →

Speakers
avatar for Bronny Walsh

Bronny Walsh

Research and Evaluation Partner, Walsh & Associates
My background is in criminology and I currently manage the SA Drug Use Monitoring in Australia (DUMA) research program which involves interviewing police detainees about drugs and crime.I also do a lot of work with Realist Evaluation trying to unpack why, and in what circumstances... Read More →


Friday September 21, 2018 12:00pm - 12:30pm AEST
Chancellor 5

1:30pm AEST

Strengthening the professionalisation of evaluation in Australia, workshop 2
AES Learning & Professional Practice committee

In 2017 the AES commissioned Better Evaluation and ANZOG to explore options for strengthening the capacity and professionalisation of the evaluation sector. The report explores options to:

  • increase motivation
  • increase capacity
  • increase opportunities.

The Learning and Professional Practice Committee (LPP) of the AES Board is interested in your views about priorities for skill development, learning pathways, embedding professional competencies and opportunities to increase demand for and strengthen the operating environment for evaluation.

We are holding two workshop style sessions and participants are invited to attend either one or both.
Workshop 1 will identify and discuss issues of most interest and concern to members. Workshop 2 will build on the first, and help shape the direction for the AES in strengthening the professionalisation of evaluation in Australia.

The outcomes of the workshop sessions will be shared at the conference closing plenary.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.


Friday September 21, 2018 1:30pm - 2:30pm AEST
Chancellor 5
 
Filter sessions
Apply filters to sessions.