Loading…
Welcome cocktail reception, Tuesday 18 September 6:00–8:00pm, open to all delegates! Venue: Penny Royal
 
Interactive session [clear filter]
Wednesday, September 19
 

11:00am AEST

In the deep end? Evaluation 101 for new evaluators
Charlie Tulloch (Policy Performance Pty Ltd)

Ask any evaluator how they ended up in this field, and most will say that they fell into it. Right in the deep end. This can be overwhelming, with theoretical, methodological, logistical and ethical challenges to consider. This presentation will provide an introductory overview of the evaluation field, adapted from evaluation capability building materials prepared and delivered within a large professional services firm. It will explore various definitions of evaluation , outline the rationale for undertaking evaluations, outline the role of evaluation across the government policy cycle, detail the most suitable types of evaluation, and step through practical considerations relating to planning, conducting and reporting on evaluation findings. It will draw on the AES Evaluators Professional Learning Competency Framework to identify the skills that new evaluators should seek to build as they develop. By the end of this session, those attending the conference to learn the basics will have a better understanding about their development path, and the contribution they can make to extending their own practice = building personal capital.

Chairs
avatar for Dan Borg

Dan Borg

Independent consultant
Dan is an evaluator with a background in environmental science and a PhD from the University of Melbourne. Dan's has experience conducting evaluations in natural resource management, emergency management and in health in the public service and not for profit sectors. Dan is passionate... Read More →

Speakers
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance helps to plan, implement and evaluate policies and programs.



Wednesday September 19, 2018 11:00am - 12:00pm AEST
Conference centre

1:30pm AEST

New words, old approaches: Weaving together foundational principles for contributing to transformation through evaluation
Robyn Bailey (Allen + Clarke), Emma Walke (University of Sydney), Roxanne Bainbridge (Central Queensland University)

Do new terms such as 'co-design' signal substantively new or different approaches to evaluation? Or are they repackaging old concepts, concepts fundamentally important for ensuring the self-determination of Indigenous peoples? Do 'co' approaches - co-design, co-operative inquiry, co-production, co-creation, inherently address issues such as power and control over decision-making and resources, or can they further entrench current inequities?

We contend that it is not evaluation approaches in and of themselves that can contribute to better outcomes for Indigenous peoples. Rather, the application of principles and practices which consciously address issues of inequity in power, diversity of voices, values and knowledge, and benefits arising from such evaluation projects. 

We have started to build a principles-based framework for guiding our practice, both during the co-design and evaluation of a substantive program aimed at improving outcomes for Aboriginal and Torres Strait Islander peoples. This framework attempts to weave together principles emphasised by Aboriginal and Western forms of inquiry - differing ways of knowing, doing and being.  

We invite you to a yarning circle to talk about foundational principles and practices which respect 'all learn, all teach' processes and practices. We would like to explore whether there are evaluation approaches that are inherently more culturally safe and transformative, whether it is the way in which we apply our craft that is key to realising better outcomes for Indigenous and ultimately all peoples, or whether it is something else. The knowledge generated in the session will be shared back with participants, using both visual and written mediums.

Chairs
avatar for Keryn Clark

Keryn Clark

DMEL & Research Consultant

Speakers
avatar for Robyn Bailey

Robyn Bailey

Senior Associate, Evaluation + Research, Allen and Clarke
Hello! I am a Pakeha (European) New Zealander, currently working in both Australia and New Zealand. Evaluation, and contributing to better outcomes for people and communities through evaluation, is my passion. Along with colleagues, I work with both government agencies and NGO providers... Read More →
avatar for Roxanne Bainbridge

Roxanne Bainbridge

Director Centre for Indigenous Health Equity Research, Central Queensland University
I am a Gungarri/Kunja Aboriginal researcher from South Western Queensland in Australia and Professorial Research Fellow at Central Queensland University where I am Director for the Centre for Indigenous Health Equity Research. My current priority evaluation is embedded in a partnership... Read More →
avatar for Emma Walke

Emma Walke

Academic Lead Aboriginal Health - Co-Lead Community Engagement, University Centre for Rural Health
I'm a Bundjalung woman, my family from Cabbage Tree Island/Ballina area Northern NSW. I have a role that works with Medical and Allied health Students when away from their home base universities to understand and work better with Aboriginal and or Torres Strait Islander peoples. I... Read More →


Wednesday September 19, 2018 1:30pm - 2:30pm AEST
Conference centre

3:30pm AEST

Learning from Failure: A Safe Space Session
Matt Healey (First Person Consulting)

The increasing appetite from government, philanthropy and other funders for innovative approaches to complex social and environment challenges has driven many towards such trends as design thinking, human centred design and co-design. These design approaches emphasise (among other things) a willingness to try and fail, and, most importantly, to learn from that failure. 

For evaluators, failure (or the potential for failure) is a risk to be mitigated. Should failure occur or mistakes be made, they tend to be kept in-house or otherwise not shared more broadly. To fail means disappointing clients, stakeholders (internal and external) and the communities we seek to benefit. 

Given that, and the increasing emphasis on integrating design into our practice, how can evaluators come together to learn from our collective failures and mistakes? How can we pass this learning onto the next generation of evaluators in a way that acknowledges their own experiences and perspectives? What are the opportunities unearthed for the evaluation sector and field by this failure?

This interactive session addresses these questions through facilitated discussion and shared reflection. Through a mix of lightning talks, small group discussions and whole room consensus-making, the session will elicit sharing about times that mistakes were made and what lessons can be learned from those mistakes—for conference attendees and the field of evaluation.

This session will result in a set of agreed upon principles that (hopefully) lay the groundwork for the future sharing of instances where mistakes were made and the lessons learned. This session will be guided by a set of house rules to ensure that attendees feel comfortable in sharing. Upon entry, participants will provide their name, contact details and consent to these principles, which will also enable follow-up after the session.

Chairs
avatar for Catherine Hastings

Catherine Hastings

PhD Candidate, Macquarie University
I am in my final year of a PhD developing explanations for Australian family homelessness. Prior to this, I worked as an applied social research and evaluation consultant. My interests are realist and complex evaluations in areas related to social equality and social justice.

Speakers
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →


Wednesday September 19, 2018 3:30pm - 4:30pm AEST
Conference centre
 
Thursday, September 20
 

9:30am AEST

Strengthening the professionalisation of evaluation in Australia, workshop 1
AES Learning and Professional Practice Committee

In 2017 the AES commissioned Better Evaluation and ANZOG to explore options for strengthening the capacity and professionalisation of the evaluation sector. The report explores options to:

  • increase motivation
  • increase capacity
  • increase opportunities.

The Learning and Professional Practice Committee (LPP) of the AES Board is interested in your views about priorities for skill development, learning pathways, embedding professional competencies and opportunities to increase demand for and strengthen the operating environment for evaluation.

We are holding two workshop style sessions and participants are invited to attend either one or both.
Workshop 1 will identify and discuss issues of most interest and concern to members. Workshop 2 will build on the first, and help shape the direction for the AES in strengthening the professionalisation of evaluation in Australia.

The outcomes of the workshop sessions will be shared at the conference closing plenary.


Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.


Thursday September 20, 2018 9:30am - 10:30am AEST
Chancellor 3

9:30am AEST

Evaluation literacy: Exploring the skills needed to motivate and enable others to access, understand and use evaluation information in non-government organisations
Alison Rogers (The Fred Hollows Foundation), Leanne Kelly (Windermere), Alicia McCoy (BeyondBlue)

The motivations and abilities of individuals to gain access to, understand and use evaluative information is highly varied. Evaluation literacy can make evaluation more appropriate, understandable and accessible. This world cafe session intends to reveal and share ways that we engage with colleagues to enhance evaluation literacy. This session is aimed at internal evaluators in non-government organisations, employees who practise and promote evaluation, and external evaluators working with organisations. We invite participants to share their experiences and learn from others. The session will examine a key issue: how do individuals promote evaluation among their colleagues in non-government organisations? 

Understanding social connections between colleagues and elucidating interpersonal dynamics is useful for considering how to transform team work dynamics. Drawing upon a social psychological theory called social interdependence theory, the presenters will facilitate the world café discussion around setting cooperative goals. Focused on ways of promoting evaluation, the questions will be structured around:
  • How do you set common goals that link all individuals? 
  • How are individuals held accountable for their contribution?
  • How do you ensure there are opportunities to connect? How do you provide encouragement? 
  • What is your preferred communication style? 
  • How do you incorporate opportunities for reflection?  
The world café session will use examples from the literature as a starting point. Participants rotating through the questions will be provided with an opportunity to share their real world experiences and hear from others. Participants will leave with an increased understanding of this topic with evidence from the literature, theory and practical examples. This useful networking opportunity will enable practitioners attempting to promote evaluation among their colleagues with practical strategies to enhance their practice.

Chairs
Speakers
avatar for Alicia McCoy

Alicia McCoy

Head of Research and Evaluation, Beyond Blue
Alicia is Head of Research and Evaluation at Beyond Blue. She has 15 years’ experience working in the health and community sectors in a range of evaluation, research and clinical roles. Alicia is a social worker by profession and her PhD at The University of Melbourne on the practice... Read More →
avatar for Alison Rogers

Alison Rogers

PhD Candidate, The University of Melbourne
Alison Rogers is a PhD candidate with the Centre for Program Evaluation at The University of Melbourne. She is also the Strategic and Innovation Adviser with the Indigenous Australian Program of The Fred Hollows Foundation based in Darwin, Northern Territory.


Thursday September 20, 2018 9:30am - 10:30am AEST
Conference centre

11:00am AEST

Freaking Super Sweet Webinars: learning new tricks from young guns (aka Webinars 101: AES webinar working group reports back)
Kara Scally-Irvine (Evalstars Limited), Liz Smith (Litmus), Kahiwa Sebire (Flinders University)

The AES is transforming and wants to increase member value. We know many AES members are not located in easy reach of the regional seminars and workshops. In 2018, the Member Services Engagement (MSE) committee decided to trial the use of webinars, with a particular emphasis on enabling greater learning and connection opportunities for members unable to attend AES events. We established a webinar working group to identify potential applications of webinar technology and best practice guidelines for webinar technology and online facilitation. In keeping with design thinking approaches, we tested our assumptions with a pilot: "A webinar on how to run webinars".  

In this interactive session, the AES Webinar Working Group will share our learnings and activities so far. We will provide an overview of what a webinar is (and isn't), different delivery options within an evaluative setting (the techie bit) and our top tips and tricks for facilitating online. We will end with our reflections on the value of the tool for AES members as a vehicle for professional development, and a tool for use in evaluations. Throughout the session, we will incorporate the use of other interactive tools, e.g. PollEverywhere, that can be used to garner engagement and gather data, so attendees leave with first-hand experience of the technology options available to them.  

We hope to deliver this session as a webinar (and later as a webcast) so members not attending the conference can benefit.  

We will also seek feedback on what the membership might like to see next from the AES to support professional development. 

Chairs
Speakers
avatar for Kara Scally-Irvine

Kara Scally-Irvine

Principal Consultant & Director/Co-convenor, KSI Consulting Limited/ANZEA
Kara has over 15 years’ experience, planning, monitoring, and evaluation experience. She has expertise in both quantitative and qualitative data collection and analysis, and systems thinking. She now applies these skills to support organisations large and small, in a range of settings... Read More →
avatar for Kahiwa Sebire

Kahiwa Sebire

Manager, Learning Design/MEval Student, University of Adelaide/University of Melbourne
Enthusiastic solution-finder and life-long learner. Exploring the possibilities of authentic learning experiences and technology with sticky notes and whiteboards in tow.Studying MEval at UniMelb, interested in ECB, education, learning analytics, technology-enhanced practice, facilitation... Read More →
avatar for Liz Smith

Liz Smith

Partner, Litmus
I am a co-founder of Litmus a specialist private sector evaluation company based in Wellington New Zealand and active across Australasia. My evaluation journey started more than 20 years ago as a nurse in the John Radcliffe Hospital Oxford when assessing the effects of a new nursing... Read More →


Thursday September 20, 2018 11:00am - 12:00pm AEST
Conference centre

12:00pm AEST

Ethical Dilemmas in Evaluation Practice
Anne Markiewicz (Anne Markiewicz and Associates)

This session will consider a range of ethical dilemmas faced by evaluators in their evaluation practice. The context for ethical evaluation practice will be set through a short introductory presentation that outlines the four foundation ethical principles of respect, relevance, responsibility and reciprocity. This presentation will be followed by consideration of a number of scenarios where ethical dilemmas exist in each of the four 'R' areas. The presentation of four scenarios will then be followed by opportunities for members of the audience to pose their own ethical dilemmas from their practice experiences. 

This session will be highly interactive as common evaluation challenges and dilemmas are identified and responses to ethical dilemmas are discussed and considered.

Chairs
Speakers
avatar for Anne Markiewicz

Anne Markiewicz

Director, Anne Markiewicz and Associates
Anne Markiewicz is a leading monitoring, evaluation, and training specialist with over 20 years’ experience designing monitoring and evaluation frameworks and developing and implementing a range of evaluation methodologies. She has designed and delivered M&E capacity development... Read More →


Thursday September 20, 2018 12:00pm - 1:00pm AEST
Conference centre

12:00pm AEST

Developing an AES Advocacy and Influence Strategy: A consultation and co-design session for AES members

Influence is one of the key components of the AES 2015-2019 Strategic Plan. The AES Advocacy and Alliances Committee is developing an Advocacy and Influence Strategy in order for the AES to project its 'voice' and to enable it to better serve its members and the profession.

The Strategy is underpinned by the key principles of:
  • Collaboration: (within the AES membership and between the members and clients) Inclusiveness: sharing information and ideas with clients and members
  • Continual professional growth: (within membership and clients)
  • Professional service: on behalf of and to our members
  • Innovation: new ways to respond to new times

In keeping with these principles, the Advocacy and Alliances Committee is offering an opportunity for AES members to be involved during the Conference in a consultation and needs analysis session that will contribute to the design of the Strategy. The session will explore what needs or issues members have regarding advocacy and influence, and their thinking about the most relevant and useful approaches.


A background paper will be made available for participants to read prior to the session. 

Chairs
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →

Speakers
avatar for Alexandra Ellinson

Alexandra Ellinson

Manager, Evaluation, Research & Policy, Institute for Public Policy & Governance, UTS
Alexandra delivers strategic evaluations, reviews and consultancies, working in partnership with clients. She is deeply knowledgeable about the social services sector and has particular expertise in interagency responses to complex issues, such as housing and homelessness, domestic... Read More →
avatar for Margaret MacDonald

Margaret MacDonald

Director, MacDonald Wells Consulting
Margaret is a leadership, public policy and evaluation consultant and works mostly in the social and health policy issues. She has a passion collaborative practice, systems thinking and linking ideas to practice.. Margaret is particularly interested in the interplay between policy... Read More →



Thursday September 20, 2018 12:00pm - 1:00pm AEST
Chancellor 5

2:00pm AEST

'What about me?': A campfire session to co-design transformational self-care guidelines for evaluators
Emma Williams (Northern Institute, CDU), John Stoney (Northern Institute, CDU)

Evaluators often - and increasingly - work in high risk, high stress situations. These include data collection in fragile states and conflict situations but also working in relatively 'safe' environments with evaluands in traumatic situations if the experience is sufficiently intense that the evaluator experiences vicarious trauma. Data collection when evaluating institutions of power presents its own challenges. Reporting also may provide a high risk, high stress point for evaluators. 'Telling truth to power' is seldom easy, and there are situations where it can have impacts on evaluators' career prospects and, in some settings, personal safety. Even the stress of juggling multiple projects with tight timelines that impose periods of little sleep, let alone adequate space for reflection, can impact on evaluator well-being. This presentation presents guidelines drafted in response to this issue and based on primary and secondary research:
  • Evaluation planning: self-care guidelines based in part on a transformation of ethical practice questions. (These often assume that the researcher/evaluator holds power and is not at risk; reverse-engineering the questions to consider potential risks to evaluator well-being proved a fruitful source of self-care guidelines.)
  • Debriefing guidelines: for use by evaluators after particularly stressful situations, based in part on transformed disaster management tools
  • Self-assessment: This checklist enables evaluators to assess their own capacity - including capacity for evaluative judgement - in high risk, high stress situations.The campfire session will use a co-design variant process involving pre-circulated materials to enable session participants to test and refine these draft guidelines. 

Chairs
DR

Dwi Ratih S. Esti

Flinders University
I've been interested in evaluations since joining the Directorate for Monitoring, Evaluating and Controlling of Regional Development at the Ministry of National Development Planning of the Republic of Indonesia in 2007. At the moment, I am conducting an evaluation research as my doctoral... Read More →

Speakers
avatar for John Stoney

John Stoney

AES Board
I've been an AES Board Member since 2016 and currently Chair the Advocacy and Alliances Committee (AAC), which supports the Influence portfolio of the 2016-19 AES Strategic Plan. At this years Conference the AAC is engaging with members to inform development of an AES Advocacy and... Read More →
avatar for Emma Williams

Emma Williams

Associate Professor, RREALI CDU Maburra
Emma is a Credentialed Evaluator and an Associate Professor at Charles Darwin University in Australia, where she researches, conducts and teaches evaluation, particularly realist evaluation. She has moved between government, academe and private consulting in evaluation over more than... Read More →



Thursday September 20, 2018 2:00pm - 3:00pm AEST
Conference centre
 
Friday, September 21
 

9:00am AEST

Evolving the evaluation deliverable
Gerard Atkinson (ARTD Consultants)

A key principle of utilisation-focused evaluation is that it needs to be useful to stakeholders, whether they are evaluation commissioners, policy developers, or the general public. Much of the theory of utilisation-focused evaluation centres on the process of evaluations, and the early and sustained engagement of stakeholders. Consideration is also given to the way an evaluation is communicated, a.k.a. the "deliverable", focusing on tailoring the communication of findings to match how different stakeholders absorb information. 

In prior decades, the sole deliverable was almost always a written report. As users of evaluations became more time poor, visual techniques for conveying information gained popularity. Slideshow presentations became a key part of communicating findings, to the point of replacing written reports in some cases. More recently, as evaluations have utilised large data sets and responded to a desire to make findings interactive, dashboards have gained in prominence as the core deliverable. However, each of these are imperfect solutions. Slideshows often omit some of the technical details required by those seeking to operationalise the findings, and dashboards are strongly focused on presenting quantitative analyses. So the question arises: what's next?

This interactive session is an opportunity for participants to bring their own ideas and needs, and brainstorm what might be the next step in the evolution of the evaluation deliverable. Starting with an overview of the evolution of the deliverable and the aims of utilisation-focused evaluation, participants will then work together in small groups with creative stimuli to explore ideas for new types of deliverables that overcome current challenges in usability and communication. Groups will consider what the next generation deliverable might look like, how it might be developed in an evaluation process, how it fits with existing deliverables, and what skills will be needed to design and deliver these in collaboration with stakeholders.

Chairs
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →

Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation - market and social research - financial and operational modelling - non-profit, government and business... Read More →


Friday September 21, 2018 9:00am - 10:00am AEST
Conference centre

11:00am AEST

We are Women! We are Ready! Amplifying our voice through Participatory Action Research
Tracy McDiarmid (International Women's Development Agency), Amanda Scothern (International Women's Development Agency), Paulina Belo (Alola Foundation)

Our organisation's work is grounded in the principles of gender equality and women's rights, delivered in partnership with inspiring organisations across the Asia Pacific region. We recognise that gender equality requires incremental and transformative change which occurs over generations, and that strengthening women's movements through collective action and learning is a key strategy in achieving change. Capturing those changes in the voices of diverse women is at the heart of our commitment to ethical, feminist, participatory evaluation.

This Interactive Session models the principles and practices of our organisation's approach.  It will explore how evaluations can be designed to strengthen the capacity of diverse women as co-researchers; to build on and generate knowledge as a resource of and for the women who create, own and share it; and to design evaluative spaces that promote authentic, inclusive forms of evidence.  

A campfire approach will  highlight our recent experience, including the design of a mid-term reflection using feminist participatory action research methodologies and the development of our Feminist Research Framework (Nov 2017), and engage session participants to enquire into, and explore other applications of these principles and practices drawing on their own experience.  Our discussion will include evaluation design (experiences, challenges, applicability to different contexts) and methodological practices such as appreciative inquiry, narrative and performative methods.  

Key learnings are envisaged on topics such as participatory design processes (ensuring delivery and community partners are involved in the development of key questions and appropriate methodologies), capacity building (empowering diverse women as co-researchers in data collection and analysis), and accessible and applicable learning (communicating and using findings relevant to diverse partners to support political, economic and social change).  Peer-to-peer exchange will be captured, and will inform the circulation of sector guidance drawing on experience and learning of session participants.

Chairs
avatar for Joanna Farmer

Joanna Farmer

Manager, Deloitte
I am an early career evaluator with a passion for mental health. I have a particular interest in how to meaningfully support people with lived experience of mental health to participate in evaluation. I am currently studying the Masters of Evaluation at Melbourne Uni.

Speakers
avatar for Tracy McDiarmid

Tracy McDiarmid

Senior Program Quality Manager, International Women's Development Agency
Dr Tracy McDiarmid completed her PhD at the University of Western Australia before working in the fields of Australian social policy, governance, disaster risk reduction, and gender. Tracy has experience in international development with a variety of International Non-Government Organisations... Read More →
avatar for Alejandra Pineda

Alejandra Pineda

Programs Coordinator, Myanmar, IWDA
I have worked with the International Women's Development Agency for three years, working in collaboration with IWDA's five Myanmar partner organisations to ensure the successful implementation of their programming on women's political participation, and women's safety and security... Read More →


Friday September 21, 2018 11:00am - 12:00pm AEST
Conference centre

1:30pm AEST

Strengthening the professionalisation of evaluation in Australia, workshop 2
AES Learning & Professional Practice committee

In 2017 the AES commissioned Better Evaluation and ANZOG to explore options for strengthening the capacity and professionalisation of the evaluation sector. The report explores options to:

  • increase motivation
  • increase capacity
  • increase opportunities.

The Learning and Professional Practice Committee (LPP) of the AES Board is interested in your views about priorities for skill development, learning pathways, embedding professional competencies and opportunities to increase demand for and strengthen the operating environment for evaluation.

We are holding two workshop style sessions and participants are invited to attend either one or both.
Workshop 1 will identify and discuss issues of most interest and concern to members. Workshop 2 will build on the first, and help shape the direction for the AES in strengthening the professionalisation of evaluation in Australia.

The outcomes of the workshop sessions will be shared at the conference closing plenary.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Sue Leahy

Sue Leahy

Managing Director, ARTD Consultants
Sue is an accomplished evaluator, policy analyst and facilitator and managing Principal at ARTD, a leading evaluation and public policy company based in NSW. She joined ARTD in 2009 from the NSW Department of Family and Community Services, where she managed a wide-ranging program... Read More →
avatar for Delyth Lloyd

Delyth Lloyd

Manager, Centre for Evaluation and Research, Dept. Health and Human Services Victoria
Capacity building, professionalisation, cultural competence, participation, facilitation, and technology.


Friday September 21, 2018 1:30pm - 2:30pm AEST
Chancellor 5

1:30pm AEST

Into the great wide open (data): Understanding and using big and open data in evaluations
Jessie Wilson (Allen and Clarke)

The idea of big data and open data - and the increasingly inevitable incorporation of these approaches into evaluations - is terrifying for some and tantalising for others. For those falling into the former category, a lack of understanding, familiarity, and/or confidence in approaching big/open data has the potential of limiting one's own evaluative practice. In other contexts, limitations with and/or misapplications of big/open data can also impact on the validity and credibility of the evaluation designs and findings we produce.

The purpose of this interactive AES conference session is two-fold. We will: 1) address these fears, concerns, and limitations about use of big/open data in evaluations; and 2) begin to learn how to use these approaches in our own evaluative practices. Although I have a strong quantitative research background, I am just beginning my own big/open data journey within an evaluation context. As such, I promise to be encouraging and honest about how we evaluation professionals can start to become, in the words of Michael Bamberger, more 'sufficiently conversant' with these new approaches and begin building them into our ever-transforming toolkits to enhance how we evaluate policies, programs and interventions.

With the above purposes in mind, the session will use a World Café approach and practical, real-world Australasian examples to discuss and share learnings about:
  • what big data and open data is and is not and differences between these approaches; 
  • evaluative situations in which the use of big/open data is and is not appropriate, framed by various considerations (e.g., evaluand, evaluation methodology, evaluation questions and criteria, stage in the evaluation's project cycle); and
  • limitations of big/open data use in evaluations (e.g., data reliability and quality, ethics, consent) and management of these limitations.
Participants will also be provided with a guide for how to assess big/open data quality within an evaluation context.

Chairs
avatar for Katherine Pontifex

Katherine Pontifex

AES22 Program Co-Chair. Manager, Evaluation Services, Wellbeing SA
Katherine is the Manager, Evaluation Services at Wellbeing SA. She is an experienced evaluation expert with an extensive background working in government on health and social programs, policies and systems. Her evaluation practice is firmly grounded in a pragmatic approach with an... Read More →

Speakers
avatar for Jessie Wilson

Jessie Wilson

Senior Associate (Evaluation + Research), Allen and Clarke
I am a senior evaluator, researcher and project manager with 10+ years’ experience working with public and private sector agencies. In addition to my workshop topic (incorporating big/open data into evaluation design and implementation), you can also talk to me about cultural competency... Read More →


Friday September 21, 2018 1:30pm - 2:30pm AEST
Conference centre
 
Filter sessions
Apply filters to sessions.