Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-02T03:29:11.505Z Has data issue: false hasContentIssue false

Collecting and using youth development outcomes data to improve youth work practice

Published online by Cambridge University Press:  14 June 2017

Abstract

There is increasing scrutiny on the factors necessary to ensure that youth development programs consistently enhance the learning and development of young people. One of these key factors is the involvement of high-quality youth work practitioners who can facilitate an individual or group process to the benefit of all participants. While the practice of reflective learning is a core best-practice principle of youth workers, there is little emphasis on their own structured learning and development beyond their initial qualification. Based on findings from a pilot project testing the first practitioner-led outcomes framework in New South Wales, Australia, this article examines the role of outcomes data in contributing to the ongoing development of youth workers and youth development organisations. It argues that external performance data is both critical to individual and organisational development, and can enhance existing reflective practices such as workplace supervision.

Type
Articles
Copyright
Copyright © Youth Action 2017 

Introduction

There is no doubt that the conditions surrounding volunteer and paid youth work are changing. In Australia, the provision of youth work historically has been a social justice response to the disengagement of young people from power and decision-making structures (Bessant Reference Bessant1997; Sercombe Reference Sercombe2010). New reform agendas have, however, seen paid professional youth work, which is largely a service ‘purchased’ by government organisations, swept up in currents of increasing performance management and accountability (Morley et al. Reference Morley, Vinson and Hatry2001; Moxley and Manela Reference Moxley and Manela2001; Tilbury Reference Tilbury2006). As the Australian Research Alliance for Children and Youth (2009) explains, ‘recently the community sector has experienced increased pressure to measure its operations, activities (outputs) and their outcomes, and provide interested parties with the results of this measurement’. These cultural shifts have seen the rise of new discourses within human services — concepts like evidence-based practice, implementation science and outcomes measurement.

A major challenge for youth work is to meet these increased expectations of rigour, which involves operating with data collection and analysis capabilities while maintaining a focus on the needs of young people and delivering effective programming and services. A secondary challenge is the use of these data to refine and improve programming and service provision in order to produce greater outcomes for young people. This is currently producing a time of tension for services, with the ‘old school’ of effective youth work (e.g. a strong emphasis on rapport and relationships over time) needing to embrace the ‘new school’ of accountability and scientific rigour.

Within this context, Seymour's (Reference Seymour2012, Reference Seymour2015) Queensland Youth Development Project provides one of the most rigorous approaches available for isolating the effective components of youth work. I suggest that outcomes data can add a similar rigour to everyday practice, and contribute to a local evidence base informed by the feedback of clients. Within the project, Seymour posits that worker reflection and development, alongside rigorous evaluation, are key to good practice in youth work. Specifically, Principle 3, Indicator 6 proposes that, ‘Programs are developed, implemented and evaluated using a variety of methods and involving a range of stakeholders’. Other points include making evidence-based claims in promotional material, and including self-reflection as part of workers’ learning plans.

To illustrate how using outcomes data to feed into these good practice guidelines is possible, this article takes as a case study a pilot project that was a collaboration between Youth Action, the peak body for young people and youth services in New South Wales, and seven youth work organisations in the Nepean–Blue Mountains district of New South Wales, Australia. The district begins where the Blue Mountains begin to rise from the Sydney basin, approximately 55 kilometres west of Sydney. It spans a narrow strip of villages that cluster to the major highway through the Blue Mountains National Park. The district houses approximately 12,000 young people and, while a relatively affluent area, many young people suffer from the geography of the isolated mountains, and there are particular communities of disadvantage nestled in the complex mix of mountain residents.

The region was chosen because of its strong network of smaller, community-run but well-developed service providers. These organisations employ paid professional youth workers supported by some community volunteers, and undertake a range of strengthening and intensive intervention services, including recreation groups, developmental skills groups, social enterprise, casework support and counselling.

Together, we collaborated in the development and trial of a practitioner-led shared outcomes measurement framework. The aim of this collaborative project was to support the development of a system created by and for youth work practitioners that could be used to collect meaningful outcomes information to meet the needs of their funding bodies, to support their practice and to improve their services to young people.

Background to outcomes data collection in youth work organisations

In my role supporting youth work organisations in New South Wales, I am heavily involved in discussions regarding contracting, including the collection of data to support government investment decisions. One of the major criticisms expressed by contracted organisations of top-down (i.e. imposed by government) collection requirements is that the data they collect are ‘meaningless’. On the other hand, government contractors often express that organisations try to avoid accountability by wanting to collect unreliable and ‘soft’ data. They want organisations that can show their effectiveness through the data they collect, and who use these data to show their commitment to ongoing improvement of service provision.

It is clear that there is an inherent tension in data collection for youth work organisations — data need to be both meaningful and useful to the organisation, and rigorous enough in their collection to satisfy needs for accountability. For this reason, we enlisted seven youth work organisations to develop a shared, practitioner-led outcomes framework that would apply across all of their programs. Our aim was to test the following questions:

  • Was the development of a shared outcomes framework possible?

  • Would a framework provide enough useful data to facilitate the improvement of service provision to young people?

  • What conditions were necessary for a practitioner-led system to satisfy the rigour requirements of government contractors?

The shared outcomes framework

Despite the diversity of programs and services offered by the participating organisations, practitioners agreed that every program and service offered contributed to building confidence, improving positive connections, improving life circumstances and increasing the overall optimism of young people. These four service outcomes were chosen as the headline measures for the shared outcomes framework. They are outlined in more detail in Table 1.

Table 1 Overview of four headline outcome measures collected under the shared outcomes framework

The place of outcomes data in youth work practice

A central tenet of youth work is the sanctity of the relationship between the worker and young person. The young person (or group of young people) always has someone on their side against the injustices of an unfair and skewed system, which has often created or contributed to many of their challenges. However, an outcomes orientation asks the question ‘What effect is this strong relationship having for the young person?’ I believe this question is more important than almost any other in youth work — if the answer to this question is not positive, then why should youth work exist?

As a worker, however, it can be challenging to notice or collect information on the positive effects of our work. Many of our effects are cumulative — with many small contributions made across a long, deepening relationship with young people that is often occurring during the turmoil of adolescence. This makes capturing the impact of our work difficult, and youth work often lends itself to longitudinal evaluation, which is beyond the scope of many organisations and programs. Conversely, it can also be extremely confronting to face the possibility that our work is being only marginally effective, or ineffective, given the effort that we have put into forming good relationships with young people.

To illustrate with an example from outside the youth work field, as a life and performance coach, I have seen the value of data for improving workplace performance first hand. I was coaching a manager who was tasked with increasing the sales profit from their region. As we progressed, he was able to discuss the new clients they had added, the new things that his staff were doing to attract new customers, how they were ‘selling’ differently, and how they were running promotions to drive greater sales. However, all of these strategies proved ineffective, and despite all of the work put in, the team didn't increase their profit. It was only through the external data that the success or failure of the endeavour was clear. Similarly, without external data from young people relating to performance and effectiveness, supervision and program improvement can only rely on the perspectives of a worker. Using the feedback of young people as a form of external data holds a worker (and supervisor) responsible for driving improvements in techniques, approaches and outcomes beyond what is possible with internal worker observations.

The shared outcomes framework as external data

Across the shared outcomes project, we saw several examples of youth work organisations rigorously challenging their ways of working, their focus and even their beliefs in how change is created for young people. One participating organisation reported, ‘We have made outcomes data core to our team processes — especially in team meetings and supervision.’ A focus on the improvement of the ‘skills’ measure has seen significant changes in planning and running of group programming (the major activity of the organisation), and a steady and intentional trend of improvement in outcomes data.

Similarly, the inclusion of external data related to optimism has challenged many organisations to question the logic of their programs, or the ways that they assume their programs are impacting on clients (Besharov and Call Reference Besharov and Call2016). At the closure workshop, spurred on by low optimism outcomes data, several organisations discussed whether it was possible to affect the general optimism of a young person when this was open to so many internal and external developmental factors. Suggestions to replace the optimism factor included substituting improvements in optimism for improvements in a different developmental construct, such as feelings of self-efficacy (capability) or confidence. These were the most rigorous discussions among practitioners regarding how programs create change for young people that I have had the pleasure of experiencing, and they were spurred on by the existence of underwhelming external outcomes data.

Conclusion

I have argued that the Queensland Youth Development Project is a framework that isolates the effective components of youth work and provides a tool for reflection on how we think about the work we do. I suggest that outcomes data can add a similar rigour to everyday practice, and contribute to a local evidence base informed by the feedback of young people.

The organisations that participated in the work discussed in this article are continuing their data collection under the framework they co-developed. Most continue to use the data actively to improve their work and individual practice. They have also widened their expertise to include developing youth-friendly evaluation techniques that gather meaningful data to feed into the outcomes framework. Many are now key participants and voices in early intervention sector reforms that are taking place, and base their views firmly in the experience they accrued under the development of the framework.

These organisations have shown how outcomes data can play a significant role in service innovation and the development of individual workers by acting as a source of external, often critical, feedback. Rather than viewing these data as a threat, they should be viewed as an opportunity for reflection. The data may verify what we do, but equally they may challenge us to change our practice and ways of working to make us more effective agents of change in the lives of young people.

In order to thrive in the ‘new’ world of increased accountability, and to continue moving youth work forward as a practice, the collection of robust outcomes data as part of a monitoring and evaluation approach is vital.

Acknowledgements

I would like to acknowledge Kathryn Seymour for her enthusiasm, advice and support. Thanks go to the various managers and staff of the organisations with which we worked on the shared outcomes pilot project — especially Damian Cooper, Dot Knox, James Wood, Jake Nauta, Julia Partington and David Poullier.

References

Australian Research Alliance for Children and Youth 2009. ‘Measuring the outcomes of community organisations’, https://www.aracy.org.au/publications-resources/command/download_file/id/111/filename/Measuring_the_outcomes_of_community_organisations.pdf.Google Scholar
Besharov, D.J. and Call, D.M. 2016. ‘Using logic models to strengthen performance measurement’, in Improving public services: International experiences in using evaluation tools to measure program performance. London: Oxford University Press.Google Scholar
Bessant, J. 1997. ‘Free market economics and new directions for youth workers’, Youth Studies Australia 16 (2): 393419.Google Scholar
Morley, E. Vinson, E. and Hatry, H.P. 2001. Outcome measurement in nonprofit organisations: Current practices and recommendations. Independent Sector and the Urban Institute, http://www.acds.ca/web/images/webpages/evaluation/MTD_Module_5_INSERT_1_Outcome_Measurement_in_Nonprofits.pdf.Google Scholar
Moxley, D. and Manela, R. 2001. ‘Expanding the conceptual basis of outcomes and their use in the human Services’, Families in Society: The Journal of Contemporary Social Services 82 (6): 569–77.Google Scholar
Sercombe, H. 2010. Youth work ethics. London: Sage.CrossRefGoogle Scholar
Seymour, K. 2012. Good practice principles for youth development organisations, 2nd ed. Brisbane: Key Centre for Ethics, Law, Justice and Governance, Griffith University.Google Scholar
Seymour, K. 2015. ‘Deficits or strengths? Re-conceptualising youth development program practice. Unpublished PhD thesis, Griffith University, Brisbane.Google Scholar
Tilbury, C. 2006. Accountability via performance measurement: The case of child protection services. Public Administration, 65 (3): 4861.Google Scholar
Figure 0

Table 1 Overview of four headline outcome measures collected under the shared outcomes framework