DOCX

FIELD, A - INFLUENCING DECISION-MAKING IN HEALTH AND

By Calvin Marshall,2014-07-04 08:12
9 views 0
THIS SHIFT IN THINKING NEEDS A CHANGE IN CULTURE AND IT NEEDS TO CONSIDER WAYS OF INFLUENCING THAT CULTURE CHANGE. THE APPROACH TAKEN TO ANNUAL PROGRAMME ...

REPORTING PROCESSES THAT INFLUENCE CULTURE CHANGE IN TERTIARY EDUCATION

    Jan P. Hausman

    Bay of Plenty Polytechnic, Tauranga, New Zealand

    This paper was presented at the Australasian Evaluation Society International Conference, Sydney, Australia, 29 August 2 September 2011.

Abstract

Annual programme review reports have been in common usage for a number of years but do they really

    make a difference? What really happens? Is it just a report of programme activity for the previous year, or does this reporting process work as an impetus for change?

    In recent times, New Zealand has shifted from a compliance based model of quality assurance to an evaluative outcomes focused approach to quality. At Bay of Plenty Polytechnic, this has led to changes in the way programme delivery and student outcomes are reflected on, discussed and reported. This paper will provide a brief overview of the changing context of accountability in tertiary education in the last two decades as New Zealand moved through an audit based system of quality towards a system of self assessment and external evaluation and review. With this background, the paper will then reveal some changes in the 2009 and 2010 approaches to annual programme review (APR), and together with an academic committee restructure, to show the shift in thinking and practice that is occurring.

Introduction

    The shift from a compliance based audit model of quality assurance to an outcomes focused evaluative model of quality assurance is designed to shift tertiary education in New Zealand away from simply a reporting process towards reflecting on the outcomes for our students and community and moving to continuous improvement. This involves a shift in thinking from

    WHAT, SO WHAT

    to

    WHAT, SO WHAT, NOW WHAT?

    This shift in thinking needs a change in culture and it needs to consider ways of influencing that culture change. The approach taken to annual programme review, in conjunction with meeting external requirements, can be one mechanism of influence of the evaluative process.

    This paper will provide a very brief overview of the historical context of accountability in tertiary education in the last two decades as New Zealand moved through an audit based system of quality towards a system of self assessment and external evaluation and review. With this background, the paper will then examine changed approaches to annual programme review (APR) promoted in the APR approach in 2010 and 2011.

    435349973.docx 1

The key changes have included:

    ; A more collaborative team approach to APR discussion;

    ; one School‟s trial of a whole of school approach to sharing the outcomes and the issues;

    ; reporting by the Heads of Schools on a „whole of school programme health‟;

    ; and an overall institutional evaluation of „programme delivery and resourcing that has highlighted

    institutional improvements that can support programme delivery.

    The paper also notes a simultaneous organisational review of academic structures, processes and reporting and the impacts that this has had on creating culture change.

    These examples will show the shift in thinking and practice that is occurring and signal where further change in the evaluative culture of the polytechnic might lead to in the future.

The early focus on accountability

    APR has been used as a reporting tool for accountability and quality improvement since the mid 1990s and was introduced by many polytechnics as part of implementing the quality management systems that emerged from 1995 onwards.

    Early APRs tended to report in terms of inputs, processes and outputs (Hausman, 1996:3). The compliance-focused APR tools used compliance that was linked to accountability, or value for money, and rules that were enforced by a system of surveillance (Harvey & Jones cited in Hausman, 1996, p33). This compliance view has improvement as a secondary function of the monitoring process (Harvey, 1993:10).

    We know through many authors over the 1990‟s (Fullan, 1992; Hausman, 1996; Hopkins, 1994; Leithwood 1992; Murphy, 1990; Stewart & Prebble 1993), that to create an environment where continuous improvement might thrive requires a culture of open discussion, collaborative and consensual decision making and an open democratic climate (Hopkins 1994).

    A study about programme review completed in 1996 (Hausman, p40) noted that

    It is questionable whether a direct, definable link can be made to suggest that programme review

    enhances the quality of the student experience. What is indicated however is that the nature of

    review itself, involving reflection and collaborative dialogue with peers has the potential to

    enhance a culture of critical continuous review. This then increases the chance of student needs

    being identified and met, and the potential for content to be continually validated. The

    identification of systems which assist in the monitoring of student progress increases the

    opportunity to recognize early indicators where students might require remedial assistance.

The move from audit to outcomes based self evaluation

    In NZ, a system of self evaluation and external evaluation and review was introduced by the New Zealand Qualifications Authority (NZQA) through a trial mode in 2009, and then in reality in 2010. This evaluative outcomes focused quality assurance replaced the audit based approach to quality assurance that had been operating for the previous 19 years. Each Tertiary Education Organisation (TEO) was required to develop its own system of self evaluation (also called self assessment). The intention is that self

    435349973.docx 2

    evaluation becomes the everyday way of working. A cyclic system was also introduced of External Evaluation and Review (EER) with the EER team‟s role being to make a confidence judgment of the

    TEO‟s educational performance, based on student outcomes, and the TEO‟s capability in self

    assessment.

    Bay of Plenty Polytechnic has several tools it uses for self evaluation, but the key tool is an Annual Programme Review report (APR). This focuses on answering the NZQA key evaluation questions of

    1. How well do programmes and activities match the needs of learners and stakeholders?

    2. How well are learners guided and supported?

    3. How effective is the teaching?

    4. How well do learners achieve?

    5. What is the value of the outcomes for key stakeholders including learners?

    6. How effective are governance and management in supporting educational achievement?

The Annual Programme Review (APR)

    The Annual Programme Review Report Template is structured with headings and question prompts. We are interested in the “health of the programme”. The evaluation of this is achieved through team discussion and consensus agreement. An internal review of the 2009 APRs revealed insufficient depth in drawing together all of the components that contribute to the student learning experience and outcomes, and a lack of analysis of the available data. What needed to happen was to move staff from the factual reporting thinking of “what” and “so what” to “what, so what and now what?” (King, 2008).

    In late 2009 we held workshops to develop report writing skills. An earlier paper (Hausman, 2010) reported on mind mapping techniques used to encourage collaborative discussion. These techniques were then applied in group meetings, both on and off campus , to discuss what had worked well, what had not and what needed to change. Overall, this promoted a more collaborative approach to discussing the health of the programme. We achieved some change, but not enough. Robbins (2011) in his leadership teaching noted “if you always do what you‟ve always done, you‟ll always get what you always got”. This comment is a clear call that if you want something to change, then you need to change the approaches. In 2010 we made some changes to the timing of receiving these reports in that we required all 2009 APR reports to be completed and presented to the School Board of Studies in March 2010, We also made a change to the way different staff members were involved. Two further layers of analysis were then applied. (See fig 1).

    435349973.docx 3

Fig 1 MODEL OF THE LAYERED APPROACH TO ANNUAL PROGRAMME REVIEW

     Head of School Applied Science read all APRs

    and produced a Health of the School Report

     Head of School Applied Technology read all

    APRs and produced a Health of the School

    Report

    Head of School Business Studies read all APRs

    and produced a Health of the School Report

    Head of School Design and Humanities read all

    APRs and produced a Health of the School

     Report

    The first layer involved the Head of School providing a “State of the School‟s Health” report by reviewing all APRs, retention and completion reports, and by taking into account other available data such as programme and teaching evaluations and complaints. Each Head of School was able to identify trends in teaching and learning and student issues across the School so that areas of school-wide improvement could be identified and worked on. This meant that rather than the APRs just reflecting what was happening in a programme of study, now there was a school-wide focus to share ideas across a range of programmes.

    The next layer of analysis and reporting involved the Director Academic and Academic Manager reading all APRs plus the four HoS reports. They then wrote an institution-wide report to identify trends and matters that needed to be considered and addressed at an institutional level. The report was discussed at Academic Board, and then an improvement focus was taken. This then involved all four Heads of School, Directors Academic and Maori and Community Development and the Academic Manager

    435349973.docx 4

    meeting on four occasions to discuss the areas for improvement and to develop institutional strategies to achieve the changes.

Other organisational change

    In 2010 we also decided that we needed to reduce some of the bureaucratic layers of decision making at the polytechnic. There appeared to be too many layers and these affected decisions related to approval processes for changes that would bring improvement for student outcomes. These were decisions about the introduction of new course content or assessments. We also wanted to change the layers involved in the approval of results so that approval was at Group Leader level rather than at a formal Board of Studies. In addition we wanted to speed up decision making that should be made at the teaching team level, such as who was the most appropriate student to be granted a “top student” award.

    As well we wanted a closer focus on student engagement, retention and progress to ensure that student completion rates and progression to the next level of study could be as good as we could get them. To achieve those goals, a less formal structure of meetings was trialled in the School of Design and Humanities.

    The established structure was in two parts:

    ; The first was a School Board of Studies with membership comprising the Head of School, School

    Academic Adviser, all Programme Coordinators, an academic staff member from another school

    and a Head of School from another School who acted as chair. Administrative staff acted as

    minutes Secretary. The role of this committee was to receive reports and we noted that much of

    this „paper heavy‟ committee was used for rubber stamping of decisions rather than robust

    academic debate.

    ; The second was a number of Programme Assessment Committees comprising a programme

    coordinator, as many teaching staff as possible, the Academi c Adviser, an independent

    academic staff member and a programme administrator as minute secretary. The role of this

    committee was to check results to make sure that all entries were accurately recorded. We replaced the established structure.

    Group Programme Committees were set up. This put a cross section of teaching and support staff into an environment where they could talk about their students, about what was working well, and what needed further work and improvement. This also meant that ideas could be shared from other programme areas.

    The Programme Assessment Committees that functioned to carry out a checking of results were removed

    and responsibility for accuracy of the results put where it belongs, with the tutors, Programme Coordinators and Group Leaders.

    Reports from the Group Programme Committees were then taken to the second new group the School

    Leadership Team. This group comprises the Head of School, all Group Leaders, the School Academic Adviser and the Administration Team Leader. They were given the authority to make both academic and

    435349973.docx 5

    school management decisions while referring relevant matters to higher level decision makers. This School Leadership Team replaced the academic forum of the School Board of Studies. Although this is a less formal structure, more people are involved in the discussions and this involved both academic and allied staff.

    The concept was challenging at first, but after nine months of trial, there have been positive outcomes. One Group Leader noted a move from retrospective solutions to problems to proactive problem identification and action. She thought that the Group Programme Committees had helped to develop the team in a way that the Board of Studies had not, and had created more talk outside of meetings. She also noted that her team enjoyed having the accountability for results approval.

    With the regular feed in through the Group Programme Committees, the Head of School felt that she was better informed about student progress, about teaching, learning and assessment, and also about staff matters. This appeared to impact on her production of the Health of the School report in March of this year in that this was much easier and quicker to achieve.

    A further process change was apparent in March 2011. The Head of the School of Applied Science chose to call an all school meeting for the presentation of the 2010 APRs. That is rather than use the smaller membership of the School Board of Studies, he invited all administrative, tutorial and technician staff to participate in the APR presentations. This wider participation meant that staff learned about other programme areas and strategies that had been used to support students. The Head of School also called for his team to take a closer look at trends in their programmes of study over the last three years. In this way, he reinforced the need for greater analysis of the data.

Where to next?

    As mentioned at the outset of this paper, we are on a continuum of improvement.

    Early indications are that the change in the way that Annual Programme Review reporting is done is having a positive impact on culture change.

    Our place on the continuum is affected by the evolving understanding of and confidence in self evaluation. It is also affected by the increasing levels of trust applied through a more organic academic and administrative decision making structure. And finally, by using a multi layered but increasingly integrated application of improvement processes.

    As a heads up on improvement for the future, we are now looking at a greater synergy between the BoPP Strategic Directions and Planning process, performance development, business unit self evaluation, and programme health. It is an exciting place to be watch this space.

References

    Fullan, M.G. (1992) „Visions that Blind‟, Educational Leadership, February. Pp19-20.

    Harvey, L. (1993) „Continuous quality improvement: A system wide view to quality in higher

    435349973.docx 6

    education‟, draft contribution for Knight, P. (Ed.) (1994) System-wide Curriculum Change.

    SEDA/OU Press.

    Hausman, J.P. (1996) The role of Programme Review in Enhancing Quality within a New Zealand

    Polytechnic: How does it work and why does it work? Administrative project submitted in partial

    fulfillment of the requirements for the degree of Masters of Educational Administration, Massey

    University.

    Hausman, J.P. (2010) Annual Programme Review: Episodic self-assessment event or real catalyst for

    continuous improvement? , Self-Assessment for Quality: How do you know good when you see it? ndrdProceedings pf the conference hosted by Otago Polytechnic, 2 and 3 December 2010. pp71-

    78.

    Hopkins, D. (1994) “Process Indicators for School Improvement” in OECD Making Education Count:

    Developing and Using International Indicators. Centre for Educational Research and Innovation,

    pp149-170.

    King, S. (1998) Discussions at the Expert Advisory Group on Quality Assurance, Wellington. Leithwood, K.A. (1992) „The Move Towards Transformational Leadership‟. Educational Leadership,

    February pp8-12.

    Murphy, J. (1990) SEER Manual. A Reference Guide for Higher Education Research and Development

    Units, Singapore.

    Robbins, A. (2011) ThinkExist.com Quotations. “Anthony Robbins quotes”. ThinkExist.com Quotations

    Online 1 Jun. 2011. 23 Jul. 2011 <http://einstein/quotes/anthony_robbins/ Stewart, D. and Prebble, T. (1993) The Reflective Principal: School Development within a Leaning

    Community. ERDC Press: Massey University.

END OF FILE

    435349973.docx 7

Report this document

For any questions or suggestions please email
cust-service@docsford.com