This case study investigated the implementation of the new Ohio educator evaluation systems and the impact of the relationship between the Ohio Teacher Education System (OTES) and the Ohio Principal Evaluation System (OPES) on implementation. In both evaluation systems, student growth measures make up 50% and performance on a standards-based rubric make up the other 50% of the final evaluation rating.
Purposeful sampling was used to recruit twelve LEAs, eleven representing the seven Race to the Top (RttT) regions and Ohio Department of Education defined typologies, and one LEA not participating in RttT. Superintendents of the LEAs were contacted by email to enlist participation. Follow-up phone interviews with the Superintendents or their designated representative provided details about the status of implementation of OTES and OPES during the 2012-2013 academic year. Follow-up site visits were conducted in the two LEAs piloting/implementing OTES with all teachers. Site visits included focus groups with teachers and administrators to provide detailed data collection regarding implementation and the relationship between OTES and OPES.
Of the twelve LEAs in the sample, eight were piloting and one was implementing OTES. Five LEAs were piloting and one was implementing OPES. Two LEAs, one RttT and the non-RttT, were not piloting or implementing either new evaluation system. Ten superintendents reported that OPES implementation was behind OTES implementation because there were too many changes happening all at the same time and too many demands on principals.
Content analysis of principal interview and teacher focus group transcripts revealed common themes related to: training, implementation decisions, responses to evaluation systems, impacts on school culture, issues/concerns, and misunderstandings/confusion about SGM. The sample of evaluation documents did not provide the desired information about which forms were used, completeness and types of information included in evaluations. At the time this study was completed, none of the LEAs had finalized the measures and percentages for the student growth measures component of OPES for 2013-14. Five of the twelve LEAs had measures and weights under consideration for the SGM measures component for OTES for 2013-14, and those are included in this report.
Key findings included, sample LEAs were not fully implementing OTES or OPES nor using student growth measures in evaluations for 2012-13; generally positive feedback characterized by a “we are in this together” approach; educators perceived the time demands for completing evaluations made the process unrealistic and unsustainable for larger buildings; educators offered evidence of competition and perceived unfairness due to variations in how student growth is measured across grade levels and subjects. Teachers expressed many misunderstandings and questions related to student growth measures, and feel they need data experts, resources, and examples to better understand the various student growth measures.