Central Academy (Henderson County) teacher Jo Morris says taking part in the Professional Growth and Effectiveness System field test has helped her ensure her students’ needs are met throughout the year and made her a better teacher. Photo by Amy Wallot, March 12, 2013

Central Academy (Henderson County) teacher Jo Morris says taking part in the Professional Growth and Effectiveness System field test has helped her ensure her students’ needs are met throughout the year and made her a better teacher. Photo by Amy Wallot, March 12, 2013

  • The Kentucky Department of Education, along with several partners and more than 50 school districts, is in the third year of a four-year plan to develop the Professional Growth and Effectiveness System (PGES). Schools statewide will pilot the new system in the 2013-14 school year, with full implementation scheduled for 2014-15. This is the fifth in a series of stories that will examine different aspects of the proposed system.

By Matthew Tungate
matthew.tungate@education.ky.gov

Teachers will not have to do anything they are not likely already doing to measure student growth under the proposed Professional Growth and Effectiveness System (PGES), according to teachers field testing the system and staff of the Kentucky Department of Education. The biggest changes may be in documenting that growth, they said.

Carolyn Noe, 2nd grade teacher at Paint Lick Elementary (Garrard County), said student growth in the PGES is based on specific, measurable, appropriate, realistic and time-bound (SMART) goals.

“It’s not that we haven’t always measured student growth, but it’s a more specific, determined way to go about it. You’re actually forcing yourself to put that down on paper, take a very close look at it and determine exactly how much student growth,” the 31-year teaching veteran said. “I think we’ve always been aware of how students are progressing, but this puts a very definite number to it, which really causes you to focus in on a student.”

For years, teachers have kept assessment journals, tracking growth over time on assessments they designed or through common assessments as a part of an established formative assessment process used at the local level, according to Todd Baldwin, executive strategic adviser with the Office of Next-Generation Learners, which is overseeing PGES design and implementation for the state Department of Education.

The fundamentals of student growth, one of the multiple measures used in the PGES, are the same as teachers have always used, he said.

“Where are kids now, where do you want them to be over time and specific to what skills or standards, and how do you know that they got there?” he said. “It’s not new.”

During the PGES field test, a teacher will pre-assess students on a certain set of knowledge, set SMART goals for where they want their class to be at the end of a certain time, decide the next steps for instruction and strategies for moving to that goal, and use assessments that are comparable across their district to see how they have done, Baldwin said.

A steering committee made up of teachers and other education stakeholders is reviewing data collected  from the field test to make recommendations to the Kentucky Board of Education about how measures will be used in the summative evaluation process when it goes live statewide in 2014-15, he said.

English teacher Jo Karen Morris teachers students from grades 6-12 at Central Academy (Henderson County), an alternative school, and she wanted them to be prepared for national tests required as part of Kentucky’s Unbridled Learning school accountability system. So she set a goal based on an assessment that uses many of the same skills. Her goal is: “For the 2012-13 school year, 100 percent of my 10th graders will score at least a 92 percent on the Section 1 English test from the Cambridge Victory for the ACT, PLAN and EXPLORE Tests.”

Morris said that if she had not been part of the field test, she wouldn’t have used the assessment,wouldn’t be setting goals throughout the year and wouldn’t be meeting students’ needs as well.

“Overall I can say that is has made me a better teacher,” she said. “With this program in place, I’m almost forced to be a better teacher and keep the student needs as my priority.”

Baldwin said two factors will contribute to student growth as part of the PGES:

1. The state contribution – student growth percentile for state-assessed grades and subjects

2. The local contribution – teacher developed student growth goals

Teachers such as those in physical education, art and other areas that are not state-assessed will

use student growth goals primarily per the recommendation of the Teacher Effectiveness Steering Committee; any additional factors regarding student growth for teachers in non-assessed areas will be considered as more research is conducted during the 2013-14 statewide pilot.

So teachers will continue to use the characteristics of highly effective teaching and learning, and classroom assessment for student learning, but districts may consider working with teachers to develop more common assessments to ensure comparability and rigor in assessment measures. Baldwin said those could be district-designed common assessments, district-adopted third-party assessments from commercial vendors, or data from assessment systems like MAP or ThinkLink.

Keown said writing her student-growth goals caused her to look at the students more closely and how she was going to meet their needs that day, “not how was I going to get through Unit 6.”

“I always prided myself on being a very student-oriented teacher, but I had no idea how it could be more student-oriented,” she said. “When I started writing SMART goals … and I got to think about each student in a very close, personal way. So it definitely changed the way I approached it.”

Morris said using the SMART goals gives her evidence that students are meeting the goals she has set for them.

“So I feel like it puts a little more pressure on the teacher, but I think it’s a good pressure because it holds you accountable,” she said.

Central Academy Principal Lisa Horn said SMART goals are good because with informal goals, teachers can “wish for the moon and the stars and then say you did it.”

“But to actually measure what you do gives the validity to whether you have taught that subject right or not,” Horn said. “So I think it helps hold the teachers more accountable to what they’re actually teaching in the classroom.”

Baldwin said teachers should be using assessments formatively to see if students are on track to meet their student-growth goal and modify their instruction if they are not. Student growth in the PGES works the same way, he said.

“The SMART process is designed to provide something that is both measurable and useful, and provides the kind of useful feedback to the teacher to inform instruction,” Baldwin said.

There are still many issues left to be decided, he said, but data from the 2012-13 field test and the 2013-14 statewide pilot – along with national research and policy work from organizations like the National Council on Teacher Quality and the Measures of Effective Teaching Project – will inform further decisions related to the measures within the system.

“The steering committee is absolutely committed to making recommendations based on feedback from the field,” he said.

MORE INFO …
Cathy White, teacherleader@education.ky.gov, (502) 564-1479