From: Brooke Anderson
To: Scott Sandall, David Lifferth, Jack Draxler, Edward Redd, Curt Webb, Jacob Anderegg, Justin Fawson, Gage Froerer, Jeremy Peterson, Dixon Pitcher, Brad Dee, Mike Schultz, Paul Ray, Curtis Oda, Brad Wilson, Steve Handy, Stewart Barlow, Timothy D. Hawkes, Raymond Ward, Becky Edwards, Doug Sagers, Susan Duckworth, Sandra Hollins, Rebecca Houck, Joel Briscoe, Angela Romero, Mike Kennedy, Brian King, Lee Perry, Fred Cox, Sophia DiCaro, LaVar Christensen, Craig Hall, Johnny Anderson, Mark A. Wheatley, Patrice Arent, Carol Moss, Eric Hutchings, Jim Dunnigan, Lynn Hemingway, Daniel McCay, Kim Coleman, Earl Tanner, Bruce Cutler, Steve Eliason, Marie Poulson, Ken Ivory, Keven John Stratton, Robert Spendlove, Richard Cunningham, Greg Hughes, John Knotwell, Melvin Brown, Kraig Powell, Scott H. Chew, Kay Christofferson, Brian Greene, Derrin Owens, Val Peterson, Bradley Daw, Keith Grover, Jon Stanard, Dean Sanpei, Norm Thurston, Francis Gibson, Michael McKell, Marc Roberts, Merrill Nelson, Brad King, Kay Mciff, Brad Last, John Westwood, mnoel, Lowry Snow, Don Ipson,
Subject: Support for HB 201: STUDENT TESTING AMENDMENTS
Date: Wed Feb 10 02:37:08 MST 2016
Body:
Hello,

My name is Brooke Anderson, and I have taught Jr High English and Computer Applications in UT schools for thirteen years, hold MS in Information Systems from U of U, and currently serve as a specialist in Evaluation, Research, and Accountability for Jordan School District. I have an interest and knowledge in both the instructional and data sides of UT education.

I support the changes to educator evaluation proposed in HB 201, namely removing student achievement data as a component, for the following reasons:
  • While we have the best student achievement data we've ever had, we still can't make a decisive claim that this student achievement is due to teacher effectiveness. The metrics we are currently using are informative, but claiming that growth or proficiency is due to the teacher is just not that simple. 
  • Based on initial study and analysis, a significant school effect can be seen in achievement data; possibly larger than the teacher effect. What this means is, take two equally effective teachers, place them in separate schools, and their student achievement data will be significantly different. MGPs or other student achievement data may be telling us more about school climate and support programs than it does about individual teachers.
  • A school district might get sued over this, and we may not be able to defend the practice of using student achievement data in teacher evaluation beyond saying, "We recognize it's problematic, but the law says we have to." If my office is going to go to court over a teacher's evaluation, I'd like something more solid to rely on.
  • We can afford to give ourselves time to develop best data practices without be legally bound to risky ones.
At the very least the role of student achievement data in teacher evaluation should be minimal until we know more about the school and teacher effect in the data.

I do appreciate the continued focus on data-driven decision-making, and identifying ways to make our public education system more effective. I look forward to the day when Utah education decisions are based on robust Utah education data.

Please vote for HB 201. Thank you for your time and efforts, and please let me know if I can be of any service.

Best wishes,
Brooke Anderson

2587 E 6710 S
Cottonwood Heights, UT 84121
801-709-8103
banders75@gmail.com