To: Francis Gibson, Aaron Osmond, Stuart Reid, Daniel McCay, Carol Moss, David Hinkins, Dean Sanpei, Greg Hughes, David Lifferth, Jim Nielson,
Subject: Vote no on HB417 - Extreme Spending on Technology Can Hurt Literacy and Wastes Money
Date: Fri Feb 28 18:25:59 MST 2014
Please vote no on HB417. Spending this much money on anything requires more consideration and much more vetting of the reasoning behind such spending.
Is the technological elixir supposed to cure all illnesses and heal all wounds and should be bought at any price?
Dr. Terrence Moore has pointed out that computers are a lot more like televisions than anyone is willing to admit. It is true that art teachers can more easily show their classes great paintings and history teachers can employ actual speeches of Churchill using videos on the web. But ninety percent of the time, that is not how the computer is being used in schools. Too much technology deprives students of a richness that cannot be bought. It up-ends teachers' roles. Pause to consider potential side effects: how much technology compromises classic literacy.
Jakell Sullivan has written the following article:
Common Core’s Metric Makes Informational Texts Trump Literature
Common Core Standards’ architect David Coleman, and his group Student Achievement partners, have created a text complexity metric designed to assess the progression of text complexity in student reading. The goal of this new metric is to elevate informational text above great and proven literary works. Hillsdale College History Professor, Dr. Terrence Moore detailed in his book, “Story-Killers: A Common Sense Case Against the Common Core” how the English language arts are being destroyed by this new metric which calls for “range” in texts. ”Range”, as Dr. Moore identifies, is code for requiring modern day, unproven and politically biased authors to be read in accelerated rates as compared to great and proven literary authors. Dr. Moore points out that this flawed Common Core reading metric actually calls for the Grapes of Wrath to be read in SECOND GRADE!!!, while a George Clooney article would be considered a “complex text” to be read in 11th grade.
In Appendix A of the Common Core Standards we find that seven reading metric companies participated in a Student Achievement Partners’ study which helped them all align their metrics to the guidelines of the Common Core creators. Page 4 of Appendix A reads, “Each of the measures has realigned its ranges to match the Standards’ text complexity grade bands and has adjusted upward its trajectory of reading comprehension development through the grades to indicate that all students should be reading at the college and career readiness level by no later than the end of high school.”
Do English teachers need a metric aligned to Common Core (which apparently most reading metric tools now are) in order to understand at what levels their students are in reading? Utah’s HB 417 assumes that they do.
Utah’s HB 417 wants to spend $1 million Utah tax dollars to provide a new technology tool that will use the Common Core aligned Lexile reading metric for assessing English standards. The bill calls for the State Board, on or before July 1, 2014, to select one or more technology providers, through a request for proposals process, to provide licenses for a tool for students in grades 4-12.
HB 417 ”enables student reading ability to be reported as a Lexile measure; uses Lexile measures to match reading materials and exercises to the comprehension level of readers.
Is it any coincidence that MetaMetrics, the company that created the Lexile Framework for Reading received a 3-year grant from the Bill and Melinda Gates Foundation?
Or, that Student Achievement Partners is mostly funded by Gates?
Is it any coincidence that Utah’s State School Board is being directed to use a Lexile provider for English Language Arts? What company will Utah choose?
It doesn’t really matter. They’re all the same. And, as more and more bills in Utah’s legislature are simply fulfilling the Obama administration’s goals for centralizing curriculum and data collection, we are losing all autonomy and agency in teaching and learning….
Additional points to consider regarding David Coleman and his role in centralizing data collection in America’s education system:
• He worked with McKinsey & Co.—the international, “big data” powerhouse—which plans to acquire and then merge SBAC and PARCC, the two federally funded testing groups for Common Core, in 2014. (Utah’s testing agent, American Institutes for Research is partnered with SBAC and plans to eventually use SBAC’s test items).
• His New York based data company, the Grow Network, was paid a $2.2 million contract to produce data studies for the Chicago Annenberg Challenge (CAC). The Grow Network’s objective was to “produce data to tell parents and teachers what test scores mean.” At that time, President Obama (then Senator Obama) was sitting on the CAC Board which paid for the contract and US Secretary of Education, Arne Ducan was Chicago’s State Superintendent.
• He was hired to be the architect of Common Core Standards
• His group, Student Achievement Partners managed to change all independent reading metrics to the requirements of the Common Core. One of those is Lexile—required in Utah’s House Bill 417. At this link, Lexile explains how they “integrated their measures into Obama’s Race to the Top Assessment Program application.”
• He was appointed to be the new head of the College Board. The College Board has shifted its mission—they are now desiging curriculum and curriculum frameworks and hiring Obama campaign data experts to decide who gets to attend college by accessing massive amounts of student data through curriculum platforms and tests. The GED, PSAT, AP Tests, SAT & ACT are all being aligned to Common Core.
• See “College Board’s Curricular Coup” – A Nine-Part Series on how David Coleman and the College Board are dismantling the idea of American exceptionalism in America’s curricula and tests.