9 de dezembro de 2016

Three Thoughts on the Ugly New PISA Results – by Frederick Hess


Posted: 08 Dec 2016 09:21 PM PST
On Tuesday, the 2015 Program for International Student Assessment (PISA) results were released. The news wasn’t good. PISA is administered every three years to a sample of 15-year-olds in education systems across the globe (73 nations, states, and systems participated in 2015). The tests gauge performance in math, reading, and science. What do the results show?
ednext-blog-dec16-hess-pisaCompared to the international averages, U.S. performance was middling in science, poor in math, and above-average in reading. U.S. math performance dropped precipitously since 2012, after dropping noticeably from 2009 to 2012. Peggy Carr, acting commissioner of the National Center for Education Statistics (NCES), drily noted that, compared to the international average, “we also have a higher percentage of students who score in the lowest performance levels … and a lower percentage of top math performers.” U.S. performance in reading and science has also declined (slightly but steadily) since 2009, by three points in reading and six points in science.
You can peruse the NCES report for yourself here, if so inclined. I don’t want to belabor things, so I’ll just offer three reflections.
One, I’m generally not a fan of using test results to assess the validity of a presidential administration’s educational efforts. Washington shenanigans are supposed to be peripheral to what happens in America’s schools; and, thankfully, that’s mostly the case. It’s hard to forget, however, that the Obama administration was cherry-picking test results to justify its machinations—back when it could find results to pick. In 2013, Secretary of Education Arne Duncan pointed to modest gains on the National Assessment of Educational Progress (NAEP) as evidence that the Common Core was working. (The argument got tendentious, especially when Duncan started selectively choosing state-level scores to make his case, but we needn’t rehash that here.) When I published a scathing assessment of the Obama-Duncan legacy in fall 2015, rebuttals mostly asserted that scores had gone up on Obama’s watch. Well, just months after I wrote that piece, the new NAEP results showed unprecedented declines in reading and math. And the new PISA results tell a tale of steady decline on Obama’s watch, following years of improving U.S. performance. I won’t claim that these results demonstrate the flaws in the Obama agenda, but they sure don’t help make its case.
Two, Obama’s spinners have spent a lot of time talking up the steady increase in the U.S. graduation rate. President Obama delivered a widely covered speech this fall to celebrate that the graduation rate had climbed to 83% in 2014-15, up about four points from where it stood in 2010-11. The thing is, if more students are graduating high school even as they are faring worse on reputable assessments, it raises questions about just what those graduation rates mean. After all, diplomas are just pieces of paper—they don’t necessarily mean that students have mastered essential knowledge or skills. If a push to get students to graduate means schools are lowering the bar, turning a blind eye, or finding ways to drag them across the finish line, then those graduation rate boosts aren’t actually a cause for celebration.
Three, I often wish people were a little more reticent about racing to insist that the latest round of test results or graduation rates prove this or that. Much of the fevered discussion of the PISA results suffers from a pretty big flaw—which is that most observers don’t really understand what these international tests measure. That makes it difficult to know what one ought to make of the results. As the Brookings Institution’s Tom Loveless has observed, PISA is a test that “assesses whether students can apply what they’ve learned to solve ‘real world’ problems.” In the case of math, Loveless explains, “The PISA math assessment is based on a philosophy known as Real Mathematics Education (RME), championed by the Freudenthal Institute in the Netherlands. Jan de Lange of the Freudenthal Institute chairs the PISA expert group in mathematics. RME’s constructivist, problem solving orientation is controversial among mathematicians.” Does this mean that one should look askance at the PISA findings? Does it make them more valuable? Reasonable people can disagree, but it’s useful to know what tests are measuring when we’re throwing their numbers around as evidence about educational policy or practice.
Let me be clear— I’m not saying that a given set of test results prove that Obama’s educational efforts have been misguided. I am saying that the Obama administration has been disingenuous when it has tried to use convenient data points to make its case. The reality is that these kinds of national results are so far removed from the regulatory minutiae of federal education policy, and the meaning of these test results can be so opaque, that everyone would be well-served if they spent less time claiming this or that test result or graduation rate proved that a grand federal agenda was the right one.
—Frederick Hess
Frederick Hess is director of education policy studies at AEI and an executive editor at Education Next.

Nenhum comentário:

Postar um comentário