Sunday, February 26, 2012

Evidence-based hypocrisy

This one meanders and is meant for heavy fire--I want to know why "evidenced-based research" is an oxymoron in education. I want to know why Gardner and Marzano are revered for their research. I want to understand why the cult of personality supersedes rationality in my craft.


I love playing with numbers, and I love trying to understand the world beyond the human noise. So do a few other folks. It's what scientists do.

If you want to know if a particular action (independent variable) has a particular effect, you set up an experiment. You minimize extraneous actions as much as you can (and eliminating them all is impossible even in the simplest experiments), run your experiment, collect your results (dependent variable), then do it again. And again and again and again....

You may see a pattern emerge, you may not. The pattern you see emerge may be consistent with what you thought you knew, it may not. If you see a pattern emerge, it may have caused by the independent variable, but, and this is critical, chances are pretty good it may not be.

If there is less than 5% chance that your results occurred randomly, they can be considered significant--and the word "significant" means nothing more, nothing less than that. You can have  results that look like change has occurred, and still have no significance, and you can have significant results that show change did not happen.

One final point. Correlation does not mean causation unless you have a perfect experiment with only one  variable--and this is impossible. If you want to know why some scientists get a reputation for walking around like they got peri-anal meter sticks under their pants,  try controlling anything for all variables.
***

Social scientists have three huge problems:

First, the myriad variables inherent in humans and their interactions  makes reducing any experiment to just one variable impossible. There are ways to minimize the noise--using huge sample sizes, for instance--but the results will always be a tad wiggly.

Second, imposing an independent variable on a select population of humans to see what happens compared to a separate, similar population creates chilling ethical considerations. Autonomous mammals tend to reject such nonsense.*

One way around this is to use retrospective studies--look for a pattern among culled data instead of trying to run a true experiment. For example, I can look at the demographic data of kids taking the HSPA, and see if there is any correlation between their Zodiac sign and their math scores. If I find a significant (again, less than 5% chance these results are random) data, I might have something worth sharing.

The last problem may be the biggest, one that infects American education today--because social science research is so wiggly, and because it tends to use retrospective data, I can make a career latching on to a piece of data suggesting correlation, scream about it, then become the go-to guru.

This happens in the natural sciences, too--people is people--but in the natural sciences, published claims are easily tested. Scientists make a living climbing on the backs of others, destroying their colleagues hypotheses with better ideas and better data--a lovely mud bath of human foibles exposed for all the world to see.

For reasons I still do not grasp, this doesn't happen much in education. We have "theories" without evidence, and handsome men gracing websites, paid to give sermons sharing their "research."
***

I have my suspicions. Careers are made selling snake oil, and there's a lot of money floating around public education.

I have no fear of research-based initiatives influencing what I do in the classroom. If decent, replicable studies shows that my lambs will learn more science if I wear an eggplant on my head, then I will do that. In the meantime, I will continue to do what has worked reasonably well for several generations influenced by Francis Parker, John Dewey, Jerome Bruner, Margaret  Donaldson, Lev Vygotsky, among many others.

I will continue to use advanced tools so long as they serve our purpose. My $2 whiteboards are superior, for what we do in science class, to our $2000 SmartBoard, but I also use 1:1 netbooks to (sometimes) good effect. My Mobi, alas, never quite took off.

I will use decent recent research, but my criteria for "decent" goes beyond a pretty face and a slick repackaging of what we already know works. Daniel Willingham (who definitely does not have a slick hairdo) is a cognitive psychologist who initially studied "brain basis of memory and learning" and now focuses on the "the application of cognitive psychology to K-12 education."

I was a pediatrician before I threw my hat into professional wrestling education. I know a lot about child development, and expected those in education would, too. (I also expected my kids to fall in love with photosynthesis at first sight--I was a bit naive going in.)

I'm all for evidence-based best practices--any superintendents out there want to try it?



*Medical research has the same issue--people are people--the Tuskegee syphilis experiment belongs in a huge Hall of Shame, as does the Fernald School which allowed Harvard and MIT to used radioactive cereal on retarded children. Theses kinds of experiments tend to be performed on the less powerful among us--the poor, the incarcerated, the children.





Young woman scientist via Shorpy/Library of Congress
Snake oil ad via Wikimedia

3 comments:

Meg Blakeney said...

Left teaching after 20+ years. 2 years in business and now I'm returning to education with a great deal of trepidation. Delighted to find someone expressing my feelings about education, the same ones that drove me away, so coherently. I wrote to Marzano and asked if his research had been peer reviewed. His reply, it has been "internally vetted". Too funny.

doyle said...

Dear Meg,

I am going to quote this far and wide--the hypocrisy is stunning.

I cannot figure out why those in education are so blind to their own fallacies.

Ryan Saenz said...

Hi Doyle, related to this evidence based article, sharing this research website relating to how much certain research topics are studied and what gaps in research still exist. Expontum (https://www.expontum.com/) - Helps researchers quickly find knowledge gaps and identify what research projects have been completed before. Thanks!