Showing posts with label snake oil. Show all posts
Showing posts with label snake oil. Show all posts

Saturday, March 24, 2012

Action pseudo-research



 
My district's recent incursions into action research have been interesting. I have the extreme fortune of sharing my prep room with a retired bench research scientist, published in multiple peer-reviewed science journals. We are tackling a specific method of inquiry, but quickly realized getting any reasonable data will require far bigger numbers than we will likely generate in our classes.

Our administration responded reasonably, explaining that the process of looking at data generated in our classrooms will encourage teachers to look critically at specific classroom practices. No one is pretending that will will develop statistically significant findings (p <0.05).

In education, a field where Marzano's and Gardner's "research" cause real (and possibly destructive) changes in the classroom, the lack of concern for the validity of the studies used to measure effective outcomes scares the crap out of me. Charismatic personalities trample over available evidence.

On Twitter I stumbled onto a group of ed folks setting up an action research project. At least one university professor was involved, someone I've met, so I figured I'd jump in on the open invitation.

I wondered aloud about how we might generate statistical significance from our work; to be fair I wasn't clear if that was even the goal. I was told that my medical background imposed a biased logical positivistic view, one too narrow for endeavors such as this one, and that I need consider other ways of viewing the world, including "intentional observation," which is, ironically, exactly how science works.

Experiment is in fact intelligent and intentional observation.
Robert Boyle, Epoch Men, 1868

(I also happened to major in philosophy, leaving Michigan with a B.S. in philosophy back in 1982--logical positivism was declared dead back in the 1970's. *sigh*)
***


Medicine and education have scary parallels. Medicine only recently advanced beyond the snake oil stage, with doctors kicking and screaming every step of the way.

Docs like to treat things. Patients like to be treated. Docs like to get paid. Patients are not quite as happy to pay. Our motto is primum non nocere--"Above all, do no harm"--which is a whole lot different than "Make them better!"

Snake oil is (mostly) harmless for self-limited illnesses, it makes patients feel like they're getting something, and docs make money. You don't need antibiotics for the vast majority of cases of sinusitis. Most docs will prescribe it anyway.

Medicine started looking at itself back in the 1970s, around the same time logical positivism was declared dead by professional philosophers. I was in medical school when evidence-based medicine starting taking a hold, and it was both liberating and frightening--most of what we did we did because, well, that's the way it's always been done. Sound familiar?

Education also finds snake oil useful--and it is for those selling it. Lots of folks make lots of money selling snake oil.

But in education, snake oil is harmful. A child's education is not a self-limited illness.We need to pay more attention to what we know through our research than we know "in our hearts." We need to pay particular attention to the folks who hide behind pseudo-research, tossing out fluorescent graphs, cooked numbers, and charismatic smiles.

I'd be glad to participate in some research. Medicine abandoned leeches not so long ago. It's time we pushed some leeches out of education.





Photo of leeches from LiveScience








Sunday, February 26, 2012

Evidence-based hypocrisy

This one meanders and is meant for heavy fire--I want to know why "evidenced-based research" is an oxymoron in education. I want to know why Gardner and Marzano are revered for their research. I want to understand why the cult of personality supersedes rationality in my craft.


I love playing with numbers, and I love trying to understand the world beyond the human noise. So do a few other folks. It's what scientists do.

If you want to know if a particular action (independent variable) has a particular effect, you set up an experiment. You minimize extraneous actions as much as you can (and eliminating them all is impossible even in the simplest experiments), run your experiment, collect your results (dependent variable), then do it again. And again and again and again....

You may see a pattern emerge, you may not. The pattern you see emerge may be consistent with what you thought you knew, it may not. If you see a pattern emerge, it may have caused by the independent variable, but, and this is critical, chances are pretty good it may not be.

If there is less than 5% chance that your results occurred randomly, they can be considered significant--and the word "significant" means nothing more, nothing less than that. You can have  results that look like change has occurred, and still have no significance, and you can have significant results that show change did not happen.

One final point. Correlation does not mean causation unless you have a perfect experiment with only one  variable--and this is impossible. If you want to know why some scientists get a reputation for walking around like they got peri-anal meter sticks under their pants,  try controlling anything for all variables.
***

Social scientists have three huge problems:

First, the myriad variables inherent in humans and their interactions  makes reducing any experiment to just one variable impossible. There are ways to minimize the noise--using huge sample sizes, for instance--but the results will always be a tad wiggly.

Second, imposing an independent variable on a select population of humans to see what happens compared to a separate, similar population creates chilling ethical considerations. Autonomous mammals tend to reject such nonsense.*

One way around this is to use retrospective studies--look for a pattern among culled data instead of trying to run a true experiment. For example, I can look at the demographic data of kids taking the HSPA, and see if there is any correlation between their Zodiac sign and their math scores. If I find a significant (again, less than 5% chance these results are random) data, I might have something worth sharing.

The last problem may be the biggest, one that infects American education today--because social science research is so wiggly, and because it tends to use retrospective data, I can make a career latching on to a piece of data suggesting correlation, scream about it, then become the go-to guru.

This happens in the natural sciences, too--people is people--but in the natural sciences, published claims are easily tested. Scientists make a living climbing on the backs of others, destroying their colleagues hypotheses with better ideas and better data--a lovely mud bath of human foibles exposed for all the world to see.

For reasons I still do not grasp, this doesn't happen much in education. We have "theories" without evidence, and handsome men gracing websites, paid to give sermons sharing their "research."
***

I have my suspicions. Careers are made selling snake oil, and there's a lot of money floating around public education.

I have no fear of research-based initiatives influencing what I do in the classroom. If decent, replicable studies shows that my lambs will learn more science if I wear an eggplant on my head, then I will do that. In the meantime, I will continue to do what has worked reasonably well for several generations influenced by Francis Parker, John Dewey, Jerome Bruner, Margaret  Donaldson, Lev Vygotsky, among many others.

I will continue to use advanced tools so long as they serve our purpose. My $2 whiteboards are superior, for what we do in science class, to our $2000 SmartBoard, but I also use 1:1 netbooks to (sometimes) good effect. My Mobi, alas, never quite took off.

I will use decent recent research, but my criteria for "decent" goes beyond a pretty face and a slick repackaging of what we already know works. Daniel Willingham (who definitely does not have a slick hairdo) is a cognitive psychologist who initially studied "brain basis of memory and learning" and now focuses on the "the application of cognitive psychology to K-12 education."

I was a pediatrician before I threw my hat into professional wrestling education. I know a lot about child development, and expected those in education would, too. (I also expected my kids to fall in love with photosynthesis at first sight--I was a bit naive going in.)

I'm all for evidence-based best practices--any superintendents out there want to try it?



*Medical research has the same issue--people are people--the Tuskegee syphilis experiment belongs in a huge Hall of Shame, as does the Fernald School which allowed Harvard and MIT to used radioactive cereal on retarded children. Theses kinds of experiments tend to be performed on the less powerful among us--the poor, the incarcerated, the children.





Young woman scientist via Shorpy/Library of Congress
Snake oil ad via Wikimedia