When groups do this exercise, EVERY item is rated "best" and "worst" by someone. Join…
How often does this happen? You see headlines, within days of each other, one reporting that research shows using/eating/practicing X lets you leap tall building with a single bound, and the other, research shows X will crush you faster than a speeding locomotive. So how do you know which research to trust? Here’s a few tips for educators, consumers or business people, although the examples come from the current crazy world of education reform.
Did They Ask The Right Question? There’s a whole book out now on The Marshmallow Test. The original research looked at delayed gratification and reported that four-year-olds who were able resist eating the marshmallow in front of them when promised a second one if they waited 10 minutes, showed better test scores, graduation rates, and more a decade later.
However, a more recent study changed the test parameters. They repeated the marshmallow part, but before offering temptation they engaged the children in an art project. Partway through their creative efforts, an adult said, “Oh, I have some more art supplies that will be fun for what you’re doing. Let me get them.” Half the children received the supplies. The other half received a broken promise—no supplies.
Guess what. The children who experienced adult consistency held out for the second marshmallow. Those whose trust had been betrayed gobbled up the first quickly. So was delayed gratification a factor at all, or were the original results actually pointing to whether the child experienced adults they could trust?
Pondering whether the right questions have been asked is imperative.
Did They Consider the Right Time Frame? In today’s environment, with the emphasis on short-term results, be they test scores or quarterly financial statements, often the time frame being examined doesn’t fit the practices being evaluated. For example, some key new research on social programs in schools such as teaching emotional control or self-efficacy, shows these programs have a significant positive impact on student test performance three to five years after the start of the intervention. Similar findings are often seen in areas such as teaching students multiple ways to solve mathematics problems instead of only standard algorithms. At first, test scores may even go down, but three to five years of this method of instruction results in deeper conceptual knowledge.
Are you jettisoning things that are working before the effects have percolated long enough to bring results?
Do You Want Their Results, Anyway? Recently, using highlighters to mark texts was “debunked” as a way to study. Again, diving into the study parameters revealed two key factors that warrant caution in tossing out highlighting altogether:
- Most of the students using the highlighters were randomly marking text. They hadn’t had instruction on using the method to highlight main ideas or connections.
- The study focused on factual knowledge. The study itself reported that when students used the knowledge as part of a research project or applied it in problem-solving—in other words, they had an actual need for the information—then highlighting proved to be a reliable tool. On the other hand, it seemed to interfere with making inferences. Dicy, isn’t it!
Do you care about knowledge recall? Or are you providing students with tools for accessing information? Both are important for different purposes. The danger is that we’ll throw out a practice such as highlighting rather than discern when it is and isn’t useful.
Can They Answer the Question They’re Asking? If you’ve been watching the backlash on neuroscience and brain research, you’ve been seeing headlines about claims where none are warranted. Listen to my colleague Dr. Dario Nardi discuss his research on how the brains of people with different preferred cognitive processes react to the same stimuli. He found out early on in his research that he could see the differences, but without asking the subjects what they were thinking, his interpretations were bound to off-base. Books such as A Skeptic’s Guide to the Mind now demonstrate how spot-on Dr. Nardi’s research methods are, even though they go against the research practices necessary to be considered for most journal publications.
So before you read the research, make sure you’re clear on the real question you’re asking, what you care about, and whether you’re getting the whole story.