Archive for the ‘Ranting’ Category

h1

Random thoughts while procrastinating, #2057

May 31, 2014

Whenever I hear “Tie A Yellow Ribbon Round The Ole Oak Tree” or “Knock Three Times,” I chafe at the preposterous scenarios therein. How can Tony Orlando & Dawn have not one but TWO #1 hits celebrating convoluted communication strategies? Why can’t their characters just use words?

That thought is often followed by another one, though: given the existence of these two atrocious nonverbal-communication songs, wouldn’t it be nice if TO&D had a third one, just to complete the set?

I’m imagining something like, “If You Can’t Stay Away, Make Me A Soufflé; Otherwise, Pasta Is Fine.”

h1

A bridge too far

March 20, 2014

A Matter Of Trust – The Bridge To Russia, a CD and video set based on Billy Joel’s 1987 concerts in the Soviet Union, is available for pre-order.

An “Editorial Review” at amazon.com says the following:

Billy has always considered that going to Russia was the most important thing he’d ever done. The freedom and excitement of his presence permanently affected the country and played no small role in the ultimate dissolution of the U.S.S.R. in 1991.

I fear that this “Piano Man Diplomacy” hypothesis has not yet had a fair hearing among 20th century historians. Aren’t many of them still naively attributing the Soviet Union’s collapse to economic problems and the like?

the music that toppled an empire

h1

The wisdom of crowds?

February 9, 2014

From the Seattle Times: 700,000 at Seahawks parade? Doesn’t add up, experts say.

It’s a lighthearted article, but it touches on the methodology of crowd estimation and uses some basic math to show that the number of parade attendees was less than the official estimate of 700,000.

How did readers respond to this dollop of evidence-based analysis? There were several themes, as exemplified by the following online quotes. (My interpretations are in brackets.)

(1) Semians: “Quit trying to overanalyze everything and simply live in the moment.” [We shouldn’t care about this information.]

(2) picklesp: “It sure as heck felt like 700,000 from ground level.” [This information doesn’t match my personal experience.]

(3) Peterkirk: “These guys are just trying to rain on our parade for whatever reason, and well tough, it didn’t rain on the parade on Wednesday and your diatribe (reporting?) isn’t going to make it rain today.” [This information doesn’t make me feel good, so I’ll ignore it.]

(4) Mr. Mytzlplk: “The ‘experts’ vary by 200,000 in their ‘real analysis.’ The fact that they’re so far off from each other tells me that they don’t know what they’re doing.” [Experts disagree about the details, so their analyses are worthless.]

(5) picklesp: “Experts get paid to pontificate.” [Experts have their own biases and agendas — which is true.]

(6) gloryhound: “I’m also skeptical of these two ‘experts” qualifications.” [The “experts” aren’t really experts.]

The above excerpts are from some of readers’ HIGHEST-rated comments. Here are two of the LOWEST-rated ones:

[from dawgsage:]

Actually 2.5 sq. ft /person is a square of almost 19 inches per side. Measuring the width of my body without a coat shows approximately 19 inches shoulder to shoulder, with a coat let’s add an inch making it 20 inches. A 2.5 square foot rectangle, with one side 20 inches would then require the other side to be 16.2 inches, from front to back. Conservatively, my measurement is 10 inches front to back. This means there would be 6.2 inches forward from my front to the back of the body of the person in front, and 6.2 inches in the back of me to the body of the next person, while laterally I am shoulder to shoulder to the adjacent people. So no I do not think it is not like standing in line, you really can’t get more crowded than that unless you were in an Iraqi prison under Saddam. So I believe the basis of the low estimates are credible.

[from CO Dawg:]

Rather than just say “well I don’t believe you!” to the experts, just do this simple experiment: put on a winter jacket (remember, it was cold that day) and stand against the wall with your arms against your side, then have someone mark the wall with chalk at your elbows. Measure that width. Then turn sideways and mark again the two widest points (belly and bottom for me, your points may vary). Measure that width.

Now, grab a calculator and multiply your personal width by personal depth. That is the square footage of space you occupy if you were standing in a crowd elbow to elbow belly to back and back to belly, like at a rock concert, and represents a good indication of the maximum crowd density at the parade.

When i did this with a sweatshirt on i came up with 2.1 feet wide and 1.25 feet deep, for 2.65 square feet, a little above the minimum cited. However, if i put on a winter jacket it adds an inch to all four sides so the measurements jump up to 2.25 by 1.42 feet, or 3.2 square feet. Adding just one more inch to each measurement increases my footprint to 3.8 square feet, and adding 3 inches increases it to 4.7 square feet. I wont presume anything about your personal space requirements, but when someone is 3 inches away from me, i still feel pretty crowded. I can thus conclude that the experts have presented a reasonable range for each person’s footprint

I have no means to measure the overall footprint of the crowd along the route, but had they asked me to do crowd estimates i would have employed the same methodology they use (measurements from an aerial photo), and probably would have come up with numbers similar to their’s. I would have multiplied the overall crowd foot print by an average space per person of 3.5 square feet (generally splitting the difference between my numbers), added 15% to account for people standing outside the footprint or watching from offices, and likely come up with a forecast of somewhere between 350-400k. Which is still a heck of a crowd.

And for those of you dismissing my opinion because of my location, we’re not immune to overly enthusiastic crowd estimates in Denver, too. I was at a presidential campaign speech in Civic Park that supposedly was attended by 100,000 people, and didnt even need to do the math to know that estimate was comically high.

So, to summarize: dismissal of the information for any old reason? Thumbs up! Attempts to check the math and verify its reasonableness? Thumbs down!

While no legislation hinges on this particular estimate, I’m troubled by the attitudes displayed here, i.e., limited interest in the nuances of data and relevant expertise. I submit that, in other arenas, this limited interest has led to the popularity of positions like “evolution is just a theory,” “vaccines cause autism,” “global warming is a hoax,” and “animal testing is unnecessary.”

In response, we scientists can grumpily bemoan an incurious public … or we can recognize that facts alone don’t always move the needle of public opinion, and we can get better at appealing to people’s emotions and imaginations.

comments from a data denialist

do the math!

h1

The “13th Man” speaks his mind

February 2, 2014

When I began running marathons in 1999, my parents had very different reactions. My dad seemed interested in my competitive success; at the least, he understood my drive to achieve a certain time or ranking. My mom thought that such events were pointless and mainly wanted to know that I got home safely.

As I think about today’s Super Bowl, I find myself feeling a lot like my mother.

Here’s my three-point stance on tackle football:
1. Football puts participants’ brains and bodies at great risk (as discussed previously on this blog and in the excellent Frontline news documentary League of Denial).
2. My interest in people’s long-term health should trump my interest in their feats of athleticism.
3. Watching football is a tacit endorsement of the sport in its current, dangerous form. Therefore, I do not watch football.

I haven’t always been this way. As a child, I was a rabid supporter of the New York Jets. Some might speculate that THIS is the problem — how could a Jets fan experience football at its finest, or avoid disillusionment?

On the contrary, I was thrilled by the tackle-evading halfback Freeman McNeil and the quarterback-hunting linemen Mark Gastineau and Joe Klecko (the “New York Sack Exchange”). Watching athletes like these has always been exciting and fun. But how much should they have to sacrifice for the sake of entertaining me? Am I just another Roman spectator enjoying the spectacle of bludgeoned, bloodied gladiators?

Not anymore. When the Seattle Seahawks and the Denver Broncos take the field in a few hours, I won’t be watching.

As my mom might say, I just want everyone to make it home safe and sound.

h1

Calculations of bees’ impact on strawberries’ market value

January 16, 2014

In the course I’m currently teaching, we’ve been reading the paper Bee pollination improves crop quality, shelf life and commercial value by Bjorn K. Klatt et al.

As explained nicely by Erik Stokstad, the paper documents how strawberries benefit from pollination by bees, as opposed to pollination by wind or self-fertilization. It turns out that, on average, bee-pollinated strawberries are larger than others and also have fewer odd shapes, better color, and superior firmness. This detailed look at strawberry quality is a useful extension of past studies showing that insect pollination often boosts crop quantity (i.e., yield).

In making their case for the agricultural importance of bees, Klatt et al. say that bee pollination accounts for at least $1.44 billion of the value of the $2.90 billion strawberry market in the European Union (EU). While I accept the take-home message that bees add a lot of value, I sure wish the authors had explained their calculations better.

The $1.44 billion estimate is the sum of two factors: $1.12 billion in market value of the fresh berries, and another $0.32 billion corresponding to improved shelf life. Let’s consider each of these in turn.

The figure of $1.12 billion is introduced in this section of the Results:

Bee pollination resulted in strawberry fruits with the highest commercial value (figure 1a). On average, bee pollination increased the commercial value per fruit by 38.6% compared with wind pollination and by 54.3% compared with self-pollination. Fruits resulting from wind pollination had a 25.5% higher market value than self-pollinated fruits. Pollination treatments were stronger than differences between varieties and thus had a main effect across all varieties (see table 2 for AICc and likelihood values). Our results suggest that altogether, bee pollination contributed 1.12 billion US$ to a total of 2.90 billion US$ made with commercial selling of 1.5 million tonnes of strawberries in the EU in 2009 [1]—but so far without consideration of the monetary value provided by enhanced shelf life (see below).

Figure 1 indicates that the mean value of 1000 wind-pollinated berries was ~$13.80 and the mean value of 1000 bee-pollinated berries was ~$22.40. The value of the wind-pollinated berries thus represents a ~38.6% DECREASE in value relative to the bee-pollinated berries; alternatively, the value of the bee-pollinated berries is a 62.9% INCREASE over the value of the wind-pollinated ones. A 62.9% increase takes us from $1.78 billion (the hypothetical value of strawberries only pollinated by wind) up to $2.9 billion, giving us the reported $1.12 billion boost from bees. The authors’ mention of a 38.6% increase when they meant a 38.6% decrease is not exactly a big deal, but initially made their math baffling to me and my students.

Perhaps more significantly, the reported market values reflect the classification of strawberries into commercial grades by the first author. Ideally, the first author would have rated the berries while “blinded,” i.e., without knowing which ones came from which treatments (bees, wind, or self). The paper doesn’t mention blinding, so I fear that there was the potential for a pro-bee bias.

Now for the $0.32 billion due to improved shelf life:

Bee pollination strongly impacted the shelf life of strawberries by improving their firmness (figure 2a). The firmness values of each treatment and variety were related to shelf life, measured as the number of days until 50% of fruits had been lost owing to surface and fungal decay (see the electronic supplementary material, S3). Higher firmness resulting from bee pollination potentially elongated the shelf life of strawberry fruits by about 12 h compared with wind pollination, and by more than 26 h compared with self-pollination. After 4 days in storage, only 29.4% of the wind-pollinated fruits and none self-pollinated fruit were still marketable, whereas, at the same time, 40.4% of the bee-pollinated fruits remained in a marketable condition. Thus, bee pollination accounted for a decrease of at least 11.0% in fruit losses during storage. These findings suggest that the value for bee pollination calculated in section 3a(i) has to be increased to accommodate this impact on the shelf life of strawberries. Hence, pollination benefits on the shelf life of strawberries potentially added another 0.32 billion US$ to the commercial value of strawberry pollination.

Here it’s clear that the authors got $0.32 billion by multiplying 11% by $2.9 billion. What’s less clear is the meaning of “storage” (did the unspecified storage conditions simulate those typically used by strawberry farmers/distributors/vendors?) and the reason(s) why a duration of 4 days was used in this calculation (is this a typical time between harvesting and consumers’ purchases?).

Such details aside, here’s a more general question relevant to both components of the $1.44 billion estimate. To what extent are commercial strawberries pollinated by bees in the wild?

The calculations assumes that the study site — a field in Germany — is representative of most or all commercial strawberry farms in Europe. The study site was intentionally set up near well-established bee hives and nests; are most or all European strawberry farms situated similarly? Perhaps the answer is obvious to people with relevant expertise, but the paper doesn’t say. It’s worth noting that if only half of commercial strawberry fields enjoy bee pollination, the estimates of bees’ economic impact would need to be cut in half.

Considering that the paper trumpets a billion-dollar claim in its abstract, more information on the calculations underlying that claim would have been appropriate. At least that’s how I see it — comments from real ecologists (Jeremy?), as well as others, are welcome!

h1

“Wheat Belly” by William Davis, MD

December 11, 2013

This post is for my dad, who bought me Wheat Belly so that I could read it and tell him what I thought of it.

WHAT’S GOOD ABOUT THIS BOOK?

1. Just as Born to Run questioned (in its entertaining but overblown way) whether we should trust shoe companies to tell us what our footwear needs are, this book advocates a healthy skepticism regarding agribusiness-influenced nutritional guidelines.

2. The book may help some people understand celiac disease and gluten-free diets. I hadn’t previously read much about gluten and was pleased to discover details about the gliadin and glutenin proteins which comprise it.

3. Likewise, I was pleased to read about wheat’s history of genetic and morphological changes, which author William Davis discusses in detail. Even if the health implications of these changes are debatable, it’s interesting to contrast the ancient einkorn wheat (14 chromosomes) and emmer wheat (28 chromosomes) with modern dwarf Triticum species (42 chromosomes).

4. There may be some truth to at least one of the author’s provocative assertions, i.e., that the breeding of wheat for more optimal baking properties may have contributed to the rise in celiac disease (van den Broeck et al. 2010).

WHAT’S BAD ABOUT THIS BOOK?

1. The central message — that wheat is evil — is irrationally and offensively simple-minded. There are a few surprising nuances in the book, such as the admission that the wheat hybridization work of Norman Borlaug helped solve the problem of world hunger. Yet Davis continually returns to his “wheat is evil” rhetoric without apparent hesitation or irony.

2. The language is often hyperdramatic at the expense of clarity and accuracy. For example, Joe Schwarcz (2013) bemoans the way wheat is blamed for osteoporosis because of its production of (tiny amounts of) sulfuric acid. “Davis panics readers with totally irrelevant statements about sulphuric acid causing burns if spilled on skin,” Schwarcz says. “Get it in your eyes and you will go blind. True, but what does that have to do with traces formed in the blood from cysteine?” Melissa McEwan (2011) accuses Davis of “anti-technology scare-mongering, preying on the agricultural ignorance of the average consumer.” She notes, “Telling me … that wheat has ‘undergone extensive agricultural genetics-engineered changes’ is hardly terrifying to me, as this describes almost all seeds on the market today.”

3. Davis writes with smug confidence about the devastating effects of wheat on human health, yet the scientific evidence for many of his claims is weak, misrepresented, or nonexistent. Pete Bronski (2012) dissects three examples of Davis’s misuse of the biomedical literature; additional examples are below.

WHAT ARE THE CENTRAL CLAIMS MADE BY THIS BOOK, AND HOW REASONABLE ARE THEY?

Wheat Belly makes numerous claims. Below are five that I personally consider important, along with some comments about their validity.

1. Wheat is unique in elevating blood sugar to unhealthy levels.

There are two issues here.

First, wheat products do not have an exceptional, uniformly high Glycemic Index (GI), the standard measure of glucose release into the blood. As McEwan writes, “If there is something special about wheat spiking blood sugar, why do some wretched coarser breads measure in the low thirties and forties (lower than many fruits and sweet potatoes), and so many gluten free breads measure so much higher? Davis mentions that the latter is often made from extremely refined processed rice, tapioca, and corn. And thus we have the answer — highly digestible carbohydrates, no matter what their provenance, are high glycemic.”

A second issue is that the GI of any carbo-rich food, including wheat, can be reduced by eating it with protein and fat. Davis describes a personal experiment in which he ingested four ounces of organic whole-wheat bread, and, sure enough, his blood glucose rose a lot. But who (other than my Uncle Scott) eats large quantities of low-fat bread, all by itself, in one sitting? It’s a pretty artificial situation, and one that is easily remedied by the addition of some peanut butter or meat or whatever.

2. Wheat is addictive.

There are multiple reasons to question this claim, which is based on the idea that wheat protein gets broken into peptides known as exorphins, which act on the brain as opioids.

First, exorphins released from wheat cannot exit the gastrointestinal tract without being further digested into non-drug-like molecules. Fred Brouns et al. (2013) explain: “Gliadorphin consists of seven amino acids (Tyr-Pro-Gln-Pro-Gln-Pro-Phe) and, as such, cannot be absorbed by the intestine. This is because the intestine peptide transporter PepT1 transports only di- and tripeptides (Gilbert et al., 2008) and transporters for larger peptides have not been identified. Gliadorphin is therefore not present in intact form in the human circulatory system and cannot reach and have an effect on the cells of the central nervous system.”

Second, as covered by Julie Jones (2012), proteins from milk and other sources yield exorphins too, so wheat is not unique in this respect (Jones 2012).

Third, Jones (2012) also points out that gluten stimulates the release of the hormones cholecystokinin and glucagon-like peptide 1, which contribute to a feeling of fullness or satiety. This is contrary to the vicious cycle proposed by Davis, in which ingesting some wheat leads to a craving for more.

3. Wheat causes obesity. Eliminating it leads to weight loss.

Michael Casper (2012) summarizes the evidence as follows: “We have to take Davis’s word that his anti-wheat diet works, because most of the cases of weight loss and recovery from illness are from his own practice.”

Jones (2012) and Brouns et al. (2013) also cite a paper from the Framingham Heart Study (E.A. Molenaar et al., Diabetes Care 2009) showing a negative correlation between whole-wheat consumption and risk of obesity. While this finding of a correlation is not a smoking gun, it is the opposite of what Davis would predict.

If wheat itself was a cause of weight gain, replacing wheat with an equivalent number of non-wheat calories would lead to weight loss. No controlled peer-reviewed study along these lines is cited by Davis; perhaps he should use some of his book royalties to finance one.

4. Wheat contributes to many other diseases besides diabetes and obesity (autism, schizophrenia, etc.).

Davis devotes full chapters to wheat’s effects on the body’s pH (Chapter 8), advanced glycation end-products (AGEs) (Chapter 9), heart disease (Chapter 10), the brain (Chapter 11), and skin (Chapter 12). There is no concise way to address all of these, but, as examples, Chris Masterjohn (2011) pokes many holes in Davis’s stories about pH and AGEs, and Jones (2012) notes a lack of strong links between wheat and autism, ADHD, or schizophrenia.

Casper (2012) adds, “Davis’s claims about wheat and schizophrenia are based on very old papers -– nothing within the past 25 years -— and this is another example of an abuse of definitions in order to further an anti-wheat agenda.”

5. Modern wheat is the product of science crossbreeding experiments. Its safety has never been tested.

Davis does not define the safety tests that he thinks are absent but needed. Apparently he considers observational studies inadequate and would prefer clinical trials (i.e., much stronger evidence than he uses to support his wheat-is-evil hypothesis). In any case, he misleads us with his implication that crossbreeding leads to radical, unpredictable, likely-to-be-dangerous changes.

For example, Davis says, “Analyses of proteins expressed by a wheat hybrid compared to its two parent strains have demonstrated that, while 95 percent of the proteins expressed in the offspring are the same, 5 percent are unique, found in neither parent.” However, his reference (Xiao Song et al., Theoretical and Applied Genetics 2009) simply reports that parents and offspring differ in the amounts of some of the proteins they produced, not changes in which proteins were present. Davis: “In one hybridization experiment, fourteen new gluten proteins were identified in the offspring.” Reality: the 14 proteins were slightly mutated versions of parental proteins, not completely new ones, and the study cited (Xin Gao et al., Planta 2010) involved somatic cell hybridization, a complicated laboratory technique not typically used by wheat breeders (NWIC 2012). Davis: “The genetic modifications created by hybridization for the wheat plants themselves were essentially fatal, since the thousands of new wheat breeds were helpless when left to grow in the wild.” Reality: yes, a plant that once grew in the wild has been adapted for domestic food production. The varieties now optimized for food production are no longer optimized for surviving in the wild, and thus do not thrive outside farms. This is neither surprising nor worrisome nor unique to wheat.

Finally, while other grains have also gone extensive optimization, Davis does not comment once upon the changes to the genomes of corn, rice, etc. In Casper’s (2012) words,

Davis argues in the second chapter of Wheat Belly that wheat has been genetically modified in the past few decades beyond all recognition, and this modification is somehow the source of the danger that wheat poses to us. It is a cunning argument, because it is certainly the case that genetic modification has taken place, and it is difficult to disprove that such changes have been harmless. But why stop at wheat? Corn, for instance, has certainly been subjected to dramatic genetic modification, as discussed in a 2003 article from the journal PLOS Biology. Why single out the genetic modifications of one crop, and not consider the implications of all the others? Or is Corn Belly intended as a sequel?

CONCLUSION

Wheat Belly offers a simple, seemingly research-based solution to a myriad of ailments. It has proven tremendously popular, especially among low-carb advocates. However, its narrative is based on “cherry-picked data, inflammatory hyperbole, misused science, irrelevant references and opinion masquerading as fact” (Schwarcz 2013). As such, it is hardly the slam-dunk case against wheat that it purports to be.

REFERENCES

Pete Bronski. Wheat Belly, busted. No Gluten, No Problem (blog), March 20, 2012.

Fred J.P.H. Brouns, Vincent J. van Buul, and Peter R. Shewry. Does wheat make us fat and sick? Journal of Cereal Science 58: 209-215, 2013.

Michael Casper. Wheat Belly, a book review. Caspersfarm’s Blog, September 13, 2012.

Julie Jones. Wheat Belly — an analysis of selected statements and basic theses from the book. Cereal Foods World 57(4): 177-189, 2012.

Chris Masterjohn. Wheat Belly — the toll of hubris on human health. The Daily Lipid (blog), October 12, 2011.

Melissa McEwan. Wheat Belly. Hunt Gather Love (blog), October 8, 2011.

National Wheat Improvement Committee (NWIC). Wheat improvement: the truth unveiled. USDA.gov (website), 2012.

Joe Schwarcz. Wheat Belly gives me a belly ache. McGill Office for Science and Society (blog), June 29, 2013.

Hetty C. van den Broeck et al. Presence of celiac disease epitopes in modern and old hexaploid wheat varieties: wheat breeding may have contributed to increased prevalence of celiac disease. Theoretical and Applied Genetics 121(8): 1527-1539, 2010.

h1

Misconceptions about misconceptions?

October 28, 2013

In checking my Google Scholar profile the other day, I was happy to find that a recent paper in CBE Life Sciences Education cited my 2012 review article on the use of music in science education. I was less happy to discover that my paper was cited as an example of bad pedagogy.

April Cordero Maskiewicz and Jennifer Evarts Lineback summarize their own paper as follows:

The goal of this paper is to inform the growing BER [Biology Education Research] community about the discussion within the learning sciences community surrounding misconceptions and to describe how the learning sciences community’s thinking about students’ conceptions has evolved over the past decade. We close by arguing that one’s views on how people learn will necessarily inform pedagogy. If we view students’ incorrect ideas as resources for refinement, rather than obstacles requiring replacement, then this model of student thinking may lead to more effective pedagogical strategies in the classroom.

I find this position interesting and sensible. I agree, of course, that how we teach should be based on research on how people learn. More specifically, I’m sympathetic to the viewpoint that (in the authors’ words) “Learning … is not the replacement of one concept or idea with another”; rather, “students learn by transforming and refining their prior knowledge into more sophisticated forms.”

The article provides two useful examples of how instructors can build upon students’ naive views of evolution, rather than simply rejecting them as wrong. So far so good.

Then comes the section “The use of the term misconceptions in current BER [Biology Education Research] Literature,” in which Maskiewicz & Lineback assert that many instructors have been slow to adopt this transform-and-refine-prior-knowledge view of learning. It’s a significant point because if everyone already holds this view and teaches according to it, there’s not much to discuss. Accordingly, Maskiewicz & Lineback searched the past three years of CBE Life Sciences Education for problematic as well as enlightened uses of the word “misconception.” Here’s what they found:

In some of these articles, the authors seemed to equate misconception with the more traditionally accepted definition of a deeply held conception that is contrary to scientific dogma (Baumler et al., 2012; Cox-Paulson et al., 2012; Crowther, 2012). Others, in contrast, seemed to use the term to reflect an ad hoc mistake or error in student understanding, one that exists prior to or emerges through instruction but, in either case, is not robust, nor does it interfere with learning (Jenkinson and McGill, 2011; Klisch et al., 2012). The authors who considered misconceptions to be “deeply rooted” spoke of instructional strategies designed to specifically elicit, confront, and replace students’ incorrect conceptions (i.e., Crowther, 2012). In contrast, authors for whom misconceptions were more tentatively held and/or emergent, suggested that students’ incorrect ideas can be amended through tailored instruction grounded in those ideas (i.e., Klisch et al., 2012). This latter perspective on learning is consistent with approaches supported by recent research in the learning sciences community (Carpenter et al., 1989; Ruiz-Primo and Furtak, 2007; Pierson, 2008).

Not only am I being dissed, but Baumler et al. (2012) and Cox-Paulson et al. (2012) are too! So, do we deserve it? Let’s look at the use of the term “misconception” in each of the articles cited.

From Baumler et al. (2012):

Questions of conservation lend themselves well to a “teachable moment” regarding the choice of nucleotide versus protein BLAST. In one group of 28 students, students were asked to provide a written response justifying their choice of using BLASTP or BLASTN. Twelve of the 14 pairs of students provided answers that were complete and exhibited clear comprehension of relevant concepts, including third position wobble. One pair gave an answer that was adequate, although not thorough, while the last pair’s response invoked introns, an informative answer, in that it revealed a misconception grounded in a basic understanding of the Central Dogma, concerning the absence of splicing in bacteria.

From Cox-Paulson et al. (2012):

Student misconceptions about DNA replication and PCR have been well documented by others (Phillips et al., 2008; Robertson and Phillips, 2008), and this exercise provided an opportunity to increase understanding of these topics.

From Crowther (2012):

My own opinion is that songs can be particularly useful for countering two types of student problems: conceptual misunderstandings and failures to grasp hierarchical layers of information. Prewritten songs may explain concepts in new ways that clash with students’ mental models and force revision of those models, or may organize information for improved clarity (e.g., general principles in the chorus, key details in the verses, other details omitted). Songwriting assignments could have similar benefits by forcing students to do the work of concisely restating concepts in their own words and organizing the information in a musical format. As an example of using music to counter misconceptions, I once team-taught a “biology for engineers” course in which my coinstructor complained that many students failed to internalize the difference between genotype and phenotype. I wrote and performed a song to drive home this distinction, the chorus being, “Genotype, ooh… It’s the genes you possess—nothing more, nothing less! Versus phenotype, ooh… Your appearance and health and reproductive success!”

Note that these were the sole instances of the word “misconception” in each article. Do they illustrate what Maskiewicz & Lineback say they illustrate? I don’t think so.

The first claim made by Maskiewicz & Lineback is that some papers (e.g., the three cited) consider misconceptions to be “deeply held” or “deeply rooted.” None of the papers cited uses either phrase, nor do I see any discussion of misconceptions’ deepness in the passages above.

The second claim is, “The authors who considered misconceptions to be ‘deeply rooted’ spoke of instructional strategies designed to specifically elicit, confront, and replace students’ incorrect conceptions (i.e., Crowther, 2012).” The “deeply rooted” business aside, is Crowther indeed advocating wholesale swapping of students’ incorrect conceptions for correct ones? No. “Prewritten songs may explain concepts in new ways that clash with students’ mental models and force revision of those models.” That is, the models should be revised — NOT discarded! As far as I can tell, this is consistent with Maskiewicz & Lineback’s recommendations up to this point. (Later in the article, they propose abandoning the term “misconceptions” altogether.)

I suspect that Maskiewicz & Lineback found the above wording (with its talk of clashing, forcing, and failures) overly adversarial, and I concede that the tone is not ideal. But the passage is basically agreeing with them!

Not being an expert on addressing misconceptions (or whatever they should be called), I was glad to get Maskiewicz & Lineback’s perspective. But if their best example of the problem is a paragraph that neglects to mention the positive aspects of one particular misconception, perhaps the problem is not as big as they are making it out to be.

h1

A fake mini-review of “In Search of Santa”

April 14, 2013

In Search of Santa (2004) appears to be a movie for young children. It’s a G-rated cartoon lasting 75 minutes and featuring an abundance of cuddly penguin characters like the protagonists — twin sister princesses voiced by Hilary and Haylie Duff. Yet beneath the crude CGI animation, cliched moral lessons, and preposterous plot twists is a seething, searing indictment of the academic world and its self-important, soulless inhabitants.

The movie’s villains are Agonysla, Derridommis, and Mortmottimes, a trio of royal advisers collectively known as the Terribly Deep Thinkers. As they explain in their theme song, “We’re the Terribly Deep Thinkers/We’re walking almanacs/Our bones are old and brittle/But our minds are sharp as tacks. We’re overeducated/We’re snooty brainiacs/ We possess an excess/Of many useless facts.”

Early in the film the triumverate puts Princess Crystal on trial for the crime of believing in Santa Claus. Later they trap her and her sister, Princess Lucinda, in the Cave of Profundity while publicly declaring the sisters “lost at sea,” thus clearing their own path to the throne in the event of the king and queen’s demise. “You just use big words to hide small thoughts,” Crystal admonishes the trio, yet she and Lucinda appear doomed until — spoiler alert — a precocious baby leopard seal leads the other penguins to the imprisoned sisters, exposing the Terribly Deep Thinkers’ avarice and egotism.

So masterful is director William R. Kowalchuk’s disguise of his anti-intellectualist parable as kiddie entertainment that reviewers have called it “a cheap Saturday morning cartoon slapped together to cash in on the Hilary Duff wave” and “forgettable fare that may only satisfy the youngest and most undiscerning viewers.” Such opinions aside, the sisters’ improbable triumph over the Terribly Deep Thinkers is not simply an instance of Duff-mania gone awry, but rather a wit-seeking missile strike against ivory towers everywhere.

h1

A Praeg-matic view of exercise evangelism

March 28, 2013

Like any popular fitness magazine, Northwest Runner has an unabashedly pro-exercise flavor, and that’s OK. But in the April 2013 NWR, triathlon coach Wade Praeger goes too far in his scornful dismissal of past and present concerns about possibly negative aspects of exercise.

Praeger’s column is titled “The Real and Imagined Perils of Being an Endurance Athlete.” It begins:

Back when I started running in the 70s, I often had to deal with the questions, “Why are you running so much?” and “What do you think about out there?” Like many of you, I put up with or ignored those silly questions and just kept on truckin’.

In fact, “Why are you running so much?” and “What do you think about out there?” are perfectly reasonable questions. Sure, it can be tedious to answer them over and over and over, but true running ambassadors will respond willingly and patiently, thus demystifying the sport for their acquaintances and perhaps even gaining a few converts. To brush off such inquiries as silly does not help the cause.

Praeger continues:

…Every month I read another article about the perils of endurance athletics…. In a recent article in the journal Heart, Dr. James O’Keefe from Saint Luke’s Mid America Heart Institute in Kansas City proposed that people who run too often or “too fast” have the same mortality as sedentary slobs…. Though his work has been thoroughly debunked and discredited by other cardiologists, the tenor of the article is repeated elsewhere in our popular culture.

These words suggest that O’Keefe is a lone quack whose only success has been in attracting publicity. Actually, he leads a team of research physicians who are publishing their work in respected peer-reviewed journals. Not everyone agrees with their findings, but the team is making a legitimate contribution to the field of exercise research, as Amby Burfoot has explained.

After further discussion, including examples of physicians’ genuinely nutty warnings from the 1890s, Praeger writes (under the heading of “Spreading misinformation”):

…I see at least three distinct causes for all of this hand-wringing and proscriptive do-goodery… Firstly … people who don’t work out need some justification for their non-participation… Secondly, there is the age-old Protestant distrust of having fun… Lastly, there is some semantic confusion between the meanings of “health” and “fitness”…

In listing these three poor reasons for worrying about exercise, Praeger implies that that there are no good reasons for doing so.

He concludes:

…When it comes to athletics and fitness, we all make choices and set priorities for ourselves. And anytime you try to impose your choices and priorities (your values) on someone else, you are being a prude, or a snob, or just a plain old pain in the ass.

I agree. The problem is that, by lumping together all exercise-related skepticism and rejecting it all as equally ludicrous, Praeger sounds as snobbish as those he’s criticizing. He should work harder to distinguish between imagined risks and sincere, reasonable questions.

[This was published as a letter in the May 2013 issue of Northwest Runner.]

h1

A [long-ago] Season on the Brink

February 2, 2013

I just read A Season on the Brink, John Feinstein’s chronicle of Bob Knight’s 1985-86 season coaching the Indiana Hoosiers basketball team. I was appalled.

Admittedly, my perspective is that of an outsider, both sports-wise and time-wise. I’ve had limited immersion in big-time college athletics; my cross country and track experiences at an NCAA Division III school of 2,000 students hardly seem relevant. Meanwhile, the culture of sports has changed over the past 27 years, and it’s questionable whether the events of the ’80s can be judged by the standards of today.

I couldn’t help myself, though. The farther I got into this book, the more I wanted to yell at the protagonist: “You insufferable, self-righteous, disingenuous prick! What right do you have to subject these students to incessant bullying, verbal abuse, and mind games? Do you really think that the goal of winning basketball games justifies such tactics? Do you? Huh? You’re the worst coach I’ve ever encountered! You make me sick!”

Such rhetoric would be an oversimplification of the truth, not entirely fair, and unnecessarily mean-spirited. But that’s exactly how Knight sounds in talking to his players much of the time. Thanks to the author’s thorough reporting, there is no doubt about this. Here’s just one example of a post-game harangue, typical except for a paucity of profanities, after a valient Indiana comeback falls short in a road game against Michigan State.

“Don’t even hang your heads,” Knight said angrily. “Don’t bother, because you don’t care. Don’t even try to tell me that you care. Every time you make a mistake you just nod your head. I told you at the half about those six points that we gave them. Ricky, you foul on the rebound with the score tied. Jesus. Harris and Jadlow, I’ve never had two more disappointing people here in my life. You two haven’t contributed two ounces to what we’re trying to do. You don’t improve or change from one day to the next.

“Boys, I want to tell you how long a season you’re in for if you don’t compete any harder than that.” He paused. His voice was almost choked now. “I never thought I would see the day when Indiana basketball was in the state it’s in right now.”

They went home dreading what was to come. The assistant coaches were genuinely frightened about what might happen next.

Feinstein provides numerous anecdotes like this, yet the book is not intended as a hatchet job. It begins with an unabashedly pro-Bob introduction from Knight’s friend Al McGuire, and ends with the following comments from the author.

As I finish this, I am reminded of an incident that took place in January. After the Indiana-Illinois game during which Bob kicked and slammed a chair, and kicked a cheerleader’s megaphone, Dave Kindred, the superb columnist for The Altanta Constitution, wrote that he was disappointed to see Knight acting this way again. Kindred, a longtime friend of Knight’s, ended the column by writing, “Once again I find myself wondering when it comes to Bob Knight if the end justifies the means.”

A few days later, Knight called Kindred. “You needed one more line for that damn column,” Knight said. “You should have finished by saying, ‘And one more time, I realize that it does.'”

Kindred thought for a moment and then said, “Bob, you’re right.”

I agree.

Feinstein provides evidence throughout the book that Knight cares deeply about his players, despite outward appearances to the contrary; that he occasionally gives them heartfelt praise; that he sticks up for them and helps them out after they graduate; and that he picks mostly on those who can handle it the best. But I just can’t get beyond Knight’s basic attitude, which I could paraphase as, “Winning is the most important thing in the world, and I’m willing to say and do almost anything to get my players to crave winning as much as I do.” That’s not an approach I can accept.

[Related: my review on amazon.com]