Articles claiming that wine tasting expertise is “junk” or “fake” are a time-tested winner in the media guaranteed to get non-experts to nod in agreement, secure in their judgement that because they can’t tell a Zinfandel from a Syrah, no one else can either. These articles almost always take at face value poorly designed studies that purport to demonstrate that even people with some experience with wine tasting fail to make accurate judgments about wine varietals or wine quality.
But these articles almost always leave out the most important evidence that wine tasting expertise is real—the fact that some people pass the rigorous exams required to earn the degree of Master of Wine or Master Sommelier. You can’t pass these exams without possessing genuine wine expertise.
So Scott Alexander should be commended for his article entitled “Is Wine Fake?” in the journal Asterisk for putting this fact front and center in his discussion of wine expertise.
But I recently watched the documentary Somm, about expert wine-tasters trying to pass the Master Sommelier examination. As part of their test, they have to blind-taste six wines and, for each, identify the grape variety, the year it was produced, and tasting notes (e.g., “aged orange peel” or “hints of berry”). Then they need to identify where the wine was grown: certainly in broad categories like country or region, but ideally down to the particular vineyard. Most candidates — 92% — fail the examination. But some pass. And the criteria are so strict that random guessing alone can’t explain the few successes.
He then discusses the results of the Oxford and Cambridge annual competition:
Top scorers were able to identify grape varieties and countries for four of the six wines. In general, tasters who did well on the reds also did well on the whites, suggesting a consistent talent. And most tasters failed on the same wines (e.g., the Grenache and Friulano), suggesting those were genuinely harder than others.
Alexander shows why all those poorly designed studies I referred earlier were poorly designed—the “experts” in the study really weren’t experts and the studies involved deception which confounded the results.
Blind wine tasting is really, really hard. It takes years of practice and study to learn to do it well. Why expect ordinary consumers or people with modest experience to show consistent results? These studies are the equivalent of asking your average undergraduate English major to pass a comprehensive, detailed exam on the language of Ulysses by James Joyce. It’s not going to happen. We don’t expect expertise in literature to be easily attainable. Why think that about wine tasting.
Kudos to Scott Alexander for getting this right.
Didn’t Roald Dahl say something like one peek at the label is better than years of study?
Thanks for bringing attention to a well-written piece on the topic. As a wine enthusiast who usually tastes blind (and a trained/experience palate), I don’t understand the continuing controversy on this topic. Experience and training matter in any profession. Comparing wines I enjoy to what the average consumer enjoys is a complete waste of time. What matters in evaluating a wine is: are their faults? is it balanced? will it age well? will it pair well with foods? etc. I am so tired of pro critics scores and notes…
Congratulations on a great article! Wine tasting is an art, gotten through years of experience. While there are few real experts, there are many people who can judge a good wine simply by taste. In short, if you like it is good. If not…oh well.