For many postgraduate students, a Ph.D. thesis will be their magnum opus – the zenith of their academic achievement. And with such a significant amount of time and effort being invested, it’s important that study topics are chosen wisely. Hence, it’s comforting to know that the world of academic research is a far more inclusive, eclectic and remarkably unusual place than one might first assume. However left-field a particular subject might seem, there are almost certainly countless other research papers that wipe the floor with it in the weirdness stakes. Here are 30 of the very strangest.
30. Ovulation: A Lap Dancer’s Secret Weapon
To investigate the theory that estrus – the interval of amplified fertility and sexual awareness often referred to as “heat” in mammals – is no longer present in human females, researchers turned to an unlikely source: lap dancers. A team from the University of New Mexico led by evolutionary psychologist Geoffrey Miller enlisted the help of 18 professional dancers. These dancers documented their ovulatory cycles, shift patterns and the amount of tips they received over the course of 60 days. Published in 2007 in the journal Evolution and Human Behavior, “Ovulatory cycle effects on tip earnings by lap dancers: economic evidence for human estrus?” noted a distinct correlation between estrus and greater income from gratuities, representing what the researchers called “the first direct economic evidence for the existence and importance of estrus in contemporary human females.”
29. Which Can Jump Higher, the Dog Flea or the Cat Flea?
Froghoppers aside, fleas are the overachieving long jumpers of the animal kingdom. Fleas have body lengths of between 0.06 and 0.13 inches but can leap horizontal distances more than a hundred times those figures. But were all fleas created equal in the jumping stakes? To find out which would triumph between the dog- and cat-dwelling varieties, researchers from the Ecole Nationale Vétérinaire de Toulouse, France meticulously recorded the leaping efforts of a collection of both species of flea. Published in 2000, the resulting paper, “A comparison of jump performances of the dog flea, Ctenocephalides canis, and the cat flea, Ctenocephalides felis felis,” declared the dog flea the winner. Yes, the canine-inclined insect jumps both higher and further than its feline-partial opponent. In 2008 the research team scooped the Annals of Improbable Research’s Ig Nobel Prize in the biology category – the Ig Nobel Prizes being awards that recognize the feats of those who “make people laugh… and then think.”
28. On Ethicists and Theft
Death row pardons, lottery wins and rain on your wedding day – all (arguably non-ironic) subjects referenced by Alanis Morissette in her 1996 single “Ironic.” One topic that would probably merit inclusion – despite the research not being published until 2009 (in Philosophical Psychology) – is the revelation that books on ethics are more liable to be absent from the shelves of university libraries than comparable books on other philosophical subjects. “Do Ethicists Steal More Books?” by University of California, Riverside professor of philosophy Eric Schwitzgebel revealed that the more recent, esoteric ethics books “of the sort likely to be borrowed mainly by professors and advanced students of philosophy” were “about 50 percent more likely to be missing” than their non-ethics counterparts. However, Professor Schwitzgebel believes this is a good thing, as “the demand that ethicists live as moral models would create distortive pressures on the field.”
27. Wet Underwear: Not Comfortable
Even babies know it: wet underwear is uncomfortable. Yet precisely why this is so is a question that went unanswered by hard science until 1994, when the journal Ergonomics published “Impact of wet underwear on thermoregulatory responses and thermal comfort in the cold.” The authors were Martha Kold Bakkevig of SINTEF Unimed in Trondheim, Norway and Ruth Nielson at Kongens Lyngby’s Technical University of Denmark. Bakkevig and Nielson had investigated “the significance of wet underwear” by monitoring the skin and intestinal warmth, as well as weight loss, of eight adult male subjects wearing wet or dry underwear in controlled cold conditions. Apart from the obvious “significant cooling effect of wet underwear on thermoregulatory responses and thermal comfort,” the research also discovered that the thickness of the underwear exerted a greater effect on these factors than the material used to make the garment. So now you know.
26. Do Woodpeckers Get Headaches?
In much the same way that we’d presume dragons don’t get sore throats, it would be a reasonable assumption that woodpeckers don’t suffer from headaches – but assumptions are a poor substitute for the authoritative grip of scientific fact. Published in 2002 in the British Journal of Ophthalmology, “Cure for a headache” came courtesy of Ivan Schwab, an ophthalmologist at the University of California, Davis. Schwab’s paper details the raft of physiological traits that woodpeckers have developed to avoid brain damage and bleeding or detached eyes when hammering their beaks into trees at up to 20 times a second, 12,000 times a day. In addition to a very broad but surprisingly squishy skull and sturdy jaw muscles, the woodpecker has a “relatively small” brain – which probably explains a lot.
25. Booty Calls: the Best of Both Worlds?
Compromise, according to U.S. poet and author Phyllis McGinley at least, is what “makes nations great and marriages happy.” It’s also the backbone of the booty call, if research published in 2009 is anything to go by. Appearing in The Journal of Sex Research, “The ‘booty call’: a compromise between men’s and women’s ideal mating strategies,” was written by researchers from the department of psychology at New Mexico State University. The study analyzed the booty-calling behavior of 61 students from the University of Texas at Austin. What’s more, it confirmed its central thesis that “the booty call may represent a compromise between the short-term sexual nature of men’s ideal relationships and the long-term commitment ideally favored by women.” Lead researcher Dr. Peter K. Jonason, now working at the University of Western Sydney, shared follow-up papers in 2011 and 2013, for The Journal of Sex Research and Archives of Sexual Behavior, respectively.
24. Mosquitoes Like Cheese
The mosquito is a formidable and destructive pest. And while it’s known that exhalation of carbon dioxide by its victims acts as a highly compelling invitation to dinner, other smelly signals have been less well documented. Published in The Lancet, Bart Knols’ 1996 research, “On human odor, malaria mosquitoes, and Limburger cheese,” changed that. The entomologist described how Anopheles gambiae, Africa’s most prolific malaria-spreading mosquito, exhibited a keen partiality for biting human feet and ankles. Crucially, the research also showed that these mosquitoes can be attracted to Limburger cheese, a stinky fromage that shares many characteristics with the whiff of human feet, offering potential use as a synthetic bait for traps. Interestingly, Knols is one of the few people to have won an Ig Nobel (for entomology in 2006) and a Nobel Peace Prize (shared in 2005 as part of the International Atomic Energy Agency).
23. Weighing Up Lead and Feathers
It doesn’t require a degree in physics – or philosophy – to understand that a pound of lead and a pound of feathers weigh the same. Yet the question of whether or not they feel the same is rather less straightforward. To examine this, researchers from the department of psychology at Illinois State University enlisted the help of 23 blindfolded volunteers, recording their perceptions of the weight of either a pound of lead or a pound of feathers contained within boxes of precisely the same shape and size. Published in 2007, the paper – “‘Which feels heavier – a pound of lead or a pound of feathers?’ A potential perceptual basis of a cognitive riddle” – discovered that participants rated the pound of lead as seeming weightier with an “above chance” frequency. The suggestion is that factors such as the “muscular forces” required to handle an object could also play a role in perceptions of weight.
22. Cat Food – Yummy?
Despite their notorious penchant for fully, or sometimes partially, dead rodents in their mouths, cats are surprisingly fussy eaters. What’s more, the pet food industry has found that kitties themselves represent unreliable and expensive test subjects in the pursuit of more appealing cat food flavors. Professor Gary Pickering of the department of biological sciences at Brock University in Ontario, Canada detailed a better option in 2009: the human palate. “Optimizing the sensory characteristics and acceptance of canned cat food: use of a human taste panel” describes the bizarre methodology for human tasters to “profile the flavour and texture of a range of cat food products” – including evaluating “meat chunk and gravy/gel constituents.” The impact of this on the number of job applications to the beer- and chocolate-tasting industries remains to be seen.
21. The Unhidden Dangers of Sword Swallowing
While “cat food taster” is unlikely to appear on anybody’s dream job list, at least that profession is unencumbered by the daily risk of serious injury. Sword swallowing, on the other hand, though occupying a similar position on the league table of tastiness, is a rather more hazardous occupation. In order to establish just how hazardous, radiologist Brian Witcombe and world champion sword swallower Dan Meyer analyzed the “technique and complications” of 46 members of the Sword Swallowers’ Association International. Published in 2009 in the British Medical Journal, their research, “Sword swallowing and its side effects,” found that performers had a heightened chance of injury when “distracted or adding embellishments” – as in the case of one unfortunate swallower who lacerated his throat after being disturbed by a “misbehaving macaw on his shoulder.” In 2007 Witcombe and Meyer together received the Ig Nobel Prize in medicine in view of the pair’s “penetrating medical report.”
20. Beer Bottle vs. Human Skull
Common weekend warrior tales would suggest that a beer bottle makes a good weapon in the event of a bar brawl. But would a full or an empty bottle inflict the most damage, and would that damage include fracturing a human skull? These important questions were answered in 2009 by a team of researchers from the University of Bern with their seminal paper, “Are full or empty beer bottles sturdier and does their fracture-threshold suffice to break the human skull?” Dr. Stephan Bolliger and his colleagues tested the breaking energy of full and empty beer bottles using a drop tower. Moreover, they discovered that a “full bottle will strike a target with almost 70 percent more energy than an empty bottle,” but that either is capable of breaking a human skull. Good to know. In a great twist of irony, Dr. Bolliger and co. picked up a 2009 Ig Nobel Prize in the “Peace” category.
19. The Propulsion Parameters of Penguin Poop
The titles of scientific research papers can sometimes be fairly impenetrable to the layman; other times they may take a more direct approach. Published in 2003, “Pressures produced when penguins pooh – calculations on avian defecation” certainly belongs to the latter category. The paper’s authors, Victor Benno Meyer-Rochow of the then International University Bremen (now Jacobs University Bremen) and Eötvös Loránd University’s Jozsef Gal, decided to address the question of how much internal pressure penguins generate for poop-firing purposes. With knowledge of just a few parameters – including the thickness of and distance covered by the fecal matter – the researchers were able to calculate that the birds employed pressures of up to 60 kPa (kilopascal) to eject their bodily waste. The project was inspired by a blushing Japanese student who, during a lecture, asked Dr. Meyer-Rochow how the penguins “decorated” their nests.
18. Lady Gaga and Pop Art
Lady Gaga clearly sees herself as something of an artist: her third album is called Artpop, and last year she voiced her desire to “bring art culture into pop in a reverse Warholian expedition.” But does anyone else agree? In 2012 University of Cambridge student Amrou Al-Kadhi decided to write a few words – 10,000 to be precise – on the subject for his final year undergraduate dissertation. The paper, looking at Lady Gaga’s place in the history of pop art and her role as a voice of cultural criticism, initially encountered some resistance from the Cambridge history of art department. However, after several meetings, the provision of a barrage of YouTube links to Gaga videos such as “Telephone” (which apparently demonstrated her postmodern aesthetic) and “a bit of work,” permission for Al-Kadhi to undertake the research was granted.
17. Even Chickens Prefer Beautiful People
A 2002 research paper by Stefano Ghirlanda, Liselotte Jansson and Magnus Enquist at Stockholm University decided to make inroads into the question – most likely contemplated by very, very few people – of whether “Chickens prefer beautiful humans.” The study saw six chickens trained to “react to” images of an ordinary male or female face. They were then tested on a series of images ranging from the average face to a face with exaggerated male or female characteristics, and a group of 14 (human) students were given the same test. Perhaps surprisingly, the chickens “showed preferences for faces consistent with human sexual preferences.” The researchers claim this offers evidence for the hypothesis that human preferences stem not from “face-specific adaptations” but from “general properties of nervous systems” – perhaps overlooking the possibility that their human test group just had very unusual tastes.
16. Erase Bad Memories, Keep Good Ones
Painful, embarrassing, or traumatic memories have an annoying habit of accumulating over the course of an average lifetime. As Courtney Miller, assistant professor at the Florida campus of The Scripps Research Institute, puts it, “Our memories make us who we are, but some of these memories can make life very difficult.” With that in mind, Miller led a team of researchers to try and find out whether certain unwanted memories – specifically, drug-related ones – could be erased without damaging other memories. Published in 2013, “Selective, Retrieval-Independent Disruption of Methamphetamine-Associated Memory by Actin Depolymerization” found that, in mice at least, this kind of bespoke amnesia is entirely possible. How? By means of inhibiting the formation of a particular molecule in the brain. “The hope is,” said Miller, “that our strategies may be applicable to other harmful memories, such as those that perpetuate smoking or post-traumatic stress disorder.”
15. The Rectal Route to Curing Hiccups
When beset by a flurry of hiccups, a few minutes of putting up with the involuntary jolting is usually sufficient to get them to subside. However, other times they can become a far more unmanageable problem, beyond the healing scope of even the oldest of wives’ tales. In such situations there’s a surprising but highly effective cure. Published in 1990, “Termination of intractable hiccups with digital rectal massage” details the case of a 60-year-old patient whose seemingly non-stop hiccups were brought to an immediate halt by a massaging finger in the rectum. A second occurrence a few hours later was curbed in a similar fashion. The research from the Bnai Zion Medical Center in Israel notes that “no other recurrences were observed.” The inspiration for the report was Dr. Francis Fesmire, who penned a medical case report with the same title in 1988 and with whom the researchers shared an Ig Nobel in 2006. Fesmire passed away in 2014, and one fitting epitaph from an entertainment-oriented research magazine mused, “Dr. Fesmire found joy and fame by putting his finger on – nay, in – the pulse of his times.”
14. Can Pigeons Tell a Picasso From a Monet?
Theirs is a list dominated by flying, pecking and defecating, and pigeons can now add “appreciation of fine art” to their skill set. Published in 1995, “Pigeons’ discrimination of paintings by Monet and Picasso” came courtesy of Shigeru Watanabe, Junko Sakamoto and Masumi Wakita at Keio University in Japan. And sure enough, the paper presents evidence that pigeons are indeed able to distinguish between works by the two artists. The birds were trained to recognize pieces by either Monet or Picasso; and crucially they then demonstrated the ability to identify works by either creator that had not been shown to them during the training period. Not bad for rats with wings. Professor Watanabe – who went on to explore paddy birds’ appreciation of the spoken word – put the paper into context, saying, “This research does not deal with advanced artistic judgments, but it shows that pigeons are able to acquire the ability to judge beauty similar to that of humans.”
13. The Nature of Navel Lint
It’s a phenomenon that most people will be familiar with: small balls of lint accumulating in the belly button. Still, until fairly recently the mechanism behind this process lacked a satisfactory explanation from the realm of science. Fortunately, that all changed in 2009 when Georg Steinhauser, a chemist and researcher at the Vienna University of Technology, published a research paper entitled “The nature of navel fluff.” After gathering 503 samples of navel lint, Dr. Steinhauser concluded that the culprit behind this common occurrence is hair on the abdomen, which dislodges small fibers from clothing and channels them into the belly button. As the Austrian himself has pointed out, “The question of the nature of navel fluff seems to concern more people than one would think at first glance.”
12. The Effects of Cocaine on Bees
The effects of cocaine on human body movement can be observed in nightclubs the world over on just about any given weekend. And as it turns out, the tediously familiar overestimation of dancing prowess is not just limited to humans. In a 2009 paper entitled “Effects of cocaine on honey bee dance behavior,” a team of researchers led by Gene Robinson, entomology and neuroscience professor at the University of Illinois at Urbana-Champaign, analyzed how honey bees are affected by low doses of cocaine. Honey bees are known to perform dances when they locate an abundant food source; and the team found that administering the drug prompted bees to circle about 25 percent quicker as well as dance more exuberantly and for longer. The bees also exaggerated the scale of their bounty. No surprise there then.
11. Fruit Bat Fellatio
Though its contents are difficult at first to make out, the grainy black and white image above actually depicts two bats engaged in some X-rated nocturnal activity. And that’s precisely the topic that a group of researchers from China and the U.K. chose to explore in their 2009 paper, “Fellatio by fruit bats prolongs copulation time.” The group looked at the copulatory behavior of the short-nosed fruit bat and observed that “females were not passive during copulation but performed oral sex.” More interestingly, the researchers also discovered that the longer the bats spent engaged in fellatio, the longer the copulation itself lasted – and that when fellatio was absent, pairs spent much less time mating.
10. The Possibility of Unicorns
It’s a question that has plagued the internet for decades: could unicorns really exist? The short answer, at least, is no. Still, King’s College London philosophy undergraduate Rachael Patterson decided to investigate whether a full dissertation on the more theoretical aspects of the subject would yield the same conclusion. Her paper, “The Possibility of Unicorns: Kripke v Dummett,” picks up on previous theses by British philosopher Michael Dummett and American logician and philosopher Saul Kripke. Why? In order to see if any more rainbow-hued light could be shed on this important question, of course. Reassuringly, perhaps, neither Kripke nor Dummett claim that these mythical creatures live in reality – although Dummett does posit the idea that in another world they might.
9. Does Country Music Make You Suicidal?
Country music is one of the most popular genres of music in the United States, with a huge audience that encompasses all age ranges. Yet given its recurrent themes of wedded disharmony and excessive drinking, Steven Stack of Wayne State University and Auburn University’s Jim Gundlach decided to probe whether country music might have an influence on municipal suicide rates in America. Published in 1992, their research paper, “The Effect of Country Music on Suicide,” actually discovered a strong link between the amount of country music radio airplay in any particular city and the suicide rate among the white population in that area. The reaction was mixed: Stack and Gundlach initially received hate mail, but in 2004 they won the Ig Nobel Prize for medicine.
8. Do Cabbies Have Bigger Brains?
The notoriously demanding exam that London’s black cab drivers must pass is called the “Knowledge” – and with good reason. Covering around 25,000 streets inside a six-mile radius of central London, the test generally requires three to four years of preparation and multiple attempts at the final exam before success is achieved. University College London neuroscientist Eleanor Maguire was inspired to take a closer look at this feat of memory after researching similar examples in the animal kingdom. Published in 2000, the resulting study, “Navigation-related structural change in the hippocampi of taxi drivers,” discovered that “cabbies” had physically larger posterior hippocampi – the areas of the brain responsible for spatial memory – than their non-cabbie counterparts. Professor Maguire’s follow-up study (with Dr. Katherine Woollett) in 2011 confirmed that trained cabbies were better at remembering London landmarks but not as good at recalling complex visual information compared to the unsuccessful trainees.
7. Shrews: To Chew or Not to Chew?
Ever felt so hungry that you could eat a horse? How about a shrew? While such scenarios are never likely to present themselves to the average person, scientists can be an altogether more experimental bunch. Take 1995 paper, “Human digestive effects on a micromammalian skeleton,” by Brian Crandall and Peter Stahl, anthropologists working at the State University of New York. Said paper investigated what would happen to a shrew – which was first skinned, disemboweled, parboiled and cut into segments – if it was swallowed, sans chewing, by a human. Interestingly, many of the rodent’s smaller bones “disappeared” on their transit through the human digestive system, while other portions of the skeleton showed “significant damage” despite the lack of chewing – a promising result to those studying human and animal remains. Following this peculiar paper, Brian Crandall became a science educator hoping to motivate future generations of (hungry) scientists.
6. Gay Dead Duck Sex
In 1935 Austrian physicist Erwin Schrödinger tried to highlight the absurdity of newly developed aspects of quantum theory. In his thought experiment, the strange quantum properties of a system are drawn on to suspend a hypothetical cat in a state of being simultaneously dead and alive. Sixty-six years later, a new piece of research saw the cat replaced by two ducks, in far less paradoxical though no less opposing states of life and death – but now with the crucial addition of gay sex. Published in 2001, “The first case of homosexual necrophilia in the mallard Anas platyrhynchos” describes Kees Moeliker’s bizarre experience. The Dutch ornithologist witnessed a male duck administering a 75-minute raping of the corpse of another male duck, freshly deceased after flying into a window. More recently, Moeliker has presided over an annual commemorative event and public conversation on how to make sure birds stop flying into windows. The event’s name? Dead Duck Day.
5. Love and Sex With Robots
“Intimate Relationships With Artificial Partners” – ludicrous science fiction, or serious science fact? According to the paper’s author, and British International Master of chess, Daniel Levy, “It may sound a little weird, but it isn’t.” Levy earned a Ph.D. from Maastricht University for his thesis, which covered sociology, psychology, artificial intelligence and robotics, among other fields. He conjectured that human-robot love, marriage and even consummation are “inevitable” by 2050. Roboticist Ronald Arkin from Atlanta’s Georgia Institute of Technology points out, “Humans are very unusual creatures. If you ask me if every human will want to marry a robot, my answer is probably not. But will there be a subset of people? There are people ready right now to marry sex toys.”
4. A Better Approach to Penile Zipper Entrapment
Unfortunately, the horror injury that befalls Ben Stiller’s character Ted, in 1998’s There’s Something About Mary, often traverses the realm of fiction to bestow real-world agony upon boys and men who wish they’d opted for a button fly. A 2005 paper by Dr. Satish Chandra Mishra from Charak Palika Hospital in New Delhi, India looked at reported methods of intervention for this most unpleasant of problems and found that many common approaches either take too long or can actually make the circumstances worse. The researchers’ paper, “Safe and painless manipulation of penile zipper entrapment,” details instead a “quick, simple and non-traumatic” method using wire cutters and a pair of pliers – though “painless” does seem a highly ambitious adjective in this particular context.
3. Flatulence As Self-Defense
The idea of a correlation between fear and bodily emissions of one variety or another is not surprising, but a 1996 paper by author Mara Sidoli detailed a much more extreme example of this relationship. In “Farting as a defence against unspeakable dread,” Sidoli described the miserable tale of Peter, a “severely disturbed adopted latency boy” who endured a difficult and traumatic early life. Despite various setbacks in his later growth, Peter demonstrated “considerable innate resilience.” However, he also developed what Sidoli called a “defensive olfactive container,” using his flatulence “to envelop himself in a protective cloud of familiarity against the dread of falling apart, and to hold his personality together.” With such a vivid and prose-rich approach to scientific research, it should come as no surprise that SIdoli scooped the Ig Nobel for literature in 1998.
2. Harry Potter = Jesus Christ
Putting an end, once and for all, to the notion that literary theory sometimes lacks real-world application, “Jesus Potter Harry Christ” is a thesis by Ph.D. student Derek Murphy that looks at “the fascinating parallels between two of the world’s most popular literary characters.” What’s more, after successfully exceeding his Kickstarter funding goal of $888, Murphy’s thesis has been transformed into a commercially available book, published in 2011, which won the Next Gen Indie Book Award for Best Religious Non-Fiction that same year. Though the idea of analyzing the similarities between J.K. Rowling’s boy wizard creation and the Son of God might seem like a frivolous endeavor, Murphy – who is currently doing his Ph.D. at Taiwan’s National Cheng Kung University – assures his public that the book’s contents are “academic and heavily researched.” Now, where’s the fun in that?
1. Rectal Foreign Bodies
Published in the journal Surgery in 1986, “Rectal foreign bodies: case reports and a comprehensive review of the world’s literature” does exactly what it says on the tin. The research, by doctors David B. Busch and James R. Starling, based in Madison, Wisconsin, looked at two cases of patients with “apparently self-inserted” anal objects, as well as available documentation on the subject.
Other factors taken into account included the patient’s age and history and the number and type of objects removed. The resulting list of 182 foreign bodies makes for an eye-watering read: of particular note are the dull knife (“patient complained of ‘knife-like pain’”) and the toolbox (“inside a convict; contained saws and other items usable in escape attempts”). The doctors’ paper was recognized for its literary value with an Ig Nobel Prize in 1995. One person’s pain is clearly another’s pleasure.
In Jonathan Franzen’s 2001 novel, “The Corrections,” a disgraced academic named Chip Lambert, who has abandoned Marxist theory in favor of screenwriting, goes to the Strand Bookstore, in downtown Manhattan, to sell off his library of dialectical tomes. The works of Theodor W. Adorno, Jürgen Habermas, Fredric Jameson, and various others cost Chip nearly four thousand dollars to acquire; their resale value is sixty-five. “He turned away from their reproachful spines, remembering how each of them had called out in a bookstore with a promise of a radical critique of late-capitalist society,” Franzen writes. After several more book-selling expeditions, Chip enters a high-end grocery store and walks out with an overpriced filet of wild Norwegian salmon.
Anyone who underwent a liberal-arts education in recent decades probably encountered the thorny theorists associated with the Institute for Social Research, better known as the Frankfurt School. Their minatory titles, filled with dark talk of “Negative Dialectics” and “One-Dimensional Man,” were once proudly displayed on college-dorm shelves, as markers of seriousness; now they are probably consigned to taped-up boxes in garages, if they have not been discarded altogether. Once in a while, the present-day Web designer or business editor may open the books and see in the margins the excited queries of a younger self, next to pronouncements on the order of “There is no document of culture which is not at the same time a document of barbarism” (Walter Benjamin) or “The whole is the false” (Adorno).
In the nineteen-nineties, the period in which “The Corrections” is set, such dire sentiments were unfashionable. With the fall of the Soviet Union, free-market capitalism had triumphed, and no one seemed badly hurt. In light of recent events, however, it may be time to unpack those texts again. Economic and environmental crisis, terrorism and counterterrorism, deepening inequality, unchecked tech and media monopolies, a withering away of intellectual institutions, an ostensibly liberating Internet culture in which we are constantly checking to see if we are being watched: none of this would have surprised the prophets of Frankfurt, who, upon reaching America, failed to experience the sensation of entering Paradise. Watching newsreels of the Second World War, Adorno wrote, “Men are reduced to walk-on parts in a monster documentary film which has no spectators, since the least of them has his bit to do on the screen.” He would not revise his remarks now.
The philosophers, sociologists, and critics in the Frankfurt School orbit, who are often gathered under the broader label of Critical Theory, are, indeed, having a modest resurgence. They are cited in brainy magazines like n+1, The Jacobin, and the latest iteration of The Baffler. Evgeny Morozov, in his critiques of Internet boosterism, has quoted Adorno’s early mentor Siegfried Kracauer, who registered the information and entertainment overload of the nineteen-twenties. The novelist Benjamin Kunkel, in his recent essay collection “Utopia or Bust,” extolls the criticism of Jameson, who has taught Marxist literary theory at Duke University for decades. (Kunkel also mentions “The Corrections,” noting that Chip gets his salmon at a shop winkingly named the Nightmare of Consumption.) The critic Astra Taylor, in “The People’s Platform: Taking Back Power and Culture in the Digital Age,” argues that Adorno and Max Horkheimer, in their 1944 book “Dialectic of Enlightenment,” gave early warnings about corporations “drowning out democracy in pursuit of profit.” And Walter Benjamin, whose dizzyingly varied career skirted the edges of the Frankfurt collective, receives the grand treatment in “Walter Benjamin: A Critical Life” (Harvard), by Howard Eiland and Michael W. Jennings, who earlier edited Harvard’s four-volume edition of Benjamin’s writings.
The Frankfurt School, which arose in the early nineteen-twenties, never presented a united front; it was, after all, a gaggle of intellectuals. One zone in which they clashed was that of mass culture. Benjamin saw the popular arena as a potential site of resistance, from which left-leaning artists like Charlie Chaplin could transmit subversive signals. Adorno and Horkheimer, by contrast, viewed pop culture as an instrument of economic and political control, enforcing conformity behind a permissive screen. The “culture industry,” as they called it, offered the “freedom to choose what is always the same.” A similar split appeared in attitudes toward traditional forms of culture: classical music, painting, literature. Adorno tended to be protective of them, even as he exposed their ideological underpinnings. Benjamin, in his resonant sentence linking culture and barbarism, saw the treasures of bourgeois Europe as spoils in a victory procession, each work blemished by the suffering of nameless millions.
The debate reached its height in the wake of Benjamin’s 1936 essay “The Work of Art in the Age of Its Technological Reproducibility,” a masterpiece of contingent optimism that praises mass culture only insofar as mass culture advances radical politics. Many readers will sympathize with Benjamin, who managed to uphold a formidable critical tradition while opening himself to the modern world and writing in a sensuous voice. He furnishes a template for the pop-savvy intellectual, the preferred model in what remains of literary life. Yet Adorno, his dark-minded, infuriating brother, will not go away: his cross-examination of the “Work of Art” essay, his pinpointing of its moments of naïveté, strikes home. Between them, Adorno and Benjamin were pioneers in thinking critically about pop culture—in taking that culture seriously as an object of scrutiny, whether in tones of delight, dismay, or passionate ambivalence.
The worst that one Frankfurt School theorist could say of another was that his work was insufficiently dialectical. In 1938, Adorno said it of Benjamin, who fell into a months-long depression. The word “dialectic,” as elaborated in the philosophy of Hegel, causes endless problems for people who are not German, and even for some who are. In a way, it is both a philosophical concept and a literary style. Derived from the ancient Greek term for the art of debate, it indicates an argument that maneuvers between contradictory points. It “mediates,” to use a favorite Frankfurt School word. And it gravitates toward doubt, demonstrating the “power of negative thinking,” as Herbert Marcuse once put it. Such twists and turns come naturally in the German language, whose sentences are themselves plotted in swerves, releasing their full meaning only with the final clinching action of the verb.
Marx adapted Hegel’s dialectic to the economic sphere, seeing it as an engine of progress. By the early twenties, a Marxist-Leninist state had ostensibly emerged in Russia, but the early members of the Frankfurt School—notably, Adorno, Horkheimer, Marcuse, Friedrich Pollock, Erich Fromm, Franz Neumann, and Leo Lowenthal—were far from starry-eyed about it. Although Marx was central to their thought, they were nearly as skeptical of Communist ideology as they were of the bourgeois mind-set that Communism was intended to supplant. “At the very heart of Critical Theory was an aversion to closed philosophical systems,” Martin Jay writes, in his history “The Dialectical Imagination” (1973).
Nazism sundered the lives of the critical theorists, almost all of whom were Jewish. Benjamin committed suicide on the Franco-Spanish border, in 1940; the others escaped to America. Much of their work in exile focussed on totalitarianism, although they assessed the phenomenon from a certain remove. For them, the genocidal state was not merely a German problem, something that resulted from listening to too much Wagner; it was a Western problem, rooted in the Enlightenment urge to dominate nature. Raymond Geuss, in the preface to a new edition of the Frankfurt School’s U.S.-government-sponsored wartime intelligence reports, notes that Nazi Germany, with its barrage of propaganda and of regulated entertainment, was seen as an “archetypally modern society.” Anti-Semitism was, from this perspective, not merely a manifestation of hatred but a means to an end—a “spearhead” of societal control. Therefore, the defeat of Mussolini and Hitler, in 1945, fell short of a final defeat of Fascism: the totalitarian mind lurked everywhere, and America was hardly free of its influence.
Chronically disapproving as these thinkers were, they were not disengaged from the culture of their day. In order to dissect it, they bent over it. One great contribution that they made to the art of criticism was the idea that any object, no matter how seemingly trivial, was worth a searching glance. In the second volume of the Harvard Benjamin edition, covering the turbulent final years of the Weimar Republic, Benjamin variously analyzes Mickey Mouse (“In these films, mankind makes preparations to survive civilization”), children’s books and toys, a food fair, Charlie Chaplin, hashish, and pornography (“Just as Niagara Falls feeds power stations, in the same way the downward torrent of language into smut and vulgarity should be used as a mighty source of energy to drive the dynamo of the creative act”). You often feel a tension between the intensity of the scrutiny and the modesty of the subject, as if an electron microscope were being used to read the fine print on a contract. Adorno, during his American exile, took it upon himself to analyze astrology columns in the Los Angeles Times. Upon reading the advice “Accept all invitations,” he hyperventilates: “The consummation of this trend is the obligatory participation in official ‘leisure-time activities’ in totalitarian countries.”
Benjamin took a different tack. In his maturity, he struggled to reconcile materialist and theological concerns: on the one hand, the Marxist tradition of social critique; on the other, the messianic tradition that preoccupied the Jewish historian Gershom Scholem, a close friend from student days. (The struggle yielded Benjamin’s most famous image, in the 1940 “Theses on the Philosophy of History”: the “angel of history” who is blown backward into the future by the storm of progress.) The messianic urge set off sparks of mystical hope that were fundamentally foreign to Adorno. Tellingly, when Benjamin addressed the subject of astrology, he was more sympathetic than censorious, seeing it as evidence of a largely extinct identification with nature: “Modern man can be touched by a pale shadow of this on southern moonlit nights in which he feels, alive within himself, mimetic forces that he had thought long since dead.”
To read the biographies of Benjamin and Adorno side by side—Eiland and Jennings’s new book, seven hundred and sixty-eight pages long, takes a place on the shelf next to Stefan Müller-Doohm’s hardly less massive 2003 life of Adorno—is to see the fraying of the grand old European bourgeoisie. Benjamin was born in Berlin in 1892; his father, Emil Benjamin, was an increasingly successful entrepreneur, his mother something of a grande dame. “Berlin Childhood Around 1900,” the most lyrical of Benjamin’s works, conjures the sumptuousness of his family home, although his all-seeing eye pierces its burnished surface: “As I gazed at the long, long rows of coffee spoons and knife rests, fruit knives and oyster forks, my pleasure in this abundance was tinged with anxiety, lest the guests we had invited would turn out to be identical to one another, like our cutlery.”
Adorno was born in Frankfurt in 1903, in conditions of comparable ease. His father, Oscar Wiesengrund, ran a wine-merchant business, and his mother, Maria Calvelli-Adorno, had sung opera. From earliest childhood, Adorno, as he chose to call himself on leaving Germany, swam in music, forming ambitions to become a composer. “Early on, I learned to disguise myself in words,” Benjamin wrote. Adorno hid in sounds.
Benjamin had the more complicated personality. Staggeringly intelligent, he was so consumed by the life of the mind that he routinely lost track of reality. Even Scholem found him “fanatically closed off.” At the same time, Benjamin indulged in bohemian tendencies: gambling, prostitutes, drinking, drugs. After failing to win an academic position, he took on journalistic assignments, coming to prefer “inconspicuous forms” over the “pretentious, universal gesture of the book.” His family life was disorderly. Those who picture him as an innocent martyr, poring over Baudelaire as history closes in on him, may be disheartened to read of his callous treatment of his wife, Dora Sophie, from whom he begged money while conducting a string of “smutty affairs,” as Dora put it. “All he is at this point is brains and sex,” she wrote.
Adorno, a cannier and less conflicted character, established himself in academia, writing dissertations on Husserl and Kierkegaard. He also studied composition with Alban Berg, one of the supreme musical figures of the twentieth century. Adorno was industrious, imperious, brusquely brilliant—the picture of the child prodigy who never fully grows up. But there was a bohemian strain in him, too. Kracauer, who began guiding Adorno when the latter was still of high-school age, wrote an autobiographical novel called “Georg” in which Adorno appears as a “little prince” named Fred, or Freddie. (Adorno was nicknamed Teddie.) Georg and Freddie go to all-night fancy-dress balls and one night end up in bed together, hovering on the edge of erotic contact.
Benjamin and Adorno met in Frankfurt in the early twenties, when Adorno was still a university student. At first, Adorno acted like a Benjamin disciple, virtuosically interrogating culture high and low. Later, he behaved more as master than as follower, subjecting Benjamin’s work to sometimes scathing criticism. In the new biography, Adorno comes across as a petty enforcer, trying to make Benjamin conform to Frankfurt School norms. Yet Eiland and Jennings may misunderstand the give-and-take of the relationship. In one letter, Adorno urges Benjamin to stop paying halfhearted tribute to Marxist concepts and instead to pursue a more idiosyncratic vision. Benjamin, for his part, was no hapless victim. When Adorno sent along a scenario for an ill-conceived music-theatre piece based on Mark Twain, Benjamin’s unconcealed disdain—“I believe I can imagine what you were attempting here”—probably caused Adorno to abandon the project. The two served each other best by challenging assumptions at every turn; it was a mutual admonition society.
With the advent of the Nazis, Benjamin left Germany at once, taking up residence primarily in France. Adorno, whose post-doctoral thesis was published the day Hitler took power, hesitated to break from Germany, occasionally making slight gestures of accommodation with the regime. When his part-Jewish ancestry made his position impossible, he settled for a time in Oxford. In 1935, Horkheimer took the Institute for Social Research to New York; in 1938, Adorno reluctantly joined him. He and his wife, Gretel, urged Benjamin to follow them, casting New York in a seductive light. In one letter, Adorno announces that Seventh Avenue in the Village “reminds us of boulevard Montparnasse.” Gretel adds, “There is no need to search for the surreal here, for one stumbles over it at every step.” Presciently, though, she anticipates that Benjamin will be unable to leave Paris: “I fear you are so fond of your arcades that you cannot part with their splendid architecture.”
She was referring to the “Arcades Project,” Benjamin’s would-be magnum opus—a kaleidoscopic study centered on the glass-covered shopping arcades of nineteenth-century Paris, intermingling literary analysis and cultural history with semi-Marxist sociology. At the heart of the scheme was Baudelaire, the prototype of the compromised modern artist, who casts off the mask of genius and surrenders to the life of the street. Baudelaire is depicted as a ragpicker, cobbling poetry from discarded fragments. At the same time, he stands apart from the crowd, enacting a ceremony of “mourning for what was and lack of hope for what is to come.” Baudelaire’s fascinated indecision in the face of nascent popular culture mirrors Benjamin’s own. The fact that the “Arcades Project” never came to fruition—a magnificent chaos of materials was published in English in 1999—suggests that, for this most hypersensitive of thinkers, the ambivalence was paralyzing.
When Benjamin committed suicide, apparently in the mistaken belief that he could not leave Nazi-occupied France, he carried with him an American entry visa, which the Institute for Social Research had obtained for him. It is hard to picture what might have happened if he had made it to New York—or, for that matter, to Jerusalem, where Scholem tried to get him to settle. The story might still have ended sadly: Eiland and Jennings emphasize that Benjamin had been tempted by suicide long before the cataclysm of 1940. Adorno, for his part, eked out a living at various institutes and think tanks in America, and when he returned to Frankfurt, in 1949, he became a monument of German intellectual life. He died in 1969, of a heart attack, after a hike in the shadow of the Matterhorn.
Last year, the German publisher Suhrkamp, as part of its ongoing critical edition of Benjamin’s works, released a volume devoted entirely to “The Work of Art in the Age of Its Technological Reproducibility.” It contains five distinct versions of the essay and related manuscripts, dating from the years 1935 to 1940, and four hundred pages of commentary. Benjamin might have scorned the scholarly fuss, but he knew the value of what he had achieved. The essay’s governing question, about what it means to create or consume art when any work can be mechanically reproduced, has grown ever more pressing in the digital age, when Bach’s complete cantatas or the Oxford English Dictionary can be downloaded in moments. In Benjamin’s lifetime, intellectuals busied themselves debating whether the new forms—photography, film, radio, popular music—constituted art. Benjamin pushed past such panel-discussion topics to the more fundamental issue of how technology changed all forms, ancient and contemporary.
First, Benjamin introduces the concept of the “aura,” which he defines as the “here and now of the artwork—its unique existence in a particular place.” To know Leonardo or Rembrandt, one must be in a room with their paintings. Chartres exists only at Chartres. The journey toward art resembles a pilgrimage. The treasures of the canon have always been embedded in ritual, whether it is medieval dogma or the “art for art’s sake” theology of the nineteenth century. In the age of reproduction, however, aura decays. When copies compete with originals, and when new works are produced with technology in mind, the old values of “creativity and genius, eternal value and mystery” fall away. Far from lamenting this development, Benjamin hails it: “For the first time in world history, technological reproducibility emancipates the work of art from its parasitic subservience to ritual.”
Free of that velvet prison, art can assume a political role. Benjamin’s dream of a radicalized mass culture emerged, in part, from his conversations with Bertolt Brecht, who believed that popular media could be marshalled to revolutionary ends, as in his and Kurt Weill’s “The Threepenny Opera.” Benjamin called the process “reception in distraction,” meaning that the masses can internalize, say, Chaplin’s images of a mechanized dehumanization and begin to question the rules of society. These spectators approach watching a film not as supplicants before an altar; rather, they take pleasure in the images and appraise them critically. They do not passively contemplate; they are alert eyewitnesses. Indeed, in the documentary films of Dziga Vertov, the masses themselves become actors, and the divide between author and public disintegrates. Benjamin’s essay is furiously perceptive, although he never quite specifies how a filmmaker can sustain an explicitly radical agenda within the commercial mainstream. Chaplin’s decision to flee to Europe in the fifties illustrates the difficulty.
When Adorno read “The Work of Art,” he readily accepted the concept of the aura and its decay. Unsentimental about his own highbrow milieu, he had already done his bit to puncture the affectations of bourgeois aesthetics, and in particular the fantasy that classical music floats above society, in an apolitical haze. In the 1932 essay “On the Social Situation of Music,” Adorno wrote, “The same type of conductor who undertakes an insatiably engrossed celebration of the Adagio of Bruckner’s Eighth lives a life closely akin to that of the head of a capitalist combine, uniting in his hand as many organizations, institutes, and orchestras as possible.” Later in the decade, in the study “In Search of Wagner,” Adorno depicted the composer of the “Ring” as a master illusionist and a harbinger of Fascism.
Benjamin’s pivot toward popular culture was, however, another matter. In a 1936 letter, Adorno complained that his friend had too cavalierly consigned bourgeois art to the “counter-revolutionary” category, failing to see that independent spirits—the likes of, say, Berg, Pablo Picasso, and Thomas Mann—could still carve out a space of expressive freedom. (Adorno believed that Benjamin was too much under the spell of Brecht, who appeared ready to cast highbrow forms on the rubbish heap.) Benjamin, Adorno said in his letter, had “startled art out of every one of its tabooed hiding places,” but he was in danger of falling under new illusions, romanticizing film and other pop forms. Adorno wrote, “If anything can be said to possess an auratic character now, it is precisely the film which does so, and to an extreme and highly suspect degree.” The cinema was the new Chartres, a venue of communal rapture.
This is an insight as profound as any found in Benjamin’s essay. Pop culture was acquiring its own cultic aspect, one neatly configured for technological dissemination. Why, after all, would the need for ritual subside when the economic system remained the same? (Benjamin once wrote, “Capitalism is a purely cultic religion, perhaps the most extreme that ever existed.”) Celebrities were rising to the status of secular gods: publicity stills froze their faces in the manner of religious icons. Pop musicians elicited Dionysian screams as they danced across the altar of the stage. And their aura became, in a sense, even more magical: instead of drawing pilgrims from afar, the pop masterpiece is broadcast outward, to a captive world congregation. It radiates and saturates.
When Adorno issued his own analyses of pop culture, though, he went off the beam. He was too irritated by the new Olympus of celebrities—and, even more, by the enthusiasm they inspired in younger intellectuals—to give a measured view. In the wake of “The Work of Art,” Adorno published two essays, “On Jazz,” and “On the Fetish Character of Music and the Regression of Listening,” that ignored the particulars of pop sounds and instead resorted to crude generalizations. Notoriously, Adorno compares jitterbugging to “St. Vitus’ dance or the reflexes of mutilated animals.” He shows no sympathy for the African-American experience, which was finding a new platform through jazz and popular song. The writing is polemical, and not remotely dialectical.
In the 1936 letter to Benjamin, Adorno offers a subtler argument—more of a plea for parity. Commercial logic is triumphant, he says, ensnaring culture high and low: “Both bear the stigmata of capitalism, both contain elements of change. . . . Both are torn halves of an integral freedom to which, however, they do not add up. It would be romantic to sacrifice one for the other.” In particular, it would be a mistake to romanticize the new mass forms, as Benjamin seems to do in his mesmerizing essay. Adorno makes the opposite mistake of romanticizing bourgeois tradition by denying humanity to the alternative. The two thinkers are themselves torn halves of a missing picture. One collateral misfortune of Benjamin’s early death is that it ended one of the richest intellectual conversations of the twentieth century.
If Adorno were to look upon the cultural landscape of the twenty-first century, he might take grim satisfaction in seeing his fondest fears realized. The pop hegemony is all but complete, its superstars dominating the media and wielding the economic might of tycoons. They live full time in the unreal realm of the mega-rich, yet they hide behind a folksy façade, wolfing down pizza at the Oscars and cheering sports teams from V.I.P. boxes. Meanwhile, traditional bourgeois genres are kicked to the margins, their demographics undesirable, their life styles uncool, their formal intricacies ill suited to the transmission networks of the digital age. Opera, dance, poetry, and the literary novel are still called “élitist,” despite the fact that the world’s real power has little use for them. The old hierarchy of high and low has become a sham: pop is the ruling party.
The Internet threatens final confirmation of Adorno and Horkheimer’s dictum that the culture industry allows the “freedom to choose what is always the same.” Champions of online life promised a utopia of infinite availability: a “long tail” of perpetually in-stock products would revive interest in non-mainstream culture. One need not have read Astra Taylor and other critics to sense that this utopia has been slow in arriving. Culture appears more monolithic than ever, with a few gigantic corporations—Google, Apple, Facebook, Amazon—presiding over unprecedented monopolies. Internet discourse has become tighter, more coercive. Search engines guide you away from peculiar words. (“Did you mean . . . ?”) Headlines have an authoritarian bark (“This Map of Planes in the Air Right Now Will Blow Your Mind”). “Most Read” lists at the top of Web sites imply that you should read the same stories everyone else is reading. Technology conspires with populism to create an ideologically vacant dictatorship of likes.
This, at least, is the drastic view. Benjamin’s heirs have suggested how messages of dissent can emanate from the heart of the culture industry, particularly in giving voice to oppressed or marginalized groups. Any narrative of cultural regression must confront evidence of social advance: the position of Jews, women, gay men, and people of color is a great deal more secure in today’s neo-liberal democracies than it was in the old bourgeois Europe. (The Frankfurt School’s indifference to race and gender is a conspicuous flaw.) The late Jamaican-born British scholar Stuart Hall, a pioneer of cultural studies, presented a double-sided picture of youth pop, defining it, in an essay co-written with Paddy Whannel, as a “contradictory mixture of the authentic and the manufactured.” In the same vein, the NPR pop critic Ann Powers wrote last month about listening to Nico & Vinz’s slickly soulful hit “Am I Wrong” in the wake of the unrest in Ferguson, Missouri, and catching the song’s undercurrents of unease. “Pop is all about commodification: the soft center of what adapts,” Powers writes. “But sometimes, when history collides with it, a simple song gains dimension.”
One way or another, the Frankfurt School mode of criticism—its skeptical ardor, its relentless scouring of mundane surfaces—has spread far. When online recappers expend thousands of words debating the depiction of rape on “Game of Thrones,” or when writers publish histories of sneakers or of the office cubicle, they show intense awareness of mass culture’s ability to shape society. And in some cases the analysis takes a recognizably dialectical turn, as in Hua Hsu’s 2011 essay, for Grantland, on Kanye West and Jay-Z’s album “Watch the Throne.” A dispassionate hip-hop fan, Hua Hsu ponders the spectacle of two leading rappers making an “album against austerity,” in which they mark their ascension to a world of “MoMA and Rothko, Larry Gagosian, and luxury hotels across three continents,” and at the same time forfeit a hip-hop tradition of fantasy and protest. Citing the Kanye track “Power”—“Grab a camera, shoot a viral / Take the power in your own hands”—Hsu writes, “This version of power is entrancing—it explains an entire generation. But it also confuses ubiquity for importance, the familiarity of a celebrity’s face for true authority.” There is no telling how Adorno and Benjamin might have negotiated such contemporary labyrinths. Perhaps, on a peaceful day, they would have accepted the compromise devised by Fredric Jameson, who has written that the “cultural evolution of late capitalism” can be understood “dialectically, as catastrophe and progress all together.”
These implacable voices should stay active in our minds. Their dialectic of doubt prods us to pursue connections between what troubles us and what distracts us, to see the riven world behind the seamless screen. “There is no document of civilization which is not at the same time a document of barbarism”: Benjamin’s great formula, as forceful as a Klieg light, should be fixed as steadily on pop culture, the ritual apparatus of American capitalism, as it has been on the art works of the European bourgeoisie. Adorno asked for only so much. Above all, these figures present a model for thinking differently, and not in the glib sense touted by Steve Jobs. As the homogenization of culture proceeds apace, as the technology of surveillance hovers at the borders of our brains, such spaces are becoming rarer and more confined. I am haunted by a sentence from Virginia Woolf’s “The Waves”: “One cannot live outside the machine for more perhaps than half an hour.” ♦