Given his pan-intellectual dexterity and neurotically analytic zeal, his taste for vicious circles and recursive loops, his ability to locate beauty and comedy in atonal technical jargon, David Foster Wallace is the perfect parachute buddy for a free fall into the mathematical and metaphysical abyss that is infinity. Just don’t expect a smooth landing—or any sort of landing, really. Less than a third the size of Infinite Jest, Everything and More sees the king of cross-purposes taking on the biggest, most dread-inducing paradox of all, with results as thrilling and maddening, as open-ended and self-devouring, as much of his fiction. A perversely tidy philosophical history of the messiest of ideas, the book is also a fibrous extreme-math gauntlet that, even as it insists on rigor, knows it can’t possibly live up to its title.
Tortured self-consciousness is a strange default mode for an educational manual, but it’s one Wallace has deftly applied to numerous ends. Creeping vine-like from a thicket of data-fetish digressions and second-guessing qualifications, his prose contorts itself into a position of consternated empathy, and it’s fascinating—even heartwarming—to observe this manic, menschy intensity thawing out the chillier echelons of calculus and set theory. As brain-twistingly abstruse as the math gets, this much is clear: David Foster Wallace feels your pain.
E&M opens on an apologetic note (“Unfortunately this is a Foreword you actually have to read”), and proceeds to fret over the variable math-savviness of potential readers and the corresponding bogglement or boredom likely to ensue. Wallace’s solicitous solution is to pepper the text (which abounds in hopscotching footnotes, torrential glossaries, and nested interpolations) with a suitably Greek-looking acronym—”IYI,” for “if you’re interested”—to denote passages that are skippable or skimmable, depending on whether you already know the stuff or probably aren’t going to get it. When the pages become more math-congested, he sounds like a kindly dentist, dispensing little squeezes of encouragement: “It would maybe be good to prepare yourself, emotionally, for having to read the following more than once.”
Discussions of ∞—the ultimate abstraction, backlit with Godlike glow—tend to begin in the spiritual ether. Wallace opts to begin in bed, in the pre-alarm early-morning netherworld when “abstract thinking tends most often to strike.” (At this point he’s already whipped out his trusty OED and subjected abstract to an emergency vivisection.) From under the covers (“Another sure sign it’s abstract thinking: you haven’t moved yet”), he gets deep and technical, recasting fuzzy ∞ panic as a crescendo of piercing ontological dilemmas: “In what way do abstract entities exist?” “At what point do the questions get so abstract . . . and the cephalalgia so bad that we simply can’t handle thinking of it anymore?”
Void thus stared into, Wallace pulls back to plot the history of ∞-as-concept, which is largely one of apprehension and equivocation—first feared (by the Greeks, who outlawed it along with zero and the negative numbers), then dismissed and fudged, and finally reimagined as a valid mathematical entity. A key paradox: ∞ is both an agoraphobic and a claustrophobic idea. Big ∞ concerns the never ending nature of the counting numbers. Little ∞ (or infinitesimals) requires a dizzying perspective shift: ∞ isn’t just what looms at the unthinkably far right of the number line but can in fact be found between, say, 0 and 1, or within any interval at all: Any finite-seeming bit of the number line is infinitely dense with intermediate points. This is closely related to Zeno’s notorious dichotomy, a literally paralyzing proposition that argues that to traverse any distance, an infinite number of subdistances have to be traversed, which is impossible, meaning motion is impossible.
Efforts to represent continuity mathematically—an important ∞-related tussle—suffered a body blow in “the Greek version of Watergate”: the Pythagoreans’ distressing discovery of irrational numbers. Aristotle’s slippery distinction between acceptable potential infinity and nonexistent actual infinity allowed math to hunker down in contented denial for centuries. Calculus emerged in the 1700s, providing a framework for continuity and change, but Wallace likens its early evolution to a “stock-market bubble”—spurred by real-world applications, it continued to advance on the shakiest of foundations.
The post-calculus breakthroughs that serve as E&M‘s climax take place in 19th-century Germany. Recognized today as the founder of set theory and transfinite math, Georg Cantor dared to conceive of actual infinities—and eventually came up with the astonishing proof that some infinities are bigger than others. He was a polarizing figure, who died in a sanitarium trying to prove that Francis Bacon authored Shakespeare’s plays. Wallace relegates much of his tragic, Nash-like bio to terse footnotes, and rejects the glib archetype of the nutty math professor. It’s particularly inappropriate here, considering Cantor’s great achievement was to liberate ∞ from its long-standing role as sanity hazard: “Saying that ∞ drove Cantor mad is sort of like mourning St. George’s loss to the dragon.”
Expunging the tone of avuncular condescension that taints much pop-science prose, E&M is also a heartfelt reaction to the blindly mnemonic methods that too many math teachers favor, even at college level. It’s a measure of Wallace’s generosity that he rarely tells the reader to just accept something on faith alone. He’s almost monomaniacal in prioritizing ideas over narrative templates. Kurt Gödel, “modern math’s absolute Prince of Darkness,” puts in a wrench-throwing late appearance with his evocatively named Incompleteness Theorem. In view of the subject, it’s only fitting that there’s no ending, let alone a happy one.