Promotion
Free shipping on $45+ Shop Now!
Change Is the Only Constant
The Wisdom of Calculus in a Madcap World
Contributors
By Ben Orlin
Formats and Prices
Price
$27.99Price
$34.99 CADFormat
Format:
- Hardcover $27.99 $34.99 CAD
- ebook $14.99 $19.99 CAD
- Audiobook Download (Unabridged)
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around October 8, 2019. This date is subject to change due to shipping delays beyond our control.
Also available from:
Excerpt
Explore book giveaways, sneak peeks, deals, and more.
INTRODUCTION
“What is,” said the philosopher Parmenides, not quite a million days ago, “is uncreated and indestructible, alone, complete, immovable and without end.” It’s a bold philosophy. Parmenides permitted no divisions, no distinctions, no future, no past. “Nor was it ever, nor will it be,” he explained; “for now it is, all at once, a continuous one.” To Parmenides, the universe was like Los Angeles traffic: eternal, singular, and unchanging.
A million days later, it remains a very stupid idea.
C’mon, Parmenides. You can lull us with poetry and ply us with adjectives, but we’re not dupes. A million days ago, there were no Buddhists, Christians, or Muslims, because Buddha, Jesus, and Muhammad had yet to be born. A million days ago, Italians did not eat tomato sauce, because “Italy” wasn’t a concept and the closest tomatoes grew 6000 miles away. A million days ago, 50 or 100 million humans walked the Earth; now, that many people visit Disney-branded theme parks each year.
In fact, Parmenides, only two things were the same a million days ago as today: (1) the ubiquity of change, and (2) your philosophy’s profound and irredeemable wrongness.
That’s the last we’ll hear of Parmenides in this book (although his savvier disciple Zeno will pop up later). Good riddance to toga-clad stoners, I say. For now, we jump ahead, past his wiser contemporary Heraclitus (“you can’t step in the same river twice”), to arrive in the late 17th century, a mere 120,000 or 130,000 days ago. That’s when a scientist named Isaac Newton and a polymath named Gottfried Leibniz birthed this book’s protagonist. It was a fresh form of mathematics, a language of change, a stab at quantifying the flux and flow of this wobbling top called Earth.
Today, we call that math “calculus.”
The first tool of calculus is the derivative. It’s an instantaneous rate of change, telling us how something is evolving at a specific moment in time. Take, for example, the apple’s velocity precisely as it strikes Newton’s noggin. A second earlier, the fruit was moving a smidge slower; a second later, it will be moving in a different direction entirely, as will the history of physical science. But the derivative does not care about the second before, or the second after. It speaks only to this moment, to an infinitesimal sliver of time.
Calculus’s second tool is the integral. It is the sum of infinite pieces, each infinitesimally small. Picture how a series of circles, each shadow-thin, can unify to create a solid object: a sphere. Or how a group of humans, each as tiny and negligible as an atom, can together constitute a whole civilization. Or how a series of moments, each of them zero seconds in itself, can amount to an hour, an eon, an eternity.
Each integral speaks to a totality, to something galactic, which the panoramic lens of our mathematics can somehow capture.
The derivative and integral have earned a lofty reputation as specialized technical tools. But I believe they can offer more. You and I are like little boats, knocked by waves, spun by whirlpools, thrown by rapids. The derivative and the integral, I hold, are pocket-sized philosophies: extendable oars for navigating this flood-swollen river of a world.
Hence, this book, and its attempts to distill wisdom from mathematics.
In the first half, Moments, we’ll explore tales of the derivative. Each extracts an instant from the babbling stream of time. We’ll consider a millimeter of the moon’s orbit, a nibble of buttered toast, a dust particle’s erratic leap, and a dog’s split-second decision. If the derivative is a microscope, then each of these stories is a carefully chosen slide, a scene in miniature.
In the second half, Eternities, we’ll call upon the integral and its power to unify infinite droplets into a single stream. We’ll encounter a circle fashioned from tiny slivers, an army raised from myriad soldiers, a skyline built of anonymous buildings, and a cosmos heavy with a billion trillion stars. If the integral is a widescreen cinema, then each of these stories is a sweeping epic that you’ve got to see in theaters. The TV at home won’t do it justice.
I want to be clear: the object in your hands won’t “teach you calculus.” It’s not an orderly textbook, but an eclectic and humbly illustrated volume of folklore, written in nontechnical language for a casual reader. That reader may be a total stranger to calculus, or an intimate friend; I’m hopeful that the stories will bring a little mirth and insight either way.
This storybook is by no means complete—missing are the tales of Fermat’s bending light, Newton’s secret anagram, Dirac’s impossible function, and so many others. But in an ever-changing world, no volume is ever exhaustive, no mythology ever finished. The river runs on.
BEN ORLIN
DECEMBER 2018
The moment of change is the only poem.
—ADRIENNE RICH
MOMENTS
MOMENT I.
Time claims another victim.
I.
THE FUGITIVE SUBSTANCE OF TIME
Jaromir Hladik has written several books, none to his satisfaction. One, he deems “a product of mere application.” Another is “characterized by negligence, fatigue, and conjecture.” A third attempted to refute a fallacy, but did so with arguments “not any less fallacious.” I myself have only birthed books as flawless and sparkling as toothpaste commercials, but even so, I can empathize—especially with the little hypocrisy that gets Hladik through the day. “Like every writer,” Jorge Luis Borges tells us, Hladik “measured the virtues of other writers by their performance, and asked that they measure him by what he conjectured or planned.”
And what has Hladik planned? Oho! Hladik is glad you asked: It’s a verse drama titled The Enemies, and it will be nothing less than his masterpiece. It will gild his legacy, cow his brother-in-law, even redeem “the fundamental meaning of his life”—if only he can clear the small hurdle of, you know, writing it.
Here I apologize, because our story takes a dark turn. Hladik—a Jew in Nazi-controlled Prague—is arrested by the gestapo. A perfunctory trial leads to a death sentence. On the eve of his execution, he prays to God:
If I exist at all, if I am not one of Your repetitions and errata, I exist as the author of The Enemies. In order to bring this drama, which may serve to justify me, to justify You, I need one more year. Grant me that year, You to whom belong the centuries and all time.
The sleepless night passes, the execution day dawns, and then, just as the sergeant barks the final command to the firing squad, just as Hladik braces for death, just as all appears irretrievably lost… the universe freezes.
God has granted him a secret miracle. This single instant—with a raindrop rolling down his cheek and the fatal bullets still en route—has been enlarged, extended, dilated. The world is suspended, but his thoughts are not. Now Hladik can complete his drama, composing and polishing the stanzas entirely in his mind. The moment will endure for a year.
Here, on the cusp of a fate no one could envy, Hladik receives a gift that’s the envy of all.
“The aim of every artist,” William Faulkner once wrote, “is to arrest motion, which is life, by artificial means and hold it fixed.” (Hladik himself is, of course, a work of fiction, by author Jorge Luis Borges.) Tempus fluit, wrote Isaac Newton: “time flows.” Tempus fugit, declared sundials in the Middle Ages: “time flees.” Although our purposes vary, all of us—artists, scientists, even those glib know-nothings we call “philosophers”—chase the same impossible prize. We want to grasp time, to hold the singular moment in our hands, the way that Hladik did.
Alas, time dodges and evades. Consider the famous “paradox of the arrow,” from incurable Greek troll Zeno of Elea.
The idea: Picture an arrow flying through the air. Now, in your mind, freeze it at a single moment, like Hladik’s firing squad. Is the arrow still moving? No, of course not—a freeze-frame is, by definition, frozen. In any given instant, the arrow is motionless. But if time is made of moments… and the arrow is in no moment ever moving… then how, exactly, can it move?
Philosophers in ancient China played similar mind games. “The dimensionless cannot be accumulated,” one wrote. “Its size is a thousand miles.” In the mathematical sense, a moment is dimensionless: It possesses no length, no duration. It’s zero seconds long. But since two times zero is still zero, two moments will also amount to zero time. The same holds for 10 moments, or a thousand, or a million. In fact, any countable number of moments will still total up to zero seconds.
But if no supply of moments ever amounts to any time, then where do months and years and cricket matches come from? How can infinitesimal moments make up an infinite timeline?
Virginia Woolf noted that time “makes animals and vegetables bloom and fade with amazing punctuality.” But it “has no such simple effect upon the mind of man. The mind of man, moreover, works with equal strangeness upon the body of time.”
We chase the moment across history, mutilating time as we go. With hourglasses and candle clocks, we sliced days into hours. With pendulums and escapements, we carved hours into minutes (the etymology: “a minute fraction of an hour”), and thence into seconds (as in “a second order” of tiny; a minute fraction of a minute). From there we decomposed time into milliseconds (half a flap of a fly’s wings), microseconds (a fancy strobe light’s flash), and nanoseconds (each enough time for light to travel a foot), not to mention pico-, femto-, atto-, zepto-, and yoctoseconds. After that the names peter out, presumably because Dr. Seuss ran out of ideas, but still we slice on. Eventually, eternity crumbles into units of “Planck time,” about a billionth of a trillionth of a yoctosecond, or just enough for light to travel of the way across a proton. No instrument can reach beyond this ultimate in brevity: physicists insist that it’s the smallest meaningful unit of time in the universe, as far as we understand (or, like me, fail to understand).
LENGTH: 1 minute
SECONDS: 60
SIGNIFICANCE: Longest recorded gap between superhero movie releases
LENGTH: 1 second
SECONDS: Uh… 1
SIGNIFICANCE: Length of a sneeze, or 0.1% of a kilosneeze
LENGTH: 1 millisecond
SECONDS:
SIGNIFICANCE: Average human attention span
LENGTH: 1 microsecond
SECONDS:
SIGNIFICANCE: Length at which video buffering becomes intolerable
LENGTH: 1 nanosecond
SECONDS:
SIGNIFICANCE: Time required for a dog to decide it doesn’t trust me
LENGTH: 1 Planck time
SECONDS:
SIGNIFICANCE: Time after which I lose the thread when physicists discuss quantum effects, such as the Planck time
LENGTH: 1 moment
SECONDS: Zero
SIGNIFICANCE: ?!?!?!?!?!?!?!
Where, oh where, is “the moment”? Is it somewhere past Planck? If we can neither gather moments into intervals, nor break intervals into moments, then what even are these invisible, indivisible things? As I write this book in the tick-tock world of ordinary time, in what effervescent nonworld is Hladik writing his?
In the 11th century, mathematics first articulated a tentative answer. While European mathematics was pulling out its hair trying to calculate the date of Easter, Indian astronomy was busy predicting eclipses. This required pinpoint precision. Astronomers began throwing around units of time so brief that it would be almost a millennium before any timepiece could possibly measure them. One truti amounted to less than 1/30,000 of a second.
These practically infinitesimal slivers of time paved the road to a concept called tatkalika-gati: instantaneous motion. How fast, and in what direction, is the moon moving right at this exact moment?
And then, what about this moment?
And what about now?
And now?
These days, tatkalika-gati goes by a more pedestrian name: “the derivative.”
Consider a speeding bicycle. The derivative measures how fast its position is changing—i.e., the bike’s velocity in a given moment. In the graph below, it’s the steepness of the curve. A steeper curve signifies a faster bike, and thus a greater derivative.
Of course, in any given moment, the bicycle is like Zeno’s arrow: motionless. Thus, we can’t calculate derivatives via freeze-frame. Instead, we work by zooming in. First, determine the bike’s speed over a 10-second interval; then, try a 1-second interval; then, a 0.1-second interval, and a 0.01-second interval, and a 0.001-second interval…
In this sly manner, we sneak up on the instant, drawing closer and closer and closer until a pattern becomes clear.
START: 12:00:00
END: 12:00:10
SPEED: 39 mph
START: 12:00:00
END: 12:00:01
SPEED: 39.91 mph
START: 12:00:00
END: 12:00:00.1
SPEED: 39.98 mph
START: 12:00:00
END: 12:00:00.01
SPEED: 39.997 mph
At precisely noon…
SPEED: 40 mph
For another example, take a fizzing reaction, as two chemicals wed their little chemical parts to form a new baby chemical. The derivative measures how fast the product’s concentration is growing—i.e., the rate of the reaction in a given moment.
Or consider an island overrun with rabbits. The derivative measures how fast the population’s size is changing—i.e., its rate of growth in a given moment. (For this graph, we must briefly entertain the fiction of “fractional rabbits,” but if your suspension of disbelief has come this far, I trust it to survive any challenge.)
This bread-and-butter mathematical concept is oddly like a poet’s fancy. The derivative is “instantaneous change”: movement captured in a moment, like lightning in a bottle. It’s the repudiation of Zeno, who said that nothing can happen in a single instant, and the vindication of Hladik, who believed that anything can.
By now, perhaps you’ve guessed the end of Hladik’s tale. For 12 months, he composes his play. He writes not “for posterity,” Borges tells us, “nor even for God, of whose literary preferences he possessed scant knowledge.” Instead, he writes for himself. He writes to satisfy what Thomas Wolfe considered the perpetual itch of the artist:
to fix eternally in the patterns of an indestructible form a single moment of man’s living, a single moment of life’s beauty, passion, and unutterable eloquence, that passes, flames and goes, slipping for ever through our fingers with time’s sanded drop, flowing for ever from our desperate grasp even as a river flows and never can be held.
Hladik has held the river. It does not matter that no one will read The Enemies, or that the bullets will, in no time at all, resume the course. It matters only that he has composed his book, that it will now live forever, in this single moment, which is its own kind of eternity.
MOMENT II.
Isaac Newton decides the moon is an apple, and vice versa.
II.
THE EVER-FALLING MOON
Isaac Newton was a curious child. I mean “curious” as in “always questing for knowledge,” and also as in “utterly weird.” According to one story, he would become so engrossed in reading that his pet cat grew fat off of his untouched meals. Or consider his first exploration of optics. Ever meet a kid so curious he’d risk his own sight for a glimpse of the truth? He writes in his journal: “I took a bodkin”—that’s a thick, blunt needle—“& put it betwixt my eye & [the] bone as neare to [the] backside of my eye as I could: & pressing my eye… there appeared severall white, darke & coloured circles.”
It’s a shame, but today we rarely remember Newton as a self-mutilating owner of obese housecats. Instead, we remember him as a guy who got hit on the head by fruit.
Actually, the noggin impact was a later embellishment. As Sir Isaac himself told the tale, all it took was a glimpse of a falling apple to set the clockwork of his mind into historic motion. “As he sat alone in a garden,” recalled Henry Pemberton, a personal friend of Newton’s, “he fell into a speculation on the power of gravity.” The apple’s fall prompted him to reflect that no matter how high we rise—rooftops, treetops, mountaintops—gravity does not diminish. It is, to repurpose a phrase from Albert Einstein, “spooky action at a distance.” The matter of Earth seems to attract the matter of objects, no matter how far removed they are.
The curious young man probed further. (No bodkins this time; just thoughts.) What if gravity reaches beyond the mountaintops? What if its pull extends outward farther than we’ve guessed?
What if it goes all the way to the moon?
Aristotle would never have believed it. The stars obey perfect patterns, symphonic cycles, like my wife’s family organizing a dinner party. Life on Earth is anarchy, a mud splatter, like me organizing a dinner party. How could the two realms possibly follow the same laws? What eye-gouging madman would dare unify the terrestrial and the celestial?
Well, in the spring of 1666, that madman was 23 years old, relaxing in the shade of his mother’s garden. He watched an apple fall, and then, by some inspired stroke, he imagined a second falling apple, this one as distant as the moon. One small step for a McIntosh, one giant leap for fruitkind.
He knew the distance, roughly: if Earth’s surface sits one unit from its center, then the moon is 60 units away.
At such a tremendous remove, how might gravity act?
Even the loftiest mountains offer no clue. Compared to the moon, the peak of Everest sits practically on the skin of the Earth, only a cosmic hair’s breadth away. But let’s suppose—in an enormous and only slightly ahistorical leap—that gravity decays at greater distances. The farther you go, the weaker its force. I’m referring to Newton’s famous “inverse square law”:
At twice the distance, there’s 1/4 the gravity.
At triple the distance, 1/9 the gravity.
At ten times the distance, just 1/100 the gravity.
Our brave spacefaring apple, 60 times farther away from Earth’s core than its timid orchard-dwelling cousins, would experience just 1/3600 the gravity. If you’ve never divided something by 3600, let me editorialize: it makes things a lot smaller.
Drop an apple near the surface of the Earth, and it falls 4.9 meters in the first second. That’s about the height of a second-story window.
Drop our astro-apple from moon height, and in the first second it descends just over 1 millimeter. That’s the thickness of a nice credit card.
Back then, the explanation of the moon’s orbit remained an open mystery. René Descartes’s vortex theory—that all heavenly objects are swept along in their paths by swirling bands of particles, like bath toys circling a drain—reigned as the favored theory. But this was a time of change: Newton’s annus mirabilis, his “miracle year” (which “miraculously” lasted more like 18 months). During this solitary stretch at his mother’s cottage in Woolsthorpe, England, waiting out the plague that was ravaging London, Newton developed the ideas that would launch modern math and science. He articulated his laws of motion, unlocked the optical secrets of the prism, managed to keep his eyes free of household objects, and discovered calculus.
Along the way, he dethroned Descartes’s vortices with the toss of an apple.
As Newton’s predecessor and soul brother Galileo knew, horizontal motion does not affect vertical motion. Drop one apple; launch an identical apple sideways; and they’ll hit the ground at the same moment. Sure, their horizontal trajectories diverge, but their vertical motions obey the same dictatorial force: gravity.
Now, take your apples to a very high mountaintop, and throw them with superhuman speed. Congratulations: you’ve stepped inside a celebrated diagram from Newton’s masterwork (the Principia), illustrating the peculiar physics of high-speed falling.
Here, thanks to the planet’s curvature, our tidy vertical/horizontal distinction evaporates. One moment’s “horizontal” is the next moment’s “vertical.” The stronger the throw, the more prolonged the fall.
Throw the apple hard—like, major-league-pitcher hard—and it will travel a little distance before falling to earth. It may reach point A or B.
Throw the apple really hard—like, Red-Sox-pitcher-throwing-at-an-uppity-Yankees-player hard—and its horizontal motion now leads it away from the planet, prolonging the fall. Perhaps it travels all the way to C.
Throw the apple stupendously hard—like, Henry-Rowengartner-on-steroids hard—and it flies away from Earth so fast that each moment’s falling merely restores the apple to its original height. The apple can thereby fall forever.
An orbit is just a perpetual fall, with no Cartesian vortices required.
How would this work with our intrepid moon-apple? Well, this is calculus, so consider a virtually infinitesimal moment: a single second of travel. Over such a brief interval, the curved arc of the orbit might as well be a straight line.
Here we indicate the distance that the apple would fall if left to gravity’s devices alone.
Now what? Newton’s next move is a nifty geometric argument. We’ve created a little right-angled triangle. We want to know its hypotenuse (i.e., its longest side). So we embed it in a larger triangle that shares the same proportions:
Because the triangles are the same shape, their sides relate by the same ratio:
Solving this equation yields the following solution:
Our apple descends, you may recall, at the gentle rate of 1 millimeter per second—about 3% the ground speed of a sloth. And yet, to keep the fruit in orbit, we must fling it sideways at a speed of 1 kilometer per second—about triple the speed of sound.
This strikes me as simple, extraordinary—and, on its face, implausible. The moon, falling like a flung apple? Really, Sir Isaac? Can you confirm this wackadoodle thought experiment with any—oh, what’s it called—evidence?
Well, consider the time our moon-apple needs to circle the Earth. At such an enormous distance, it will have to traverse a path 2.5 million kilometers in circumference. Moving just over 1 kilometer per second, how long will this take?
Genre:
- "Orlin guides us through the attic of calculus, which is filled not only with mathematical facts, but with true stories, riddles, mathematical fables, and paradoxes. This is the book I wish I had before I'd ever heard what a limit is."—Zach Weinersmith, author of the webcomic Saturday Morning Breakfast Cereal
- "Exploring calculus not with complicated equations, but with stories, tons of illustrations, and (yes!) comics, Change is the Only Constant is an impressively engaging and engrossing read. It's the first -- and so far only -- book of mathematics that I read entirely in a single sitting!"—Ryan North, author of How to Invent Everything and Dinosaur Comics
- "Ben Orlin has written a funny, smart, endlessly engaging book -- that just happens to be about one of the most important and complicated subjects on the planet. If you love math, this book is for you. But if you've ever felt intimidated by math, or you've wondered why you should care about it, then this book is even more for you. (Don't tell the math people I said that.)"—David Litt, New York Times bestselling author of Thanks, Obama and Obama speechwriter
- "In Ben Orlin's delightful treatment, calculus is like a box of chocolates. You never know what you're going to get next -- a poem, a proof, a cartoon, a quip. But despite all the changes, one thing stays constant: It's one tasty morsel after another."—Steven Strogatz, professor of mathematics, Cornell University, and author of Infinite Powers
- "With wit that had me laughing from page one, Change Is the Only Constant describes calculus as a way of thinking about the world, driven by insightful and hilariously illustrated examples drawn not just from the usual suspects, like physics and economics, but from history, poetry, literature, and the thoughts of a corgi at the beach."—Grant Sanderson, creator of 3Blue1Brown
- "All the calculus you never learned, broken up, broken down, illustrated, and friendly. Orlin's storybook telling of the history of math is a treat for your inner geek, and a major gift for your adult mind. A pleasure!"—Rebecca Dinerstein, author of The Sunlit Night
- "The book is a more polished, extensive discussion of the concepts that pepper Orlin's blog, featuring his trademark caustic wit, a refreshingly breezy conversational tone, and of course, lots and lots of bad drawings. It's a great, entertaining read for neophytes and math fans alike because Orlin excels at finding novel ways to connect the math to real-world problems-or in the case of the Death Star, to problems in fictional worlds."—Ars Technica, on Math With Bad Drawings
- "Ben Orlin is terribly bad at drawing. Luckily he's also fantastically clever and charming. His talents have added up to the most glorious, warm, and witty illustrated guide to the irresistible appeal of mathematics."—Hannah Fry, mathematician, University College London and BBC presenter, on Math With Bad Drawings
- "MATH WITH BAD DRAWINGS is a gloriously goofy word-number-and-cartoon fest that drags math out of the classroom and into the sunlight where it belongs. Great for your friend who thinks they hate math - actually, great for everyone!"—Jordan Ellenberg, author of How Not To Be Wrong, on Math With Bad Drawings
- "Brilliant, wide ranging, and irreverent, Math with Bad Drawings adds ha ha to aha. It'll make you smile - plus it might just make you smarter and wiser."—Steven Strogatz, Author of The Joy of X, on Math With Bad Drawings
- "Orlin's ability to masterfully convey interesting and complex mathematical ideas through the whimsy of drawings (that, contrary to the suggestion of the title, are actually not that bad) is unparalleled. This is a great work showing the beauty of mathematics as it relates to our world. This is a must read for anyone who ever thought math isn't fun, or doesn't apply to the world we live in!"—John Urschel, mathematician named to Forbes® "30 Under 30" list of outstanding young scientists and former NFL player, on Math With Bad Drawings
- "[Orlin's] latest cartoon triumph."—New Scientist
- On Sale
- Oct 8, 2019
- Page Count
- 320 pages
- Publisher
- Black Dog & Leventhal
- ISBN-13
- 9780316509084
Newsletter Signup
By clicking ‘Sign Up,’ I acknowledge that I have read and agree to Hachette Book Group’s Privacy Policy and Terms of Use