Book review and Highlights: The One Device, by Brian Merchant

The One Device by Brian Merchant

“Steve Jobs will forever be associated with the iPhone. He towers over it, he introduced it to the world, he evangelized it, he helmed the company that produced it. But he did not invent it.

Proving the lone-inventor myth inadequate does not diminish Jobs’s role as curator, editor, bar-setter—it elevates the role of everyone else to show he was not alone in making it possible.”

In “The One Device,” Brian Merchant spends 400+ pages shattering our illusions that the iPhone is, indeed, just that one device invented by one person.

Instead, Merchant takes us back in time, sometimes hundreds of years, to explore the people, ideas, and progress that eventually allowed Steve Jobs to weave together one of the most impactful—and the most profitable—products in history. 


At times the technical detail is drawn too sharply. The book bogs down in spots, mulling over information in a way that would never fly on a modern iPhone assembly line. In other spots, the pacing flies. When Merchant goes inside the firmly locked doors at Apple, describing the incredible effort put forth by so many to, create the iPhone, the tension and sense of urgency are palpable and the story moves quickly. 

The most interesting passages detail the drama and office politics that swirled around the iPhone as it evolved from long-shot idea to prototypes and, eventually, the assembly line. Jobs lives up to his ruthless reputation in the book, but we also see many people sacrificed their time, freedom—and sometimes their health or marriages—to bring the iPhone to life.

We also learn about the iPhone’s impact on the lithium mines deep in the Atacama Desert in Chile. And Merchant sneaks into the monolithic (and dystopian) FoxConn compound in China, to show us what life is like for the thousands of workers who assemble the devices we use every day. 

Like the iPhone, this book packs a lot inside its frame: deep history, tales of Jobs Machavellian machinations, the backbiting and secrecy and sacrifice that all swirled around the creation of the device. 

If you’re a fan of Apple, Steve Jobs, and/or the iPhone, this book will shatter popular myths of Steve Jobs as the singular inventor or the iPhone itself as, well, The One Device. 

The iPhone is the result of many people who made technological progress over hundreds of years. It exists because of the sacrifice of many people at Apple, and thousands of people doing incredibly hard work along the supply chain, from raw materials to finished product.

Jobs, rather than singular inventor, is shown to be a curator, building on technological advancements of the past, pushing others to push them forward, and synthesizing it all into the device we know today. Merchant does a good job chasing down and laying out the path for us.


Kindle highlights from The One Device

Teardown

“Every once in a while, a revolutionary product comes along that changes everything,” he said. “Well, today, we’re introducing three revolutionary products of this class. The first one is a wide-screen iPod with touch controls. The second is a revolutionary mobile phone. And the third is a breakthrough internet communications device. An iPod, a phone, and an internet communicator… “Are you getting it? These are not three separate devices; this is one device, and we are calling it iPhone. “Today,” he added, “Apple is going to reinvent the phone.” Then it did.

The top car brand, the Toyota Corolla: 43 million units. The bestselling game console, the Sony PlayStation: 382 million. The number-one book series, Harry Potter: 450 million books. The iPhone: 1 billion.

Profit margins on the iPhone have been reported to be as high as 70 percent, and as “low” as 41 percent.

The iPhone isn’t just a tool; it’s the foundational instrument of modern life.

The left side of the unit is filled with a long, flat battery—it takes up half the iPhone’s real estate.

The logic board, a bracket-shaped nest that houses the chips that bring the iPhone to life, wraps around to the right. A family of cables snake around the top.

“There’s four different cables connecting the display assembly to the rest of the phone,”

“One is going to be the digitizer that receives your touch input. So that works with a whole array of touch capacitors embedded in the glass. You can’t actually see them, but when your finger’s there… they can detect where you’re touching. So that’s its own cable, and the LCD has its own cable, the fingerprint sensor has its own cable. And then the last cable up here is for the front piece and camera.”

This book is an attempt to trace those cables—not just inside the phone, but around the world and throughout history.

“There was an extraordinary cult of Jobs as seemingly the inventor of this world-transforming gizmo, when he wasn’t,” the historian David Edgerton says. “There is an irony here—in the age of information and the knowledge society, the oldest of invention myths was propagated.” He’s referring to the Edison myth, or the myth of the lone inventor—the notion that after countless hours of toiling, one man can conjure up an invention that changes the course of history.

It is very rare that there’s a single inventor of a new technology—or even a single responsible group. From the cotton gin to the lightbulb to the telephone, most technologies are invented simultaneously or nearly simultaneously by two or more teams working entirely independently.

Ideas really are “in the air,” as patent expert Mark Lemley puts it.

the iPhone is what’s often called a convergence technology. It’s a container ship of inventions, many of which are incompletely understood.

multitouch was developed decades earlier by a trail of pioneers from places as varied as CERN’s particle-accelerator labs to the University of Toronto to a start-up bent on empowering the disabled.

while I made Apple officials fully aware of this project from the outset and repeatedly spoke with and met their PR representatives, they declined my many requests to interview executives and employees.

i: Exploring New Rich Interactions iPhone in embryo

Apple’s user-testing lab at 2 Infinite Loop had been abandoned for years. Down the hall from the famed Industrial Design studio, the space was divided by a one-way mirror so hidden observers could see how ordinary people navigate new technologies. But Apple didn’t do user testing, not since Steve Jobs returned as CEO in 1997. Under Jobs, Apple would show consumers what they wanted, not solicit their feedback. But that deserted lab would make an ideal hideaway for a small group of Apple’s more restless minds, who had quietly embarked on an experimental new project.

Their mission was vague but simple: “Explore new rich interactions.” The ENRI group, let’s call it, was tiny.

“There was a core little secret group,” says one member, Joshua Strickon, “with the goal of re-envisioning input on the Mac.”

The story of the iPhone starts, in other words, not with Steve Jobs or a grand plan to revolutionize phones, but with a misfit crew of software designers and hardware hackers tinkering with the next evolutionary step in human-computer symbiosis.

So, the key proto-iPhoners were European designers and East Coast engineers. They all arrived at Apple during its messy resurgent years, just before or just after the return of Jobs.

“What are the new features that we want to have in our experiences?”

At the time, touch tech was largely limited to resistive screens—think of old ATMs and airport kiosks. In a resistive touchscreen, the display is composed of layers—sheets coated with resistive material and separated by a tiny gap. When you touch the screen with your finger, you press the two layers together; this registers the location of said touch.

Instead of relying on force to register a touch, capacitive sensing puts the body’s electrochemistry to work. Because we’re all electrical conductors, when we touch a capacitive surface, it creates a distortion of the screen’s electrostatic field, which can be measured as a change in capacitance and pinpointed rather precisely.

CHAPTER 1 A Smarter Phone Simon says, Show us the road to the smartphone

The year was 1993. The visionary inventor was Frank Canova Jr., who was working as an engineer in IBM’s Boca Raton, Florida, labs. Canova conceived, patented, and prototyped what is widely agreed to be the first smartphone, the Simon Personal Communicator, in 1992.

“I really don’t see the iPhone as an invention so much as a compilation of technologies and a success in smart packaging,”

“The iPhone is a confluence technology. It’s not about innovation in any field,” he says.

“The innovations of the Simon are reflected in virtually all modern touchscreen phones.”

The next generation of the Simon, the Neon, never made it to the market, but its screen rotated when you rotated the phone—a signature feature of the iPhone. Yet today, the Simon is merely a curious footnote in computing history. So the question is: Why didn’t the Simon become the first iPhone?

Robida imagined people using his tScope (twenty-first-century branding was still beyond his grasp, so I’ve taken the liberty of helping him out) for entertainment—watching plays, sports, or news from afar; Maurier pictured people using it to stay ultraconnected with family and friends. Those are two of the most powerful draws of the smartphone today; two of its key functions—speed-of-light social networking and audiovisual communication—were outlined as early as the 1870s.

“In a historical sense, the computer is no more than an instantaneous telegraph with a prodigious memory, and all the communications inventions in between have simply been elaborations on the telegraph’s original work,” according to the history of technology scholar Carolyn Marvin.

the last quarter of the nineteenth century has a special importance,” Marvin writes. “Five proto–mass media of the twentieth century were invented during this period: the telephone, phonograph, electric light, wireless, and cinema.” If you’re counting, these are the prime ingredients for the smartphone you’ve got in your pocket right now.

“Why don’t you use a real ear?” he asked. Bell was game. The surgeon cut an ear from a dead man’s head, including its eardrum and the associated bones. Bell took the skull fragment and arranged it so a straw touched the eardrum at one end and a piece of smoked glass at the other. When he spoke loudly into the ear, the vibrations of the eardrum made tiny markings on the glass. “It was one of the most extraordinary incidents in the whole history of the telephone,” Casson noted.

The satirical magazine Punch presciently ran a cartoon in its Forecasts for 1907 issue that depicted the future of mobile communications: A married couple sitting on the lawn, facing away from each other, engrossed in their devices. The caption reads: These two figures are not communicating with one another. The lady is receiving an amatory message, and the gentleman some racing results. That cartoon was lampooning the growing impact of telephones on society, satirizing a grim future where individuals sat alone next to one another, engrossed in the output of their devices and ignoring their immediate surroundings—lol?

The first truly mobile phone was, quite literally, a car phone. In 1910, the tinkerer and inventor Lars Magnus Ericsson built a telephone into his wife’s car; he used a pole to run a wire up to the telephone lines that hung over

the roads of rural Sweden. “Enough power for a telephone could be generated by cranking a handle, and, while Ericsson’s mobile telephone was in a sense a mere toy, it did work,” Jon Agar, the mobile-phone historian, notes.

“When wireless is perfectly applied the whole earth will be converted into a huge brain, which in fact it is, all things being particles of a real and rhythmic whole,” the famed scientist and inventor Nikola Tesla told Collier’s magazine.

“We shall be able to communicate with one another instantly, irrespective of distance. Not only this, but through television and telephony we shall see and hear one another as perfectly as though we were face to face, despite intervening distances of thousands of miles; and

the instruments through which we shall be able to do this will be amazingly simple compared with our present telephone. A man will be able to carry one in his vest pocket.”

Cell phones became feasible pretty much immediately after the discovery of transistors, the key ingredient to modern computers, at Bell Labs.

Science fiction shaped what would become the smartphone as well, with two sources of inspiration looming large: “Star Trek. No doubt,” Garcia says. “The tricorder and the communicator are direct influences, and I’ve spoken to several innovators who have specifically cited Trek.” The second is 2001: A Space Odyssey, which featured a device called a Newspad.

The first device to be packaged explicitly as a “smartphone” was the Ericsson R380—a standard-looking cell phone that flipped open to reveal a touchscreen that could be used with a stylus. Nokia had phones that could run apps and play music. There was even another device, launched in 1998, that was actually called the iPhone—it was billed as a three-in-one “Internet Touchscreen Telephone” that functioned as an email reader, telephone, and internet terminal and was sold by a company called InfoGear. “Without those, the iPhone would never have happened,” Garcia says,

The technologies that shape our lives rarely emerge suddenly and out of nowhere; they are part of an incomprehensibly lengthy, tangled, and fluid process brought about by contributors who are mostly invisible to us. It’s a very long road back from the bleeding edge.

CHAPTER 2 Minephones Digging out the core elements of the iPhone

Mining on Cerro Rico is a decentralized affair. The site is nominally owned by Bolivia’s state-run mining company Comibol, but miners don’t draw pay from the state; they work essentially as freelancers in loose-knit cooperatives. They gather the tin, silver, zinc, and lead ores and sell them to smelters and processors, who in turn sell to larger commodity buyers. This freelance model, combined with the fact that Bolivia is one of the poorest countries in South America, makes regulating work in the mines difficult. That lack of oversight helps explain why as many as three thousand children are believed to work in Cerro Rico.

Thanks to an obscure amendment to the 2010 Dodd-Frank financial-reform bill aimed at discouraging companies from using conflict minerals from the Democratic Republic of the Congo, public companies must disclose the source of the so-called 3TG metals (tin, tantalum, tungsten, and gold) found in their products. Apple says that it set about mapping its supply chain in 2010. In 2014, the company began publishing lists of the confirmed smelters it uses and said it was working to rid its supply chain of smelters buying conflict minerals altogether.

Apple directly purchases few of the raw materials that wind up in its products.

What is the iPhone actually composed of at its most elemental level?

“It’s twenty-four percent aluminum,” Michaud says.

The iPhone is 0.02 percent tungsten,

“There were no precious metals detected in any major quantities, maybe a dollar or two,” Michaud says. “Nickel is worth nine dollars a pound and there’s two grams of it.” It’s used in the iPhone’s microphone.

There’s more arsenic in the iPhone than any of the precious metals, about 0.6 grams, though the concentration is too low to be toxic.

Silicon accounts for 6 percent of the phone, the microchips inside. The batteries are a lot more than that: They’re made of lithium, cobalt, and aluminum.

Aluminum Aluminum is the most abundant metal on Earth. It’s also the most abundant metal in your iPhone, due to its anodized casing.

Cobalt Most of the cobalt that ends up in the iPhone is in its lithium-ion battery, and it comes from the Democratic Republic of the Congo.

So, according to Michaud’s calculations, producing a single iPhone requires mining 34 kilos of ore, 100 liters of water, and 20.5 grams of cyanide, per industry average.

Many of the iPhone’s base elements are dug out in conditions that most iPhone users wouldn’t tolerate for even a few minutes. Cash-poor but resource-rich countries will face an uphill struggle as long as there’s a desire for these metals—demand will continue to drive mining companies and commodities brokers to find ways to get them.

CHAPTER 3 Scratchproof Break out the Gorilla Glass

If your grandparents ever served you a casserole in a white, indestructible-looking dish with blue cornflowers on the side, then you’ve eaten out of the material that would give rise to the glass that protects your iPhone. That dish is made of CorningWare, a ceramic-glass hybrid created by one of the nation’s largest, oldest, and most inventive glass companies.

in September 2006, just four months before Steve Jobs planned to reveal the iPhone to the world, he showed up at Apple HQ in a huff. “Look at this,” he said to a midlevel executive, holding up a prototype iPhone with scratch marks all over its plastic display—a victim of sharing his pocket with his keys. “Look at this. What’s with the screen?” “Well, Steve,” the exec said, “we have a glass prototype, but it fails the one-meter drop test one hundred out of one hundred times—” Jobs cut him off. “I just want to know if you are going make the fucking thing work.”

“We switched from plastic to glass at the very last minute, which was a curveball,” Tony Fadell, the head of the original iPhone’s engineering team, tells me with a laugh.

Jobs told Weeks he doubted Gorilla Glass was good enough, and began explaining to the CEO of the nation’s top glass company how glass was made. “Can you shut up,” Weeks interrupted him, “and let me teach you some science?” It was one of the rare occasions Jobs was legitimately taken aback in a meeting like that, and he fell silent. Weeks took to the whiteboard instead, and outlined what made his glass superior. Jobs was sold, and, recovering his Jobsian flair, ordered as much as Corning could make—in

Outside the idyllic town known for its bountiful tobacco harvests, a key component of one of the world’s bestselling devices is forged in a state-of-the-art glass factory. It’s one of the few parts of the iPhone that’s manufactured in the United States.

CHAPTER 4 Multitouched How the iPhone became hands-on

It’s clear why Jobs would want to lay claim to multitouch so aggressively: it set the iPhone a world apart from its competition. But if you define multitouch as a surface capable of detecting at least two or more simultaneous touches, the technology had existed, in various forms, for decades before the iPhone debuted. Much of its history, however, remains obscured, its innovators forgotten or unrecognized.

Which brings us to Bent Stumpe. The Danish engineer built a touchscreen back in the 1970s to manage the control center for CERN’s amazingly named Super Proton Synchrotron particle accelerator. He offered to take me on a tour of CERN, to show me “the places where the capacitive multitouch screen was born.” See, Stumpe believes that there’s a direct lineage from his touchscreen to the iPhone. It’s “similar to identical” to the extent, he says, that Apple’s patents may be

“Musicians have a longer history of expressing powerful creative ideas through a technological intermediary than perhaps any other profession that ever has existed,” Buxton says. “Some people would argue weapons, but they are perhaps less creative.”

“It is certainly true that a touch from the sense of a human perspective—like what humans are doing with their fingers—was always part of a musical instrument. Like how you hit a note, how you do the vibrato with a violin string and so on,” Buxton says. “People started to work on circuits that were capable of capturing that kind of nuance. It wasn’t just, ‘Did I touch it or not?’ but ‘How hard did I touch it?’ and ‘If I move my fingers and so on, could it start to get louder?’”

The first device that we would recognize as a touchscreen today is believed to have been invented by Eric Arthur Johnson, an engineer at England’s Royal Radar Establishment, in 1965. And it was created to improve air traffic control.

E. A. Johnson’s touchscreen was indeed adopted by Britain’s air traffic controllers, and his system remained in use until the 1990s. But his capacitive-touch system was soon overtaken by resistive-touch systems, invented by a team under the American atomic scientist G. Samuel Hurst as a way to keep track of his research. Pressure-based resistive touch was cheaper, but it was inexact, inaccurate, and often frustrating—it would give touch tech a bad name for a couple of decades.

computer interface more fluid and efficient. Westerman’s chief motivator still was improving the hand-friendliness of keyboards; the pad was less repetitive and required lighter keystrokes.

The success of the dissertation had energized both teacher and student, and Elias and Westerman began to think they’d stumbled on the makings of a marketable product. They patented the device in 2001 and formed their company, FingerWorks, while still under the nurturing umbrella of the University of Delaware.

interest in the start-up slowly percolated. They were selling a growing number of pads through their website, and their dedicated users were more than just dedicated; they took to calling themselves Finger Fans and started an online message board by the same name. But at that point, FingerWorks had sold around fifteen hundred touchpads.

When the iPhone was announced in 2007, everything suddenly made sense. Apple filed a patent for a multitouch device with Westerman’s name on it, and the gesture-controlled multitouch technology was distinctly similar to FingerWorks’.

“People with chronic RSI injuries were suddenly left out in the cold, in 2005, by an uncaring Steve Jobs,” Dstamatis wrote. “Apple took an important medical product off the market.”

Which brings us back to Jobs’s claim that Apple invented multitouch. Is there any way to support such a claim? “They certainly did not invent either capacitive-touch or multitouch,” Buxton says, but they “contributed to the state of the art. There’s no question of that.”

Why did it take so long for touch to become the central mode of human-machine interaction when the groundwork had been laid decades earlier? “It always takes that long,” Buxton says. “In fact, multitouch went faster than the mouse.” Buxton calls this phenomenon the Long Nose of Innovation, a theory that posits, essentially, that inventions have to marinate for a couple of decades while the various ecosystems and technologies necessary to make them appealing or useful develop.

“The thing that concerns me about the Steve Jobs and Edison complex—and there are a lot of people in between and those two are just two of the masters—what worries me is that young people who are being trained as innovators or designers are being sold the Edison myth, the genius designer, the great innovator, the Steve Jobs, the Bill Gates, or whatever,” Buxton says. “They’re never being taught the notion of the collective, the team, the history.”

But imagine watching the rise of the smartphone and the tablet, watching the world take up capacitive touchscreens, watching a billionaire CEO step out onto a stage and say his company invented them—thirty years after you were certain you proved the concept. Imagine watching that from the balcony of your third-floor one-bedroom apartment in the suburbs of Geneva that you rent with your pension and having proof that your DNA is in the device but finding that nobody seems to care. That kind of experience, I’m afraid, is the lot of the majority of inventors, innovators, and engineers whose collective work wound up in products like the iPhone.

We aren’t great at conceiving of technologies, products, even works of art as the intensely multifaceted, sometimes generationally collaborative, efforts that they tend to be.

ii: Prototyping First draft of the one device

one day in the summer of 2003, Ive led Jobs into the user-testing facility adjacent to his design studio, where he unveiled the ENRI project and gave him a hands-on demonstration of the powers of multitouch. “He was completely unimpressed,” Ive said. “He didn’t see that there was any value to the idea.

Jobs spent the next few days thinking it over, and evidently changed his mind. Soon, in fact, he decided that he loved it.

Even now, Huppi’s amused when the Mossberg incident comes up. “Steve said, ‘Yeah, I went to my engineers and said “I want a thing that does this this and this”’—and that’s all total bullshit because he had never asked for that.”

The hardware effort to transform the rig into a working prototype—which at the time was a multitouch tablet—was given a code name: Q79. The project went on lockdown.

“This paper from Sony implied that you could do true multitouch with rows and columns,” Strickon says. It meant a lattice of electrodes laid out on the screen could do the sensing. Josh Strickon considers this to be one of the most crucial moments in the course of the project. The paper, he says, presented a “much more elegant way” to do multitouch. It just hadn’t been done on a transparent surface yet. So, tracing the outline of Sony’s SmartSkin, he patched together a DIY multitouch screen.

Ording’s good nature was tested from time to time too. He’d been sitting in weekly meetings with the CEO but often found Jobs’s mean streak too much to handle. “There was a period of time,” Ording says, “that for a couple months or so, half a year or whatever, I didn’t go to the Steve meetings.” Jobs would chew out his colleagues in a mean-spirited way that made Ording not want to participate. “I just didn’t want to go. I was like, ‘No, Steve’s an asshole.’ Too many times he would be nasty for no good reason,” he says. “No one understood, because most people would die to go to these meetings—like, ‘Oh, it’s Steve.’ But I was tired of it.”

The marketing department’s ideas for how to sell the new touch-based device didn’t exactly inspire confidence either. They put together a presentation to show how they could position the tablet to sell to real estate agents, who could use it to show images of homes to their clients. “I was like, ‘Oh my God, this is so off the mark,’” Strickon says.

‘We’re giving out a new award—for innovation,’” Strickon recalls. He brought the Q79 team up on stage and gave them trophies: life-size red polished apples made of stone. He wouldn’t, or couldn’t, say anything else about it. “They literally said nothing. Nothing,” Strickon says. “They’re giving this team an award and couldn’t tell you what it was.”

“Classic internal secrecy bullshit,”

“There was no product there,” Christie says. “Bas had a couple of demos, one was twisting this image with two fingers and the other was scrolling a list. That was all lacking a compelling virtue. It was like, okay—why? There was always a little skepticism.… Apple’s trackpad was so good at that point compared to the competition.”

the materials were putting the device in the thousand-dollar range, basically the same cost as a laptop. “And I think that’s when Steve made that call; Steve Jobs was like, ‘We can’t sell this—it’s too expensive,’” Huppi says.

Jobs had fallen seriously ill, and he would take multiple months off in 2004 to have long-overdue surgery to remove a malignant tumor on his pancreas. “Steve getting sick the first time, that sort of stopped things in the tracks,” Strickon says. “Nothing was happening when Steve was out. It was just completely odd.” And so Q79 began to sputter.

The iPhone The project languished until the end of 2004 when an executive decision came down. Jobs had decided Apple needed to do a phone. “I got a call from Steve,” Ording says. “‘We’re gonna do a phone. There’s gonna be no buttons. Just a touchscreen.’ Which was big news.”

Jobs would soon pit the iPod team against a Mac software team to refine and produce a product that was more specifically phone-like. The herculean task of squeezing Apple’s acclaimed operating system into a handheld phone would take another two years to complete. Executives would clash; some would quit. Programmers would spend years of their lives coding around the clock to get the iPhone ready to launch, scrambling their social lives, their marriages, and sometimes their health in the process.

But it all had been set into motion years before. The concept of the iPhone wasn’t the product of Steve Jobs’s imagination—though he would fiercely oversee, refine, and curate its features and designs—but of an open-ended conversation, curiosity, and collaboration. It was a product born of technologies nurtured by other companies and then ingeniously refined by some of Apple’s brightest minds—people who were then kept out of its public history.

CHAPTER 5 Lion Batteries Plugging into the fuel source of modern life

Chile’s Atacama Desert is the most arid place on Earth apart from the freeze-dried poles.

we have this barren, unearthly place to thank for keeping our iPhones running. Chilean miners work this alien environment every day, harvesting lithium from vast evaporating pools of marine brine.

Lithium is the lightest metal and least dense solid element, and while it’s widely distributed around the world, it never occurs naturally in pure elemental form; it’s too reactive. It has to be separated and refined from compounds, so it’s usually expensive to get. But here, the high concentration of lithium in the salar brines combined with the ultradry climate allows miners to harness good old evaporation to obtain the increasingly precious metal.

Lithium-ion batteries are the power source of choice for laptops, tablets, electric cars, and, of course, smartphones.

Lithium-ion batteries were first pioneered in the 1970s because experts feared humanity was heading down a different, more literal, road of death due to its dependence on oil.

A battery is basically just three parts: two electrodes (an anode with a negative charge and a cathode with a positive charge) and an electrolyte running between them.

And it worked like most of our modern batteries do today, through oxidation and reduction. The chemical reactions cause a buildup of electrons in the anode (in Volta’s pile, it’s the zinc), which then want to jump to the cathode (the copper). The electrolyte—whether it’s brine-soaked cloth or a dead frog—won’t let it. But if you connect the battery’s anode and cathode with a wire, you complete the circuit, so the anode will oxidize (lose electrons), and those electrons will travel to the cathode, generating electrical current in the process.

Not only does the li-ion battery power our gadgets, but it’s the bedrock of electric vehicles. It’s somewhat ironic, then, that it was invented by a scientist employed by the world’s most notorious oil company.

Exxon.

“We came up with the concept of intercalation and built the first room-temperature lithium rechargeable cells at Exxon,” Whittingham says. Intercalation is the process of inserting ions between layers in compounds; lithium ions in the anode travel to the cathode, creating electricity, and since the reaction is reversible, the lithium ions can travel back to the anode, recharging the battery.

Whittingham’s brainchild was a leap ahead of both. Powerful and lightweight, it could power much smaller portable consumer electronics (think the iPod versus the Walkman)—if it worked.”

Whittingham’s work

was continued by the man who would make the consumer-electronics boom possible.

The mining operation itself, smack-dab in the middle of the salt desert, is unusual. There’s no entrance carved out of rock, no deepening pit into the earth. Instead, there’s a series of increasingly electric-colored, massive brine-filled evaporating pools that perfectly reflect the mountains that line the horizon. They’re separated by endless mounds of salt—the by-product of the mining effort.

Underneath all that encrusted salt, sometimes just one to three meters below, there’s a giant reservoir of brine, a salty solution that boasts a high concentration of lithium.

When I tell John Goodenough that I’m calling him from a lithium mine in the Atacama Desert, he lets loose a howling hoot. Goodenough is a giant in his field—he spearheaded the most important battery innovations since Whittingham’s lithium breakthrough—and that laugh has become notorious. At age ninety-four, he still heads into his office nearly every day,

Fueled by Goodenough’s research and Sony’s product development, lithium batteries became a global industry unto themselves. As of 2015, they made up a thirty-billion-dollar annual market. And the trend is expected to continue, abetted by electric and hybrid vehicles.

At the evaporation ponds, Enrique says, “You’re always pumping in and pumping out.” First, the workers start an evaporation route, which precipitates rock salt. Pump. Then they get potassium salt. Pump. Eventually, they concentrate the brine solution until it’s about 6 percent lithium.

This vast network of clear to blue to neon-green pools is only the first step in creating the lithium that ends up in your batteries. After it’s reduced to a concentrate, the lithium is shipped by tanker truck to a refinery in Salar del Carmen, by the coast.

The refinery operation is an industrial winter wonderland. Salt crystals grow on the reactors, and lithium flakes fall like snow on my shoulders. That’s because 130 tons of lithium carbonate are whipped up here every day and shipped from Chile’s ports. That’s 48,000 tons of lithium a year. Because there’s less than a gram of lithium in each iPhone, that’s enough to make about forty-three billion iPhones.

“The battery is the key to a lot of the psychology behind these devices,” iFixit’s Kyle Wiens points out. When the battery begins to drain too fast, people get frustrated with the whole device it powers. When the battery’s in good shape, so is the phone. Predictably, the lithium-ion battery is the subject of a constant tug-of-war; as consumers, we demand more and better apps and entertainment, more video rendered in ever-higher res. Of course, we also pine for a longer-lasting battery, and the former obviously drains the latter. And Apple, meanwhile, wants to keep making thinner and thinner phones. “If we made the iPhone a millimeter thicker,” says Tony Fadell, the head of hardware for the first iPhone, “we could make it last twice as long.”

Goodenough believes that a new and better battery—one whose key ingredient is sodium, not lithium—is on the horizon. “We are on the verge of another battery development that will also prove societally transformational,” he says. Sodium is heavier and more volatile than lithium, but cheaper and more easily accessible.

“If you pull an iPhone apart, I think the big question we ask folks like Apple is, Do you want more efficient electronics or more energy density in your battery?”

CHAPTER 6 Image Stabilization A snapshot of the world’s most popular camera

If future archaeologists were to dust off advertisements for the most popular mass-market cameras of the nineteenth and twenty-first centuries, they would notice some striking similarities in the sloganeering of the two periods. Exhibit A: You Press the Button, We Do the Rest. Exhibit B: We’ve taken care of the technology. All you have to do is find something beautiful and tap the shutter button.

Exhibit A comes to us from 1888, when George Eastman, the founder of Kodak, thrust his camera into the mainstream with that simple eight-word slogan.

Eastman had initially hired an ad agency to market his Kodak box camera but fired them after they returned copy he viewed as needlessly complicated. Extolling the key virtue of his product—that all a consumer had to do was snap the photos and then take the camera into a Kodak shop to get them developed—he launched one of the most famous ad campaigns of the young industry.

Exhibit B, of course, is Apple’s pitch for the iPhone camera. The spirit of the two campaigns, separated by over a century, is unmistakably similar: both focus on ease of use and aim to entice the average consumer, not the photography aficionado.

As he did in most areas of his company, Eastman handled the promotional details himself. And he had a gift for it—almost an innate ability to frame sentences into slogans, to come up with visual images that spoke directly and colorfully to everyone.” Remind you of anyone?

In the beginning, the 2-megapixel camera that Apple tacked onto its original iPhone was hardly a pinnacle of innovation. Nor was it intended to be. “It was more like, every other phone has a camera, so we better have one too,” one senior member of the original iPhone team tells me.

“There’s over two hundred separate individual parts” in the iPhone’s camera module, Graham Townsend, the head of Apple’s camera division, told 60 Minutes in 2016.

“Steve didn’t like the external iSight because he hated warts,” Bilbrey says, “he hated anything that wasn’t sleek and design-integrated.”

Jobs walked over to Bilbrey, the room dead silent, and said, “Okay, what can you do?” Bilbrey said, “Well, we could go with a CMOS imager inside and—” “You know how to make this work?” Jobs said, cutting him off. “Yeah,” Bilbrey managed. “Well, can you do a demonstration? In a couple weeks?” Jobs said impatiently. “And I said, yeah, we could do it in a couple weeks.

Compression artifacts are what you see when you try to watch YouTube over a slow internet connection or, in ye olden times, when you’d try to watch a DVD on an old computer with a full hard drive—you’d get that dreaded picture distortion in the form of pixelated blocks. This happens when the system applies what’s called lossy compression, which dumps parts of the media’s data until it becomes simple enough to be stored on the available disk space (or be streamed within the bandwidth limitations, to use the YouTube example). If the compressor can’t reconstruct enough data to reproduce the original video, quality tanks and you get artifacts.

“If we don’t create the blocks, we don’t have to remove them. Now that sounds obvious, but how do you reconstruct video if you don’t have a block?” His idea, he says, was building out an entire screen that was nothing but blocks. He wrote an algorithm that allowed the device to avoid de-blocking, making the entire frame of video available for playback.

They worked out the glitch and showed it to Jobs the next day; he signed off on it with as much brevity as he’d dismissed it the day before: “It looks great.”

Selfies are as old as cameras themselves (even older, if you count painted self-portraits). In 1839, Robert Cornelius was working on a new way to create daguerreotypes, the predecessor of photography. The process was much slower, so, naturally, he indulged the urge to uncover the lens, run into the shot, wait ten minutes, and replace the lens cap. He wrote on the back, The first light picture ever taken, 1839.

Today, smartphone camera quality is close enough to the digital point-and-clicks that the iPhone is destroying large swaths of the industry. Giants like Nikon, Panasonic, and Canon are losing market share, fast.

One of the features that routinely gets top billing in Apple’s ever-improving iPhone cameras is the optical image stabilization. It’s a key component; without it, the ultralight iPhone, influenced by every tiny movement of your hands, would produce impossibly blurry photos.

It was developed by a man whom you’ve almost certainly never heard of: Dr. Mitsuaki Oshima. And thanks to Oshima, and a vacation he took to Hawaii in the 1980s, every single photo and video we’ve described above has come out less blurry.

He made the connection between the jittery camera and his vibrating gyroscope: It occurred to him that he could eliminate blur by measuring the rotation angle of a camera with a vibrating gyro, and then correct the image accordingly. That, in the simplest terms, is what image stabilizers do today.

CHAPTER 7 Sensing Motion From gyroscopes to GPS, the iPhone finds its place

Jean-Bernard-Léon Foucault—not to be confused with the more strictly philosophical Foucault, Michel—had set out to prove that the Earth rotated on its axis. In 1851, he suspended a bob and a wire from the ceiling of the Paris Observatory to show that the free-swinging pendulum would slowly change direction over the course of the day, thus demonstrating what we now call the Coriolis effect.

For his next experiment, Foucault used a gyroscope—essentially a spinning top with a structure that maintains its orientation—to more precisely demonstrate the same effect. At its fundamental level, it’s not so different from the gyroscope that’s lodged in your iPhone—which also relies on the Coriolis effect to keep the iPhone’s screen properly oriented. It’s just that today, it takes the form of a MEMS—a microelectromechanical system—crammed onto a tiny and, frankly, beautiful chip.

The gyroscope in your phone is a vibrating structure gyroscope (VSG). It is—you guessed it—a gyroscope that uses a vibrating structure to determine the rate at which something is rotating.

The gyroscope is one of an array of sensors inside your phone that provide it with information about how, exactly, the device is moving through the world and how it should respond to its environment. Those sensors are behind some of the iPhone’s more subtle but critical magic—they’re how it knows what to do when you put it up to your ear, move it horizontally, or take it into a dark room. To understand how the iPhone finds its place in the universe—especially in relation to you, the user—we need to take a crash course through its most crucial sensors, and two of its location-tracking chips.

The Accelerometer “iPhone’s built-in accelerometer detects when the user has rotated the device from portrait to landscape, then automatically changes the contents of the display accordingly,” Apple’s press team wrote in 2007,

the computer’s new accelerometer. It had a purpose beyond enabling laptop light-saber duels, of course; when someone knocked a laptop off a table and it went into freefall, the accelerometer would automatically shut off the hard drive to protect the data. So Apple already had a starting point.

The Proximity Sensor

Let’s get back to that original iPhone announcement, which noted that the “iPhone’s built-in light sensor automatically adjusts the display’s brightness to the appropriate level for the current ambient light, enhancing the user experience and saving power at the same time.”

Proximity sensors are how your iPhone knows to automatically turn off its display when you lift it to your ear and automatically revive it when you go to set it back down. They work by emitting a tiny, invisible burst of infrared radiation.

If the object—your face—is close, then it’s pretty intense, and the phone knows the display should be shut off. If it receives low-intensity light, then it’s okay to shine on.

GPS

The story of why your iPhone can effortlessly direct you to the closest Starbucks begins, as so many good stories do, with the space race.

October 4, 1957, and the Soviets had just announced that they’d successfully launched the first artificial satellite, Sputnik 1, into orbit.

the Soviets had eschewed the heavy array of scientific equipment they’d initially intended to launch and instead outfitted Sputnik with a simple radio transmitter.

anyone with a shortwave radio receiver could hear the Soviet satellite as it made laps around the planet. The MIT team of astronomers assigned to observe Sputnik noticed that the frequency of radio signals it emitted increased as it drew nearer to them and decreased as it drifted away. This was due to the Doppler effect, and they realized that they could track the satellite’s position by measuring its radio frequency—and also that it could be used to monitor theirs. It took the U.S. Navy only two years from that point to develop the world’s first satellite navigation system, Transit. In the 1960s and 1970s, the U.S. Naval Research Laboratory worked in earnest to establish the Global Positioning System.

Google Maps did not, in fact, originate with Google. It began as a project headed up by Lars and Jens Rasmussen, two Danish-born brothers who were both laid off from start-ups in the wake of the first dot-com bubble-burst. Lars Rasmussen, who describes himself as having the “least developed sense of direction” of anyone he knows,

After years of failing to interest anyone in the technology—people kept telling them they just couldn’t monetize maps—they eventually sold the thing to Google, where it would be transfigured into an app for the first iPhone.

Magnetometer

it’s a compass basically.

Now, the magnetometer, accelerometer, and gyroscope all feed their data into one of Apple’s newer chips: the motion coprocessor, a tiny chip that the website iMore describes as Robin to the main processor’s Batman. It’s an untiring little sidekick, computing all the location data so the iPhone’s brain doesn’t have to, saving time, energy, and power.

CHAPTER 8 Strong-ARMed How the iPhone grew its brain

Kay is one of the forefathers of personal computing; he’s what you can safely call a living legend. He directed a research team at the also-legendary Xerox PARC, where he led the development of the influential programming language Smalltalk, which paved the way for the first graphical user interfaces.

remembered praying Moore’s law would run out with Moore’s estimate,” Kay says, describing the famous law put forward by computer scientist and Intel co-founder Gordon Moore. It states that every two years, the number of transistors that can fit on a square inch of microchip will double.

“Rather than becoming something that chronicled the progress of the industry, Moore’s law became something that drove

It describes the reason that we can fit the equivalent of a multiple-ceiling-high supercomputer from the 1970s into a black, pocket-size rectangle today—and the reason we can stream high-resolution video seamlessly from across the world, play games with complex 3-D graphics, and store mountains of data all from our increasingly slender phones.

You might have heard it said that the computer in your phone is now more powerful than the one that guided the first Apollo mission to the moon. That’s an understatement. Your phone’s computer is way, way more powerful. Like, a hundred thousand times more powerful. And it’s largely thanks to the incredible shrinking transistor.

The transistor may be the most influential invention of the past century. It’s the foundation on which all electronics are built, the iPhone included;

computers are programmed to understand a binary language—a string of yes-or-no, on-or-off, or 1-or-0—humans need a way to indicate each position to the computer. Transistors can interpret our instructions to the computer; amplified could be yes or on or 1; not amplified, no, off, 0. Scientists found

More transistors mean, on a very basic level, that more complex commands can be carried out. More transistors, interestingly, do not mean more power consumption. In fact, because they are smaller, a larger number of transistors mean less energy is needed. So, to recap: As Moore’s law proceeds, computer chips get smaller, more powerful, and less energy intensive. Programmers realized they could harness the extra power to create more complex programs, and thus began the cycle you know and perhaps loathe: Every year, better devices come out that can do new and better things; they can play games with better graphics, store more high-res photos, browse the web more seamlessly, and so on.

In 1954, Texas Instruments released the Regency TR-1, the first transistor radio.

In 1971, a scrappy upstart of a company named Intel released its first microchip, the 4004. Its transistors were spread over twelve square millimeters. There were ten thousand nanometers between each transistor. As the Economist helpfully explained, that’s “about as big as a red blood cell… A child with a decent microscope could have counted the individual transistors of the 4004.” Transistor count: 2,300.

The first iPhone processor, a custom chip designed by Apple and Samsung and manufactured by the latter, was released in 2007. Transistor count: 137,500,000.

That sounds like a lot, but the iPhone 7, released nine years after the first iPhone, has roughly 240 times as many transistors. Total count: 3.3 billion.

Moore’s law is beginning to collapse because chipmakers are running up against subatomic space constraints.

Computers may have to switch to new methods altogether, like quantum computing, if they’re going to continue to get faster.

The ARM processor that Wilson designed has become the most popular in history; ninety-five billion have been sold to date, and fifteen billion were shipped in 2015 alone.

ARM chips are in everything: smartphones, computers, wristwatches, cars, coffeemakers, you name it.

Speaking of naming it, get ready for some seriously nested acronyms. Wilson’s design was originally called the Acorn RISC Machine, after the company that invented it, Acorn, and RISC, which stands for reduced instruction set computing. RISC was a CPU design strategy pioneered by Berkeley researchers who had noticed that most computer programs weren’t using most of a given processor’s instruction set, yet said processor’s circuitry was nonetheless burning time and energy decoding those instructions whenever it ran. Got it? RISC was basically an attempt to make a smarter, more efficient machine by tailoring a CPU to the kinds of programs it would run.

“The low-power big thing that the ARM is most valued for today, the reason that it’s on all your mobile phones, was a complete accident,” Wilson says. “It was ten times lower than Steve had expected. That’s really the result of not having the right sort of tools.”

Wilson had designed a powerful, fully functional 32-bit processor that consumed about one-tenth of a watt of energy.

Shortly after, in an effort to continue to simplify their designs, Wilson and company built the first so-called System on Chip, or SoC, “which Acorn did casually, without realizing it was a world-defining moment.” The SoC basically integrates all the components of a computer into one chip—hence the name.

“ARM became successful for a completely different reason, and that was the way the company was set up,” she says. Wilson calls it the “ecosystem model”: ARM designers would innovate new chips, work closely with clients with specific demands, and then license the end designs, as opposed to selling or building them in-house. Clients would buy a license giving them access to ARM’s design catalog; they could order specifications, and ARM would collect a small royalty on each device sold.

Steve Jobs was at first adamantly opposed to allowing anyone besides Apple to create apps for the iPhone. It would take a deluge of developers calling for access, a band of persistent hackers jailbreaking the walled garden, and internal pressure from engineers and executives to convince Jobs to change course. It was, essentially, the equivalent of a public protest stirring a leader to change policy.

Remember, the original iPhone, per Apple’s positioning, was to be • A wide-screen iPod with touch controls • A phone • An internet communicator The revolutionary “there’s an app for that” mentality was nowhere on display.

“Steve gave us a really direct order that we weren’t going to allow third-party developers onto our device,” Andy Grignon, a senior engineer who worked on the iPhone, says. “And the primary reasons were: it’s a phone above anything else. And the second we allow some knucklehead developer to write some stupid app on our device, it could crash the whole thing and we don’t want to bear the liability of you not being able to call 911 because some badly written app took the whole thing down.”

Jobs had an intense hatred of phones that dropped calls,

His Nokia, or whatever it was that he was using at the time, if it crashed on him, the chances were more than likely that he’d fling and smash it. I saw him throw phones. He got very upset at phones. The reason he did not want developer apps on his phone is he did not want his phone crashing.”

“The thing with Steve was that nine times out of ten, he was brilliant, but one of those times he had a brain fart, and it was like, ‘Who’s going to tell him he’s wrong?’” Bilbrey says.

the iPhone was one of the most intuitive, powerful mobile-computing devices ever conceived—but it took an enterprising group of hackers to let consumers really use it like one. Since their exploits were regularly covered by tech blogs and even the mainstream media, it demonstrated to the public—and Apple—a thirst for third-party apps.

“The iPhone was almost a failure when it first launched,” Bilbrey says. “Many people don’t realize this. Internally, I saw the volume sales—when the iPhone was launched, its sales were dismal. It was considered expensive; it was not catching on. For three to six months. It was the first two quarters or something. It was not doing well.”

Bilbrey says the reason was simple. “There were no apps.”

“Scott Forstall said, ‘Steve, I’ll put the software together, we’ll make sure to protect it if a crash occurs. We’ll isolate it; we won’t crash the phone.’”

“Let me just say it: We want native third party applications on the iPhone, and we plan to have an SDK [software developer’s kit] in developers’ hands in February,” Steve Jobs wrote in an announcement posted to Apple’s website. “We are excited about creating a vibrant third party developer community around the iPhone and enabling hundreds of new applications for our users.”

Really, it was Facebook, if I may use them sort of metaphorically, that had figured out what a phone is for.”

“It was a one-to-one correlation. When we announced that we were going to have apps, and we started allowing them, that’s when people started buying the iPhone,”

“The guy who wrote iFart made a million dollars on that fucking app. Of course we laugh at it today, but Jesus, dude, a million fucking dollars. From an app that plays fart noises.”

“One way to think about apps are as ‘simplified interfaces for visualizing data,’” Rothstein says. “Any app, whether social media, mapping, weather, or even a game, takes large amounts of data and presents it through a small interface, with a variety of buttons or gestures for navigating and manipulating that data.

A volvelle does the same.

a volvelle is basically a paper wheel with a data set inscribed on it that’s fastened to another paper wheel with its own data set. The wheels can be manipulated to produce easily parsable information about the world.

The point is that people have been using tools to simplify and wield data and coordinate solutions for centuries.

In 2016, one report estimated that the app economy was worth $51 billion and that it would double by 2020.

“The app industry is now bigger than Hollywood,” Dediu tells me, “but nobody really talks about it.”

“Note also that high tech once led to a world of leisure; it now leads to unrelenting work.”

there’s another thing about the app economy: It’s almost all games. In 2015, games accounted for 85 percent of the App Store’s revenue,

The other major-grossing segment of the app market besides games is subscription services. As of the beginning of 2017, Netflix, Pandora, HBO Go, Spotify, YouTube, and Hulu all ranked in the twenty top-grossing apps on the App Store.

Which brings us back to Alan Kay. “New media just ends up simulating existing media,”

You’re going to wind up with something that is relatively inexpensive, within consumer buying habits.

So it needed to mass-produce ideas so a few percentage of the population could get them. And that was enough to transform Europe. Didn’t transform most people in Europe; it transformed the movers and shakers. And that’s what’s happened.” Coincidentally, some

CHAPTER 9 Noise to Signal How we built the ultimate network

As of 2016, there were 7.4 billion cell phone subscribers in a world of 7.43 billion people (along with 1.2 billion LTE wireless data subscribers).

their personal drivers, and for business. By 1973, the networks were broad and technology advanced enough that Motorola’s Martin Cooper was able to debut the first prototype mobile-phone handset, famously making a public call on the toaster-size plastic cell.

Fi began long before the web as we know it existed and was actually developed along the same timeline as ARPANET. The genesis of wireless internet harkens back to 1968 at the University of Hawaii, where a professor named Norman Abramson had a logistics problem. The university had only one computer, on the main campus in Honolulu, but he had students and colleagues spread across departments and research stations on the other islands. At the time, the internet was routed through Ethernet cables—and routing an Ethernet cable hundreds of miles underwater to link the stations wasn’t an option.

Not entirely unlike the way harsh northern terrain drove the Scandinavians to go wireless, the sprawling expanse of Pacific Ocean forced Abramson to get creative. His team’s idea was to use radio communications to send data from the terminals on the small islands to the computer on Honolulu, and vice versa. The project would grow into the aptly named ALOHAnet, the precursor to Wi-Fi. (One of those reverse-engineered acronyms, it originally stood for Additive Links On-line Hawaii Area.) The ARPANET is the network that would mutate into the internet, and it’s fair to say that ALOHAnet would do the same with Wi-Fi.

we should thank Norwegian teens for popularizing texting. A researcher named Friedhelm Hillebrand, the chairman of GSM’s nonvoice services committee, had been carrying out informal experiments on message length at his home in Bonn, Germany. He counted the characters in most messages he tapped out and landed on 160 as the magic number for text length. “This is perfectly sufficient,” he thought. In 1986, he pushed through a requirement mandating that phones on the network had to include something called short message service, or SMS. He then shoehorned SMSing into a secondary data line originally used to send users brief updates about network status.

Its creators thought that text messaging would be useful for an engineer who was, say, out in the field checking on faulty wires—they’d be able to send a message back to the base. It was almost like a maintenance feature, Agar says.

It was a minuscule part of the sprawling GSM, and engineers barely texted. But teenagers, who were amenable to a quick, surreptitious way to send messages, discovered the service. Norwegian teenagers, Agar says, took to texting in far greater numbers than any network engineers ever did. During the nineties, texting was largely a communication channel for youth culture.

This principle plays out again and again throughout the history of technology: Designers, marketers, or corporations create a product or service, users decide what they actually want to do with it.

Note:This is another key idea for the summary / review, along with the iphone as a collection of evolving technologies and not a breakthrough invention.

In the summer of 2014, Joel Metz, a cell-tower climber and twenty-eight-year-old father of four, was working on a tower in Kentucky, 240 feet above the ground. He was replacing an old boom with a new one when his colleagues heard a loud pop; a cable suddenly flew loose and severed Metz’s head and right arm, leaving his body dangling hundreds of feet in the air for six hours. The gruesome tragedy is, sadly, not a fluke. Which is why it’s absolutely necessary to interrupt the regularly scheduled story of collaboration, progress, and innovation with a reminder that it all comes at a cost, that the infrastructure that makes wireless technology possible is physically built out by human labor and high-risk work, and that people have died to grow and maintain that network. Too many people. Metz’s death was just one of scores that have befallen cell-tower climbers over the past decade.

You might recall the complaints about AT&T’s network that poured in after the iPhone debuted; it was soon overloaded, and Steve Jobs was reportedly furious. AT&T’s subsequent rush to build out more tower infrastructure for better coverage, ProPublica’s report indicated, contributed to hazardous working conditions and the higher-than-usual death toll.

We might not have developed wireless radio communications without Marconi, cell

phones without Bell Labs, a standardized network without EU advocates—and we wouldn’t get reception without the sacrifice of workers like Joel Metz. Our iPhones wouldn’t have a network to run on without all of the above.

iii: Enter the iPhone Slide to unlock

If you worked at Apple in the mid-2000s, you might have noticed a strange phenomenon afoot. People were disappearing. It happened slowly at first. One day there’d be an empty chair where a star engineer used to sit. A key member of the team, gone. Nobody could tell you exactly where they went. “I had been hearing rumblings about, well, it was unclear what was being built, but it was clear that a lot of the best engineers from the best teams had been slurped over to this mysterious team,” says Evan Doll, who was a software engineer at Apple then.

“It was really intense, probably professionally one of the worst times of my life,” Andy Grignon says. “Because you created a pressure cooker of a bunch of really smart people with an impossible deadline, an impossible mission, and then you hear that the future of the entire company is resting on it. So it was just like this soup of misery,” Grignon says. “There wasn’t really time to kick your feet back on the desk and say, ‘This is going to be really fucking awesome one day.’ It was like, ‘Holy fuck, we’re fucked.’ Every time you turned around there was some just imminent demise of the program just lurking around the corner.”

Like many mass-adopted, highly profitable technologies, the iPhone has a number of competing origin stories. There were as many as five different phone or phone-related projects—from tiny research endeavors to full-blown corporate partnerships—bubbling up at Apple by the middle of the 2000s. But if there’s anything I’ve learned in my efforts to pull the iPhone apart, literally and figuratively, it’s that there are rarely concrete beginnings to any particular products or technologies—they evolve from varying previous ideas and concepts and inventions and are prodded and iterated into newness by restless minds and profit motives.

Rokring Out

In 2004, Motorola was manufacturing one of the most popular phones on the market, the ultrathin Razr flip phone. Its new CEO, Ed Zander, was friendly with Jobs, who liked the Razr’s design, and the two set about exploring how Apple and Motorola might collaborate.

Partnering with Motorola was an easy way to try to neutralize a threat to the iPod. Motorola would make the handset; Apple would do the iTunes software.

Inside Apple, however, expectations for the Rokr could not have been lower. “We all knew how bad it was,” Fadell says. “They’re slow, they can’t get things to change, they’re going to limit the songs.” Fadell laughs aloud when discussing the Rokr today. “All of these things were coming together to make sure it was really a shitty experience.”

“Steve was gathering information during those meetings” with Motorola and Cingular, Richard Williamson says. He was trying to figure out how he might pursue a deal that would let Apple retain control over the design of its phone.

“Apple is best when it’s fixing the things that people hate,” Greg Christie tells me. Before the iPod, nobody could figure out how to use a digital music player; as Napster boomed, people took to carting around skip-happy portable CD players loaded with burned albums. And before the Apple II, computers were mostly considered too complex and unwieldy for the layperson. “For at least a year before starting on what would become the iPhone project, even internally at Apple, we were grumbling about how all of these phones out there were all terrible,”

‘Oh my God, we need to go in and clean up this market too—why isn’t Apple making a phone?’”

“We were spending all this time putting iPod features in Motorola phones,” Bell says. “That just seemed ass-backwards to me. If we just took the iPod user experience and some of the other stuff we were working on, we could own the market.” It was getting harder to argue with that logic. The latest batches of MP3 phones were looking increasingly like iPod competitors, and new alternatives for dealing with the carriers were emerging.

On November 7, 2004, Bell sent Jobs a late-night email. “Steve, I know you don’t want to do a phone,” he wrote, “but here’s why we should do it: Jony Ive has some really cool designs for future iPods that no one has seen. We ought to take one of those, put some Apple software around it, and make a phone out of it ourselves instead of putting our stuff on other people’s phones.” Jobs called him right away. They argued for hours, pushing back and forth. Bell detailed his convergence theory—no doubt mentioning the fact that the mobile-phone market was exploding worldwide—and Jobs picked it apart. Finally, he relented. “Okay, I think we should go do it,” he said.

And they had to establish the fundamentals; for instance, What should it look like when you fire up your phone? A grid of apps seems like the obvious way to organize a smartphone’s functions today—now that it’s like water, as Chaudhri says—but it wasn’t a foregone conclusion. “We tried some other stuff,” Ording says. “Like, maybe it’s a list of icons with the names after them.” But what came to be called SpringBoard emerged early on as the standard. “They were little Chiclets, basically,” Ording says. “Now, that’s Imran too, that was a great idea, and it looked really nice.”

“In January, in the New Year, he blows a gasket and tells us we’re not getting it,” Christie says. The fragments might have been impressive, but there was no narrative drawing the disparate parts together; it was a jumble of half-apps and ideas. There was no story. “It was as if you delivered a story to your editor and it was a couple of sentences from the introductory paragraph, a few from the body, and then something from the middle of the conclusion—but not the concluding statement.”

“He said, ‘You have two weeks.’ It was February of 2005, and we kicked off this two-week death march.”

When Fadell heard that a phone project was taking shape, he grabbed his own skunkworks prototype design of the iPod phone before he headed into an executive meeting. “There was a meeting where they were talking about the formation of the phone project on the team,” Grignon says. “Tony had [it] in his back pocket, a team already working on the hardware and the schematics, all the design for it. And once they got the approval for it from Steve, Tony was like, ‘Oh, hold on, as a matter of fact’—whoo-chaa! Like he whipped it out, ‘Here’s this prototype that we’ve been thinking about,’ and it was basically a fully baked design.”

Which Phone

There were two options: (a) take the beloved, widely recognizable iPod and hack it to double as a phone (that was the easier path technologically, and Jobs wasn’t envisioning the iPhone as a mobile-computing device but as a souped-up phone), or (b) transmogrify a Mac into a tiny touch-tablet that made calls (which was an exciting idea but frayed with futuristic abstraction).

Development had continued on the Rokr throughout 2005. “We all thought the Rokr was a joke,” Williamson says.

During the demonstration, Jobs held the phone like an unwashed sock.

at about the same moment that Jobs was announcing “the world’s first mobile phone with iTunes” to the media, he was resolving to make it obsolete.

“Listen. We’re going to change plans.… We’re going to do this iPod-based thing, make that into a phone because that’s a much more doable project. More predictable.” That was Fadell’s project. The touchscreen effort wasn’t abandoned, but while the engineers worked on whipping it into shape, Jobs directed Ording, Chaudhri, and members of the UI team to design an interface for an iPod phone, a way to dial numbers, select contacts, and browse the web using that device’s tried-and-true click wheel.

There were now two competing projects vying to become the iPhone—a “bake-off,” as some engineers put it. The two phone projects were split into tracks, code-named P1 and P2, respectively. Both were top secret. P1 was the iPod phone. P2 was the still-experimental hybrid of multitouch technology and Mac software.

Eventually, the executives overseeing the most important elements of the iPhone—software, hardware, and industrial design—would barely be able to tolerate sitting in the same room together. One would quit, others would be fired, and one would emerge solidly—and perhaps solely—as the new face of Apple’s genius in the post-Jobs era.

“We prototyped a new way,” Grignon says of the early device. “It was this interesting material… it still had this touch sensitive click wheel, right, and the Play/Pause/Next/Previous buttons in blue backlighting. And when you put it into phone mode through the UI, all that light kind of faded out and faded back in as orange. Like, zero to nine in the click wheel in an old rotary phone, you know, ABCDEFG around the edges.”

After everyone else, including Fadell, started to agree that multitouch was the way forward, Schiller became the lone holdout. He “just sat there with his sword out every time, going, ‘No, we’ve got to have a hard keyboard. No. Hard keyboard.’ And he wouldn’t listen to reason as all of us were like, ‘No, this works now, Phil.’ And he’d say, ‘You gotta have a hard keyboard!’” Fadell says.

“Phil is not a technology guy,” Brett Bilbrey, the former head of Apple’s Technology Advancement Group, says. “There were days when you had to explain things to him like a grade-school kid.” Jobs liked him, Bilbrey thinks, because he “looked at technology like middle America does, like Grandma and Grandpa did.”

“We’re making the wrong decision!” Schiller shouted. “Steve looked at him and goes, ‘I’m sick and tired of this stuff. Can we get off of this?’ And he threw him out of the meeting,”

Remember, even after the iPhone’s launch, Steve Jobs would describe it as “more like an iPod” than a computer. But those who’d been in the trenches experimenting with the touch interface were excited about the possibilities it presented for personal computing and for evolving the human-machine interface. “There was definitely discussion: This is just an iPod with a phone. And we said, no, it’s OS X with a phone,” Henri Lamiraux says. “That’s

“Once we had OS X ported and these basic scrolling interactions nailed, the decision was made: We’re not going to go with the iPod stack, we’re going to go with OS X.”

The software for the iPhone would be built by Scott Forstall’s NeXT mafia; the hardware would go to Fadell’s group.

CHAPTER 10 Hey, Siri Where was the first artificially intelligent assistant born?

Before Siri was a core functionality of the iPhone, it was an app on the App Store launched by a well-funded Silicon Valley start-up. Before that, it was a research project at Stanford backed by the Defense Department with the aim of creating an artificially intelligent assistant.

Symbolic reasoning describes how the human mind uses symbols to represent numbers and logical relationships to solve problems both simple and complex.

With the resonant opening line “I propose to consider the question, ‘Can machines think?’” in his 1950 paper “Computing Machinery and Intelligence,” Alan Turing framed much of the debate to come. That work discusses his famous Imitation Game, now colloquially known as the Turing Test, which describes criteria for judging whether a machine may be considered sufficiently “intelligent.”

DARPA. The Defense Advanced Research Projects Agency (or ARPA before 1972) had funded a number of AI and speech-recognition projects in the 1960s, leading Raj Reddy and others to develop the field and inspiring the likes of Tom Gruber to join the discipline. In 2003, decades later, DARPA made an unexpected return to the AI game. The agency gave the nonprofit research outfit SRI International around two hundred million dollars to organize five hundred top scientists in a concerted research effort to build a virtual AI.

The project had begun the year after the first iPhone launched, and as the Siri project took shape, it was clear that it would be aimed at smartphones. “The Siri play was mobile from the beginning,” he says. “Let’s do an assistant, let’s make it mobile. And then let’s add speech, when speech is ready.… By the second year, the speech-recognition technology had gotten good enough we could license it.”

In 2010, after they’d settled on the name and with the speech-recognition technology ready for prime time, they launched the app. It was an immediate success. “That was pretty cool,” Gruber says. “We saw it touched a nerve when we launched in the App Store as a start-up and hit top of its category in one day.” It did not take long for Apple to come knocking.

after that from Apple,” Gruber says. That call came directly from Steve Jobs himself.

Gruber says Siri can’t offer emotional intelligence—yet. He says they need to find a theory to program first. “You can’t just say, oh, ‘Be a better girlfriend’ or ‘Be a better listener.’ That’s not a programmable statement. So what you can say is ‘Watch behaviors of the humans’ and ‘Here’s the things that you want to watch for that makes them happy, and here’s one thing that is bad, and do things to make them happier.’ AIs will do that.”

CHAPTER 11 A Secure Enclave What happens when the black mirror gets hacked

Half an hour after I showed up at Def Con, my iPhone got hacked. The first rule of attending the largest hacker conference in North America is to disable Wi-Fi and Bluetooth on all your devices. I had done neither. Soon, my phone had joined a public Wi-Fi network, without my permission.

So, to keep safe the things that hackers might want most—bank-account info, passwords—Apple designed the Secure Enclave. “We want the user’s secrets to not be exposed to Apple at any point,” Krsti´c told a packed house in Mandalay Bay Casino in Las Vegas. The Secure Enclave is “protected by a strong cryptographic master key from the user’s passcode… offline attack is not possible.”

“All fingerprint information is encrypted and stored inside a secure enclave in our new A7 chip. Here it’s locked away from everything else, accessible only by the Touch ID sensor. It’s never available to other software, and it’s never stored on Apple servers or backed up to iCloud.”

Basically, the enclave is a brand-new subcomputer built specifically to handle encryption and privacy without ever involving Apple’s servers. It’s

Hacking as the techno-cultural phenomenon that we know today probably picked up steam with the counterculture-friendly phone phreaks of the 1960s. At the time, long-distance calls were signaled in AT&T’s computer routing system with a certain pitch, which meant that mimicking that pitch could open the system.

The culture of hacking, reshaping, and bending consumer technologies to one’s personal will is as old as the history of those technologies. The iPhone is not immune. In fact, hackers helped push the phone toward adopting its most successful feature, the App Store.

“Hi, everyone, I’m Geohot. This is the world’s first unlocked iPhone,” George Hotz announced in a YouTube video that was uploaded in August 2007. It’s since been viewed over two million times.

“They went through a really aggressive, top-down hardening campaign for the entire iOS platform over the last few years,” Guido says, “and instead of thinking about it from a tactical perspective, of, like, ‘Let’s just fix all the bugs,’ they came at it from a really architectural perspective and thought about the attacks they were gonna face and kind of predicted where some of them were going.” They stopped playing cat-and-mouse with hackers and started rewriting the rules, setting out mousetraps long before the mice had a chance to sneak into the house.

“Don’t try to hack the iPhone, it’s too hard, you won’t get anything out of it,” Guido says. That’s the attitude of most black-hat hackers. “Apple can smack you down really quickly. They issue patches that people actually apply.”

iPhones have nonetheless been subject to a number of high-profile hacks. Charlie Miller famously managed to get the App Store to approve a malware app that allowed him to break Apple’s stranglehold on the device. For five hundred dollars, Michigan State University professor Anil Jain was able to build a device that fooled the iPhone’s fingerprint sensors.

The landscape is changing—as Guido noted, there are more persons of interest with iPhones and more of an imperative to hack into those phones. It’s less likely to be done by loose-knit hackers out for the lulz or to earn a few bucks; it’s more likely to come from a government agency or a well-paid firm that does business with government agencies.

Apple has reportedly opened over seventy iPhones at the behest of law enforcement, though many of those were before the Secure Enclave necessitated a novel software hack from Apple.)

CHAPTER 12 Designed in California, Made in China The cost of assembling the planet’s most profitable product

trade name, Foxconn. Foxconn is the single largest employer on mainland China; there are 1.3 million people on its payroll. Worldwide, among corporations, only Walmart and McDonald’s employ more.

Shenzhen was the first SEZ, or special economic zone, that China opened to foreign companies, beginning in 1980. At the time, it was a fishing village that was home to some twenty-five thousand people. In one of the most remarkable urban transformations in history, today, Shenzhen is China’s third-largest city,

90 percent of the world’s consumer electronics pass through Shenzhen.

“Most employees last only a year” was a common refrain. Perhaps that’s because the pace of work is widely agreed to be relentless, and the management culture was often described as cruel.

“They just didn’t keep their promises, and that’s another way of tricking you.” He says Foxconn promised them free housing but then forced them to pay exorbitantly high utility bills for electricity and water. The current dorms sleep eight to a room, and he says they used to be twelve to a room. But Foxconn would shirk on social insurance and be late or fail to pay bonuses. And many workers sign contracts that subtract a hefty penalty from their paychecks if they quit before a three-month introductory period. “We thought Foxconn was a good factory to work in, but we found out once we got there that it was not.”

This culture of high-stress work, anxiety, and humiliation contributes to widespread depression. Xu says there was another suicide a few months ago. He saw it himself. The victim was a college student who worked on the iPhone assembly line. “Somebody I knew, somebody I saw around the cafeteria,” he says. After being publicly scolded by a manager, he got into a quarrel. Company officials called the police, though the worker hadn’t been violent, just angry. “He took it very personally,” Xu says, “and he couldn’t get through it.” Three days later, he jumped out of a ninth-story window.

Imagine another factory. This one measures one and a half miles wide by one mile long, spans sixteen million square feet of factory floor space, and includes ninety-three towering buildings. It has its own dedicated power plant. It employs over a hundred thousand workers who toil for nearly twelve hours a day. Those workers have migrated from rural regions all across the country in search of higher wages. In all, it’s a marvel of efficiency and production—it’s described as an “almost self-sufficient and self-contained industrial city.” No, it’s not run by Foxconn in the 2010s. It’s Henry Ford’s Rouge River complex in the 1930s.

As of 2012, each iPhone required 141 steps and 24 labor hours to manufacture. It has likely risen since then. That means that, in a very conservative estimate, workers spent 1,152,000,000 hours screwing, gluing, soldering, and snapping iPhones together in a single three-month period.

In 2011, President Obama held a dinner meeting with some of Silicon Valley’s top brass. Naturally, Steve Jobs was in attendance, and he was discussing overseas labor when Obama interrupted. He wanted to know what it would take to bring that work home. “Those jobs aren’t coming back,” Jobs famously said. It wasn’t just that overseas labor was cheaper—which it was—it was also that the sheer size, industriousness, and flexibility of the workforce there was necessary to meet Apple’s manufacturing needs.

Apple turns over its entire inventory every five days, meaning each iPhone goes from the factory line in China to a cargo jet to a consumer’s hands in a single workweek.

Unlike in Ford’s factories, Chinese assembly workers making ten to twenty dollars a day (in 2010 dollars) would have to pay the equivalent of three months’ wages for the cheapest new iPhone. In reality, they’d have to scrimp and save for a year—remember, many workers barely make enough to live on unless they’re pulling overtime—to be able to buy one. So none of them did. We didn’t meet a single iPhone assembler who actually owned the product he or she made hundreds of each day.

The blocks keep coming, so we keep walking. Longhua starts to feel like the dull middle of a dystopian novel, where the dread sustains but the plot doesn’t, or the later levels of a mediocre video game, where the shapes and structure start to feel uglily familiar, where you could nod off into a numb drift.

there was a different kind of ugliness. For whatever reason—the rules imposing silence on the factory floors, its pervasive reputation for tragedy, or the general feeling of unpleasantness the environment itself imparts—Longhua felt heavy, even oppressively subdued.

What was remarkable about Foxconn City was that the whole of its considerable expanse was unrepentantly dedicated to productivity and commerce. You were either working, paying, or shuffling grayly in between.

app for driving the workers that make the devices that enabled apps. They were hoping to spread the word, that licensing the AshCloud app could become another part of their business; a couple factories had already been using it, they said. Now, factory workers could be controlled, literally, by the devices they were manufacturing. I thought of one ex-Foxconn worker we interviewed. “It never stops,” he said. “It’s just phones and phones and phones.”

CHAPTER 13 Sellphone How Apple markets, mythologizes, and moves the iPhone

By the time the iPhone was actually announced in 2007, speculation and rumor over the device had reached a fever pitch, generating a hype that few to no marketing departments are capable of ginning up. I see at least three key forces at work. Together, they go a little something like this: 1. Shroud products in electric secrecy leading up to…

2. Sublime product launches featuring said products that are soon to appear in… 3. Immaculately designed Apple Stores.

“Apple is so secretive that there is essentially an entire industry built around creating, spreading and debunking rumors about the company,” the Huffington Post declared in 2012.

secrecy plays a powerful role in ratcheting up demand among consumers. In a 2013 paper for Business Horizons, “Marketing Value and the Denial of Availability,” David Hannah and two fellow business professors at Simon Fraser University theorized how Apple’s secrecy benefits its product sales. “According to reactance theory, whenever free choice—for example, of goods or services—is limited or restricted, the need to retain freedoms makes humans desire them significantly more than previously, especially if marketers can convince people that those freedoms are important.

These Stevenotes aren’t a novel format. Alexander Graham Bell, recall, went on tours and put on shows in exhibition halls and convention centers across the Eastern Seaboard to demonstrate his new telephone.

But the most famous tech demo of all was the one that may have most informed Jobs’s style. In 1968, an idealistic computer scientist named Doug Engelbart brought together hundreds of interested industry onlookers at the San Francisco Civic Center—the same civic center where the iPhone 7 demo was made nearly forty years later—and introduced a handful of technologies that would form the foundational DNA of modern personal computing. Not only did Engelbart show off publicly a number of inventions like the mouse, keypads, keyboards, word processors, hypertext, videoconferencing, and windows, he showed them off by using them in real time. The tech journalist Steven Levy would call it “the mother of all demos,” and the name stuck.

“As windows open and shut, and their contents reshuffled, the audience stared into the maw of cyberspace,” Levy writes. “Engelbart, with a no-hands mike, talked them through, a calming voice from Mission Control as the truly final frontier whizzed before their eyes. Not only was the future explained, it was there.” The model for today’s tech-industry keynote presentations was forged, almost instantly; the presentation style was perhaps not as influential as the technologies presented, but they were closely intertwined.

Jobs’s keynotes were the product of CEO John Sculley: “A marketing expert, he envisioned the product announcements as ‘news theater,’ a show put on for the press. The idea was to stage an event that the media would treat as news, generating headlines for whatever product was introduced. News stories, of course, are the most valuable advertising there is.” Sculley thought that entertaining a crowd should be the priority, so product demos should be “like staging a performance,” he wrote in his autobiography, Odyssey. “The way to motivate people is to get them interested in your product, to entertain them, and to turn your product into an incredibly important event.”

Believing that the most anticipated new color of the new model has sold out requires a suspension of disbelief. “It’s completely manufactured by one of the most brilliant marketing teams,” Buxton says. “They designed the production, supply chain, and everything. So with the greatest tradition of Spinal Tap, if needed, they could turn the volume up to eleven to meet demand.”

Apple Stores began opening in the early aughts. Initially opposed by the board, they’ve proven to be a sales behemoth. In 2015, they were the most profitable per square foot of any retail operation in the nation by a massive margin; the stores pulled in $5,546 per square foot. With two-thirds of all Apple revenues generated by iPhones, that’s a lot of hocked handsets.

CHAPTER 14 Black Market The afterlife of the one device

Shenzhen has long been known for manufacturing cheap iPhone knockoffs with names like Goophone or Cool999 that mimic the look of the iconic device but could hardly pass as the real thing. But the phones here are identical to any you’d find in an Apple Store, just used.

In 2015, China shut down a counterfeit iPhone factory in Shenzhen, believed to have made some forty-one thousand phones out of secondhand parts.

If an iPhone has had its battery replaced, is it not still an iPhone? Or what if the screen isn’t made of Gorilla Glass? What if it had extra RAM? Shenzhen phone hackers can jack a phone’s memory to twice the amount in standard iPhones. These are all just tweaked, refurbished iPhones, but are apt to be called “counterfeit” by the media.

To understand why this place exists, we need to go back even farther: In the 1970s and 1980s, as plastic, lead, and toxic-chemical-filled electronics were hitting the consumer market in quantities never before seen, disposing of them became a serious concern. Landfills stuffed with cathode-ray tubes and lead circuit boards (lead solder used to be ubiquitous) posed environmental threats, and citizens of rich countries began to demand environmental controls on e-waste disposal. Those controls, however, led to the rise of the “toxic traders,” who bought the e-waste and shipped it to be dumped in China, Eastern Europe, or Africa.

“About 41.8 million metric tonnes of e-waste was generated in 2014 and partly handled informally, including illegally,” a 2016 UN report, “Waste Crimes,” noted. “This could amount to as much as USD $18.8 billion annually. Without sustainable management, monitoring and good governance of e-waste, illegal activities may only increase, undermining attempts to protect health and the environment, as well as to generate legitimate employment.”

“Go to Kenya, go to Mombasa, go to Nairobi.” Some of the best device-repair technicians Minter has ever seen, he says, can be found there. And the waste dumping is no longer the “toxic colonialism” of yore. Some African and Asian companies are eager to import working secondhand phones. Usually not iPhones, but Android phones and even the cheap Chinese knockoffs will find a second life in African or South Asian markets.

There are few places around the globe that remain untouched by the influence of the iPhone. Even where it’s an aspirational device, it has nonetheless driven mass adoption of smartphones built in its likeness and kindled, yes, a nearly universal desire, as Jon Agar put it.

iV: The One Device Purple reign

By 2006, the basic contours of the iPhone project had been defined. Members of the Mac OS team and the NeXT mafia would engineer the software; the Human Interface team would work closely with them to improve, integrate, and dream up new designs; and the iPod team would wrangle the hardware. A team was working day in and day out to identify and strip lines of code from Mac OS to make it fit on a portable device.

“Email was a big function of these phones,” Ganatra says. “We saw that from the BlackBerry. So we knew that we had to nail email. I think that was something that was big on Scott and Steve’s minds: We can’t come out with a smartphone and try to take on the king of email and not have a great client ourselves.”

the UI designer Freddy Anzures, who’d been working on the unlock concept with Imran, took a flight out to New York from San Francisco. The team had been thinking about how to open the phone, and, well, there he was: He stepped into the airplane bathroom stall and slid to lock. Then, of course, he slid to unlock. That, he thought, would be a great design hack. It was a smart way to activate a touchscreen whose sensors always needed to be on.

The story of the Home button is actually linked to both a feature on the Mac and everyone’s favorite science-fictional user interface—the gesture controls in Minority Report. The Tom Cruise sci-fi film, based on a Philip K. Dick story, was released in 2002, right when the ENRI talks were beginning. Ever since, the film has become shorthand for futuristic user interface—the characters wave their hands around in the air to manipulate virtual objects and swipe elements away. It’s also an ancestor of some of the core user-interface elements of the iPhone.

“There were these known truths that we discussed,” Ganatra says, “and [that] we knew couldn’t be violated.” 1. Home button always takes you back home. 2. Everything has to respond instantly to a user’s touch. 3. Everything has to run at least sixty frames per second. All operations.

We knew we didn’t want to have anything like a user manual. If you ship one of those, you’ve already kind of failed.”

The iPhone would be billed as three devices in one—a phone, a touch music player, and an internet communicator.

WAP allowed users to access stripped-down versions of websites, often text-only or with only low-resolution images. “We called it the baby web—you got these dumbed-down web pages,” Williamson says. “We thought that it was maybe possible to take full-on web content to display in one place.”

AT&T to get persistent connections. “Without a persistent connection, you can’t do things like notifications, and things like iMessage become a lot more difficult. They said, ‘No, we can’t do persistent connections! We have millions of devices! No, no, we can’t do that!’” They did that.

We wanted to bring them out of the mentality where, you know, they were in, which was, ‘We want you to pay for ringtones, pay for text messages,’ and into this reality of ‘It’s just a computer that needs an IP connection, and we want that and everything you can get with an IP connection.’

For a moment, there was a chance that the iPhone would kill QWERTY. “The radical idea was that we had no physical keyboard,”

“We showed Steve all of these things and he shot them all down. Steve wanted something that people could understand right away,”

“Steve loved this stuff,” Grignon says. “He loved to set up division. But it was a big ‘fuck you’ to the people who couldn’t get in. Everyone knows who the rock stars are in a company, and when you start to see them all slowly get plucked out of your area and put in a big room behind glass doors that you don’t have access to, it feels bad.”

“People were more concerned about the downside of a Steve interaction than the potential benefit. I don’t want to make it seem like it was this gulag environment, but there was definitely a strong undercurrent of fear, paranoia, that definitely was a part of any interaction that a team might have with Steve, for sure.”

Sometimes, new recruits had to sign a preliminary NDA first, agreeing that they would never discuss the existence of the next NDA they were about to sign, in case they didn’t want to sign that one.

To this day, Tony Fadell sounds exasperated when the conversation turns to iPhone politics. “The politics were really hard,” he says. “And they got even worse over time. They became emboldened by Steve, because he didn’t want the UI—I could see it—but he wouldn’t let anyone else on the hardware team see it, so there was this quasi-diagnostics operating system interface.

“With something like the iPhone, everything defers to the display,” Ive said. “A lot of what we seem to be doing in a product like that is getting design out of the way.

“So the best way of lowest impedance is a solder joint. It doesn’t deteriorate over time like a connector.”

He adds, “We weren’t given the mission of ‘make this reparable’; we were given the mission of ‘make a great product that we can ship.’ We don’t care about removable batteries. We’ve never made any of our batteries removable on the iPod. So, when you’re new to doing something like this, you’re not tied. We were all very naive as well. It wasn’t like we were phone engineers.”

“Chips nowadays are basically software,” Grignon says. “When you make a new piece of silicon, there’s some dude in Korea that’s gotta actually type out code, and it gets compiled into a piece of metal, silicon. Like any software program, there’s bugs in there,

Inside Apple, the successful launch meant that Forstall had triumphed, Grignon says. “It set the stage politically for what was eventually going to happen, which was Tony being ousted. That was foretold. You saw that in the intro, when he swiped him to delete. In the introduction, Steve is showing how easy it is to manage your contact list, right? And he’s introducing swipe to delete. And he’s like, If there’s something here you don’t want, no complicated thing, blah blah blah, you just swipe it away—and it was Tony Fadell. You just flick it, I can delete him, and he’s gone. And I was like, Ahhhh, and the audience was doing this clap-clap—except for at Apple, everyone who was on the project was like, ‘Holy fuck.’ That was a message. He was basically saying, ‘Tony’s out.’ Because in rehearsals, he wasn’t deleting Tony. He just deleted another random contact.”

When the iPhone launched in June 2007, lines snaked around Apple Stores around the world. Diehards vying to be the first to own the Jesus phone waited outside for hours, even days. The media covered the buzz exhaustively. But despite all the spectacle, after a strong opening weekend—when Apple says it moved 270,000 units in thirty hours or so—sales were actually relatively slow. For now, the app selection was locked, the phone ran only on painfully slow 2G networks, and nothing was customizable, not even the wallpaper.

Microsoft CEO Steve Ballmer famously scoffed, “Five hundred dollars? Fully subsidized? With a plan?… That is the most expensive phone in the world.”

if you think about it, this is a Mac. We took a Mac and we squished it into a little box. It’s a Mac Two. It’s the same DNA. The same continuity.”

“Products like multitouch were incubated for many, many years,” Doll says. “Core Animation as well had been worked on for quite a while prior to the phone. Scott Forstall, who led up the whole iPhone effort as a VP, was a rank-and-file engineer working on these same frameworks that evolved into what you use to build iPhone apps,” he says. “And those were not invented in a year, or created in a year, they were created over probably twenty years, or fifteen years before the iPhone came around.”

Those frameworks are made of code that’s been written, improved, and recombined since the 1980s—since the days of NeXT, before the modern Apple era—by some of the same people who were instrumental in building the iPhone.

Apple had been banking code, ideas, and talent for twenty years at that point. “There was a compounding interest effect that was happening,”

“It’s not just a question of waking up one morning in 2006 and deciding that you’re going to build the iPhone; it’s a matter of making these nonintuitive investments and failed products and crazy experimentation—and being able to operate on this huge timescale,”

When the right market incentives arrived at its doorstep, Apple tapped into that bank of nonintuitive investments that had been accruing interest for decades. From its code base to its design standards, Apple drew from its legacy of assets to translate the ancient dream of a universal communicator into a smartphone.

“Thirty-six people I worked with at Apple have died,” he says. “It is intense.”

the actual origin of computing as we know it today probably begins not with the likes of Charles Babbage or Alan Turing or Steve Jobs but with a French astronomer, Alexis Clairaut, who was trying to solve the three-body problem. So he enlisted two fellow astronomers to help him carry out the calculations, thus dividing up labor to more efficiently compute his equations.


Steve Jobs will forever be associated with the iPhone. He towers over it, he introduced it to the world, he evangelized it, he helmed the company that produced it. But he did not invent it. I think back to David Edgerton’s comment that even now, in the age of information animated by our one devices, the smartphone’s creation myth endures. For every Steve Jobs, there are countless Frank Canovas, Sophie Wilsons, Wayne Westermans, Mitsuaki Oshimas. And I think back to Bill Buxton’s long nose of innovation, and to the notion that progress drives ideas continually into the air. Proving the lone-inventor myth inadequate does not diminish Jobs’s role as curator, editor, bar-setter—it elevates the role of everyone else to show he was not alone in making it possible.

Technology is an advancing tide, he means, and even the achievements that led to something as popular and influential as the iPhone will eventually be swept away. “It doesn’t last,” he says.

Previous
Previous

Deep track deep-dive: Glamour Profession, by Steely Dan

Next
Next

How to build an online audience with David Perell and Anthony Pompliano