It’s Time to Push Tech Forward, and Rebuild What It Broke

Making progress means making (sometimes devastating) mistakes. And then learning from them.
WIRED25 logo of a W and 25
Photograph: The Voorhes

In 1904, a group of Canadian workers began the hard slog of constructing the world's longest bridge, across the Saint Lawrence River just south of the city of Quebec. It was a wildly ambitious project. And it wasn't just for the Quebecois: Railroads were revolutionizing commerce and communications, and the bridge would connect people and allow trains to run from New Brunswick in the east to Winnipeg in the west.

The river was 190 feet deep at the center, and ice piled high above the water's surface in the winter. Nothing about the bridge's construction would be easy. The engineers chose a complex cantilever design, a cutting-edge approach but a cost-efficient one too. Ambition creates risks, and warning signs started to appear. The steel trusses weighed more than expected. Some of the lower chords of the bridge seemed misaligned or bent. Workers raised concerns. But the project's leaders pressed ahead.

November 2019. Subscribe to WIRED.

Illustration: Mike Perry

Exactly 100 years later, in February 2004, a young entrepreneur named Mark Zuckerberg founded The Facebook. His ambition was nothing less than to remake the internet around personal relationships and then to remake the world around Facebook. When the company filed to go public in 2012, he published a letter to potential investors. “Facebook was not originally created to be a company. It was built to accomplish a social mission—to make the world more open and connected,” he wrote. “We don't build services to make money; we make money to build better services.” An open and connected world, he wrote, would make the economy stronger and businesses better. Facebook was building a bridge and relentlessly increasing its span.

One day in August 1907, several years into the construction of the bridge over the Saint Lawrence River, calamity struck in the space of 15 seconds. Every major section of the structure's nearly complete southern half collapsed. Workers were crushed or swept into the current. Another group of men found temporary safety but drowned under the rising tide. In all, 75 people died, including 33 Mohawk steelworkers from the nearby Kahnawake reserve.

By now, you surely see where I'm going with this. In 2016, Facebook was struck by calamity too. The core algorithm of the company's News Feed was weaponized by Russian operatives and purveyors of fake news. A platform designed for connecting people turned out to be a remarkable accelerant for political divisions. The election was a mess, whatever your politics, and Facebook was partly to blame. The company's philosophy—move fast and break things—was fine when the only thing at stake was whether your aunt could reconnect with her high school ex. That philosophy lost its roguish charm when democracy itself was up for grabs. Then, in 2018, Facebook faced the worst crisis of its short existence when news broke that a shady political outfit called Cambridge Analytica had siphoned off data from nearly 100 million users of the platform.

For several years now, we've been living in a time of intense backlash against the technology industry. It's not clear when it started, but if one had to choose a date, November 8, 2016, isn't a bad one. Within six months of the election, Molotov cocktails were being chucked at the captains of Silicon Valley from all directions—and employees of the biggest tech companies were among those lighting the wicks. Antitrust law, disdained for decades, suddenly became exciting. Worries that had been playing as background music in society for years—online privacy, the fears of artificial intelligence taking jobs—began to crescendo. Ad targeting was redefined as surveillance capitalism. Self-driving cars were redefined as death traps. #DeleteUber became a meme. The reputation of an entire industry tanked, just as had happened eight years earlier to finance. In 2016, WIRED ran a photograph of Mark Zuckerberg on the cover with the line “Could Facebook Save Your Life?” Fifteen months later, we ran a photo-illustration of him bloodied and bruised. No words were necessary.

There's no question that the tech industry had it coming. It had become arrogant. The nerds had ascended, culturally and socially, and had become enchanted with their own virtuous self-image. They spoke like Saint Francis in public while privately worshipping Mammon. In hindsight, Facebook's missionary IPO letter reads like a parody. But the backlash has included some gratuitous swipes too. Take self-driving cars. They don't text while driving; they don't drink. If we can get them to work, they'll save tens of thousands of lives a year. Almost everything we do has become simpler, easier, and more efficient in some way because of software. Even Facebook deserves a certain sympathy as it tries to juggle the conflicting priorities of privacy, transparency, and safety—as the public demands perfection on all three.

Now let's return to the Quebec bridge. After the catastrophe, the site became a place of pilgrimage and, eventually, renewal. The Canadian government needed the railway link, and it took over the design and construction of the bridge. New plans were drawn up, involving stronger supports and a new kind of truss. The cantilever arms on either side went up and stood strong. By 1916, the only major task remaining was to link the two sides with a 5,000-ton center span. It was maneuvered into place via tugboat, and soon the workers began to lift it up with huge hangers. But once more, calamity struck. The hoisting system failed, and the giant centerpiece plunged into the river, taking 13 workers with it.

Soon, though, the Canadian government's engineers tried a third time. Many lives had been lost, but connecting the two sides of the river remained essential for the country's prosperity. So the builders reconstructed the collapsed center span and, just three years after the second collapse, the Prince of Wales presided over its opening. The bridge held. Soon, cars and trains were crossing the same river in which so many people had died. For a century, it has stood as the longest cantilever bridge in the world. Quebec and Canada have prospered for it.

The Quebec Bridge across the Saint Lawrence River near Quebec City.Photograph: Getty Images

More important, the collapses became an ethical touchstone. A professor named H. E. T. Haultain decided that he wanted to commemorate the story, and he called on the poet and author Rudyard Kipling, who had previously written an ode to engineers, for help. Haultain worked with Kipling and the leaders of Canada's main engineering universities to develop what's called “The Ritual of the Calling of an Engineer.” And for nearly a century, graduates have gone through a ceremony in which they recite their obligations to their craft: “I will not henceforward suffer or pass, or be privy to the passing of, Bad Workmanship or Faulty Material in aught that concerns my works before mankind as an Engineer, or in my dealings with my own Soul before my Maker.”

At the end of the ceremony, they are given iron rings to wear as a reminder of these obligations: Move slow and get things right. According to myth, the first of these rings were forged from pieces of the collapsed bridge. The rings are worn on the pinky finger of the dominant hand, so that they click on the table when the engineer signs or stamps a blueprint.

The culture of civil engineers has always been different from the culture of software engineers. The former are formally certified and regulated; the latter can learn their craft from scratch in their basement. And there's good reason for civil engineers to demand more rigor than their software engineer brethren. If you make a mistake in a line of code, you can fix it from your chair. Repairing a steel beam submerged in an icy river is a different matter. Software companies grow, too, according to the logic of network effects and increasing returns to scale. They have to move fast if they want to thrive. Such rules and logic rarely, if ever, apply in the physical world.

So, yes, the cultures have to be different. And the problems differ too. The bridge collapsed twice because of failure in execution; Facebook's problem was more a failure of imagination and the inability to see how the platform could be used for harm.

That said, Silicon Valley, and software engineers everywhere, could still learn something from the culture that asks its adherents to wear those iron rings. Tech companies operate in digital worlds, but their actions have consequences in the physical one. And when you build things, you are responsible to the people who use them. You have to think through what could go wrong instead of assuming everything will go right. You have to build as if you have a ring forged from a shattered bridge on your pinky.

Sometimes systems crack, and then they shatter. But sometimes the crack leads to the remedy. And that's what we need now: a coming together of many tribes to fix the mess we're in, and to learn from the mistakes the industry has made. We need action from governments, builders, users, people inside Silicon Valley, and people everywhere else writing code.

The point isn't to stop progress but to enable it. Much of the magic in our lives comes from software: the music we hear, the movies we watch, the stories we tell. We live longer, eat better, and keep in our pockets computers more powerful than the supercomputers that guided the first people in space. We can record police abuses with our smartphones and hold power to account. Gene editing could help us feed the planet; we may send people to Mars; technological acumen is redefining global politics. A thousand inefficient businesses have been pulled up from the roots, as better ideas have sprung from the soil.

And that's what this issue is about: the builders who understand the consequences of their choices. It's about people who recognize the awesome responsibility of the technological powers bequeathed to us by our predecessors. There's Kate Darling, whose research is redefining the way we think about our moral responsibilities to robots. There's Patrick Collison, who, along with his brother, created a company, Stripe, that makes it vastly easier for people all over the world to start businesses and process payments without tearing down the entire financial system. There's Laura Boykin, using pocket-DNA sequencing to save Africa's cassava crop. There's Eva Galperin, protecting her fellow hackers from stalkerware and authoritarians. There's Moriba Jah, working to map the garbage orbiting our planet.

These are people who build things fast but who are also fixing things. They're using technology to take us to new places. They're thinking deeply—but not to the point of paralysis—about the problems society faces and the ways that technology can help. They're people who realized that the bridge they were building may have collapsed, but rather than abandon it, they built it anew with forethought. Or, as the ceremony of the Iron Ring requires one to pledge, “My Time I will not refuse; my Thought I will not grudge; my Care I will not deny towards the honour, use, stability and perfection of any works to which I may be called to set my hand.”


Join us for the return of the WIRED25 Festival, lively stage chats and workshops with luminaries and icons, from Chris Evans and N. K. Jemisin to Stewart Butterfield and NSA cybersecurity head Anne Neuberger.


NICHOLAS THOMPSON (@nxthompson) is WIRED's editor in chief.

This article appears in the November issue. Subscribe now.

Let us know what you think about this article. Submit a letter to the editor at mail@wired.com.


More Great WIRED Stories