Debates about technology and progress are often framed in terms of “optimism” vs. “pessimism.” For instance, Steven Pinker, Matt Ridley, Johan Norberg, Max Roser, and the late Hans Rosling have been called the “New Optimists” for their focus on the economic, scientific, and social progress of the last two centuries. Their opponents, such as David Runciman and Jason Hickel, accuse them of being blind to real problems in the world, such as poverty, and to risks of catastrophe, such as nuclear war.
Economic historian Robert Gordon calls himself “the prophet of pessimism.” His book The Rise and Fall of American Growth warned that the days of high economic growth are over for the United States and will not return. Gordon’s opponents include a group he calls the “techno-optimists,” such as Andrew McAfee and Erik Brynjolfsson, who have predicted a growth spurt in productivity from information technology.
It’s tempting to choose sides. But while it can be rational to be optimistic or pessimistic on any specific question, these terms are too imprecise to be adopted as a general intellectual identity. Those who identify as optimists can be too quick to dismiss or downplay the problems of technology, while self-styled technology pessimists or progress skeptics can be too reluctant to believe in solutions.
As we look forward to the post-pandemic recovery, once again we’re being tugged between the optimists, who highlight all the diseases that may soon be beaten through new vaccines, and the pessimists, who warn that humanity will never win the evolutionary arms race against microbes. But this represents a false choice. History provides us with powerful examples of people who were brutally honest in identifying a crisis but were equally active in seeking solutions.
At the end of the 19th century, William Crookes—physicist, chemist, and inventor of the Crookes tube (an early type of vacuum tube)—was the president of the British Association for the Advancement of Science. On September 7, 1898, he used the traditional annual address to the association to issue a dire warning.
The British Isles, he said, were at grave risk of running out of food. His reasoning was simple: the population was growing exponentially, but the amount of land under cultivation could not keep pace. The only way to continue to increase production was to improve crop yields. But the limiting factor on yields was the availability of nitrogen fertilizer, and the sources of nitrogen, such as the rock salts of the Chilean desert and the guano deposits of the Peruvian islands, were running out. His argument was detailed and comprehensive, based on figures for wheat production and land availability from every major European country and colony; he apologized in advance for boring his audience with statistics.
He criticized the “culpably extravagant” waste of nonrenewable nitrogen resources. To those who looked myopically only at recent years of the harvest, which had been quite sufficient, he pointed out that those years had been unusually fruitful, which masked the problem. The bounty of the recent past was no guarantee of prosperity in the future.
In a sense, Crookes was an “alarmist.” His purpose was to draw attention to a problem caused by progress and growth. He sought to open the eyes of the complacent. He began by saying that “England and all civilized nations stand in deadly peril,” variously referring to “a colossal problem” of “urgent importance,” an “impending catastrophe,” and “a life-and-death question for generations to come.” To those who would call him alarmist, he insisted that his message was “founded on stubborn facts.”
Crookes caused a sensation, and many critics spoke against his message. They pointed out that wheat wasn’t the only food, that people would moderate consumption of it if necessary, and that land for wheat could be taken from what was used for meat and dairy production, especially as prices rose. They said that he underestimated the opportunities for American farmers to supply food to other nations, by better adapting their methods to the soil and climate so as to increase production.
Writing in Nature in 1899, one R. Giffen compared Crookes to Thomas Malthus, and to others who had predicted shortages of various natural resources—such as Eduard Suess, who had said that gold would run out, and William Stanley Jevons, who warned about Peak Coal. Giffen’s tone is weary as he notes that “there has been much experience of these discussions since the time of Malthus.” Every time, he explains, we’ve been unable to make precise forecasts because the anticipated limits to growth are too far in the future, or we know too little about their causes.
But Crookes had always intended his remarks to take “the form of a warning rather than of a prophecy.” In the speech, he said:
“It is the chemist who must come to the rescue … Before we are in the grip of actual dearth the chemist will step in and postpone the day of famine to so distant a period that we and our sons and grandsons may legitimately live without undue solicitude for the future.”
Crookes’s plan was to tap a virtually unlimited source of nitrogen: the atmosphere. Plants can’t use atmospheric nitrogen directly; instead, they use other nitrogen-containing compounds, which in nature are manufactured from atmospheric nitrogen by certain bacteria, a process called fixation. Crookes said that the artificial fixation of atmospheric nitrogen was “one of the great discoveries awaiting the ingenuity of chemists,” and he was optimistic that it could happen soon, calling it “a question of the not-far-distant future.”
He devoted a significant part of his speech to exploring this solution. He pointed out that nitrogen can be burned at sufficiently high temperatures to create nitrate compounds, and that this can be done using electricity. He even estimated practical details, such as the cost of the nitrates produced this way, which was competitive at market rates, and whether the process could be scaled up to industrial levels: the new hydroelectric plant at Niagara Falls, he concluded, would alone provide all the electricity needed to make up the gap he had forecast.
Crookes knew that synthetic fertilizer wasn’t a permanent solution, but he was satisfied that when the problem reappeared in the distant future, his successors would be able to deal with it. His alarmism was not a philosophical position, but a contingent one. Once the facts of the situation were changed by the invention of suitable technology, he was happy to call off the alarm.
Was Crookes correct? By 1931, the year he had said we could run out of food, it was clear that his predictions had not been perfect. The harvest had increased, but not because crop yields greatly improved. Instead, acreage had actually increased, to a degree Crookes had thought impossible. This happened in part because of improvements in mechanization, including the gas tractor. Mechanization drove down labor costs, which made marginally yielding lands profitable. As often happens, a solution came from an unexpected direction, invalidating the assumptions of forecasters both optimistic and pessimistic.
But if Crookes was not correct in his detailed predictions, he was correct in essence. His two key points were accurate: one, that food in general and yields in particular were problems that would have to be reckoned with in the next generation or so; two, that synthetic fertilizer from the fixation of atmospheric nitrogen would be a key aspect of the solution.
Less than two decades after his speech, the German chemist Fritz Haber and industrialist Carl Bosch developed a process to synthesize ammonia out of atmospheric nitrogen and hydrogen gas. Ammonia is a chemical precursor of synthetic fertilizers, and the Haber-Bosch process is still one of the most important industrial processes today, providing fertilizer for almost half the world’s food production.
The chemist, ultimately, did come to the rescue.
So was Crookes an optimist or a pessimist? He was pessimistic about the problem—he was not complacent. But he was optimistic about finding a solution—he was no defeatist, either.
In the 20th century, fears of overpopulation and food supply once again reared their heads. In 1965, the world population growth rate reached an all-time high of 2% per year, enough to double every 35 years; and as late as 1970, it is estimated, over a third of people in developing countries were undernourished.
The 1968 book The Population Bomb, by Paul and Anne Ehrlich, opened with a call for surrender: “The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate.” In 1970, Paul Ehrlich reinforced the defeatism, saying that in a few years “further efforts will be futile” and “you may as well look after yourself and your friends and enjoy what little time you have left.” Because they saw the situation as hopeless, the Ehrlichs supported a proposal to cut off aid to countries such as India that were seen as not doing enough to limit population growth.
Fortunately for India and the rest of the world, others were not ready to give up. Norman Borlaug, working in Mexico in a program funded by the Rockefeller Institute, developed high-yield varieties of wheat that resisted fungal disease, used fertilizer more efficiently, and could grow at any latitude. In the 1960s, thanks in part to the new grains, Mexico transformed itself from an importer to an exporter of wheat and India and Pakistan nearly doubled their yields, averting the famine that the Ehrlichs saw as inevitable.
Yet even after winning the Nobel Peace Prize for his accomplishments, Borlaug never lost sight of the challenge involved in making agriculture keep up with population, and never considered it solved for good. In his 1970 Nobel lecture, he called the increases in food production “still modest in terms of total needs” and, pointing out that half the world is undernourished, said “no room is left for complacency.” He warned that “most people still fail to comprehend the magnitude and menace of the ‘Population Monster.’” “And yet,” he continued, “I am optimistic for the future of mankind.” Borlaug was confident that human reason would eventually bring population under control (and indeed, the global birth rate has been declining ever since).
The risk of adopting an “optimistic” or “pessimistic” mindset is the temptation to take sides on an issue depending on a general mood, rather than forming an opinion based on the facts of the case. “Don’t worry,” says the optimist; “accept hardship,” counters the pessimist.
We can see this play out in debates over covid and lockdowns, over climate change and energy usage, over the promise and peril of nuclear power, and in general over economic growth and resource consumption. As the debates escalate, each side digs in: the “optimists” question whether a threat is even real; the “pessimists” deride any proposed technological solution as a false “quick fix” that merely allows us to rationalize postponing the difficult but inevitable cutbacks. (For an example of the latter, see the “moral hazard” arguments against geoengineering as a strategy to address climate change.)
To embrace both the reality of problems and the possibility of overcoming them, we should be fundamentally neither optimists nor pessimists, but solutionists.
The term “solutionism,” usually in the form of “technocratic solutionism,” has been used since the 1960s to mean the belief that every problem can be fixed with technology. This is wrong, and so “solutionism” has been a term of derision. But if we discard any assumptions about the form that solutions must take, we can reclaim it to mean simply the belief that problems are real, but solvable.
Solutionists may seem like optimists because solutionism is fundamentally positive. It advocates vigorously advancing against problems, neither retreating nor surrendering. But it is as far from a Panglossian, “all is for the best” optimism as it is from a fatalistic, doomsday pessimism. It is a third way that avoids both complacency and defeatism, and we should wear the term with pride.
Why can’t tech fix its gender problem?
Not competing in this Olympics, but still contributing to the industry’s success, were the thousands of women who worked in the Valley’s microchip fabrication plants and other manufacturing facilities from the 1960s to the early 1980s. Some were working-class Asian- and Mexican-Americans whose mothers and grandmothers had worked in the orchards and fruit canneries of the prewar Valley. Others were recent migrants from the East and Midwest, white and often college educated, needing income and interested in technical work.
With few other technical jobs available to them in the Valley, women would work for less. The preponderance of women on the lines helped keep the region’s factory wages among the lowest in the country. Women continue to dominate high-tech assembly lines, though now most of the factories are located thousands of miles away. In 1970, one early American-owned Mexican production line employed 600 workers, nearly 90% of whom were female. Half a century later the pattern continued: in 2019, women made up 90% of the workforce in one enormous iPhone assembly plant in India. Female production workers make up 80% of the entire tech workforce of Vietnam.
Venture: “The Boys Club”
Chipmaking’s fiercely competitive and unusually demanding managerial culture proved to be highly influential, filtering down through the millionaires of the first semiconductor generation as they deployed their wealth and managerial experience in other companies. But venture capital was where semiconductor culture cast its longest shadow.
The Valley’s original venture capitalists were a tight-knit bunch, mostly young men managing older, much richer men’s money. At first there were so few of them that they’d book a table at a San Francisco restaurant, summoning founders to pitch everyone at once. So many opportunities were flowing it didn’t much matter if a deal went to someone else. Charter members like Silicon Valley venture capitalist Reid Dennis called it “The Group.” Other observers, like journalist John W. Wilson, called it “The Boys Club.”
The venture business was expanding by the early 1970s, even though down markets made it a terrible time to raise money. But the firms founded and led by semiconductor veterans during this period became industry-defining ones. Gene Kleiner left Fairchild Semiconductor to cofound Kleiner Perkins, whose long list of hits included Genentech, Sun Microsystems, AOL, Google, and Amazon. Master intimidator Don Valentine founded Sequoia Capital, making early-stage investments in Atari and Apple, and later in Cisco, Google, Instagram, Airbnb, and many others.
Generations: “Pattern recognition”
Silicon Valley venture capitalists left their mark not only by choosing whom to invest in, but by advising and shaping the business sensibility of those they funded. They were more than bankers. They were mentors, professors, and father figures to young, inexperienced men who often knew a lot about technology and nothing about how to start and grow a business.
“This model of one generation succeeding and then turning around to offer the next generation of entrepreneurs financial support and managerial expertise,” Silicon Valley historian Leslie Berlin writes, “is one of the most important and under-recognized secrets to Silicon Valley’s ongoing success.” Tech leaders agree with Berlin’s assessment. Apple cofounder Steve Jobs—who learned most of what he knew about business from the men of the semiconductor industry—likened it to passing a baton in a relay race.
Predicting the climate bill’s effects is harder than you might think
Human decision-making can also cause models and reality to misalign. “People don’t necessarily always do what is, on paper, the most economic,” says Robbie Orvis, who leads the energy policy solutions program at Energy Innovation.
This is a common issue for consumer tax credits, like those for electric vehicles or home energy efficiency upgrades. Often people don’t have the information or funds needed to take advantage of tax credits.
Likewise, there are no assurances that credits in the power sectors will have the impact that modelers expect. Finding sites for new power projects and getting permits for them can be challenging, potentially derailing progress. Some of this friction is factored into the models, Orvis says. But there’s still potential for more challenges than modelers expect.
Putting too much stock in results from models can be problematic, says James Bushnell, an economist at the University of California, Davis. For one thing, models could overestimate how much behavior change is because of tax credits. Some of the projects that are claiming tax credits would probably have been built anyway, Bushnell says, especially solar and wind installations, which are already becoming more widespread and cheaper to build.
Still, whether or not the bill meets the expectations of the modelers, it’s a step forward in providing climate-friendly incentives, since it replaces solar- and wind-specific credits with broader clean-energy credits that will be more flexible for developers in choosing which technologies to deploy.
Another positive of the legislation is all its long-term investments, whose potential impacts aren’t fully captured in the economic models. The bill includes money for research and development of new technologies like direct air capture and clean hydrogen, which are still unproven but could have major impacts on emissions in the coming decades if they prove to be efficient and practical.
Whatever the effectiveness of the Inflation Reduction Act, however, it’s clear that more climate action is still needed to meet emissions goals in 2030 and beyond. Indeed, even if the predictions of the modelers are correct, the bill is still not sufficient for the US to meet its stated goals under the Paris agreement of cutting emissions to half of 2005 levels by 2030.
The path ahead for US climate action isn’t as certain as some might wish it were. But with the Inflation Reduction Act, the country has taken a big step. Exactly how big is still an open question.
China has censored a top health information platform
The suspension has met with a gleeful social reaction among nationalist bloggers, who accuse DXY of receiving foreign funding, bashing traditional Chinese medicine, and criticizing China’s health-care system.
DXY is one of the front-runners in China’s digital health startup scene. It hosts the largest online community Chinese doctors use to discuss professional topics and socialize. It also provides a medical news service for a general audience, and it is widely seen as the most influential popular science publication in health care.
“I think no one, as long as they are somewhat related to the medical profession, doesn’t follow these accounts [of DXY],” says Zhao Yingxi, a global health researcher and PhD candidate at Oxford University, who says he followed DXY’s accounts on WeChat too.
But in the increasingly polarized social media environment in China, health care is becoming a target for controversy. The swift conclusion that DXY’s demise was triggered by its foreign ties and critical work illustrates how politicized health topics have become.
Since its launch in 2000, DXY has raised five rounds of funding from prominent companies like Tencent and venture capital firms. But even that commercial success has caused it trouble this week. One of its major investors, Trustbridge Partners, raises funds from sources like Columbia University’s endowments and Singapore’s state holding company Temasek. After DXY’s accounts were suspended, bloggers used that fact to try to back up their claim that DXY has been under foreign influence all along.
Part of the reason the suspension is so shocking is that DXY is widely seen as one of the most trusted online sources for health education in China. During the early days of the covid-19 pandemic, it compiled case numbers and published a case map that was updated every day, becoming the go-to source for Chinese people seeking to follow covid trends in the country. DXY also made its name by taking down several high-profile fraudulent health products in China.
It also hasn’t shied away from sensitive issues. For example, on the International Day Against Homophobia, Transphobia, and Biphobia in 2019, it published the accounts of several victims of conversion therapy and argued that the practice is not backed by medical consensus.
“The article put survivors’ voices front and center and didn’t tiptoe around the disturbing reality that conversion therapy is still prevalent and even pushed by highly ranked public hospitals and academics,” says Darius Longarino, a senior fellow at Yale Law School’s Paul Tsai China Center.