For the past several years, economists, and government leaders have regularly sounded alarms about the dangers of big tech monopolies. On her 2020 campaign website, for example, Senator Elizabeth Warren said “big tech companies have too much power, too much power over our economy, our society, our democracy.” In the months since the election, politicians on both the left and right have expressed concerns over how to encourage competition and innovation among the big tech leaders, and even how to hold onto democratic ideals in the face of digital misinformation and conspiracy theories.
The challenge with a company like Facebook is that its business model actively encourages tribalism and anger, which is not the way markets usually work, says Paul Romer, an economics professor at New York University who previously served as the chief economist of The World Bank and was the co-recipient of the 2018 Nobel Prize in Economics Sciences. “When economists defend the market, we have this very simple idea in mind, where I as a buyer give something and get some good back,” he says. “None of those features are characteristic of this new market for digital services, where advertising is like the hidden method of capturing compensation for these firms.”
Users, he says, “are being manipulated in ways that they don’t fully understand.”
Regulators won’t work because big tech firms are too powerful, Romer maintains, while traditional antitrust laws are not well-suited to deal with this problem. But a progressive tax on digital advertising revenue, passed by state legislatures, could create a unique incentive for companies such as Google and Facebook to split up their businesses and discourage growth by acquisition.
Such a progressive tax model, however, needs to be aggressive: “The kind of tax that I think would create a big incentive to change at, say, Google and Facebook, the two biggest firms in this market, has to be a tax where the average tax rate they pay right now, given their size, is 35% of their revenue.”
Show notes and links:
“Taxing Digital Advertising,” Paul Romer, May 1, 2021
“Maryland Breaks Ground with Digital Advertising Tax,” National Law Review, March 17, 2021
“Once Tech’s Favorite Economist, Now a Thorn in Its Side,” Steve Lohr, New York Times, May 20, 2021
Laurel Ruma: I’m Laurel Ruma from MIT Technology Review and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. Our topic today is taxing digital advertising. Can taxes specifically aimed at breaking up big tech be levied to encourage competition, innovation, and help democracy? The five largest tech companies, Facebook, Amazon, Apple, Alphabet/Google and Microsoft are worth a combined $7 trillion. What economic efficiencies can be gained in the fight for fairness? Two words for you: Rethinking capitalism.
My guest is Paul Romer, an economics professor at New York University who served as the chief economist of The World Bank. Paul was the co-recipient of the 2018 Nobel Prize in Economics Sciences, for his work in integrating technological innovations into long-run macro economic analysis. For the first time, this integrated ideas and innovation into economic models and clarified the societal benefits that are possible when people come together to collaborate in new ways.
This episode of Business Lab is produced in association with Omidyar Network.
Welcome to The Business Lab, Paul.
Paul Romer: It’s good to be here.
Laurel: United States Senator Elizabeth Warren said, and I quote, “Big tech companies have too much power, too much power over our economy, our society, our democracy.” What is the danger of monopolies, of these large powerful companies?
Paul: That’s a well-crafted sentence by Senator Warren because it ends on the most important point. The real danger here is the threat to our democracy. The second most important one is the threat to the social fabric that determines our quality of life. One of the problems with economics and the way it has approached antitrust is that it has neglected those two issues and focused on very narrow questions: Are firms charging too much for some service? And does that mean that some people aren’t using as much of it as they could? But that captures only a small fraction of the damage that’s being done by having firms that are so large, and firms that are using a particular business model, this model based on targeted digital advertising, which has created so many bad incentives, and which creates such unusual risks for our democratic system.
Laurel: What are some of those risks?
Paul: The nature of the advertising model is that these firms want to keep people engaged watching the screen, so that they see more ads. Facebook discovered, and their research has been published on this, that if they could create more contention, more animosity, more anger, people would stay engaged for a longer period of time. And so we’ve got a business model which is actively encouraging some of the most damaging sides of human nature, this tribalism, this anger, this tendency to treat your opponent as an enemy who’s almost inhuman. So this is not the way markets usually work. When economists defend the market, we have this very simple idea in mind, where I as a buyer give something, I give money to a seller. I get some good back. And then if I don’t like what I get back, I can take my business elsewhere. None of those features are characteristic of this new market for digital services, where advertising is like the hidden method of capturing compensation for these firms. And users are being manipulated in ways that they don’t fully understand.
Laurel: So what kind of regulatory actions could have or should have been taken to confront the growth of some of these enormous companies?
Paul: To be honest, push back if you don’t like this answer, but I tend to like to look forward. We could look at decisions that we made in the past that were a mistake. But I think the really important ones are: What should we do now?
Laurel: To go ahead and challenge that, is it something that needs to be looked at perhaps more frequently? I mean, do we have to wait until something really bad happens, until an election is almost overthrown?
Paul: Well, I will say I think we’ve been negligent. Economists and people who shape opinion, people who worry about policy, I think we’re guilty of gross negligence in letting this problem fester and become so bad. So I think it’s very clear to me that we need to do something to stop the trajectory that we’re on. And I think it’s a huge mistake on all of our parts that we didn’t act sooner. But the real question is: What do we do now?
Laurel: There’s two issues here, right? One is the way that these enormous companies make the money, and then the enormity of these enormous companies.
Paul: Well, of those two, I think this business model, based on targeted digital advertising, has created these enormous incentives for spying on people and collecting information. A few years ago, I started saying that these firms know more about me than the Stasi knew about people in East Germany. And that was kind of like a controversial thing to say back then. Now everybody just accepts that. They think this is just the inevitable consequence of the market and technology. But they’ve lost the outrage, and they’ve lost the sense of how dangerous it is to let any small group of people have that much information that they can use to manipulate us.
Laurel: We’ve fallen into this trap of thinking, “Well, we use these services for free, so giving them a little bit of my data, I’m okay with.” But that’s not really what we’re talking about anymore, is it?
Paul: I think this one is a tricky one because by and large, the cost from, say, each person letting these companies have all this information is not something that each individual bears. It’s really a cost to society, so letting them have information from all of us means that they have enormous monopoly power. They can collect enormous returns and accumulate this enormous amount of wealth that you described. But it also gives them the ability to, for example, display targeted political ads, where one demographic group is being shown a message from one candidate that the rest of us never see. And those ads, just like the strategy for engagement, those ads often appeal to animosity, tribalism, anger. Again, we’re using advertising to enhance, to develop the worst side of human nature. And you don’t have to look very far in history to see how bad things can turn out when you amplify and normalize this very ugly, angry side of our instincts about us versus them.
Laurel: A slight shift: It seems as soon as we as a society identify something as too big to fail, it fails, causing unknown and often catastrophic outcomes. I’m thinking of Boeing as an example. So what do you think about Boeing and how large it’s become and what that actually means?
Paul: After the 2008 financial crisis, I wrote a paper saying that the FFA, combined with the NTSB, the National Transportation Safety Board, those two agencies were the gold standard for regulation. We should be trying to have a similar kind of structure for regulating financial markets. Well, fast forward a decade and a half, what’s happened is that Boeing, as this concentrated interest, was able to work through the Congress and cite the messages from economists about how regulation slows down innovation. And Boeing managed to eviscerate what used to be this very effective regulatory system at the FAA with some oversight by the NTSB.
And then Boeing, as a result, because there was no regulatory oversight, built this really kludge of an airplane that turned out to be incredibly dangerous and killed people. So it’s a story of the erosion of regulatory capacity that was achieved through pretty straightforward means, for example, just cutting the budget or limiting the budget at the FAA, so they couldn’t hire enough people to do the job they were assigned to do, to regulate Boeing. So this was a case where, by undercutting the regulation, Boeing hurt its workers, hurt its shareholders, killed people. It was a really terrible turn of events, but I think it’s a caution for us because people who say, well, like Facebook, are saying, “Well, let’s just have some regulators that regulate the tech firms.”
What the Boeing episode tells us is that a firm that’s strong enough can actually corrupt and eviscerate any regulatory system, and can often capture those regulators. So I’m very pessimistic that any regulatory body can actually rein in and control these firms. And of course, I think that’s why Facebook is advocating for regulation because they know that’s the measure that would leave them in the strongest position. So when I started thinking, well, what can we do about these firms? I started from the very beginning and said, “We’ve got a system with checks and balances, with a kind of executive branch, where regulators sit. You’ve got the judiciary that hears antitrust cases. And you’ve got the legislature.” Which of these three systems is the one to use to try and deal with the problems that we’re facing?
I concluded that I think regulators would just not work because the firms we’re dealing with are already way too powerful. And I also, this is a separate point that we could explore, but I also think that the judiciary and antitrust, traditional antitrust laws, are not well suited to dealing with this problem. So the way forward, it seemed to me, was for us as voters to say to our legislators, “We don’t want to live in a society like this, where a few individuals have so much power, and where they’re using that power to kind of undermine the quality of social life and threaten our democracy.” So if we said that to our legislators, we’d tell the legislators, “Pass a law that stops this bad behavior.” And then the tax that I proposed was a measure that legislatures could pass that could do a lot to solve the problems that we’re facing.
Laurel: Let’s talk a little bit about that. You mentioned a progressive tax on advertising. How would that work?
Paul: When you impose a tax, you have to anticipate that people will do things to avoid paying tax. So I designed a tax where the things they would do to try and avoid paying tax are exactly the things we want them to do. So we want this tax to be progressive. The bigger the total advertising revenue the firm collects, the higher the tax rate. So if one of these firms splits itself in two, like if Facebook were to spin Instagram out, the total tax bill for the two firms would be smaller when they’re separate compared to when it’s part of one combined entity. So the progressivity in the tax encourages split ups, spin outs. It discourages growth by acquisition.
The other thing is that I suggested it be a tax imposed on revenue from digital advertising. So if these firms don’t want to pay this tax, they could shift to a subscription model, the kind of model that Netflix uses, or a service like Duolingo uses, so that people actually pay something to get access to some valuable service. So you can do this, but this tax has to be big enough to create a real stick that if you don’t do something to change, you’re going to pay a lot of tax to the government if you stick with this very damaging model.
Laurel: I was absolutely captivated by this model and the fact that it’s real in the US state of Maryland. The state legislature is considering legislation, Senate Bill Two, to create an advertising tax on tech companies, and it works like this, a tax somewhere between 2.5% and 10% would be applied to digital ad sales in the state of Maryland on IP addresses. And that would be a huge amount of money raised, something like $250 million annually. So you were part of that effort to really push this through the legislature. What did you say in your testimony to support this idea?
Paul: Just to kind of just recap where we are, they’ve actually passed this bill. The governor vetoed it at the end of last year, but the legislature overrode the veto, so this bill is now law in Maryland. It is going to be challenged by these tech companies, usually operating through some front organizations that they’ll use to challenge it in court. So we have some ways to go in this fight, the fight’s not over. But the message I gave to the legislators, I mean first, I wrote an op-ed in the New York Times, which is what somebody there read and then reached out to me about pursuing this idea. They were interested in this partly because they had made a commitment to significantly improve their educational system and they were looking for sources of revenue.
But they also understood the problems with big tech, and understood the appeal of going after a tax which actually is targeting harmful behavior. To set expectations, I think there’s a chance that the current bill will be overturned in court. There’s going to be a lot of legal resources that are deployed to try and fight this. And one of the things I told the legislators in private is just expect that the first bill might be overturned. Watch and see what this really somewhat politicized federal judiciary is going to say is wrong with the bill, and be ready to pass a new version that avoids the problems that they complain about. So this is a longer term battle plan we have to have, and we shouldn’t be worried about setbacks along the way.
The other point I made to them was that most taxes discourage good things. If you imposed a tax on going to school, fewer people would go to school. That’d be a bad tax. But this is a tax which discourages a bad thing, and that’s the most important kind of tax to pursue when you need revenue, and it’s a way to discourage bad things. I liken it to my co-recipient for the prize, Bill Nordhaus’ idea of a tax on carbon emissions, which has the same motivation, which is to stop people from doing something which is very harmful for all of us.
The other thing is that the tax rates that they thought were politically feasible in Maryland are frankly too low to make much difference for these tech firms. Even if every state in the United States, or the federal government adopted a tax at the rates that they’re looking at, progressive from 0%, to 2%, to 10%, this would be kind of small change for these tech companies. So I have a new proposal that I’m about to launch for the national government, where we impose taxes that get much higher and which I think really are strong enough to change behavior in these tech firms. And one other thing we might want to talk about is why it’s so important to tax revenue rather than corporate income because the corporate income tax is a deeply flawed and failing way to try and tax corporations.
Laurel: That seems to be an issue in the United States that’s coming up more and more, as companies look for creative ways to avoid paying on those corporate revenue numbers.
Paul: It’s really a losing battle because conceptually, income is the difference between revenue and cost. Revenue and cost are incurred in different places, so you can’t say, “Where is income earned?” That creates at this level of principle, I mean, forget about how hard it is to get the information you need to impose this tax. Even if you had all the information you wanted, reasonable people can differ about where income is earned because it’s a difference in two things. That creates all this opportunity for firms to shift the legal location for income and to move income to these low tax jurisdictions, so you get this race to the bottom, different jurisdictions are competing by offering lower and lower corporate tax rates.
Some people think you can patch this and try and limit this behavior. I think you’re just fighting a losing battle, and we really need to switch to something like taxing revenue because we know where revenue is collected. We know that there are ads that these firms get paid to serve up, that are shown to people in Maryland, or in Massachusetts, or California. And so this empowers each of those states to tax revenue that is incurred in those states. And they don’t face this issue of a race to the bottom.
Laurel: We’re increasing taxes, but we’re doing it for a good reason because education needs more money. We’re also doing it because these large companies aren’t paying their fair share. 10% may sound like a large number, but not when you’re talking about hundreds of billions of dollars. But this is a start. Right? So the Omidyar Network is looking at how you actually implement various policy ideas to rebalance this inequity in the data economy. This is one solution. Can you think of others? Are you looking at others?
Paul: It’s important to emphasize that this will not address all of the issues we face associated with firms that are so large and so powerful. Apple, for example, does not capture much revenue through advertising, and it’s got a very strong market position that people may want to think about other measures that might limit its power. I frankly am not as worried about Apple because Apple isn’t destroying our democracy and undermining the quality of life. But there are traditional reasons why you might not want firms that are so powerful.
Amazon, for example, is now collecting a growing share of its revenue through advertising, but it also had very strong positions in just being the platform for matching buyers and sellers. So it would still be a very powerful force, even if it just abandoned digital advertising revenue. So in both of these cases, there’s room to think about other measures that could deal with the traditional problems of firms that are too large. In terms of the specific measures that one could employ, the one part of antitrust law that’s been significantly underutilized and should be brought back is merger review. It should be much harder for one of these dominant firms to acquire a new firm that could potentially grow into a competitor, such as the Facebook purchase of Instagram or WhatsApp.
In a properly functioning system, those mergers and acquisitions should not have been allowed, so that’s an easy thing to do. The part of antitrust which I think is just doomed is trying to bring a lawsuit and charge them with committing a crime, and then get a judge to agree to break them up based on their “crime” that they’ve committed. This is a very crude way to try and limit size, and it puts judges in a position which is really untenable for them. It is a very complicated type of penalty to impose, and so their tendency has been even in cases where there’s a clearly demonstrated violation of the antitrust law, like there was with Microsoft, judges overturned. In the appeals courts, they overturned the breakup remedy that the Justice Department had proposed.
And to be clear, I worked with the Justice Department in crafting this remedy. The appeals courts refused to implement something that they felt was so aggressive and so intrusive. And I think that’s the problem we’ll face with any lawsuit that tries to now force Facebook to spin out Instagram. So the only way I see to get those two things separate now is to create a very strong incentive, so that they’ll save $10 billion a year in taxes if they split it into two companies instead of running it as one company.
Laurel: So perhaps we should get down into these details about a progressive tax on advertising. If that is one possible lever, how does that progressive tax work? And would it necessarily be federal, or could it be state by state, by municipality?
Paul: I think that it could be either. And this is why it’s so important to pick revenue because different jurisdictions could make their own decisions on this. This has implications internationally as well. The US could decide how much it wants to tax ad revenue, but Canada could make its own decision on that. Germany and France could make their own decisions. So we want to empower all of these different jurisdictions to make their own decisions in response to the wishes of their citizens and voters. So we want to get away from a system where you have to have these international tax treaties where everybody’s agreeing to do the same thing to have the tax system work, and that’s really where we are with the corporate income tax.
But in terms of the level of taxation, I want to be clear about this. The kind of tax that I think would create a big incentive to change at, say, Google and Facebook, the two biggest firms in this market, I think this has to be a tax where the average tax rate they pay right now, given their size, is on the order of 35%. So 35% of their revenue would be collected by the government if they don’t change, if they just stick with business as usual. And to get to an average tax rate, if your tax rate is kind of gradually increasing as you come up, you start with a big bracket where there’s no tax at all, and then it’s a 5% tax, 10% tax. To get an average tax rate of 35%, you need to have marginal tax rates, like the tax on the highest bracket of revenue. You need marginal tax rates that are 50%, 60%, even approaching 70%.
So this needs to be a very aggressive tax. People will scream like stuck pigs when I go public, as I guess I’m doing right now about what these tax rates need to be. But there’s a couple of easy ways to respond to this. I mean, one is, these companies will say, “If you took 30% or 40% of our revenue, you would kill us.” Well, that’s actually not true–30% or 40% of their revenue would just move them back to what they were earning in 2019, 2020. They’ve experienced enormous growth. Everybody thought they were viable in 2018, 2019, 2020, so it can’t be true that you take away 30% of their revenue, suddenly revenue that was great three years ago is now impossible to live on in this new model. And of course, this is because their costs are mainly fixed costs. They can just scale up how many of these ads they serve up without incurring a lot more cost.
So they could certainly be viable if they had to pay 30%, 40% of their revenue to the government. And this would actually attract and collect a reasonable amount of revenue that could be used, say, to finance the infrastructure bill, for example. $50 billion, $60 billion and growing per year in tax revenue. The other thing about a tax that is aggressive is that it does mean that a firm that might pay $15 billion, at the scale of Google and Facebook, might pay $12 billion, $15 billion in tax a year. If they split themselves in half, that’ll go down dramatically, maybe from $12 billion to $6 billion, or $15 billion to $6 billion. And if they split themselves into four pieces, their tax bill would go down, the total tax bill across all of the surviving firms, the total tax bill could be as low as $2 billion.
And the reason to be so aggressive about this is that if these companies scream as they will, the answer is just, listen, guys, if you don’t want to pay the tax, just switch to a subscription model. Just don’t use the ads. Or if you don’t want to pay the tax, just split yourself up into independent companies. So I think we have to be ready to tolerate and remain firm in the face of these screams of outrage about high marginal tax rates and just insist that, listen, we are the citizens in this country. And in a democracy, we get to decide what kind of society we’re going to live in. And we don’t want to live in a society that lets you continue to do what you’re doing right now.
Laurel: And that is certainly unique characteristics of the data economy. So we now have these issues of: How do we reduce disinformation? How do we increase privacy? Rebalancing the wealth and reducing the economic dependency on these large farms, to think that you could break up one of them into four different companies and still have each one be worth $2 billion at least is quite something else.
Paul: Worth probably, I don’t know, $25 billion or more. But they’d collectively still be paying $2 billion a year, say, in tax.
Laurel: I’m sorry. You’re correct. Thank you.
Paul: There’s a movie I like, Chinatown, with Jack Nicholson, where at the very end of the movie something terrible happens to an innocent woman who’s killed. And Nicholson is devastated. And some friend says to him, “Forget it, Jake. It’s Chinatown.” The message is, you can’t do anything. This is so complicated. The forces you’re fighting are so powerful. You can’t do anything about this. Well, this is kind of the message economists have been sending for decades now. It’s the market, forget it. It’s the market. You can’t control what the market does. If you’ve got these firms that are now dominating political advertising, forget about it. Forget it. You can’t do anything.
That’s just so false. As citizens, we can decide we don’t want them to have that kind of power in our markets for political advertising. We don’t want all of these secret targeted ads that are inflaming the passions. And so the economists need to stop encouraging this learned helplessness amongst the citizenry, and we need to be saying, “It is up to us to decide what kind of a society we want to live in.” And if we make a decision, we get our legislators to make a change.
And by the way, I think that despite the polarization we’re seeing right now, this issue might be one where you could attract some attention from both the left and the right because the right has been keenly aware of the enormous power, say, that Mark Zuckerberg possesses, or Jack Dorsey possesses at Twitter. And so they are now kind of shifting away from their usual defense of, well, it’s the market, so it must be good, and recognizing, no, there’s some aspects of this market equilibrium that we think are really bad, that are kind of inconsistent with the principles of freedom and free speech that this country was founded on. So I’m mildly optimistic that this is something where we could reach some kind of a consensus and actually do something.
Laurel: Speaking of representation, on which America is founded, there have been rumblings in Congress holding these firms accountable. Are you hopeful that might actually happen?
Paul: Well, I think those rumblings have been somewhat useful in raising attention. But they’re mostly, so far at least, theater. There’s really no consensus around an agenda for what we could do. There are people like Senator Warren, Senator Warner, who’ve been thinking about measures we could adopt. But there’s been no coalescing around some practical measure. So we need to get out, get moved beyond these showpieces, where we express outrage and try to watch these executives squirm. We need to get to the point where we actually do something that will make a difference.
Laurel: And what a great call to action that is. Thank you, Paul, for joining us today on The Business Lab.
Paul: Thank you. This is the first time I’ve actually told people, no, I mean marginal tax rates as high as 65%, 75%, so you may get some animated responses when this goes live. But people should also go look at my blog because I’ll actually have analytics behind this available on my blog. And anybody who’s interested can learn more there.
Laurel: That was Paul Romer, Nobel Prize-winning economist and professor at New York University, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review, overlooking the Charles River. That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology. And you can find us in print, on the web, and at dozens of events each year around the world. For more information about us and the show, please check out our website at technologyreview.com.
This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Collective Next. Thanks for listening.
This podcast episode was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.
How do I know if egg freezing is for me?
The tool is currently being trialed in a group of research volunteers and is not yet widely available. But I’m hoping it represents a move toward more transparency and openness about the real costs and benefits of egg freezing. Yes, it is a remarkable technology that can help people become parents. But it might not be the best option for everyone.
Read more from Tech Review’s archive
Anna Louie Sussman had her eggs frozen in Italy and Spain because services in New York were too expensive. Luckily, there are specialized couriers ready to take frozen sex cells on international journeys, she wrote.
Michele Harrison was 41 when she froze 21 of her eggs. By the time she wanted to use them, two years later, only one was viable. Although she did have a baby, her case demonstrates that egg freezing is no guarantee of parenthood, wrote Bonnie Rochman.
What happens if someone dies with eggs in storage? Frozen eggs and sperm can still be used to create new life, but it’s tricky to work out who can make the decision, as I wrote in a previous edition of The Checkup.
Meanwhile, the race is on to create lab-made eggs and sperm. These cells, which might be made from a person’s blood or skin cells, could potentially solve a lot of fertility problems—should they ever prove safe, as I wrote in a feature for last year’s magazine issue on gender.
Researchers are also working on ways to mature eggs from transgender men in the lab, which could allow them to store and use their eggs without having to pause gender-affirming medical care or go through other potentially distressing procedures, as I wrote last year.
From around the web
The World Health Organization is set to decide whether covid still represents a “public health emergency of international concern.” It will probably decide to keep this status, because of the current outbreak in China. (STAT)
Researchers want to study the brains, genes, and other biological features of incarcerated people to find ways to stop them from reoffending. Others warn that this approach is based on shoddy science and racist ideas. (Undark)
A watermark for chatbots can expose text written by an AI
For example, since OpenAI’s chatbot ChatGPT was launched in November, students have already started cheating by using it to write essays for them. News website CNET has used ChatGPT to write articles, only to have to issue corrections amid accusations of plagiarism. Building the watermarking approach into such systems before they’re released could help address such problems.
In studies, these watermarks have already been used to identify AI-generated text with near certainty. Researchers at the University of Maryland, for example, were able to spot text created by Meta’s open-source language model, OPT-6.7B, using a detection algorithm they built. The work is described in a paper that’s yet to be peer-reviewed, and the code will be available for free around February 15.
AI language models work by predicting and generating one word at a time. After each word, the watermarking algorithm randomly divides the language model’s vocabulary into words on a “greenlist” and a “redlist” and then prompts the model to choose words on the greenlist.
The more greenlisted words in a passage, the more likely it is that the text was generated by a machine. Text written by a person tends to contain a more random mix of words. For example, for the word “beautiful,” the watermarking algorithm could classify the word “flower” as green and “orchid” as red. The AI model with the watermarking algorithm would be more likely to use the word “flower” than “orchid,” explains Tom Goldstein, an assistant professor at the University of Maryland, who was involved in the research.
The Download: watermarking AI text, and freezing eggs
That’s why the team behind a new decision-making tool hope it will help to clear up some of the misconceptions around the procedure—and give would-be parents a much-needed insight into its real costs, benefits, and potential pitfalls. Read the full story.
This story is from The Checkup, MIT Technology Review’s weekly newsletter giving you the inside track on all things health and biotech. Sign up to receive it in your inbox every Thursday.
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Elon Musk held a surprise meeting with US political leaders
Allegedly in the interest of ensuring Twitter is “fair to both parties.” (Insider $)
+ Kanye West’s presidential campaign advisors have been booted off Twitter. (Rolling Stone $)
+ Twitter’s trust and safety head is Musk’s biggest champion. (Bloomberg $)
2 We’re treating covid like flu now
Annual covid shots are the next logical step. (The Atlantic $)
3 The worst thing about Sam Bankman-Fried’s spell in jail?
Being cut off from the internet. (Forbes $)
+ Most crypto criminals use just five exchanges. (Wired $)
+ Collapsed crypto firmFTX has objected to a new investigation request. (Reuters)
4 Israel’s tech sector is rising up against its government
Tech workers fear its hardline policies will harm startups. (FT $)
5 It’s possible to power the world solely using renewable energy
At least, according to Stanford academic Mark Jacobson. (The Guardian)
+ Tech bros love the environment these days. (Slate $)
+ How new versions of solar, wind, and batteries could help the grid. (MIT Technology Review)
6 Generative AI is wildly expensive to run
And that’s why promising startups like OpenAI need to hitch their wagons to the likes of Microsoft. (Bloomberg $)
+ How Microsoft benefits from the ChatGPT hype. (Vox)
+ BuzzFeed is planning to make quizzes supercharged by OpenAI. (WSJ $)
+ Generative AI is changing everything. But what’s left when the hype is gone? (MIT Technology Review)
7 It’s hard not to blame self-driving cars for accidents
Even when it’s not technically their fault. (WSJ $)
8 What it’s like to swap Google for TikTok
It’s great for food suggestions and hacks, but hopeless for anything work-related. (Wired $)
+ The platform really wants to stay operational in the US. (Vox)
+ TikTok is mired in an eyelash controversy. (Rolling Stone $)
9 CRISPR gene editing kits are available to buy online
But there’s no guarantee these experiments will actually work. (Motherboard)
+ Next up for CRISPR: Gene editing for the masses? (MIT Technology Review)
10 Tech workers are livestreaming their layoffs
It’s a candid window into how these notoriously secretive companies treat their staff. (The Information $)