What most of the posts share is a sense of desperation. As WeChat has grown to be the super app that’s used in almost all aspects of life, having your primary account banned can be devastating. The Weibo posts describe how having their WeChat accounts banned made it difficult for people to get messages from colleagues, potential employers, or family members. Some write they are now on the brink of depression.
Meanwhile, Tencent’s customer service Weibo accounts only posted robotic responses under these posts asking them to provide more information. Two Weibo users told MIT Technology Review that posting under the hashtag didn’t help their appeals process at all.
Life after WeChat
Being banned from WeChat turns you into a ghost on the ubiquitous platform. “After losing WeChat, it feels like you lost connection to the world,” says Chen. “Even though you can still log into your WeChat account, read the messages others sent you and the group messages, and make digital payments, you can’t interact with them or reply to them.”
WeChat started allowing banned users to export their contacts in 2020, so if they choose to register a new account and start over, they can add their friends back one by one. But for most WeChat users who have had the app for over a decade, this means adding thousands of contacts manually and explaining to them what they did to trigger the ban.
Chen used his old account for 11 years and had over 1,400 contacts. It took him several hours to add back 500 contacts from his back-up account. “When I was adding contacts back, I was questioned if I was a scammer and the person called me to confirm. If I don’t have this person’s number or other confirmation methods, maybe they will straightaway refuse to befriend me,” Chen says. Then, there’s also the subscriptions, bookmarked content, public accounts he follows, and all other information tied to his WeChat account that he needs to migrate too.
On Friday, after the discussion of the protest had ebbed, many WeChat users were discovering who among their friends were banned or helping their friends spread their new WeChat handles. A 2020 article that offered a helpful checklist on what to do after being banned by WeChat gained at least 70,000 views overnight.
News of the suspensions obviously had a chilling effect too, as people weighed whether to talk about the protest when it was now clear it could get their accounts banned. By holding people’s access to digital services hostage, the government was able to obstruct the spread of information and increase its control.
Not everyone is willing to become a hostage. While Tina has heard about the posts on Weibo begging Tencent for help, that’s not what she wants to do. She understands the severity of political censorship and doesn’t believe posting will help.
So far, she has only told her close contacts about what happened and plans to try living her life without a WeChat account, at least for a little while. She has always felt she spent too much time on social media apps anyway—maybe this forced leave could be a detox experience.
“Many people were registering their second accounts yesterday. But I told them I won’t. I want to give it a try. If, let’s say, I can still live my life normally without WeChat, I think I can choose not to register another account,” she says. “I don’t think an individual should be bound so close with [WeChat] together.”
The Download: metaverse fashion, and looser covid rules in China
Fashion creator Jenni Svoboda is designing a beanie with a melted cupcake top, sprinkles, and doughnuts for ears. But this outlandish accessory isn’t destined for the physical world—Svoboda is designing for the metaverse. She’s working in a burgeoning, if bizarre, new niche: fashion stylists who create or curate outfits for people in virtual spaces.
Metaverse stylists are increasingly sought-after as frequent users seek help dressing their avatars—often in experimental, wildly creative looks that defy personal expectations, societal standards, and sometimes even physics.
Stylists like Svoboda are among those shaping the metaverse fashion industry, which is already generating hundreds of millions of dollars. But while, to the casual observer, it can seem outlandish and even obscene to spend so much money on virtual clothes, there are deeper, more personal, reasons why people are hiring professionals to curate their virtual outfits. Read the full story.
Making sense of the changes to China’s zero-covid policy
On December 1, 2019, the first known covid-19 patient started showing symptoms in Wuhan. Three years later, China is the last country in the world holding on to strict pandemic control restrictions. However, after days of intense protests that shocked the world, it looks as if things could finally change.
Beijing has just announced wide-ranging relaxations of its zero covid policy, including allowing people to quarantine at home instead of in special facilities for the first time.
Uber’s facial recognition is locking Indian drivers out of their accounts
Uber checks that a driver’s face matches what the company has on file through a program called “Real-Time ID Check.” It was rolled out in the US in 2016, in India in 2017, and then in other markets. “This prevents fraud and protects drivers’ accounts from being compromised. It also protects riders by building another layer of accountability into the app to ensure the right person is behind the wheel,” Joe Sullivan, Uber’s chief security officer, said in a statement in 2017.
But the company’s driver verification procedures are far from seamless. Adnan Taqi, an Uber driver in Mumbai, ran into trouble with it when the app prompted him to take a selfie around dusk. He was locked out for 48 hours, a big dent in his work schedule—he says he drives 18 hours straight, sometimes as much as 24 hours, to be able to make a living. Days later, he took a selfie that locked him out of his account again, this time for a whole week. That time, Taqi suspects, it came down to hair: “I hadn’t shaved for a few days and my hair had also grown out a bit,” he says.
More than a dozen drivers interviewed for this story detailed instances of having to find better lighting to avoid being locked out of their Uber accounts. “Whenever Uber asks for a selfie in the evenings or at night, I’ve had to pull over and go under a streetlight to click a clear picture—otherwise there are chances of getting rejected,” said Santosh Kumar, an Uber driver from Hyderabad.
Others have struggled with scratches on their cameras and low-budget smartphones. The problem isn’t unique to Uber. Drivers with Ola, which is backed by SoftBank, face similar issues.
Some of these struggles can be explained by natural limitations in face recognition technology. The software starts by converting your face into a set of points, explains Jernej Kavka, an independent technology consultant with access to Microsoft’s Face API, which is what Uber uses to power Real-Time ID Check.
“With excessive facial hair, the points change and it may not recognize where the chin is,” Kavka says. The same thing happens when there is low lighting or the phone’s camera doesn’t have a good contrast. “This makes it difficult for the computer to detect edges,” he explains.
But the software may be especially brittle in India. In December 2021, tech policy researchers Smriti Parsheera (a fellow with the CyberBRICS project) and Gaurav Jain (an economist with the International Finance Corporation) posted a preprint paper that audited four commercial facial processing tools—Amazon’s Rekognition, Microsoft Azure’s Face, Face++, and FaceX—for their performance on Indian faces. When the software was applied to a database of 32,184 election candidates, Microsoft’s Face failed to even detect the presence of a face in more than 1,000 images, throwing an error rate of more than 3%—the worst among the four.
It could be that the Uber app is failing drivers because its software was not trained on a diverse range of Indian faces, Parsheera says. But she says there may be other issues at play as well. “There could be a number of other contributing factors like lighting, angle, effects of aging, etc.,” she explained in writing. “But the lack of transparency surrounding the use of such systems makes it hard to provide a more concrete explanation.”
The Download: Uber’s flawed facial recognition, and police drones
One evening in February last year, a 23-year-old Uber driver named Niradi Srikanth was getting ready to start another shift, ferrying passengers around the south Indian city of Hyderabad. He pointed the phone at his face to take a selfie to verify his identity. The process usually worked seamlessly. But this time he was unable to log in.
Srikanth suspected it was because he had recently shaved his head. After further attempts to log in were rejected, Uber informed him that his account had been blocked. He is not alone. In a survey conducted by MIT Technology Review of 150 Uber drivers in the country, almost half had been either temporarily or permanently locked out of their accounts because of problems with their selfie.
Hundreds of thousands of India’s gig economy workers are at the mercy of facial recognition technology, with few legal, policy or regulatory protections. For workers like Srikanth, getting blocked from or kicked off a platform can have devastating consequences. Read the full story.
I met a police drone in VR—and hated it
Police departments across the world are embracing drones, deploying them for everything from surveillance and intelligence gathering to even chasing criminals. Yet none of them seem to be trying to find out how encounters with drones leave people feeling—or whether the technology will help or hinder policing work.
A team from University College London and the London School of Economics is filling in the gaps, studying how people react when meeting police drones in virtual reality, and whether they come away feeling more or less trusting of the police.
MIT Technology Review’s Melissa Heikkilä came away from her encounter with a VR police drone feeling unnerved. If others feel the same way, the big question is whether these drones are effective tools for policing in the first place. Read the full story.
Melissa’s story is from The Algorithm, her weekly newsletter covering AI and its effects on society. Sign up to receive it in your inbox every Monday.