A similar video was captured in Arcadia, California, in September 2019. Dressed in what looks like pajamas, a woman runs into the frame of another doorbell camera. She, too, is looking over her shoulder as she knocks, but her perpetrator catches up quickly. As she screams “No!” and tries to resist, the man drags her by her hair onto the front lawn. The view is obstructed, but he appears to hit her repeatedly and stomp on her. Finally, he says, “Get up or I’ll kill you.”
These videos reveal traumatic moments, and experts say the individuals captured on camera have no control over what happens to the images. In both cases, the camera belongs to a stranger, and so does the video. The homeowner is the one who agrees to Amazon’s terms of service and chooses how to share the video—whether it’s uploaded to the Neighbors app, given to the police, or handed over to the media.
The person in the footage “has no relationship with the company… and never agreed to their likeness being cut up, made into a product,” says Angel Díaz, senior counsel with the Liberty and National Security Program at the Brennan Center for Justice. Critics such as Díaz contend that such videos essentially become free marketing material for Ring, which trades on fear and voyeurism.
The company counters that videos like these, upsetting as they are, can help protect the public. “Ring built Neighbors to empower people to share important safety information with each other and connect with the public safety agencies that serve them,” Daniels, the Ring spokesperson, wrote in an emailed statement.
And, Ring says, it takes steps to protect the privacy of people who appear in such videos. “When it comes to sharing customer videos with media or to our owned channels, our current policy is that we either obtain a release or blur the face of every identifiable person in the video before we share.”
When violent incidents like these are caught on camera and shared, on the surface it may appear that the system of video surveillance and of neighbors looking out for each other is working as it should. Video evidence can certainly aid police and prosecutors. But advocates for domestic violence victims say that when these intimate moments are made public, the people involved are victimized again, by losing their power to make their own decisions. The women in such videos may have wanted and needed help, advocates say—but not necessarily from the police.
In Manor, Texas, for example, police charged the man in the video with third-degree felony kidnapping. But the woman in the video later told local reporters that she was looking for an attorney to try getting the charges dropped.
“They’re selling fear in exchange for people giving up their privacy.”
Angel Díaz, Brennan Center for Justice
Yann LeCun has a bold new vision for the future of AI
Melanie Mitchell, an AI researcher at the Santa Fe Institute, is also excited to see a whole new approach. “We really haven’t seen this coming out of the deep-learning community so much,” she says. She also agrees with LeCun that large language models cannot be the whole story. “They lack memory and internal models of the world that are actually really important,” she says.
Natasha Jaques, a researcher at Google Brain, thinks that language models should still play a role, however. It’s odd for language to be entirely missing from LeCun’s proposals, she says: “We know that large language models are super effective and bake in a bunch of human knowledge.”
Jaques, who works on ways to get AIs to share information and abilities with each other, points out that humans don’t have to have direct experience of something to learn about it. We can change our behavior simply by being told something, such as not to touch a hot pan. “How do I update this world model that Yann is proposing if I don’t have language?” she asks.
There’s another issue, too. If they were to work, LeCun’s ideas would create a powerful technology that could be as transformative as the internet. And yet his proposal doesn’t discuss how his model’s behavior and motivations would be controlled, or who would control them. This is a weird omission, says Abhishek Gupta, the founder of the Montreal AI Ethics Institute and a responsible-AI expert at Boston Consulting Group.
“We should think more about what it takes for AI to function well in a society, and that requires thinking about ethical behavior, amongst other things,” says Gupta.
Yet Jaques notes that LeCun’s proposals are still very much ideas rather than practical applications. Mitchell says the same: “There’s certainly little risk of this becoming a human-level intelligence anytime soon.”
LeCun would agree. His aim is to sow the seeds of a new approach in the hope that others build on it. “This is something that is going to take a lot of effort from a lot of people,” he says. “I’m putting this out there because I think ultimately this is the way to go.” If nothing else, he wants to convince people that large language models and reinforcement learning are not the only ways forward.
“I hate to see people wasting their time,” he says.
The Download: Yann LeCun’s AI vision, and smart cities’ unfulfilled promises
“We’re addicted to being on Facebook.”
—Jordi Berbera, who runs a pizza stand in Mexico City, tells Rest of World why he has turned to selling his wares through the social network instead of through more conventional food delivery apps.
The big story
“Am I going crazy or am I being stalked?” Inside the disturbing online world of gangstalking
Jenny’s story is not linear, the way that we like stories to be. She was born in Baltimore in 1975 and had a happy, healthy childhood—her younger brother Danny fondly recalls the treasure hunts she would orchestrate. In her late teens, she developed anorexia and depression and was hospitalized for a month. Despite her struggles, she graduated high school and was accepted into a prestigious liberal arts college.
There, things went downhill again. Among other issues, chronic fatigue led her to drop out. When she was 25 she flipped that car on Florida’s Sunshine Skyway Bridge in an apparent suicide attempt. At 30, after experiencing delusions that she was pregnant, she was diagnosed with schizophrenia. She was hospitalized for half a year and began treatment, regularly receiving shots of an antipsychotic drug. “It was like having my older sister back again,” Danny says.
On July 17, 2017, Jenny jumped from the tenth floor of a parking garage at Tampa International Airport. After her death, her family searched her hotel room and her apartment, but the 42-year-old didn’t leave a note. “We wanted to find a reason for why she did this,” Danny says. And so, a week after his sister’s death, Danny—a certified ethical hacker—decided to look for answers on Jenny’s computer. He found she had subscribed to hundreds of gangstalking groups across Facebook, Twitter, and Reddit; online communities where self-described “targeted individuals” say they are being monitored, harassed, and stalked 24/7 by governments and other organizations—and the internet legitimizes them. Read the full story.
The US Supreme Court has overturned Roe v. Wade. What does that mean?
Access to legal abortion is now subject to state laws, allowing each state to decide whether to ban, restrict or allow abortion. Some parts of the country are much stricter than others—Arkansas, Oklahoma and Kentucky are among the 13 states with trigger laws that immediately made abortion illegal in the aftermath of the ruling. In total, around half of states are likely to either ban or limit access to the procedure, with many of them refusing to make exceptions, even in pregnancies involving rape, incest and fetuses with genetic abnormalities. Many specialized abortion clinics may be forced to close their doors in the next few days and weeks.
While overturning Roe v Wade will not spell an end to abortion in the US, it’s likely to lower its rates, and force those seeking them to obtain them using different methods. People living in states that ban or heavily restrict abortions may consider travelling to other areas that will continue to allow them, although crossing state lines can be time-consuming and prohibitively expensive for many people facing financial hardship.
The likelihood that anti-abortion activists will use surveillance and data collection to track and identify people seeking abortions is also higher following the decision. This information could be used to criminalize them, making it particularly dangerous for those leaving home to cross state lines.
Vigilante volunteers already stake out abortion clinics in states including Mississippi, Florida and North Carolina, filming people’s arrival on cameras and recording details about them and their cars. While they deny the data is used to harass or contact people seeking abortions, experts are concerned that footage filmed of clients arriving and leaving clinics could be exploited to target and harm them, particularly if law enforcement agencies or private groups were to use facial recognition to identify them.
Another option is to order so-called abortion pills to discreetly end a pregnancy at home. The pills, which are safe and widely prescribed by doctors, are significantly less expensive than surgical procedures, and already account for the majority of abortions in the US.