But behind every filter is a person dragging lines and shifting shapes on a computer screen to achieve the desired look. Beauty may be subjective, and yet society continues to promote stringent, unattainable ideals that—for women and girls—are disproportionately white, slender, and feminine.
Instagram publishes very little data about filters, especially beauty filters. In September of 2020, Meta announced that over 600 million people had tried at least one of its AR features. The metaverse is a concept much bigger than Meta and other companies investing in AR and VR products. Snap and TikTok capture huge numbers of filter users, though Snap is also investing in place-based AR. Meta’s product suite includes the Oculus headset and Ray-Ban smart glasses, but it’s focused on what made Facebook popular—the face.
Beauty filters, especially those that dramatically alter the shape of a face and its features, are particularly popular—and contested. Instagram banned these so-called deformation effects from October 2019 until August 2020 because of concerns about the impact they have on mental health. The policy has since been updated to outlaw only filters that encourage plastic surgery. The policy states that “content must not promote the use or depict the sale of a potentially dangerous cosmetic procedure, as per the Facebook Community Standards. This includes effects that depict such procedures through surgery lines.” According to a statement to MIT Technology Review in April 2021, this policy is enforced by “a combination of human and automated systems to review effects as they are submitted for publishing.” Creators told me, however, that deformation filters often get flagged inconsistently, and it’s not clear what exactly encourages the use of cosmetic surgery.
“It became sensational”
Though many people use beauty filters merely for fun and entertainment, those puppy ears are actually a big technical feat. First they require face detection, in which an algorithm interprets the various shades of pixels picked up by a camera to identify a face and its features. A digital mask of some standard face is then applied to the image of the real face and adjusts to its shape, aligning the mask’s virtual jawline and nose to the person’s. On that mask, graphics developed by coders create the effects seen on the screen. Computer vision technology of just the past few years has allowed this to happen in real time and in motion.
Spark AR is Instagram’s software developer kit, or SDK, and it allows creators of augmented-reality effects to more easily make and share the face filters that cover the Instagram feed. It is in this deep rabbit hole of filter demonstration videos on YouTube that I first came across Florencia Solari, a creative AR technologist and a well-known creator of filters on Instagram. She showed me how to make a face filter that promised to plump and lift my cheeks and fill out my lips for that Kardashianesque, surgically enhanced face shape.
“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.”
“I have this inflate tool that I am going to apply with symmetry,” Solari said, “because any modifications that I do to this face, I want to be symmetrical.” I tried to keep up by dragging the outline of my digital mannequin’s cheekbone up and out with my cursor. Next, I right-clicked on the map of her bottom lip and selected “Increase” several times, playing God. Soon, with Solari as my guide, I had a filter that, while sloppy and simple, I could upload to Instagram and unleash to the world.
FLORENCIA SOLARI
Solari is part of a new class of AR and VR creators who have made a career by mastering this technology. She started coding when she was around nine years old and was drawn to the creativity of virtual-world development. Making her own filters on Instagram was a hobby at first. But in 2020, Solari left a full-time job as an AR developer at Ulta Beauty to pursue online AR full time as an independent consultant. She’s recently worked with Meta and several other big brands (which she says she can’t disclose) to create branded AR web experiences, including filters.
Solari’s very first filter, called “vedette++,” went viral back in September 2019. “I tried to make an interpretation of what the superstar of the future would be,” Solari says. The filter applies an iridescent, slightly green shine to the skin, which is smoothed all over and inflated under each eye to the point that it looks as if half a clementine has been shoved inside each cheek. Lips double in size, and face shape is adjusted so that a distinct jawline tapers into a small chin. “It was kind of a mix of an alien, but with a face that looked like it was full of Botox,” says Solari. “It really became, like, sensational.”
The Frost nails its uncanny, disconcerting vibe in its first few shots. Vast icy mountains, a makeshift camp of military-style tents, a group of people huddled around a fire, barking dogs. It’s familiar stuff, yet weird enough to plant a growing seed of dread. There’s something wrong here.
Welcome to the unsettling world of AI moviemaking. The Frost is a 12-minute movie from Detroit-based video creation company Waymark in which every shot is generated by an image-making AI. It’s one of the most impressive—and bizarre—examples yet of this strange new genre. Read the full story, and take an exclusive look at the movie.
—Will Douglas Heaven
Microplastics are everywhere. What does that mean for our immune systems?
Microplastics are pretty much everywhere you look. These tiny pieces of plastic pollution, less than five millimeters across, have been found in human blood, breast milk, and placentas. They’re even in our drinking water and the air we breathe.
Given their ubiquity, it’s worth considering what we know about microplastics. What are they doing to us?
The short answer is: we don’t really know. But scientists have begun to build a picture of their potential effects from early studies in animals and clumps of cells, and new research suggests that they could affect not only the health of our body tissues, but our immune systems more generally. Read the full story.
Here, bits of plastic can end up collecting various types of bacteria, which cling to their surfaces. Seabirds that ingest them not only end up with a stomach full of plastic—which can end up starving them—but also get introduced to types of bacteria that they wouldn’t encounter otherwise. It seems to disturb their gut microbiomes.
There are similar concerns for humans. These tiny bits of plastic, floating and flying all over the world, could act as a “Trojan horse,” introducing harmful drug-resistant bacteria and their genes, as some researchers put it.
It’s a deeply unsettling thought. As research plows on, hopefully we’ll learn not only what microplastics are doing to us, but how we might tackle the problem.
Read more from Tech Review’s archive
It is too simplistic to say we should ban all plastic. But we could do with revolutionizing the way we recycle it, as my colleague Casey Crownhart pointed out in an article published last year.
We can use sewage to track the rise of antimicrobial-resistant bacteria, as I wrote in a previous edition of the Checkup. At this point, we need all the help we can get …
… which is partly why scientists are also exploring the possibility of using tiny viruses to treat drug-resistant bacterial infections. Phages were discovered around 100 years ago and are due a comeback!
Our immune systems are incredibly complicated. And sex matters: there are important differences between the immune systems of men and women, as Sandeep Ravindran wrote in this feature, which ran in our magazine issue on gender.
Artists are often the first to experiment with new technology. But the immediate future of generative video is being shaped by the advertising industry.Waymark made The Frost to explore how generative AI could be built into its products. The company makes video creation tools for businesses looking for a fast and cheap way to make commercials. Waymark is one of several startups, alongside firms such as Softcube and Vedia AI, that offer bespoke video ads for clients with just a few clicks.
Waymark’s current tech, launched at the start of the year, pulls together several different AI techniques, including large language models, image recognition, and speech synthesis, to generate a video ad on the fly. Waymark also drew on its large data set of non-AI-generated commercials created for previous customers. “We have hundreds of thousands of videos,” says CEO Alex Persky-Stern. “We’ve pulled the best of those and trained it on what a good video looks like.”
To use Waymark’s tool, which it offers as part of a tiered subscription service starting at $25 a month, users supply the web address or social media accounts for their business, and it goes off and gathers all the text and images it can find. It then uses that data to generate a commercial, using OpenAI’s GPT-3 to write a script that is read aloud by a synthesized voice over selected images that highlight the business. A slick minute-long commercial can be generated in seconds. Users can edit the result if they wish, tweaking the script, editing images, choosing a different voice, and so on. Waymark says that more than 100,000 people have used its tool so far.
The trouble is that not every business has a website or images to draw from, says Parker. “An accountant or a therapist might have no assets at all,” he says.
Waymark’s next idea is to use generative AI to create images and video for businesses that don’t yet have any—or don’t want to use the ones they have. “That’s the thrust behind making The Frost,” says Parker. “Create a world, a vibe.”
The Frost has a vibe, for sure. But it is also janky. “It’s not a perfect medium yet by any means,” says Rubin. “It was a bit of a struggle to get certain things from DALL-E, like emotional responses in faces. But at other times, it delighted us. We’d be like, ‘Oh my God, this is magic happening before our eyes.’”
This hit-and-miss process will improve as the technology gets better. DALL-E 2, which Waymark used to make The Frost, was released just a year ago. Video generation tools that generate short clips have only been around for a few months.
The most revolutionary aspect of the technology is being able to generate new shots whenever you want them, says Rubin: “With 15 minutes of trial and error, you get that shot you wanted that fits perfectly into a sequence.” He remembers cutting the film together and needing particular shots, like a close-up of a boot on a mountainside. With DALL-E, he could just call it up. “It’s mind-blowing,” he says. “That’s when it started to be a real eye-opening experience as a filmmaker.”