Police are flying surveillance over Washington. Where were they last week?
Nor were resources an issue. The United States Capitol Police, or USCP, is one of the most well-funded police forces in the country. It is responsible for security across just 0.4 square miles of land, but that area hosts some of the most high-profile events in American politics, including presidential inaugurations, lying-in-state ceremonies, and major protests. The USCP is well-staffed, with 2,300 officers and civilian employees, and its annual budget is at least $460 million—putting it among the top 20 police budgets in the US. In fact, it’s about the size of the Atlanta and Nashville police budgets combined. For comparison, the DC Metropolitan Police Department—which works regularly with the USCP and covers the rest of the District’s 68 square miles—has a budget of $546 million.
The USCP is different from state and local departments in other important ways, too. As a federal agency that has no residents inside its jurisdiction, for example, it answers to a private oversight board and to Congress—and only Congress has the power to change its rules and budgets. Nor is it subject to transparency laws such as the Freedom of Information Act, which makes it even more veiled than the most opaque departments elsewhere in the country.
All of this means there is little public information about the tools and tactics that were at the USCP’s disposal ahead of the riots.
But “they have access to some pretty sophisticated stuff if they want to use it,” says Stoughton. That includes the resources of other agencies like the Secret Service, the FBI, the Department of Homeland Security, the Department of the Interior, and the United States military. (“We are working [on technology] on every level with pretty much every agency in the country,” the USCP’s then-chief said in 2015, in a rare acknowledgment of the force’s technical savvy.)
What should have happened
With such resources at its disposal, the Capitol Police would likely have made heavy use of online surveillance ahead of January 6. Such monitoring usually involves not just watching online spaces, but tracking known extremists who had been at other violent events. In this case, that would include the “Unite the Right” rally in Charlottesville, Virginia, in 2017 and the protest against coronavirus restrictions at the Michigan state capitol in 2020.
Exactly what surveillance was happening before the riots is unclear. The FBI turned down a request for a comment, and the USCP did not respond. “I’d find it very hard to believe, though, that a well-funded, well-staffed agency with a pretty robust history of assisting with responding to crowd control situations in DC didn’t do that type of basic intelligence gathering,” says Stoughton.
Ed Maguire, professor of criminal justice at Arizona State University, is an expert on protests and policing. He says undercover officers would usually operate in the crowd to monitor any developments, which he says can be the most effective surveillance tool to manage potentially volatile situations—but that would require some preparedness and planning that perhaps was lacking.
Major events of this kind would usually involve a detailed risk assessment, informed by monitoring efforts and FBI intelligence reports. These assessments determine all security, staffing, and surveillance plans for an event. Stoughton says that what he sees as inconsistency in officers’ decisions to retreat or not, as well as the lack of an evacuation plan and the clear delay in securing backup, point to notable mistakes.
This supports one of the more obvious explanations for the failure: that the department simply misjudged the risk.
What seems to have happened
It appears that Capitol Police didn’t coordinate with the Park Police or the Metropolitan Police ahead of the rally—though the Metropolitan Police were staffed at capacity in anticipation of violence. Capitol Police Chief Steven Sund, who announced his resignation in the wake of the riots, also asserts that he requested additional National Guard backup on January 5, though the Pentagon denies this.
The USCP has also been accused of racial bias, along with other police forces. Departments in New York, Seattle, and Philadelphia are among those looking into whether their own officers took part in the assault, and the Capitol Police itself suspended “several” employees and will investigate 10 officers over their role.
But one significant factor that might have altered the volatility of the situation, Maguire says, is that police clashes with the Proud Boys in the weeks and days before the event, including a violent rally in Salem, Oregon, and the arrest of the white supremacist group’s leader, Henry Tarrio, fractured the right wing’s assumption that law enforcement was essentially on their side. On January 5, Maguire had tweeted about hardening rhetoric and threats of violence as this assumption started to fall apart.
Inside the conference where researchers are solving the clean-energy puzzle
The Advanced Research Projects Agency for Energy (ARPA-E) funds high-risk, high-reward energy research projects, and each year the agency hosts a summit where funding recipients and other researchers and companies in energy can gather to talk about what’s new in the field.
As I listened to presentations, met with researchers, and—especially—wandered around the showcase, I often had a vague feeling of whiplash. Standing at one booth trying to wrap my head around how we might measure carbon stored by plants, I would look over and see another group focused on making nuclear fusion a more practical way to power the world.
There are plenty of tried-and-true solutions that can begin to address climate change right now: wind and solar power are being deployed at massive scales, electric vehicles are coming to the mainstream, and new technologies are helping companies make even fossil-fuel production less polluting. But as we knock out the easy wins, we’ll also need to get creative to tackle harder-to-solve sectors and reach net-zero emissions. Here are a few intriguing projects from the ARPA-E showcase that caught my eye.
“I heard you have rocks here!” I exclaimed as I approached the Quaise Energy station.
Quaise’s booth featured a screen flashing through some fast facts and demonstration videos. And sure enough, laid out on the table were two slabs of rock. They looked a bit worse for wear, each sporting a hole about the size of a quarter in the middle, singed around the edges.
These rocks earned their scorch marks in service of a big goal: making geothermal power possible anywhere. Today, the high temperatures needed to generate electricity using heat from the Earth are only accessible close to the surface in certain places on the planet, like Iceland or the western US.
Geothermal power could in theory be deployed anywhere, if we could drill deep enough. Getting there won’t be easy, though, and could require drilling 20 kilometers (12 miles) beneath the surface. That’s deeper than any oil and gas drilling done today.
Rather than grinding through layers of granite with conventional drilling technology, Quaise plans to get through the more obstinate parts of the Earth’s crust by using high-powered millimeter waves to vaporize rock. (It’s sort of like lasers, but not quite.)
The emergent industrial metaverse
Annika Hauptvogel, head of technology and innovation management at Siemens, describes the industrial metaverse as “immersive, making users feel as if they’re in a real environment; collaborative in real time; open enough for different applications to seamlessly interact; and trusted by the individuals and businesses that participate”—far more than simply a digital world.
The industrial metaverse will revolutionize the way work is done, but it will also unlock significant new value for business and societies. By allowing businesses to model, prototype, and test dozens, hundreds, or millions of design iterations in real time and in an immersive, physics-based environment before committing physical and human resources to a project, industrial metaverse tools will usher in a new era of solving real-world problems digitally.
“The real world is very messy, noisy, and sometimes hard to really understand,” says Danny Lange, senior vice president of artificial intelligence at Unity Technologies, a leading platform for creating and growing real-time 3-D content. “The idea of the industrial metaverse is to create a cleaner connection between the real world and the virtual world, because the virtual world is so much easier and cheaper to work with.”
While real-life applications of the consumer metaverse are still developing, industrial metaverse use cases are purpose-driven, well aligned with real-world problems and business imperatives. The resource efficiencies enabled by industrial metaverse solutions may increase business competitiveness while also continually driving progress toward the sustainability, resilience, decarbonization, and dematerialization goals that are essential to human flourishing.
This report explores what it will take to create the industrial metaverse, its potential impacts on business and society, the challenges ahead, and innovative use cases that will shape the future. Its key findings are as follows:
• The industrial metaverse will bring together the digital and real worlds. It will enable a constant exchange of information, data, and decisions and empower industries to solve extraordinarily complex real-world problems digitally, changing how organizations operate and unlocking significant societal benefits.
• The digital twin is a core metaverse building block. These virtual models simulate real-world objects in detail. The next generation of digital twins will be photorealistic, physics-based, AI-enabled, and linked in metaverse ecosystems.
• The industrial metaverse will transform every industry. Currently existing digital twins illustrate the power and potential of the industrial metaverse to revolutionize design and engineering, testing, operations, and training.
The Download: China’s retro AI photos, and experts’ AI fears
Across social media, a number of creators are generating nostalgic photographs of China with the help of AI. Even though these images get some details wrong, they are realistic enough to trick and impress many of their followers.
The pictures look sophisticated in terms of definition, sharpness, saturation, and color tone. Their realism is partly down to a recent major update of image-making artificial-intelligence program Midjourney that was released in mid-March, which is better not only at generating human hands but also at simulating various photography styles.
It’s still relatively easy, even for untrained eyes, to tell that the photos are generated by an AI. But for some creators, their experiments are more about trying to recall a specific era in time than trying to trick their audience. Read the full story.
Zeyi’s story is from China Report, his weekly newsletter giving you the inside track on tech in China. Sign up to receive it in your inbox every Tuesday.
Read more of our reporting on AI-generated images:
+ These new tools let you see for yourself how biased AI image models are. Bias and stereotyping are still huge problems for systems like DALL-E 2 and Stable Diffusion, despite companies’ attempts to fix it. Read the full story.