America’s sequencing boom may be throwing money at the wrong problem
Instead of trying to work through these issues at the national level, the sequencing contracts allow individual public health agencies to request the names and contact information of people who have tested positive for variants of concern. But that just pushes the same problems of data ownership down the chain.
“Some states are very good and want to know a lot about variants that are circulating in their state,” says Labcorp’s Brian Krueger. “The other states are not.”
Public health epidemiologists often have little experience with bioinformatics, using software to analyze large datasets like genomic sequences. Only a few agencies have pre-existing sequencing programs; even if they did, having each jurisdiction to analyze just a small slice of the dataset undercuts how much knowledge can be gleaned about real-world behavior.
Getting around those issues—making it easier to connect sequences and clinical metadata on a large scale—would require more than just root and branch reform of privacy regulations, however. It would need a reorganization of the entire healthcare and public health systems in the US, where each of the 64 public health agencies operate as fiefdoms, and there is no centralization of information or power.
“Metadata is the single biggest uncracked nut,” says Jonathan Quick, managing director of pandemic response, preparedness, and prevention at the Rockefeller Foundation. (The Rockefeller Foundation helps fund coverage at MIT Technology Review,, although it has no editorial oversight.) Because it’s so hard for public health to put together big enough datasets to really understand real-world variant behavior, our understanding has to come from vaccine manufacturers and hospitals adding sequencing to their own clinical trials, he says.
It’s frustrating to him that so many huge datasets of useful information already exist in electronic medical records, immunization registries, and other sources, but can’t easily be used.
“There’s a whole lot more that could be learned, and learned faster, without the shackles we put on the use of that data,” says Quick. “We can’t just rely on the vaccine companies to do surveillance.”
Boosting state-level bioinformatics
If public health labs are expected to focus more on tracking and understanding variants on their own, they’ll need all the help they can get. Doing something about variants case-by-case, after all, is a public health job, while doing something about variants on a policy level is a political one.
Public health labs generally use genomics to expose otherwise-hidden information about outbreaks, or as part of track and trace efforts. In the past, sequencing has been used to connect E. coli outbreaks to specific farms, identify and interrupt chains of HIV transmission, isolate US Ebola cases, and follow annual flu patterns.
Even those with well-established programs tend to use genomics sparingly. The cost of sequencing has dropped precipitously over the last decade, but the process is still not cheap, particularly for cash-strapped state and local health departments. The machines themselves cost hundreds of thousands of dollars to buy, and more to run: Illumina, one of the biggest makers of sequencing equipment, says labs spend an average of $1.2 million annually on supplies for each of its machines.
Health agencies don’t just need money; they also need expertise. Surveillance requires highly trained bioinformaticians to turn a sequence’s long strings of letters into useful information, as well as people to explain the results to officials, and convince them to turn any lessons learned into policy.
Fortunately, the OAMD has been working to support state and local health departments as they try to understand their sequencing data, employing regional bioinformaticians to consult with public health officers and facilitating agencies’ efforts to share their experiences.
It is also pouring hundreds of millions into building and supporting those agencies’ own sequencing programs—not just for covid, but for all pathogens.
But many of those agencies are facing pressure to sequence as many covid genomes as possible. Without a cohesive strategy for collecting and analyzing data, it’s unclear how much utility those programs will have.
“We’ll miss a ton of opportunities if we just give health departments money to set up programs without having a federal strategy so that everyone knows what they’re doing,” says Warmbrod.
Initial visions, usurped
Mark Pandori is director of the Nevada state public health laboratory, one of the programs OAMD supports. He has been a strong proponent of genomic surveillance for years. Before moving to Reno, he ran the public health lab in Alameda County, California, where he helped pioneer a program using sequencing to track how infections were being passed around hospitals.
Turning sequences into usable data is the biggest challenge for public health genomics programs, he says.
“The CDC can say, ‘go buy a bunch of sequencing equipment, do a whole bunch of sequencing.’ But it doesn’t do anything unless the consumers of that data know how to use it, and know how to apply it,” he says. “I’m talking to you about the robotics we need to get things sequenced every day, but health departments just need a simple way to know if cases are related.”
When it comes to variants, public health labs are under many of the same pressures the CDC faces: everyone wants to know what variants are circulating, whether or not they can do anything with the information.
Pandori launched his covid sequencing program hoping to cut down on the labor needed to investigate potential covid outbreaks, quickly identifying whether cases caught near each other were related or coincidental.
His lab was the first in North America to identify a patient reinfected with covid-19, and later found the B.1.351 variant in a hospitalized man who had just come back from South Africa. With rapid contact tracing, the health department was able to prevent it from spreading.
The emergent industrial metaverse
Annika Hauptvogel, head of technology and innovation management at Siemens, describes the industrial metaverse as “immersive, making users feel as if they’re in a real environment; collaborative in real time; open enough for different applications to seamlessly interact; and trusted by the individuals and businesses that participate”—far more than simply a digital world.
The industrial metaverse will revolutionize the way work is done, but it will also unlock significant new value for business and societies. By allowing businesses to model, prototype, and test dozens, hundreds, or millions of design iterations in real time and in an immersive, physics-based environment before committing physical and human resources to a project, industrial metaverse tools will usher in a new era of solving real-world problems digitally.
“The real world is very messy, noisy, and sometimes hard to really understand,” says Danny Lange, senior vice president of artificial intelligence at Unity Technologies, a leading platform for creating and growing real-time 3-D content. “The idea of the industrial metaverse is to create a cleaner connection between the real world and the virtual world, because the virtual world is so much easier and cheaper to work with.”
While real-life applications of the consumer metaverse are still developing, industrial metaverse use cases are purpose-driven, well aligned with real-world problems and business imperatives. The resource efficiencies enabled by industrial metaverse solutions may increase business competitiveness while also continually driving progress toward the sustainability, resilience, decarbonization, and dematerialization goals that are essential to human flourishing.
This report explores what it will take to create the industrial metaverse, its potential impacts on business and society, the challenges ahead, and innovative use cases that will shape the future. Its key findings are as follows:
• The industrial metaverse will bring together the digital and real worlds. It will enable a constant exchange of information, data, and decisions and empower industries to solve extraordinarily complex real-world problems digitally, changing how organizations operate and unlocking significant societal benefits.
• The digital twin is a core metaverse building block. These virtual models simulate real-world objects in detail. The next generation of digital twins will be photorealistic, physics-based, AI-enabled, and linked in metaverse ecosystems.
• The industrial metaverse will transform every industry. Currently existing digital twins illustrate the power and potential of the industrial metaverse to revolutionize design and engineering, testing, operations, and training.
The Download: China’s retro AI photos, and experts’ AI fears
Across social media, a number of creators are generating nostalgic photographs of China with the help of AI. Even though these images get some details wrong, they are realistic enough to trick and impress many of their followers.
The pictures look sophisticated in terms of definition, sharpness, saturation, and color tone. Their realism is partly down to a recent major update of image-making artificial-intelligence program Midjourney that was released in mid-March, which is better not only at generating human hands but also at simulating various photography styles.
It’s still relatively easy, even for untrained eyes, to tell that the photos are generated by an AI. But for some creators, their experiments are more about trying to recall a specific era in time than trying to trick their audience. Read the full story.
Zeyi’s story is from China Report, his weekly newsletter giving you the inside track on tech in China. Sign up to receive it in your inbox every Tuesday.
Read more of our reporting on AI-generated images:
+ These new tools let you see for yourself how biased AI image models are. Bias and stereotyping are still huge problems for systems like DALL-E 2 and Stable Diffusion, despite companies’ attempts to fix it. Read the full story.
Evolutionary organizations reimagine the future
The global technology consultancy Thoughtworks describes organizations that can respond to marketplace changes with continuous adaptation as “evolutionary organizations.” It argues that, instead of focusing only on technology change, organizations should focus on building capabilities that support ongoing reinvention. While many organizations recognize the benefit of adopting agile approaches in their technology capabilities and architectures, they have not extended these structures and ways of thinking throughout the operating model, which would allow their impact to extend beyond that of a single transformation project.
Global spending on digital transformation is growing at a brisk pace: 16.4% per year according to IDC. The firm’s 2021 “Worldwide Digital Transformation Spending Guide” forecasts that annual transformation expenditures will reach $2.8 trillion in 2025, more than double the spending in 2020.1 At the same time, research from Boston Consulting Group shows that 7 out of 10 digital transformation initiatives fall short of their objectives. Organizations that succeed, however, achieve almost double the earnings growth of those that fail and more than double the growth in the total value of their enterprises.2 Understanding how to make these transitions successful, then, should be of key interest to all business leaders.
This MIT Technology Review Insights report is based on a survey of 275 corporate leaders, supplemented by interviews with seven experts in digital transformation. Its key findings include the following:
• Digital transformation is not solely a technology issue. Adopting new technology for its own sake does not set the organization up to continue to adapt to changing circumstances. Among survey respondents, however, transformation is still synonymous with tech, with 70% planning to adopt a new technology in the next year, but only 41% pursuing changes to their business model.
• The business environment is changing faster than many organizations think. Most survey respondents (81%) believe their organization is more adaptable than average and nearly all (89%) say that they’re keeping up with or ahead of their competitors—suggesting a wide gap between the rapidly evolving reality and executives’ perceptions of their preparedness.
• All organizations must build capabilities for continuous reinvention. The only way to keep up is for organizations to continuously change and evolve, but most traditional businesses lack the strategic flexibility necessary to do this. Nearly half of business leaders outside the C-suite (44%), for example, say organizational structure, silos, or hierarchy are the biggest obstacle to transformation at their firm.
• Focusing on customer value and empowering employees are keys to organizational evolution. The most successful transformations prioritize creating customer value and enhancing customer and employee experience. Meeting evolving customer needs is the constant source of value in a world where everything is changing, but many traditional organizations fail to take this long view, with only 15% of respondents most concerned about failing to meet customer expectations if they fail to transform.
• Rapid experimentation requires the ability to fail and recover quickly. Organizations agree that iterative, experimental processes are essential to finding the right solutions, with 81% saying they have adopted agile practices. Fewer are confident, however, in their ability to execute decisions quickly (76%)—or to shut down initiatives that aren’t working (60%).