Connect with us

Tech

Embracing culture change on the path to digital transformation

Published

on

Embracing culture change on the path to digital transformation


Meanwhile, young financial services companies were coming to market with innovative products and services and NAB was finding it difficult to compete. “Many customers today are expecting an Amazon experience, a Google experience, a Meta experience, but we were still operating in the 1990s,” says Day. “We stood back, and we looked at it, and we decided that our entire culture needed to change.”

What ensued was nothing less than an internal transformation. “Our original teams didn’t have a lot of tech skills, so to tell them that they were going to have to take on all of this technical accountability, an operational task that had previously been handed to our outsourcers, was daunting,” says Day.

Day and his team rolled out a number of initiatives to instill confidence across the organization and train people in the necessary technical skills. “We built confidence through education, through a lot of cultural work, a lot of explaining the strategy, a lot of explaining to people what good looked like in 2020, and how we were going to get to that place,” says Day.

This episode of Business Lab is produced in association with Infosys Cobalt.

Full transcript:

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma. And this is Business Lab. The show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. Our topic today is digital transformation. Most organizations have begun the journey to digitize their services and operations, and some are further along than others in bringing disruption to the marketplace. How do you bring transformation to organizations that are in highly regulated, service-based industries where competitive differentiation requires innovation?

Two words for you, internal transformation.

My guest is Steve Day, the chief technology officer of enterprise technology at National Australia Bank.

This podcast is produced in partnership with Infosys Cobalt.

Welcome, Steve.

Steve: Thank you, Laurel. It’s a pleasure to be here.

Laurel: National Australia Bank or NAB is undergoing a significant digital transformation. Gartner recently found that IT executives see the talent shortage as the largest barrier to deploying emerging technologies, specifically cloud-based technologies, but NAB uses insourcing. Most listeners are familiar with outsourcing, what exactly is insourcing and how does it relate to outsourcing?

Steve: Yeah. I think it’s all in the name. Insourcing would be the exact opposite of outsourcing. And to give you a little bit of history, National Australia Bank, like many banks, decided to outsource a large part of its operations in the 1990s. We basically pushed all our operations and a large part of our development capability out to third parties with the intent of lowering costs and making our operations far more process driven. I think those two objectives were achieved, but we did have an unintended consequence. We basically froze our operations in time, and that created a situation. If you roll forward to 2018, we realized that we were still operating like we’re in the 1990s. We were very waterfall driven. Our systems were highly processed driven, but in a very manual way, and it took us a very long time to roll out new products and services that our customers really needed.

It was about at that time that we realized we needed to do something different. We spoke with our outsources, of course, but to be honest, they weren’t motivated to reduce our internal costs and to help us become far more agile. They were very happy for us to be paying them large amounts of money to do large amounts of work. So at that point, we decided to bring our capability back into the business.

Laurel: So waterfall being the opposite of agile, right? You were finding that was hindering your progress as a company, correct?

Steve: It really was hindering our progress. We were very slow. It took us years to roll out new products and services. We had some young financial services companies knocking on the doors, startups, and the like, that were agile and able to compete really quickly, and we needed to change. We needed to look at a different way to roll out our products so that we could give customers what they’re expecting. Many customers today are expecting an Amazon experience, a Google experience, a Meta experience, but we were still operating in the 1990s. That’s when we really pushed our call too. We stood back and we looked at it, and we decided that our entire culture needed to change.

We did that by building a series of tech guilds. We built a cloud guild, a data guild, an insourcing framework. We built our NAB Engineering Foundation and with a goal of building a culture of innovation of cloud, of agile, and being able to deliver great products and services to our customers in a cost effective, but very safe way. And as part of that, we started on our cloud migrations and that is really moving at pace now.

Laurel: Insourcing seems to be working so far, but it didn’t happen overnight, as you said. And even though 2018 wasn’t that long ago, what was the journey like to first realize that you had to change the way you were working and then convince everyone to work in a very different way?

Steve: We did realize that if we didn’t get the culture embedded that we would not be successful. So building that capability and building the culture was number one on the list. It was five years ago. It feels like a very long time ago to me. But we started that process and through the cloud guild we trained 7,000 people in cloud and 2,700 of those today are industry certified and working in our teams. So we’ve made really good progress. We’ve actually moved a lot of the original teams that were a bit hesitant, a bit concerned about having to move to this whole new way of working. And remember that our original teams didn’t have a lot of tech skills, so to tell them that they were going to have to take on all of this technical accountability, an operational task that had previously been handed to our outsourcers, was daunting. And the only way we were going to overcome that was to build confidence. And we built confidence through education, through a lot of cultural work, a lot of explaining the strategy, a lot of explaining to people what good looked like in 2020, and how we were going to get to that place.

Laurel: NAB’s proportion of apps on public cloud will move from one third to about 80% by 2025, but security and regulatory compliance have been primary concerns for organizations and regulated industries like healthcare and financial services. How has NAB addressed these concerns in the cloud?

Steve: Initially, there was a lot of concern. People were not sure about whether cloud was resilient, whether it was secure, whether it could meet the compliance requirements of our regulators, or whether the board and our senior leadership team would be happy to take such a large change to the way we did business. We actually flew the board over to meet with many of the companies in the Valley to give them an idea of what was going on. We did a huge education program for our own teams. We created a new thing called The Executive Guild, so that middle management would have a great feel on what we were doing and why we were doing it. And as part of that, we created a set of tools that would help us move safely.

One of those was CAST, a framework that we use to migrate applications to cloud. CAST stands for Cloud, Adoption, Standards, and Techniques. And it really covers all the controls we use and how we apply those controls in our environment to make sure that when we migrate applications to cloud, they are the absolute safest they can be. It’s safe to say that when we built CAST, we actually did an uplift in our requirements. That enabled a lot of people to see that we were taking it very seriously, and that it was actually quite a high bar to achieve this compliance. But we were willing to invest, and we invested a lot in getting the applications to that level.

Another thing we did was build compliance as code. Now, infrastructure as code, what cloud is built on, allows you to then create compliance as code. So all of the checks and balances that used to be done manually by people with check boards, I used to say, are now being done in the code itself. And because a server is no longer a piece of tin in the corner, it’s an actual piece of code itself, a piece of software, you can run a lot of compliance checks on that, also from software.

A third thing that we did to give everyone a sense of comfort is we didn’t pin the success of NAB to the success of any one cloud company. We came up with a public, multi-cloud strategy, and that meant that at least for all our significant applications, we would run them on two different cloud providers. Now that would be expensive if you did every cloud in the most robust way, which would be active-active across both clouds. So we created our multi-cloud framework, which was about categorizing each application across multi-dimensions, and then assigning that workload to one of six multi-cloud treatments. Multi-cloud treatment one being, basically no multi-cloud, it’s an app for convenience. It doesn’t really matter if that application goes away. We allow that to sit in one cloud all the way through to our most critical applications, which we insist on running active-active across both clouds. And in our case, that would be MCT6. So given all of those frameworks, the tools, and the focus that we put on that, I think we gave the organization and the leadership at the organization some confidence that what we were doing was the right move and that it would give us our ability to serve customers well, while also remaining safe.

Laurel: How has cloud enabled innovation across NAB? I can see it in the teams and you’ve even upskilled executives to be comfortable with technology and what agile means and how you’re going to change the way that things are done. But what else are you seeing that’s just brought some kind of a particular efficiency that is a particularly proud moment for you?

Steve: I think I would go back to that description I just gave you about infrastructure as code being an incredible enabler of innovation. I mentioned compliance as code, but there’s also all kinds of operational innovation that you can perform when your infrastructure is software rather than hardware. Just being able to replicate things very quickly. The fact that you can have as many development environments as you need to develop your applications quickly and efficiently, because when you’re finished with them, you just turn them off and stop paying for them. The fact that we can move to serverless type applications now that don’t actually require any infrastructure sitting below them and enable our application team to not have to interact with anyone and just get on and develop their applications. Things like grid computing, which create massive computing power for a short burst of time. You pay a lot, but you only pay a lot for a very short amount of time. So you end up paying not very much at all. But to achieve massive things in predicting what the market’s going to do at times of concern and things like that.  Infrastructure-aware apps, some of the amazing things we are doing in cyber at the moment to understand cyberattacks, to be able to thwart them in a much more elegant way than we have in the past. Financial operations that enable us to take control of the elasticity of that cloud environment. And all of those things sort of add up to this platform of innovation that people can build things on that really create creative innovation.

Laurel: And how does that turn into benefits for customers? Because user experience is always an important consideration when building out tech services and as you mentioned, customers certainly expect Google- or Meta-like experiences. They want online, fast, convenient, anywhere they are, on any device, so how is something like artificial intelligence at an ATM serving both the need for improved security and improved user experience?

Steve: Great question. I think for improved security, fraud is a great one. There are so many scams going on right now, and AI has really enabled us to be able to detect fraud and to work with our customers, to prevent it in many cases. We’re seeing patterns of fraud or the ways that fraudsters actually approach their victims, and we’re able to pick that up and intervene in many cases. Operational predictions on things that are going to fail or break. And then things that are just better for customers like faster home loans. A large number of our home loans are approved in under an hour now because the AI allows us to take calculated risks, basically to do risk management in a really fast and efficient way. And then there are small things. There’s some great stuff like if I get a check, I just take a picture of it from my banking app on the iPhone and it’s instantly processed. Those sorts of things are really leading to better customer experiences.

Laurel: That’s my favorite as well, but a home loan under an hour, that’s pretty amazing.

Steve: And that’s because we have a history of what that customer’s done with us. We no longer have to have that customer fill in large surveys of what their monthly spending is and what their salary is and all of that. We have all that data. We know all that about the customer and to have to ask them again, is just silly to be frank. We can take all that information and process it directly out of their account. All we need is the customer’s permission. The open banking legislation and things that have come through at the moment that allow us to gain access to information with the customer’s permission through their other financial services, that also enables us to have a good understanding of that customer’s ability to meet their repayments.

We also do a lot of AI on things like valuations. The amount of AI going into valuing the property now is absolutely incredible. In the past, you’ve had to send somebody out to a house to do the valuation so that they can appreciate things like road noise, right? How much road noise does that property have? What are the aspects of that house? And through being able to look at, say, Google Maps and see how many cars per hour are flowing past that house, what the topology of the landscape is around that house, we can actually do calculations and tell exactly what the road noise is at that property. And we’re able to use layers and layers and layers of information such as that and that goes along with, is the house on a flood plain? Is the house overflown by aircraft, what material is the house made of? We can pick all of that from satellite imagery. Does it have a swimming pool? Does it have solar panels? We can gather a lot of that and actually do the valuation on the property as well, much faster than we have in the past. And that enables us to then provide these really fast turnarounds on things like home loans.

Laurel: That’s amazing. And of course, all of that helps keep innovation up at the bank, but then also improve your own efficiencies and money. Making money is part of being a business. And then you put the money back into making better experiences for your customers. So it’s sort of a win-win for everyone.

Steve: Yeah, I think so. I haven’t loaned money for a house since all of that has been put into place, but I’m really looking forward to the next time I do and having such a good experience.

Laurel: Collaborating with your customers is very important and collaborating with your competitors could be as well. So NAB teamed up with cloud providers and other global banks on an open digital finance challenge to prototype new banking services on a global scale. Why did NAB decide to do this? And what are some of the global financial challenges this initiative was looking to solve?

Steve: I think creating great partnerships to encourage innovation is a path forward. Like everything, we don’t have a monopoly on great ideas. And I think if we limited ourselves to the ideas we came up with, we wouldn’t be serving our customer’s best interests. Searching globally for great ideas and then going through a process of looking to see whether they can actually be productionized, it’s a great way of bringing innovation into the bank.

My favorite at the moment is Project Carbon, which is seven banks around the world all getting together to create a secure clearinghouse for voluntary carbon credits, which if you think about that and where the world’s going and how important that will be going forward, it’s just absolutely wonderful that we’ve got this situation being built today. But yeah, there’ll be things that create more secure payments, faster payments, more convenient payments, more resilient ledgers, and I mentioned faster home loans, etc. It’s just an exciting time to be in the industry.

Laurel:  And to be so open and willing to work with other folks as well. What else are you excited about? There’s so much innovation happening at NAB and across the financial services industry, what are you seeing in the next three to five years?

Steve: I’m seeing a faster pace of change. One of the things I’m aware of at the moment, things are changing so fast, that it’s really hard to predict what is going to come up in the near future. But one thing we know for sure is we will need a platform that enables us to pivot quickly to whatever that is. So I’m actually most excited about the opportunity to build a platform that is incredibly agile and allows us to pivot and to move and to exploit some of these great ideas that are coming in from global partners, or internally or wherever they’re coming from. Our new graduates come up with quite a few themselves. How do we get those ideas to production really quickly in a safe way? And I think that is what really excites me is the opportunity to build such a platform.

Laurel: Steve, thank you so much for joining us on the Business Lab. This has been a fantastic conversation.

Steve: Thank you, Laurel.

Laurel: That was Steve Day, the chief technology officer of enterprise technology at National Australia Bank, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review overlooking the Charles River. That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology. And you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyReview.com.

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Collective Next. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Tech

AI and data fuel innovation in clinical trials and beyond

Published

on

AI and data fuel innovation in clinical trials and beyond


Laurel: So mentioning the pandemic, it really has shown us how critical and fraught the race is to provide new treatments and vaccines to patients. Could you explain what evidence generation is and then how it fits into drug development?

Arnaub: Sure. So as a concept, generating evidence in drug development is nothing new. It’s the art of putting together data and analyses that successfully demonstrate the safety and the efficacy and the value of your product to a bunch of different stakeholders, regulators, payers, providers, and ultimately, and most importantly, patients. And to date, I’d say evidence generation consists of not only the trial readout itself, but there are now different types of studies that pharmaceutical or medical device companies conduct, and these could be studies like literature reviews or observational data studies or analyses that demonstrate the burden of illness or even treatment patterns. And if you look at how most companies are designed, clinical development teams focus on designing a protocol, executing the trial, and they’re responsible for a successful readout in the trial. And most of that work happens within clinical dev. But as a drug gets closer to launch, health economics, outcomes research, epidemiology teams are the ones that are helping paint what is the value and how do we understand the disease more effectively?

So I think we’re at a pretty interesting inflection point in the industry right now. Generating evidence is a multi-year activity, both during the trial and in many cases long after the trial. And we saw this as especially true for vaccine trials, but also for oncology or other therapeutic areas. In covid, the vaccine companies put together their evidence packages in record time, and it was an incredible effort. And now I think what’s happening is the FDA’s navigating a tricky balance where they want to promote the innovation that we were talking about, the advancements of new therapies to patients. They’ve built in vehicles to expedite therapies such as accelerated approvals, but we need confirmatory trials or long-term follow up to really understand the evidence and to understand the safety and the efficacy of these drugs. And that’s why that concept that we’re talking about today is so important, is how do we do this more expeditiously?

Laurel: It’s certainly important when you’re talking about something that is life-saving innovations, but as you mentioned earlier, with the coming together of both the rapid pace of technology innovation as well as the data being generated and reviewed, we’re at a special inflection point here. So, how has data and evidence generation evolved in the last couple years, and then how different would this ability to create a vaccine and all the evidence packets now be possible five or 10 years ago?

Arnaub: It’s important to set the distinction here between clinical trial data and what’s called real-world data. The randomized controlled trial is, and has remained, the gold standard for evidence generation and submission. And we know within clinical trials, we have a really tightly controlled set of parameters and a focus on a subset of patients. And there’s a lot of specificity and granularity in what’s being captured. There’s a regular interval of assessment, but we also know the trial environment is not necessarily representative of how patients end up performing in the real world. And that term, “real world,” is kind of a wild west of a bunch of different things. It’s claims data or billing records from insurance companies. It’s electronic medical records that emerge out of providers and hospital systems and labs, and even increasingly new forms of data that you might see from devices or even patient-reported data. And RWD, or real-world data, is a large and diverse set of different sources that can capture patient performance as patients go in and out of different healthcare systems and environments.

Ten years ago, when I was first working in this space, the term “real-world data” didn’t even exist. It was like a swear word, and it was basically one that was created in recent years by the pharmaceutical and the regulatory sectors. So, I think what we’re seeing now, the other important piece or dimension is that the regulatory agencies, through very important pieces of legislation like the 21st Century Cures Act, have jump-started and propelled how real-world data can be used and incorporated to augment our understanding of treatments and of disease. So, there’s a lot of momentum here. Real-world data is used in 85%, 90% of FDA-approved new drug applications. So, this is a world we have to navigate.

How do we keep the rigor of the clinical trial and tell the entire story, and then how do we bring in the real-world data to kind of complete that picture? It’s a problem we’ve been focusing on for the last two years, and we’ve even built a solution around this during covid called Medidata Link that actually ties together patient-level data in the clinical trial to all the non-trial data that exists in the world for the individual patient. And as you can imagine, the reason this made a lot of sense during covid, and we actually started this with a covid vaccine manufacturer, was so that we could study long-term outcomes, so that we could tie together that trial data to what we’re seeing post-trial. And does the vaccine make sense over the long term? Is it safe? Is it efficacious? And this is, I think, something that’s going to emerge and has been a big part of our evolution over the last couple years in terms of how we collect data.

Laurel: That collecting data story is certainly part of maybe the challenges in generating this high-quality evidence. What are some other gaps in the industry that you have seen?

Arnaub: I think the elephant in the room for development in the pharmaceutical industry is that despite all the data and all of the advances in analytics, the probability of technical success, or regulatory success as it’s called for drugs, moving forward is still really low. The overall likelihood of approval from phase one consistently sits under 10% for a number of different therapeutic areas. It’s sub 5% in cardiovascular, it’s a little bit over 5% in oncology and neurology, and I think what underlies these failures is a lack of data to demonstrate efficacy. It’s where a lot of companies submit or include what the regulatory bodies call a flawed study design, an inappropriate statistical endpoint, or in many cases, trials are underpowered, meaning the sample size was too small to reject the null hypothesis. So what that means is you’re grappling with a number of key decisions if you look at just the trial itself and some of the gaps where data should be more involved and more influential in decision making.

So, when you’re designing a trial, you’re evaluating, “What are my primary and my secondary endpoints? What inclusion or exclusion criteria do I select? What’s my comparator? What’s my use of a biomarker? And then how do I understand outcomes? How do I understand the mechanism of action?” It’s a myriad of different choices and a permutation of different decisions that have to be made in parallel, all of this data and information coming from the real world; we talked about the momentum in how valuable an electronic health record could be. But the gap here, the problem is, how is the data collected? How do you verify where it came from? Can it be trusted?

So, while volume is good, the gaps actually contribute and there’s a significant chance of bias in a variety of different areas. Selection bias, meaning there’s differences in the types of patients who you select for treatment. There’s performance bias, detection, a number of issues with the data itself. So, I think what we’re trying to navigate here is how can you do this in a robust way where you’re putting these data sets together, addressing some of those key issues around drug failure that I was referencing earlier? Our personal approach has been using a curated historical clinical trial data set that sits on our platform and use that to contextualize what we’re seeing in the real world and to better understand how patients are responding to therapy. And that should, in theory, and what we’ve seen with our work, is help clinical development teams use a novel way to use data to design a trial protocol, or to improve some of the statistical analysis work that they do.

Continue Reading

Tech

Power beaming comes of age

Published

on

Power beaming comes of age


The global need for power to provide ubiquitous connectivity through 5G, 6G, and smart infrastructure is rising. This report explains the prospects of power beaming; its economic, human, and environmental implications; and the challenges of making the technology reliable, effective, wide-ranging, and secure.

The following are the report’s key findings:

Lasers and microwaves offer distinct approaches to power beaming, each with benefits and drawbacks. While microwave-based power beaming has a more established track record thanks to lower cost of equipment, laser-based approaches are showing promise, backed by an increasing flurry of successful trials and pilots. Laser-based beaming has high-impact prospects for powering equipment in remote sites, the low-earth orbit economy, electric transportation, and underwater applications. Lasers’ chief advantage is the narrow concentration of beams, which enables smaller trans- mission and receiver installations. On the other hand, their disadvantage is the disturbance caused by atmospheric conditions and human interruption, although there are ongoing efforts to tackle these deficits.

Power beaming could quicken energy decarbonization, boost internet connectivity, and enable post-disaster response. Climate change is spurring investment in power beaming, which can support more radical approaches to energy transition. Due to solar energy’s continuous availability, beaming it directly from space to Earth offers superior conversion compared to land-based solar panels when averaged over time. Electric transportation—from trains to planes or drones—benefits from power beaming by avoiding the disruption and costs caused by cabling, wiring, or recharge landings.

Beaming could also transfer power from remote renewables sites such as offshore wind farms. Other areas where power beaming could revolutionize energy solutions include refueling space missions and satellites, 5G provision, and post-disaster humanitarian response in remote regions or areas where networks have collapsed due to extreme weather events, whose frequency will be increased by climate change. In the short term, as efficiencies continue to improve, power beaming has the capacity to reduce the number of wasted batteries, especially in low-power, across-the- room applications.

Public engagement and education are crucial to support the uptake of power beaming. Lasers and microwaves may conjure images of death rays and unanticipated health risks. Public backlash against 5G shows the importance of education and information about the safety of new, “invisible” technologies. Based on decades of research, power beaming via both microwaves and lasers has been shown to be safe. The public is comfortable living amidst invisible forces like wi-fi and wireless data transfer; power beaming is simply the newest chapter.

Commercial investment in power beaming remains muted due to a combination of historical skepticism and uncertain time horizons. While private investment in futuristic sectors like nuclear fusion energy and satellites booms, the power-beaming sector has received relatively little investment and venture capital relative to the scale of the opportunity. Experts believe this is partly a “first-mover” problem as capital allocators await signs of momentum. It may be a hangover of past decisions to abandon beaming due to high costs and impracticality, even though such reticence was based on earlier technologies that have now been surpassed. Power beaming also tends to fall between two R&D comfort zones for large corporations: it does not deliver short-term financial gain, but it is also not long term enough to justify a steady financing stream.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Continue Reading

Tech

The porcelain challenge didn’t need to be real to get views

Published

on

The porcelain challenge didn’t need to be real to get views


“I’ve dabbled in the past with trying to make fake news that is transparent about being fake but spreads nonetheless,” Durfee said. (He once, with a surprising amount of success, got a false rumor started that longtime YouTuber Hank Green had been arrested as a teenager for trying to steal a lemur from a zoo.)

On Sunday, Durfee and his friends watched as #PorcelainChallenge gained traction, and they celebrated when it generated its first media headline (“TikTok’s porcelain challenge is not real but it’s not something to joke about either”). A steady parade of other headlines, some more credulous than others, followed. 

But reflex-dependent viral content has a short life span. When Durfee and I chatted three days after he posted his first video about the porcelain challenge, he already could tell that it wasn’t going to catch as widely as he’d hoped. RIP. 

Nevertheless, viral moments can be reanimated with just the slightest touch of attention, becoming an undead trend ambling through Facebook news feeds and panicked parent groups. Stripping away their original context can only make them more powerful. And dubious claims about viral teen challenges are often these sorts of zombies—sometimes giving them a second life that’s much bigger (and arguably more dangerous) than the first.

For every “cinnamon challenge” (a real early-2010s viral challenge that made the YouTube rounds and put participants at risk for some nasty health complications), there are even more dumb ideas on the internet that do not trend until someone with a large audience of parents freaks out about them. 

Just a couple of weeks ago, for instance, the US Food and Drug Administration issued a warning about boiling chicken in NyQuil, prompting a panic over a craze that would endanger Gen Z lives in the name of views. Instead, as Buzzfeed News reported, the warning itself was the most viral thing about NyQuil chicken, spiking interest in a “trend” that was not trending.

And in 2018, there was the “condom challenge,” which gained widespread media coverage as the latest life-threatening thing teens were doing online for attention—“uncovered” because a local news station sat in on a presentation at a Texas school on the dangers teens face. In reality, the condom challenge had a few minor blips of interest online in 2007 and 2013, but videos of people actually trying to snort a condom up their nose were sparse. In each case, the fear of teens flocking en masse to take part in a dangerous challenge did more to amplify it to a much larger audience than the challenge was able to do on its own. 

The porcelain challenge has all the elements of future zombie content. Its catchy name stands out like a bite on the arm. The posts and videos seeded across social media by Durfee’s followers—and the secondary audience coming across the work of those Durfee deputized—are plausible and context-free. 

Continue Reading

Copyright © 2021 Seminole Press.