Connect with us

Politics

AI is Neutral Technology: What May Be Harmful in Social Media Can Help Healthcare – ReadWrite

Published

on

AI is Neutral Technology: What May Be Harmful in Social Media Can Help Healthcare - ReadWrite


Netflix’s new “The Social Dilemma” documentary has been eye-opening for millions of viewers (see in: hundustantimes, dotcom), sparking conversation — and concern — about how the algorithms used by social media platforms manipulate human behavior.

Here is: “AI is Neutral Technology: What May be Harmful in Social Media Can Help Healthcare — By Dr. Darren Schulte, MD is Chief Executive Officer at Apixio.

By leveraging artificial intelligence that has become shockingly good at analyzing, predicting, and influencing user behavior. The film asserts that the resulting unintended consequences have created real-life dystopian implications: excessive screen time that causes real-world relationships to suffer, addictive behavior, alarming societal divisiveness, and even higher rates of depression, self-harm, and suicide.

These consequences as users look to social media for validation. Big tech corporations profit enormously by harvesting and analyzing their user data and manipulating their behavior to benefit advertisers.

While the film appears to give machine learning algorithms a bad rap, these algorithms aren’t inherently evil. It all depends upon what the algorithms are trained to do.

In fact, the use of AI algorithms in healthcare has tremendous potential to transform health care by improving individual patient outcomes and overall population health, enabling more personalized medicine, reducing waste and costs, and accelerating the discovery of new treatment and preventative measures.

The same type of algorithms showcased in the Social Dilemma can be trained to analyze data generated by patients, care providers, and devices (like wearables). 

The algorithms can even use surveillance of body functions (like lab tests and vital signs) to provide deeper and more accurate insight into individual health, health-related habits, and behaviors over time.

By combining that individual data with anonymous, aggregated population data, we can discover better treatments, refine clinical guidelines, and discover new therapies to improve overall population health.

  • Improve response to emergent diseases like COVID-19. One of the problems we’ve had with effectively treating COVID-19 patients is that there’s been a lot of experimentation and trial-and-error. However, even the data on the results of those therapies has been slow to propagate across the global medical community.

Hospitals and physicians only have data on the patients that they are treating themselves. With no cohesive system for sharing patient data. Providers in America, for example, have not been able to benefit quickly enough from the knowledge and experience of providers in Asia and Europe — where the virus spread first.

By leveraging AI to mine aggregated medical records from millions of individuals, we could see what treatments have been most effective for specific patient cohorts.

Even further, we could analyze the characteristics of those already infected to see which attributes make one more likely to develop the most severe symptoms. By identifying vulnerable populations faster, we can then take targeted steps to prevent infection and implement the most effective treatments.

As we have seen, the analysis and exchange of this data manually, takes far too long, contributing to the propagation and death toll. With AI, we can surface this knowledge much faster and potentially reduce the impact of the next novel disease.

  • Provide better patient surveillance. Identifying how – and how fast – COVID-19 spreads has also been a significant challenge. Scientists traditionally use a metric called R0 (pronounced “R naught”), a measure of the average number of people infected by one infectious individual.

Using R0 to predict COVID-19’s spread has been problematic for several reasons, including the fact that different groups use different models and data, and asymptomatic individuals can spread the disease without knowing that they are infected.

AI can help resolve this issue to improve patient surveillance by analyzing both medical records of patients who tested positive alongside contact tracing data that indicates the potential for infection. By combining this data and analyzing it at scale, medical authorities can use this insight to determine where to implement aggressive testing programs and more restrictive shelter-in-place measures to slow the spread of disease.

  • Improve the quality of care. Health care providers want to deliver the best quality of care to their patients. But one of the challenges they face is measuring quality and patient outcomes with empirical evidence. With patient data scattered across different sources like electronic health records (EHRs), lab results, imaging studies, it is difficult to aggregate and analyze.

By implementing systems that consolidate this data and allow providers to use AI to mine it for insights, physician practices and hospitals can identify trends among patients and implement quality improvement programs.

For example, if they see that individuals with certain characteristics fail to follow-up on important health concerns, providers can intervene with appointment reminders, transportation resources, provide telehealth options, or other interventions to keep patients engaged in their own care.

On the flip side, insurers are also concerned about care quality and ensuring patients get the best possible outcome at the lowest possible cost.

AI can help insurers track and measure patient outcomes as they move through the care system—from a primary care provider to a specialist to a hospital for surgery and into a rehab facility, for example—and identify providers or treatment protocols that may not be delivering optimal results. Insurers can then work with providers to implement new approaches to improve success rates and overall patient outcomes.

  • Identify and mitigate concerning trends. During a typical patient encounter, doctors only have access to the medical information for the patient in front of them. Consulting their patient history provides a limited view of factors that might indicate declining health. With data scattered across different systems, doctors do not always have all the data they need at hand.

AI can help surface broader indicators that a patient’s health may be declining over time.

By analyzing aggregate data across a large population, AI can show that patients with certain vital signs or trends in their data might be headed toward developing certain conditions, like diabetes or heart disease.

Physicians can use this information as a predictor of potential trouble and begin implementing preventative action. Some solutions can alert physicians to these insights as notifications within the Electronic Health Record (EHR) during the patient encounter. This allows physicians to take swift action to prevent disease progression.

  • Enable personalized medicine. The health care industry has been moving toward personalized medicine for years, aiming to transform the “one-size-fits-all” approach to care into a customized plan for each individual. But this is practically impossible without access to aggregated data and insights that only AI can provide.

Consider the AI social media companies use to create and leverage personas to prompt engagement and drive advertising dollars. If we were to apply the same technique to build health care personas for each person, we could then provide this information to providers (with the patient’s permission).

Providers could then use tools like notifications, nudges, cues, or other communication (just like social media) to elicit positive behavior for better health.

For example, providers could target at-risk patients with prescription reminders, diet recommendations, or other resources relevant to their specific health situation.

  • Reduce diagnostic and treatment errors. Even the best providers can overlook important details and make mistakes, especially with the pressure they are under to squeeze more patients into a typical day.

Just as algorithms can help social platforms surface insights about their audience to woo advertisers, physicians can use algorithms to surface insights to diagnose and treat conditions accurately. For example, AI can highlight confounding conditions or risk factors for patients, allowing doctors to consider the individual’s entire health profile when making decisions.

AI can also aid in surfacing potential drug interactions that could put patients at risk. All of this can substantially lower the risk of errors that cause patients harm, not to mention reduce the risk of malpractice accusations.

The same way algorithms can identify Facebook users who might be interested in a new lawnmower and serve up an appropriate ad; they can help providers identify high-risk patients before they develop costly care needs. By culling through data to identify risk factors, AI allows providers to implement preventative and early intervention strategies.

For example, an algorithm might spot a specific obesity indicator that correlates with the risk for Type II diabetes or identify patients with high blood pressure that are at greater risk of heart attack, stroke, or kidney disease.

These insights can be delivered at the point of care, even during a patient encounter. If a patient displays a specific set of symptoms, as the data is entered into the EHR, the physician is alerted to the risk and can review trends in disease progression or confounding conditions to plot the best course of action.

  • Identify optimal treatment pathways through data-based referrals. Traditionally, when a patient needed to see a specialist, for surgery or physical therapy, for example, physicians typically referred to providers with whom they have existing relationships.

Unfortunately for patients, this does not always mean they get the best care for their unique situation. Does the provider have experience working with patients with co-morbidities? Do they specialize in complex surgeries or more typical procedures?

AI allows providers to refer to the best provider for each patient’s unique needs based on hard evidence of success and proven outcomes, rather than simply based on existing ties.

For example, if a patient with diabetes needs a knee replacement, AI can help primary care providers to identify orthopedic specialists and rehabilitation providers with proven, demonstrably better results in handling patients with this co-existing condition.

  • Reduce spending waste. About 30% of healthcare spending is considered “waste,” totaling up to $935 billion. Nearly $80 billion alone can be attributed to overtreatment or low-value care.

In other words, providers order more tests, services, and procedures that aren’t necessarily the best option—or even necessary at all—mostly in an effort to protect themselves against being accused of not doing enough and to meet insurer’s requirements (e.g., ordering x-rays before an MRI when an injury is clearly soft tissue related or sending patients for multiple repeat mammograms before conducting an ultrasound to evaluate a suspicious lump).

By mining data using algorithms, providers and insurers can focus on using the tests and procedures that demonstrate high value or necessary for specific instances. For example, is it necessary for patients on certain medications to get blood tests every 90 days? Do wellness visits add value to patients?

By looking at what is most effective across the larger population, AI can help point physicians in the right direction earlier, reducing unnecessary diagnostics and placing the patient on the path to better health more quickly.

AI thereby can reduce wasteful spending by identifying diagnostics that are most effective and economical, potentially saving patients and payers millions every year on ineffective tests and treatments.

  • Accelerate drug and treatment discovery. The current pathway to new drugs, vaccines, and treatments is long and arduous. On average, it takes at least ten years for new drugs to go from discovery to marketplace, with trials alone taking as long as seven years on average. For new vaccines, the average time to market is up to 12 years (which puts hope for a COVID-19 vaccine by year’s end into perspective).

One of the reasons the process is so slow is the lack of advanced data and analytics capabilities in the process.

The use of AI to analyze patient and drug performance data could substantially accelerate the time to market for new drugs and vaccines, which could save lives.

Just as the lack of data analytics meant doctors struggled to devise effective COVID-19 protocols, the inability to rapidly analyze trial data and evaluate new use cases for existing drugs prevents patients from getting the treatment they need.

Algorithms can accelerate this analysis and get much-needed medicines into the hands of patients faster.

All this time can add up to a significant cost and take away from time spent in direct, face-to-face time with patients.

AI can help reduce this burden and lower operational costs by automating manual processes like prior authorizations, reducing retrospective chart reviews by surfacing the right data to the right people earlier. The right data, quickly obtainable, will help physicians make better, faster decisions.

These efficiencies enabled by AI, on the administrative side, ultimately lower the cost of health care services for both patients and payers and frees up more resources to improve direct patient care.

The negative use of social media comes when the data influences human behavior bringing negative consequences.

For the most part, technology is neutral. But in the wrong hands with the wrong motives or objectives, the use of algorithms can raise serious ethical questions.

The same algorithms that cause us to feel more anxious, isolated, or depressed when leveraged by social media can also be used to help us heal, stay healthy, and achieve optimal well-being.

The questions are all about the algorithm’s objective and training, testing, and user feedback data that are used by the algorithm.  The reality is that managing both individual and public health in the 21st century requires access to data and insights.

Without data-driven insights, we are just guessing what will work in healthcare and what doesn’t.

Leveraging algorithms to analyze health care data empowers physicians to devise a truly personalized care plan for each individual. The physician can improve the quality of care overall and lower health care costs by tapping into collective insight and knowledge gleaned from millions of patient records.

Image Credit: karolina grabowska; pexels

Politics

How Preql is Transforming Data Transformation

Published

on

How Preql is Transforming Data Transformation


More than one million small businesses use ecommerce platform Shopify to reach a global audience of consumers. That includes direct-to-consumer (DTC) all-stars like Allbirds, Rothy’s and Beefcake Swimwear.

But online sellers like these are also ingesting data from platforms like Google Analytics, Klaviyo, Attentive and Facebook Ads, which quickly complicates weekly reporting.

That’s where data transformation comes in.

dbt and Preql 

As the name implies, data transformation tools help convert data from its raw format to clean, usable data that enables analytics and reporting. Centralizing and storing data is easier than it’s ever been, but creating reporting-ready datasets requires aligning on business definitions, designing output tables, and encoding logic into a series of interdependent SQL scripts, or “transformations.” Businesses are making significant investments in data infrastructure tooling, such as ingestion tools, data storage, and visualization/BI without having the internal expertise to transform their data effectively. But they quickly learn if you can’t effectively structure your data for reporting, they won’t get value from the data they’re storing—or the investment they’ve made.

The space includes two major players: dbt and startups.

Founded in 2016, dbt “built the primary tool in the analytics engineering toolbox,” as the company says, and it is now used by more than 9,000 companies—and it is backed by more than $414 million.

But dbt is a tool for developers at companies with established analytics engineering teams.

Preql, on the other hand, is a startup  building no-code data transformation tool that targets business users who might not have expertise in programming languages but who nevertheless need trusted, accessible data.  

Preql’s goal is to automate the hardest, most time-intensive steps in the data transformation process so businesses can be up and running within days as opposed to the six- to 12-month window for other tools. 

“We built Preql because the transformation layer is the most critical part of the data stack, but the resources and talent required to manage it make reliable reporting and analytics inaccessible for companies without large data functions,” said Gabi Steele, co-founder and co-CEO of Preql.

The startup is therefore positioning itself as an alternative to hiring full analytics engineering teams solely to model and manage business definitions—especially among early-stage companies that are first building out their data capabilities. 

In other words, Preql is the buffer between the engineering team and the people who actually need to use the data.

“Data teams tend to be highly reactive. The business is constantly asking for data to guide decision making, but in the current transformation ecosystem, even small changes to data models require time and expertise. If business users can truly manage their own metrics, data talent will be able to step out of the constant back and forth of fulfilling reporting requests and focus on more sophisticated analyses,” said Leah Weiss, co-founder and co-CEO of Preql.

But that’s not to say dbt and Preql are bitter rivals. In fact, they are part of the same data transformation community—and there’s a forthcoming integration.

“One way to think about it is we want to help the organizations get up and running really quickly and get the time to value from the data they’re already collecting and storing without having to have the specialized talent that’s really well versed in dbt,” Steele added. “But as these companies become more sophisticated, we will be outputting dbt, so they can leverage it if that’s the tool that they’re most comfortable with.”

A Closer Look at Preql

The startup raised a $7 million seed round in May, led by Bessemer Venture Partners, with participation from Felicis.

Preql collects business context and metric definitions and then abstracts away the data transformation process. It helps organizations get up and running with a central source of truth for reporting without having a data team or writing SQL.

Preql reads in data from the warehouse and writes back clean, reporting-ready schemas. It partners with data ingestion tools that move data from source applications into the warehouse such as Airbyte and Fivetran and cloud data warehouses like Snowflake, Redshift and BigQuery. For businesses who consume data in BI tools, it also partners with Looker, Tableau and Sigma Computing. 

DTC Target

Preql is initially focused on the DTC market in part because the metrics, such as cost of customer acquisition (CAC), conversion rate and life-time value (LTV), are standardized. They also tend to have lean operations.

“We’ve found that these companies are working really hard to download data from disparate sources—third-party platforms that they use, Shopify, their paid marketing platforms—in order to get a sense of even basic business health and performance,” Weiss said. 

They also tend to use manual reporting processes, which means “it’s often an operations person who’s downloading data from a bunch of sources, consolidating that in spreadsheets, making a bunch of manual interventions and then outputting weekly reporting or quarterly reporting,” she added. 

But much of what these companies want to measure about performance is consistent and a lot of the data sources are structured the same way.

“With Preql, we were able to make some assumptions about what we wanted to measure with the flexibility to customize a few of those definitions that are specific to our business,” added Cynthia Plotch, co-founder at Stix, a women’s health essentials ecommerce site. “Preql gave us clean, usable data for reporting.  We were up and running with weekly reporting within days, saving us months of effort if we had to invest in data engineering teams.”

Data Transformation in 2027

Steele and Weiss believe the next five years will be about “delivering on the promise of the modern data stack.”

In other words, answering questions like: Now that we have scalable storage and ingestion, how can we make sure we can actually leverage data for decision making? And how can we build trust in reporting so we can build workflows around it and act on it? 

This is because a lot of companies struggle to move on to predictive analytics and machine learning because they never solved the fundamental issue of creating trusted, accessible data. 

 What’s more, Preql believes the next phase of tools will go beyond building infrastructure to deliver more value as data talent sits closer and closer to the business.

“Data analytics will only get more complicated because the number of data sources is growing, along with their complexity, and the need is becoming more acute for real time results. And the more data you have, the more granular the questions become and even more is expected of it,” Amit Karp, partner at Bessemer Venture Partners added. “I think we’re in the very early innings of what’s going to be a very long wave—five, ten or even 20 years down the road.  It’s a giant market.”

Rekha Ravindra

Rekha has 20+ years of experience leading high-growth B2B tech companies and has built deep expertise in data infrastructure – helping to take often very complex technology and ideas and make them understandable for broader business and tech audiences.

Continue Reading

Politics

Can Traditional Companies Act Like Start-Ups?

Published

on

Demos_Parneros_ Traditional Companies and Start Ups.jpg


Much has been made about the culture clash between older, slower, more traditional companies and younger, more dynamic, faster-moving tech start-ups. Each has advantages and disadvantages, but, generally speaking, it is very hard to reconcile the two approaches, as they are naturally in opposition to each other.

The general motto among start-ups of “move fast and break things” has led to very quick yet massive successes, with some companies, Google and Amazon being the most obvious examples, growing larger than traditional competitors who have been around for decades and decades. But it has also led to a lot of unconsidered damage to traditional industries like transportation and publishing, their ‘disruption’ doing as much harm as good. And, more often than not, start-ups can see millions or even billions in investment being wasted on bad ideas and unproven tech (Theranos, anyone?). “Fake it till you make it” means that, eventually, you actually do need to make it.

Image Credits: Pexels

Meanwhile, traditional companies, while providing more useful and regular forms of employment, great institutional knowledge, and decades of business experience, have their own problems. Because they often resemble large, inefficient bureaucracies, they are slow to move and respond to change. Old companies can be blind to, and even fearful of, innovation and new technology. This can leave them dead in the water when the future finally arrives. Kodak, for example, went from venerated, dominant business to almost nothing in just a few years because it refused to accept the revolution of digital photography.

But is there a way to integrate the two approaches? To take the best from both cultures and business plans and use those aspects to move into the future? To get big, old businesses to work, at least in some ways, like small, agile, young start-ups? Yes, but it isn’t easy.

Innovation Without Disruption

As stated, one of the greatest fears of traditional companies is having their business, or their entire sector, undercut by a growing start-up. While independent start-ups are expected to disrupt, be change agents, or however you want to put it, more traditional companies are prone to be much more risk averse. Naturally, one of the smartest things that an old company can do to avoid being left behind is to lead the disruption themselves.

Demos_Parneros_ Traditional Companies and Start Ups_3.jpg
Image Credits: Pexels

Many traditional businesses are currently investing in, and should continue to invest in, the digital transformation of their business model, from top to bottom. This, however, is a slow process, especially in sizable companies. The use of machine learning, predictive analysis, AI, and other cutting edge digital tools allows old business models to become more efficient, and respond to changes in supply and demand, and market tumult, in better and smarter ways. But it isn’t as easy as flipping a switch.

A New Business to Try New Things

Quite a few traditional businesses are spinning out new sectors, tech labs, and other separate silos to do the work of digital innovation for them. This isn’t uncommon. Businesses have, basically forever, had subsidiaries. The problem is that old businesses have trouble actually committing to the idea.

Often, the business that is spun-out is, essentially, a temporary one. The leaders of the core business get cold feet, limit the new project’s mandate, and pull it back in as soon as possible. Such hesitance is limiting in today’s digital world, where the next revolutionary innovation is always just around the corner.

Demos_Parneros_ Traditional Companies and Start Ups_3.jpg
Image Credits: Pexels

Furthermore, spin-outs with good ideas and potential for growth are frequently allowed to die on the vine, just as often they go to seed. Or, to make things clearer, the core business doesn’t invest in the digital spin-out’s success. The great advance of digital companies is their ability to scale with almost lightning speed. But core business have to be ready with resources and support for the scale-up to even happen, let alone work. Otherwise, a grand opportunity will go to waste.

If a business spin-out does well enough, it should be allowed to grow and change as it needs to, provided that it remains successful and worthwhile. Whether the goal is for the new business to simply make money in an area the core business isn’t directly addressing, or developing digital innovations for the core business to take up, if it works it works. Don’t get in the way of success just because it is new, or comes in an unfamiliar form. At the same time, core businesses must be careful of how they measure success for these new experiments. Measuring the new company or spin-out with the same metrics as the core business can sometimes choke the momentum and not give an accurate picture. Afterall, newer, smaller businesses, or initiatives shouldn’t be expected to be profitable immediately.

Cultural Change, From the Executive Level On Down

All the innovation in the world won’t mean anything if the people running the business itself refuse to change. Older companies, and older executives, can become set in their ways, dismissive of new technologies and ways of doing business, and ignore the automation and efficiencies of advanced digital tools. We saw this at the beginning of the widespread use of the internet twenty years ago, and we’re seeing it now.

More important than this, is the need for people in positions of real power in companies to implement the changes needed for innovation and advancement, and do so thoroughly and effectively. There must be a willingness to let the start-up culture infiltrate and influence the way business is done at every level, or it won’t be effective enough to help.

Demos_Parneros_ Traditional Companies and Start Ups_3.jpg
Image Credits: Pexels

It is painfully common for large, traditional companies to put money into research and development of new ideas and new technologies, only for executives and other decision makers to ignore what’s in front of them, either because of cost, or risk, or something as simple as a fear of the future.

But the future of business is changing in a digital world. Things move and change with an almost frightening speed. The Covid-19 pandemic is absolute proof of that; it wasn’t just companies with digital tools at the ready that were able to survive. While they had an advantage, it was the companies that were able to acknowledge the rapidly changing situation, and react to it quickly and efficiently, that kept things going and in some cases, even improved their bottom lines.

But It’s More Than Just a Cultural Change

One of the biggest advantages of tech start up culture is that it is forward-facing. It is an attitude towards business and technology that is not just looking towards the future (every business does that), but is actively trying to grapple with it, and even to shape it, if possible. Traditional, legacy businesses need to admit that the world is not static, and they have a responsibility in influencing how their industry develops.

Part of that responsibility is letting innovators be innovators. If a large company spins out a business unit to study and improve its digital technology, that company can’t then balk when those innovators recommend widespread change, or create a new idea that could shake the company, or its whole industry, to its core.

Demos_Parneros_ Traditional Companies and Start Ups_3.jpg
Image Credits: Pexels

Conclusion

To put it as simply as possible, for an older, more traditional company to reap the benefits of adopting a start-up model, it has to actually adopt it. It can’t just make superficial changes, it needs to truly invest. But that kind of investment carries risk, which can make more traditional companies nervous. The work of transformation must actually be done.

That means supporting digital innovations and changes when they make things more efficient. It means letting spin-out businesses actually try new things, and grow to scale when they hit upon something new and successful. It means executives getting out of the way so the forces of change can actually, you know, change things. Otherwise, the ‘traditional’ company will just be the ‘old’ company, sitting around waiting for some new tech upstart to disrupt it into obsolescence.

Demos Parneros

Demos Parneros

CEO | President | Board Director

Demos Parneros is an experienced and innovative retail and e-commerce leader, helping Staples grow from a startup to a Fortune 100 company, serving as President of North American Retail and E-commerce businesses. He subsequently took on the role of CEO at Barnes & Noble, leading a focused transformation plan, which eventually led to the sale of the company. In addition to previously serving on several high-profile company boards, Demos now leads CityPark LLC, where he has invested in 15 companies, including several leading-edge retail tech startups.

Continue Reading

Politics

Understanding Edge Computing and Why it Matters to Businesses Today

Published

on

Hady Shaikh


The edge computing market is expected to reach $274 billion by 2025, focusing on segments like the internet of things, public cloud services, and patents and standards.

Most of this contribution is backed by enterprises shifting their data centers to the cloud. This has enabled enterprises to move beyond cloud systems to edge computing systems and extract the maximum potential from their computing resources.

This blog will provide a closer understanding of edge computing and how it helps businesses in the technology sector.

Understanding edge computing

From a technical standpoint, edge computing is a distributed computing framework that bridges the gap between enterprise applications and data sources, including IoT devices or local edge servers.

For an easier understanding, edge computing helps businesses recreate experiences for people and profitability through improved response time and bandwidth availability.

Why does edge computing matter for businesses?

When we talk about the most significant industry zones worldwide, for instance, the GCC region, which is heavily focused on the focus areas like cloud services, the transition from cloud technology to edge computing is now more prominent than ever for enterprises to leverage the potential of the technology.

And with only 3% of businesses at an advanced stage in digital transformation initiatives, the potential of edge computing is up for grabs.

It doesn’t matter if you’re running a mobile app development company, a grocery store next door, or a next-gen enterprise. You need to understand how cloud edge helps businesses and invest in this open-source technology.

Predictive maintenance

Edge computing is primarily sought in industries where value-added assets have a massive impact on the business in case of losses.

The technology has enabled reports delivery systems to send and receive documentation in seconds, usually taking days to weeks.

Consider the example of the oil and gas industry, where some enterprises utilize edge computing. The predictive maintenance allowed them to proactively manage their pipeline and locate the underlying issues to prevent any accumulated problems.

Support for remote operations

The pandemic has forced businesses to opt for remote operations, or a hybrid work model at the least, with the workforce, spread across different geographical boundaries.

This drastic shift has brought in the use of edge apps that would permit employees to secure access to their organization’s official servers and systems.

Edge computing helps remote operations and hybrid teams by reducing the amount of data volume commuting via networks, providing computing density and adaptability, limiting data redundancy, and helping users comply with compliance and regulatory guidelines.

Faster response time

Businesses can enjoy lower latency by deploying computational processes near edge devices. For instance, employees typically experience delays when corresponding with their colleagues on another floor due to a server connected in any part of the world.

While an edge computing application would route data transfer across the office premises, lower the delays, and considerably save bandwidth at the same time.

You can quickly scale this example of in-office communication to the fact that around 50% of data created by businesses worldwide gets created outside the cloud. Putting it simply, edge computing allows instant transmission of data.

Robust data security

According to Statista, by 2025, global data production is expected to exceed 180 zettabytes. However, the data security concerns will equally increase proportionately.

And with businesses producing and relying on data more than ever, edge computing is a solid prospect to process large amounts of data sets more efficiently and securely when done near the data source.

When businesses take the cloud as their sole savior for data storage in a single centralized location, it opens up risks for hacking and phishing activities.

On the other hand, an edge-computing architecture puts an extra layer of security as it doesn’t depend on a single point of storage or application. In fact, it is distributed to different devices.

In case of a hack or phishing attempt, a single compromised component of the network can be disconnected from the rest of the network, preventing a complete shutdown.

Convenient IoT adoption

Global IoT spending is expected to surpass $410 billion by 2025. For businesses, especially in the manufacturing sector, who rely on connected technology, the internet of things is at the thickest of things in the global industry today.

Such organizations are on the constant hunt to up their computational potential and probe into IoT through a more dedicated data center.

The adoption of edge computing makes the subsequent adoption of enterprise IoT quite cheap and puts little stress on the network’s bandwidth.

Businesses with computational prowess can leverage the IoT market without adding any major infrastructure expenses.

Lower IT costs

The global IT spending on devices, enterprise software, and communication services rose from $4.21 trillion to $4.43 trillion in 2022. While a considerable share of the global spending accounts for cloud solutions, obviously as the pandemic has only pushed the remote operations and hybrid working model further up.

When users keep the data physically closer to the network’s edge, the cost of sending the data to the cloud reduces. Consequently, it encourages businesses to save on IT expenses.

Besides cutting costs, edge computing also contributes to helping businesses increase their ROI through enhanced data transmission speed and improved networks needed to experiment with new models.

How is edge computing different from cloud computing?

Although edge computing and cloud computing are each other’s counterparts for data storage and distribution, there are some key differences regarding the user’s context.

Deployment

Edge computing deploys resources at the point where data generates. In contrast, cloud computing deploys resources at global locations.

Centralization/decentralization

Edge computing operates in a decentralized fashion, while cloud computing is centralized.

Architecture

Edge is made on a stable architecture, and cloud resources are made on loose-coupled components.

Response time

Edge-based resources respond instantaneously, and cloud resources have a higher response time.

Bandwidth

Edge computing requires lower bandwidth, while the cloud counterpart consumes a higher bandwidth.

Although, the above difference makes edge computing a clear winner in all aspects for any business. But there’s a catch!

Suppose your business resides at multiple physical locations, and you need a lower latency network to promptly cater to your customers who are away from your on-prem location. In that case, edge computing is the right choice for you.

Top edge computing use cases

Although there are numerous examples of edge computing use cases, I’ll talk about a few that I find the most interesting.

Autonomous vehicles

Autonomous flocking of truck convoys is the easiest example we can come for autonomous vehicles. With the entire fleet traveling close while saving fuel expenses and limiting congestion, edge computing has the power to eliminate the needs of all the drivers except the one in the front vehicle.

The idea being the trucks will be able to communicate with the others via low latency.

Remote monitoring of oil and gas industry assets

Oil and gas accidents have proved catastrophic throughout the industry’s history. This requires extreme vigilance when monitoring the assets.

Although oil and gas assets are placed at remote locations, the edge computing technology facilitates real-time analytics with processing closer to the asset, indicating less dependency on high-quality connectivity to a centralized cloud.

Smart grid

Edge computing is on course to elevate the adoption of smart grids, enabling enterprises to handle their energy consumption better.

Modern factories, plants, and office buildings use edge platform-connected sensors and IoT devices to observe energy usage and examine their consumption in real-time.

The data from real-time analytics will aid energy management companies in creating suitable, efficient workarounds. For example, watching where high energy consumption machinery runs during off-peak hours for electricity demand.

Cloud gaming

Cloud gaming, seemingly the next-big-thing in the gaming business like Google Stadia, PlayStation Now, etc., dramatically leans on latency.

Moreover, cloud gaming companies are on the quest to build edge servers as close to gamers as possible to reduce latency and provide a fully immersive, glitch-less experience.

Final thoughts

This concludes our discussion on understanding edge computing and how it matters for enterprises worldwide.

Now that you understand the benefits of edge computing and its applications in different industries and use cases, it is evident that it’s a great value proposition for businesses that want to acquire competitive advantages and lead their spaces from the front line.

Featured Image Credit: Provided by the Author; Thank you!

Hady Shaikh

Hady Shaikh is a professional product strategist with experience of over 10 years of working with businesses in mobile app development, product marketing, and enterprise solutions spaces. His C-suite leadership and expertise spans over helping clients in the MENA and US region build top-tier digital products and acquire tech consultancy. Currently working as the Principal Product Strategist at TekRevol, a US-based custom software development company, Hady’s vision is to establish a robust digital foothold in the GCC region by helping clients with their product strategy and development.

Continue Reading

Copyright © 2021 Seminole Press.