Connect with us

Politics

AI is Neutral Technology: What May Be Harmful in Social Media Can Help Healthcare – ReadWrite

Published

on

AI is Neutral Technology: What May Be Harmful in Social Media Can Help Healthcare - ReadWrite


Netflix’s new “The Social Dilemma” documentary has been eye-opening for millions of viewers (see in: hundustantimes, dotcom), sparking conversation — and concern — about how the algorithms used by social media platforms manipulate human behavior.

Here is: “AI is Neutral Technology: What May be Harmful in Social Media Can Help Healthcare — By Dr. Darren Schulte, MD is Chief Executive Officer at Apixio.

By leveraging artificial intelligence that has become shockingly good at analyzing, predicting, and influencing user behavior. The film asserts that the resulting unintended consequences have created real-life dystopian implications: excessive screen time that causes real-world relationships to suffer, addictive behavior, alarming societal divisiveness, and even higher rates of depression, self-harm, and suicide.

These consequences as users look to social media for validation. Big tech corporations profit enormously by harvesting and analyzing their user data and manipulating their behavior to benefit advertisers.

While the film appears to give machine learning algorithms a bad rap, these algorithms aren’t inherently evil. It all depends upon what the algorithms are trained to do.

In fact, the use of AI algorithms in healthcare has tremendous potential to transform health care by improving individual patient outcomes and overall population health, enabling more personalized medicine, reducing waste and costs, and accelerating the discovery of new treatment and preventative measures.

The same type of algorithms showcased in the Social Dilemma can be trained to analyze data generated by patients, care providers, and devices (like wearables). 

The algorithms can even use surveillance of body functions (like lab tests and vital signs) to provide deeper and more accurate insight into individual health, health-related habits, and behaviors over time.

By combining that individual data with anonymous, aggregated population data, we can discover better treatments, refine clinical guidelines, and discover new therapies to improve overall population health.

  • Improve response to emergent diseases like COVID-19. One of the problems we’ve had with effectively treating COVID-19 patients is that there’s been a lot of experimentation and trial-and-error. However, even the data on the results of those therapies has been slow to propagate across the global medical community.

Hospitals and physicians only have data on the patients that they are treating themselves. With no cohesive system for sharing patient data. Providers in America, for example, have not been able to benefit quickly enough from the knowledge and experience of providers in Asia and Europe — where the virus spread first.

By leveraging AI to mine aggregated medical records from millions of individuals, we could see what treatments have been most effective for specific patient cohorts.

Even further, we could analyze the characteristics of those already infected to see which attributes make one more likely to develop the most severe symptoms. By identifying vulnerable populations faster, we can then take targeted steps to prevent infection and implement the most effective treatments.

As we have seen, the analysis and exchange of this data manually, takes far too long, contributing to the propagation and death toll. With AI, we can surface this knowledge much faster and potentially reduce the impact of the next novel disease.

  • Provide better patient surveillance. Identifying how – and how fast – COVID-19 spreads has also been a significant challenge. Scientists traditionally use a metric called R0 (pronounced “R naught”), a measure of the average number of people infected by one infectious individual.

Using R0 to predict COVID-19’s spread has been problematic for several reasons, including the fact that different groups use different models and data, and asymptomatic individuals can spread the disease without knowing that they are infected.

AI can help resolve this issue to improve patient surveillance by analyzing both medical records of patients who tested positive alongside contact tracing data that indicates the potential for infection. By combining this data and analyzing it at scale, medical authorities can use this insight to determine where to implement aggressive testing programs and more restrictive shelter-in-place measures to slow the spread of disease.

  • Improve the quality of care. Health care providers want to deliver the best quality of care to their patients. But one of the challenges they face is measuring quality and patient outcomes with empirical evidence. With patient data scattered across different sources like electronic health records (EHRs), lab results, imaging studies, it is difficult to aggregate and analyze.

By implementing systems that consolidate this data and allow providers to use AI to mine it for insights, physician practices and hospitals can identify trends among patients and implement quality improvement programs.

For example, if they see that individuals with certain characteristics fail to follow-up on important health concerns, providers can intervene with appointment reminders, transportation resources, provide telehealth options, or other interventions to keep patients engaged in their own care.

On the flip side, insurers are also concerned about care quality and ensuring patients get the best possible outcome at the lowest possible cost.

AI can help insurers track and measure patient outcomes as they move through the care system—from a primary care provider to a specialist to a hospital for surgery and into a rehab facility, for example—and identify providers or treatment protocols that may not be delivering optimal results. Insurers can then work with providers to implement new approaches to improve success rates and overall patient outcomes.

  • Identify and mitigate concerning trends. During a typical patient encounter, doctors only have access to the medical information for the patient in front of them. Consulting their patient history provides a limited view of factors that might indicate declining health. With data scattered across different systems, doctors do not always have all the data they need at hand.

AI can help surface broader indicators that a patient’s health may be declining over time.

By analyzing aggregate data across a large population, AI can show that patients with certain vital signs or trends in their data might be headed toward developing certain conditions, like diabetes or heart disease.

Physicians can use this information as a predictor of potential trouble and begin implementing preventative action. Some solutions can alert physicians to these insights as notifications within the Electronic Health Record (EHR) during the patient encounter. This allows physicians to take swift action to prevent disease progression.

  • Enable personalized medicine. The health care industry has been moving toward personalized medicine for years, aiming to transform the “one-size-fits-all” approach to care into a customized plan for each individual. But this is practically impossible without access to aggregated data and insights that only AI can provide.

Consider the AI social media companies use to create and leverage personas to prompt engagement and drive advertising dollars. If we were to apply the same technique to build health care personas for each person, we could then provide this information to providers (with the patient’s permission).

Providers could then use tools like notifications, nudges, cues, or other communication (just like social media) to elicit positive behavior for better health.

For example, providers could target at-risk patients with prescription reminders, diet recommendations, or other resources relevant to their specific health situation.

  • Reduce diagnostic and treatment errors. Even the best providers can overlook important details and make mistakes, especially with the pressure they are under to squeeze more patients into a typical day.

Just as algorithms can help social platforms surface insights about their audience to woo advertisers, physicians can use algorithms to surface insights to diagnose and treat conditions accurately. For example, AI can highlight confounding conditions or risk factors for patients, allowing doctors to consider the individual’s entire health profile when making decisions.

AI can also aid in surfacing potential drug interactions that could put patients at risk. All of this can substantially lower the risk of errors that cause patients harm, not to mention reduce the risk of malpractice accusations.

The same way algorithms can identify Facebook users who might be interested in a new lawnmower and serve up an appropriate ad; they can help providers identify high-risk patients before they develop costly care needs. By culling through data to identify risk factors, AI allows providers to implement preventative and early intervention strategies.

For example, an algorithm might spot a specific obesity indicator that correlates with the risk for Type II diabetes or identify patients with high blood pressure that are at greater risk of heart attack, stroke, or kidney disease.

These insights can be delivered at the point of care, even during a patient encounter. If a patient displays a specific set of symptoms, as the data is entered into the EHR, the physician is alerted to the risk and can review trends in disease progression or confounding conditions to plot the best course of action.

  • Identify optimal treatment pathways through data-based referrals. Traditionally, when a patient needed to see a specialist, for surgery or physical therapy, for example, physicians typically referred to providers with whom they have existing relationships.

Unfortunately for patients, this does not always mean they get the best care for their unique situation. Does the provider have experience working with patients with co-morbidities? Do they specialize in complex surgeries or more typical procedures?

AI allows providers to refer to the best provider for each patient’s unique needs based on hard evidence of success and proven outcomes, rather than simply based on existing ties.

For example, if a patient with diabetes needs a knee replacement, AI can help primary care providers to identify orthopedic specialists and rehabilitation providers with proven, demonstrably better results in handling patients with this co-existing condition.

  • Reduce spending waste. About 30% of healthcare spending is considered “waste,” totaling up to $935 billion. Nearly $80 billion alone can be attributed to overtreatment or low-value care.

In other words, providers order more tests, services, and procedures that aren’t necessarily the best option—or even necessary at all—mostly in an effort to protect themselves against being accused of not doing enough and to meet insurer’s requirements (e.g., ordering x-rays before an MRI when an injury is clearly soft tissue related or sending patients for multiple repeat mammograms before conducting an ultrasound to evaluate a suspicious lump).

By mining data using algorithms, providers and insurers can focus on using the tests and procedures that demonstrate high value or necessary for specific instances. For example, is it necessary for patients on certain medications to get blood tests every 90 days? Do wellness visits add value to patients?

By looking at what is most effective across the larger population, AI can help point physicians in the right direction earlier, reducing unnecessary diagnostics and placing the patient on the path to better health more quickly.

AI thereby can reduce wasteful spending by identifying diagnostics that are most effective and economical, potentially saving patients and payers millions every year on ineffective tests and treatments.

  • Accelerate drug and treatment discovery. The current pathway to new drugs, vaccines, and treatments is long and arduous. On average, it takes at least ten years for new drugs to go from discovery to marketplace, with trials alone taking as long as seven years on average. For new vaccines, the average time to market is up to 12 years (which puts hope for a COVID-19 vaccine by year’s end into perspective).

One of the reasons the process is so slow is the lack of advanced data and analytics capabilities in the process.

The use of AI to analyze patient and drug performance data could substantially accelerate the time to market for new drugs and vaccines, which could save lives.

Just as the lack of data analytics meant doctors struggled to devise effective COVID-19 protocols, the inability to rapidly analyze trial data and evaluate new use cases for existing drugs prevents patients from getting the treatment they need.

Algorithms can accelerate this analysis and get much-needed medicines into the hands of patients faster.

All this time can add up to a significant cost and take away from time spent in direct, face-to-face time with patients.

AI can help reduce this burden and lower operational costs by automating manual processes like prior authorizations, reducing retrospective chart reviews by surfacing the right data to the right people earlier. The right data, quickly obtainable, will help physicians make better, faster decisions.

These efficiencies enabled by AI, on the administrative side, ultimately lower the cost of health care services for both patients and payers and frees up more resources to improve direct patient care.

The negative use of social media comes when the data influences human behavior bringing negative consequences.

For the most part, technology is neutral. But in the wrong hands with the wrong motives or objectives, the use of algorithms can raise serious ethical questions.

The same algorithms that cause us to feel more anxious, isolated, or depressed when leveraged by social media can also be used to help us heal, stay healthy, and achieve optimal well-being.

The questions are all about the algorithm’s objective and training, testing, and user feedback data that are used by the algorithm.  The reality is that managing both individual and public health in the 21st century requires access to data and insights.

Without data-driven insights, we are just guessing what will work in healthcare and what doesn’t.

Leveraging algorithms to analyze health care data empowers physicians to devise a truly personalized care plan for each individual. The physician can improve the quality of care overall and lower health care costs by tapping into collective insight and knowledge gleaned from millions of patient records.

Image Credit: karolina grabowska; pexels

Politics

How Alternative Data is Changing the Finance Sector

Published

on

How Alternative Data is Changing the Finance Sector


Alternative data has been touted as the future for various companies. Financial services companies have taken a particular interest in the field as it has the potential to either provide completely novel signals or improve existing investment strategies.

However, understanding the scale and importance of alternative data has always been challenging as businesses in the sector are often shrouded in mystery. Investing is extremely competitive as alpha often depends on the signal strength other companies can acquire.

Now, however, the veil has been lifted, even if slightly. Finally, there is enough data to understand how far alternative data and web scraping have entrenched themselves into the industry, allowing us to understand their importance.

What is alternative data and web scraping?

Alternative data is a negatively defined term meaning everything that is not traditional data. The latter is considered to be everything that’s published regularly according to regulations, government action, or other oversight. In other words, it’s all the data from statistics departments, financial reports, press releases, etc.

Since alternative data is defined negatively, it’s every information source that’s not traditional. While the definition is somewhat broad, alternative data does have its characteristics. Namely, it’s almost always unstructured, comes in various formats (i.e., text, images, videos), and often is extracted for a highly specific purpose.

Data acquisition is significantly more complicated because both the sources and the formats are varied. Data as a Service (DaaS) businesses can resolve most of the acquisition issues; however, finding one that holds the necessary information can be complex.

Web Scraping and in-house solutions in alternative data acquisition

Many companies turn to building in-house solutions for alternative data acquisition. One of the primary methods for doing so is called web scraping. In short, it’s a method of automating online public data collection by employing bots.

These solutions go through a starting set of URLs and download the data stored within. Most bots will also further collect any URLs stored on the page for continued crawling. As a result, they can blaze through many sources within seconds or minutes.

Collected data is then delivered and parsed for analysis. Some of it, such as pricing information, can be integrated into completely automated solutions. Other data, such as anything from which investment signals might be extracted, is analyzed manually by dedicated professionals.

Web scraping is shaping the financial services industry

As mentioned above, financial services and investment companies have taken a particular interest in web scraping earlier than nearly anyone else. These businesses thrive upon gaining an informational edge over their competitors or the market as a whole.

So, in some sense, it was no surprise when web scraping turned out to be a key player in the financial services industry. So we surveyed over 1000 decision-makers in the financial services industry across the US and UK regions to find out more about how data is being managed in these companies.

Image Credit: Oxylabs; Thank you!

 

While internal data, as expected, remains the primary source of insight for all decision-making, web scraping has nearly overtaken it in the financial services industry. Almost 71% of our respondents have indicated that they use web scraping to help clients make business decisions.

Web Scraping and Growth Tendencies

Other insights are even more illuminating. For example, while web scraping has shown clear growth tendencies, we didn’t expect 80% of the survey respondents to believe that the focus will shift towards it even more in the coming 12 months. Nevertheless, these trends indicate a clear intent to change the dominant data acquisition methods in the industry.

Finally, there’s reason to believe that the performance of web scraping is equally as impressive. There may have been reason to believe that the process of automated data collection is simply a byproduct of hype. Big data has been a business buzzword for the longest time, so it may seem that some of that emotion might have transferred to web scraping.

Implementing Web Scraping

However, those who have implemented web scraping do not seem to think it’s pure hype. Over a quarter of those who have implemented the process believe it has had the most significant positive impact on revenue. Additionally, nearly half (44%) of all respondents plan to invest in web scraping the most in the coming years.

Our overall findings are consistent across regions. As the US and UK are such significant players in the sector, the conclusions likely extend to global trends, barring some exceptions where web scraping might be trickier to implement due to legal differences.

The survey has only uncovered major differences in how web scraping is handled, not whether it’s worthwhile. For example, in the US, it’s rarely the case that compliance or web scraping itself would be outsourced (12% & 8%, respectively). On the other hand, the UK is much more lenient regarding outsourced departments (22% and 15% for outsourced compliance and outsourced web scraping, respectively).

Conclusion

While the way data is being managed in the financial services industry has been shrouded in mystery for many years, we’re finally getting a better glimpse into the trends and changes the sector has been undergoing. As we can see, web scraping and alternative data play a major role in shaping the industry.

Becoming the true first adopters of web scraping, however, I think, is only the beginning. Both the technology and the industry are still maturing. Therefore, I firmly believe we will see many new and innovative developments in data extraction and analysis in the finance sector, which novel web scraping applications will head.

Image Credit: Pixabay; Pexels; Thank you!

Julius Cerniauskas

CEO at Oxylabs

Julius Cerniauskas is Lithuania’s technology industry leader & the CEO of Oxylabs, covering topics on web scraping, big data, machine learning & tech trends.

Continue Reading

Politics

How to Implement a Splintered Content Strategy

Published

on

How to Use SEO if You Have No Experience


Content makes the marketing world go round. It doesn’t matter what your overarching marketing strategy looks like – content is the fuel source. You can’t go anywhere without it. The biggest problem is that content can be expensive to create. We operate in a business world where thousands of pieces of content are created every single second. Trying to keep up can feel like an expensive exercise in futility.

The key to successful digital marketing in an era of saturated online channels is extracting maximum value from your content. If the traditional approach is built around “single-use” content, you need to switch gears and opt for a multi-use approach that allows you to leverage the same content over and over again. One way to do this is by building out a “splintered” content strategy.

What is a Splintered Content Strategy?

The best way to understand the splintered approach to content creation is via an analogy. In the analogy, you start with one core topic that relates to your brand and readers. This topic is represented as a tree. Then, when you want to get more value out of the tree, you chop it down into big logs. These logs represent sub-topics of more significant topics. These logs can then be split and broken down into even smaller niches. (And this process of splintering the original topic into smaller/different pieces of micro-content can go on and on.)

Content splintering is not to be confused with content republishing or duplication. The mission isn’t to reuse the same content so much as to extract more value from the original content by finding new uses, applications, angles, and related topics. Not only does this approach help you maximize your ROI, but it also creates a tightly-correlated and highly-consistent web of content that makes both search engines and readers happy.

What You’ll Need for a Splintered Content Strategy

In order to get started with creating splintered content, you’ll need a few things:

  • Keyword research. The process always begins with keyword research. First, you need to perform detailed SEO research to zero in on the keywords that specifically resonate with your target audience. This feeds your topic selection and actual content creation. (You can think of keyword research as developing a blueprint. Just like you can’t build a house without plans, you can’t implement a splintered content strategy without keyword research.)
  • General topic. Armed with the right keywords, you can begin the process of choosing a broad topic. A general topic is a very basic, overarching topic that speaks to a specific target audience.
  • Content writers. You’ll need a team of people to actually create the content. While it’s possible to do this on your own, you ideally want to hire content writers to do the heavy lifting on your behalf. This allows you to focus on the big-picture strategy.
  • Consistency. A splintered content strategy requires consistency. Yes, there are ways to automate and streamline, but you have to ensure that you’re consistently churning out content (and that the content is closely correlated).

A good splintered content strategy takes time to develop. So, in addition to everything mentioned above, you’ll also need patience and resilience. Watch what’s working, and don’t be afraid to iterate. And remember one thing: You can always splinter a piece of content into more pieces.

How to Plan and Execute a Splintered Content Strategy

Now that we’re clear on splintered content and some of the different resources you’ll need to be successful, let’s dig into the actual how-to by looking at an illustration of how this could play out. (Note: This is not a comprehensive breakdown. These are merely some ideas you can use. Feel free to add, subtract, or modify to fit your own strategy needs.)

Typically, a splintered content strategy begins with a pillar blog post. This is a meaty, comprehensive resource on a significant topic that’s relevant to your target audience. For example, a financial advisor might write a pillar blog post on “How to Sell Your House.” This post would be several thousand words and include various subheadings that drill into specific elements of selling a house.

The most important thing to remember with a pillar post is that you don’t want to get to micro with the topic. You certainly want to get micro with the targeting – meaning you’re writing to a very specific audience – but not with the topic. Of course, you can always zoom in within the blog post, and with the splinters it produces, but it’s much more difficult to zoom out.

  • Turn the Blog Post Into a Podcast Series

Once you have your pillar piece of content in place, the splintering begins. One option is to turn the blog post into a series of podcast episodes. Each episode can touch on one of the subheadings.

If these are the subheadings from the blog post, they would look like this:

  • How to prepare for selling > Episode 1
  • How to find a real estate agent > Episode 2
  • How to declutter and stage your property > Episode 3
  • How to price your property > Episode 4
  • How to choose the right offer > Episode 5
  • How to negotiate with repair requests > Episode 6
  • How to prepare for closing day > Episode 7
  • How to move out > Episode 8

Depending on the length of your pillar content, you may have to beef up some of the sections from the original post to create enough content for a 20- to 30-minute episode, but you’ll at least have a solid outline of what you want to cover.

  • Turn Podcasts Into YouTube Videos

Here’s a really easy way to multiply your content via splintering. Just take the audio from each podcast and turn it into a YouTube video with graphic overlays and stock video footage. (Or, if you think ahead, you can record a video of you recording the podcast – a la “Joe Rogan” style.)

  • Turn YouTube Videos Into Social Clips

Cut your 20-minute YouTube video down into four or five different three-minute clips and soundbites for social media. These make for really sticky content that can be shared and distributed very quickly.

  • Turn Each Podcast Into Long-Form Social Posts

Take each podcast episode you recorded and turn them into their own long-form social posts. Of course, some of this content will cover information already hashed out in the original pillar post, but that’s fine. As long as you aren’t duplicating content word-for-word, it’s totally fine if there’s overlap.

  • Turn Long-Form Social Posts Into Tweets

Your long-form social posts can then be turned into a dozen or more individual short-form tweets. Find the best sentences, most shocking statements, and most powerful statistics from these posts and schedule a series of automated posts to go out over a few weeks. (You can automate this process using a tool like Hootsuite or Buffer.)

  • Turn Content Into an Email Campaign

Finally, take your best content and turn it into a series of emails to your list. You may even be able to set up an autoresponder series that slowly drips on people with a specific call-to-action.

Using the example from this article, a real estate agent might send out a series of 10 emails over 30 days with a call-to-action to get a free listing valuation.

Take Your Content Strategy to the Next Level With Splintered Content Strategy

There isn’t necessarily a proper way to implement a splintered content strategy. But, like everything regarding marketing, there’s ample room for creativity.

Conclusion

Use the parts of this article that resonate with you and adapt the rest to fit your vision for your content. Just remember the core objective of this entire approach: content maximization.

The goal is to get the most value out of your content as possible. And you do that by turning each piece of content you create into at least one more piece of content. If you do this efficiently, you will be successful.

Image Credit: by Kampus Production; Pexels; Thank you!

Timothy Carter

Chief Revenue Officer

Timothy Carter is the Chief Revenue Officer of the Seattle digital marketing agency SEO.co, DEV.co & PPC.co. He has spent more than 20 years in the world of SEO and digital marketing leading, building and scaling sales operations, helping companies increase revenue efficiency and drive growth from websites and sales teams. When he’s not working, Tim enjoys playing a few rounds of disc golf, running, and spending time with his wife and family on the beach — preferably in Hawaii with a cup of Kona coffee. Follow him on Twitter @TimothyCarter

Continue Reading

Politics

Successful AI Requires the Right Data Architecture – Here’s How

Published

on

Successful AI Requires the Right Data Architecture - Here’s How


For companies that can master it, Artificial Intelligence (AI) promises to deliver cost savings, a competitive edge, and a foothold in the future of business. But while the rate of AI adoption continues to rise, the level of investment is often out of kilter with monetary returns. To be successful with AI you’ll want the right data architecture. This article tells you how.

Currently, only 26% of AI initiatives are being put into widespread production with an organization. Unfortunately, this means many companies spend a lot of time on AI deployments without seeing tangible ROI.

All Companies Must Perform Like a Tech Company

Meanwhile, in a world where every company must perform like a tech company to stay ahead, there’s increasing pressure on technical teams and Engineering and IT leaders to harness data for commercial growth. Especially as spending on cloud storage increases, businesses are keen to improve efficiency and maximize ROI from data that are costly to store. But unfortunately, they don’t have the luxury of time.

To meet this demand for rapid results, mapping data architecture can no longer stretch on for months with no defined goal. At the same time, focusing on standard data cleaning or Business Intelligence (BI) reporting is regressive.

Tech leaders must build data architecture with AI at the forefront of their objectives.

To do otherwise — they’ll find themselves retrofitting it later. In today’s businesses, data architecture should drive toward a defined outcome—and that outcome should include AI applications with clear benefits for end-users. This is key to setting your business up for future success, even if you’re not (yet) ready for AI.

Starting From Scratch? Begin With Best Practices for Data

Data Architecture requires knowledge. There are a lot of tools out there, and how you stitch them together is governed by your business and what you need to achieve. The starting point is always a literature review to understand what has worked for similar enterprises, as well as a deep dive into the tools you’re considering and their use cases.

Microsoft has a good repository for data models, plus a lot of literature on best data practices. There are also some great books out there that can help you develop a more strategic, business-minded approach to data architecture.

Prediction Machines by Ajay Agarwal, Joshua Gans, and Avi Goldfarb is ideal for understanding AI at a more foundational level, with functional insights into how to use AI and data to run efficiently. Finally, for more seasoned engineers and technical experts, I recommend Designing Data-Intensive Applications by Martin Kleppmann. This book will give you the very latest thinking in the field, with actionable guidance on how to build data applications, architecture, and strategy.

Three Fundamentals for a Successful Data Architecture

Several core principles will help you design a data architecture capable of powering AI applications that deliver ROI. Think of the following as compass points to check yourself against whenever you’re building, formatting, and organizing data:

  • Building Toward an Objective:

    Always have your eye on the business outcome you’re working toward as you build and develop your data architecture is the cardinal rule. In particular, I recommend looking at your company’s near-term goals and aligning your data strategy accordingly.

    For example, if your business strategy is to achieve $30M in revenues by year-end, figure out how you can use data to drive this. It doesn’t have to be daunting: break the more important goal down into smaller objectives, and work toward those.

  • Designing for Rapid Value Creation:

    While setting a clear objective is key, the end solution must always be agile enough to adapt to changing business needs. For example, small-scale projects might grow to become multi-channel, and you need to build with that in mind. Fixed modeling and fixed rules will only create more work down the line.

    Any architecture you design should be capable of accommodating more data as it becomes available and leveraging that data toward your company’s latest goals. I also recommend automating as much as you can. This will help you make a valuable business impact with your data strategy quickly and repeatedly over time.

    For example, automate this process from the get-go if you know you need to deliver monthly reporting. That way, you’ll only spend time on it during the first month. From there, the impact will be consistently efficient and positive.

  • Knowing How to Test for Success:

    To keep yourself on the right track, it’s essential to know if your data architecture is performing effectively. Data architecture works when it can (1) support AI and (2) deliver usable, relevant data to every employee in the business. Keeping close to these guardrails will help ensure your data strategy is fit for purpose and fit for the future.

The Future of Data Architecture: Innovations to Know About

While these key principles are a great starting place for technical leaders and teams, it’s also important not to get stuck in one way of doing things. Otherwise, businesses risk missing opportunities that could deliver even greater value in the long term. Instead, tech leaders must constantly be plugged into the new technologies coming to market that can enhance their work and deliver better outcomes for their business:

  • Cheaper Processing:

    We’re already seeing innovations making processing more cost-efficient. This is critical because many of the advanced technologies being developed require such high levels of computer power they only exist in theory. Neural networks are a prime example. But as the required level of computer power becomes more feasible, we’ll have access to more sophisticated ways of solving problems.

    For example, a data scientist must train every machine learning model. But in the future, there’s potential to build models that can train other models. Of course, this is still just a theory, but we’ll definitely see innovation like this accelerate as processing power becomes more accessible.

  • Bundled Tools:

    Additionally, when it comes to apps or software that can decrease time to value for AI, we’re in a phase now where most technology available can only do one thing well. The tools needed to productionize AI — like storage, machine learning providers, API deployment, and quality control — are unbundled.

    Currently, businesses risk wasting precious time simply figuring out which tools they need and how to integrate them. But technology is gradually emerging that can help solve for multiple data architecture use cases, as well as databases that are specialized for powering AI applications.

    These more bundled offerings will help businesses put AI into production faster. It’s similar to what we’ve seen in the fintech space. Companies initially focused on being the best in one core competency before eventually merging to create bundled solutions.

  • Data Marts vs. Data Warehouses:

    Looking further into the future, it seems safe to predict that data lakes will become the most important AI and data stack investment for all organizations. Data lakes will help organizations understand predictions and how best to execute those insights. I see data marts becoming increasingly valuable for the future.

    Marts deliver the same data to every team in a business in a format they can understand. For example, Marketing and Finance teams see the same data represented in metrics that are familiar and – most importantly – a format they can use. The new generation of data marts will have more than dimensions, facts, and hierarchy. They won’t just be slicing and dicing information — but will support decision-making within specific departments.

Conclusion

As the technology continues to develop, it’s critical that businesses stay up to speed, or they’ll get left behind. That means tech leaders staying connected to their teams, and allowing them to bring new innovations to the table.

Even as a company’s data architecture and AI applications grow more robust, it’s essential to make time to experiment, learn and (ultimately) innovate.

Image Credit: by Polina Zimmerman; Pexels; Thank you!

Atul Sharma

Atul founded Decision Intelligence company Peak in 2015 with Richard Potter and David Leitch. He has played a pivotal role in shaping Peak’s Decision Intelligence platform, which emerged as an early leader in a category that is expected to be the biggest technology movement for a generation. Peak’s platform is used by leading brands including Nike, Pepsico, KFC and Sika.
On a mission to change the way the world works, the tech scaleup has grown quickly over the last seven years and now numbers over 250 people globally. Regularly named a top place to work in the UK, this year Peak received the Best Companies 3-star accreditation, which recognizes extraordinary levels of employee engagement.
Prior to Peak, Atul spent over 20 years working in data architecture and data engineering. He has worked on designing and implementing data integration and data warehouse engagements for global companies such as Morrisons Plc, The Economist, HBOS, Admin Re (Part of Swiss Re) and Shell.

Continue Reading

Copyright © 2021 Seminole Press.