Connect with us

Politics

Why You’re Probably Thinking About Real-Time Systems in the Wrong Way

Published

on

Why You’re Probably Thinking About Real-Time Systems in the Wrong Way


Organizations in numerous industries are increasingly interested in and are attempting to build, real-time systems that far exceed the limited capabilities of the software systems of the recent past. The issues that these systems need to address impact internal operations and customer experiences, and also extend beyond the walls of the individual organization to change the expected capabilities of the industry, and even the health of the planet.

The next generation of real-time systems come into play in extremely diverse uses:

  • Safety and security: Delivering new levels of health and public safety in smart buildings that automatically detect and stop the spread of disease,
  • Retail: Enabling new personalized proximity marketing experiences in physical retail environments,
  • Emergencies: Detecting floods and other emergency situations and then automatically triggering evacuation protocols

In all these scenarios – there can be no compromise in terms of responsiveness, reliability, and scalability.  This demands that those in charge of development embrace a more modern way of thinking about the way these high-performance real-time systems should be architected.

When the Database First Way of Creating Real-Time Systems Fails — How Many Elevators Can You Actually Monitor — Before It Breaks?

A modern super-city might have buildings with hundreds of thousands of elevators – all of which require constant monitoring to detect situations of interest such as security and safety concerns. The best way to address this kind of ‘smart building’ challenge is through real-time stream processing that can handle data analytics at scale and deliver consistent and timely situational awareness.

Development would likely start with information from a single elevator with analysis done in a simple time-series database and small batch queries. But it would be incorrect to assume that what works for one should work for hundreds and then thousands.

The flaw in this assumption is that the database queries will be able to handle the explosion of data without massive loss in performance speed. This approach works as expected with a small number of elevators, but the whole system fails when the amount of data (elevators) grows beyond the capabilities of the database

Regardless of placing other real-time capabilities around the periphery of the traditional database in this system, the use of a database itself is what inherently breaks the system at scale.

The solution to creating a robust scalable system is to perform the analytics of anomaly detection in memory, and then move information to the database for historical purposes.  The database is the last step, not the first, in a modern real-time system.

The Three Kinds of Real-Time Systems

While there is growing interest in real-time systems, there is accompanying noise, confusion, and misinformation about the different kinds and capabilities of real-time systems, as well as the relevance (or not) of databases to their ability to scale and perform as required.  There are three types of real-time systems, each of which is relevant for solving a different class of problems.

  1. ‘Hard’ Real-Time Systems – hardware-based,
  2. Micro-Batch Real-Time Systems – ‘soft’ real-time systems that use more traditional data processes and queries,
  3. Event-Driven Real-Time Systems – ‘soft’ real-time systems that use stream or event processing.


1. “Hard’ Real-Time Systems

These types of systems are needed to solve problems that cannot tolerate any missing ‘deadlines,’ with performance speed measured in a few milliseconds. No database could ever deliver on this kind of performance, and in addition, all hardware equipment and computing need to be done on-premise.  High precision automated robotic assembly lines require the rigor of this type of real-time system.

2. Micro-Batch Real-Time Systems

This approach to real-time systems is most appropriate for problems that only require some real-time processing with latencies in the hundreds of milliseconds (or even seconds,) and that require little scaling.  eCommerce ordering systems can be a good match for this.

Traditional approaches to data processing are performed against small amounts of data (micro-batches) at a fast ‘duty cycle.’  Ground zero for creating fatal problems is found in attempts to scale the system and diminish the latency between batches to make these systems function, similar to event-driven real-time systems. 

As the number of batches increases linearly, the compute overhead and cost to continually run the queries in the growing volume of micro-batches increases exponentially (up to the square of the database size.)  At some point, the law of physics kicks in, and it becomes impossible to make the data analysis layer of the system perform in the defined ‘real-time’ at high volume. Ultimately, a database will never be as fast as event processing.

3. Event-Driven Real-Time Systems

This is the ‘goldilocks’ solution for applications that require action within a very short time period in the 1-10 millisecond range. A recommendation system is an appropriate use of this kind of real-time system – such as in eCommerce or in industrial automation.

In-memory processing, not a database, is the driving force in this system. Information (from IoT sensors, embedded AI, event brokers, etc.) is processed in flight using stream analytics, and it can then be sent to a database for historical purposes.

As the amount of data increases — the compute work scales linearly  — not exponentially – as in the case of the micro-batch approach.

Finding and Avoiding the Performance and Scale Choke Points in Real-Time Systems

The analysis of the three types of real-time systems shows us that systems that use a traditional database storage model can never be scalable in real-time, even if the ingestion was real-time.

It takes time to perform queries, and query performance degrades as a database grows – which is exactly what happens when you try to scale a system.  In our earlier elevator example, ingestion was real-time, but accessing and performing queries on the information stored in the database was NOT real-time.

The performance of that system was ultimately gated by the worst performing part of the entire system – the database.

In designing the next generation of real-time systems, it is essential to consider the time frame in which different information must be accessed and understood and the scale to which you ultimately want to grow your system.

It’s Not an Either-Or Decision — Next-Generation Real-Time Systems Will Need to Be Hybrid

There is not a one-size-fits-all approach to real-time systems. But it’s always important to start with understanding which information needs to be stored over longer periods of time in a database for historical reporting, deeper analytics, and pattern recognition.

Next, as opposed to the information that requires immediate action (on the order of milliseconds) for real-time event processing. The best systems will be those that combine the different models of data processing to take advantage of the benefits each offer.

Featured Image Credit: Natã Romualdo; Pexels; Thank you!

Mark Munro

Mark has over 35 years’ experience in IT as a Software Engineer, Consultant, Technical Architect and Solutions Architect for various development tools and platforms companies from 2 & 3 tier client/server to SOA, and Event Driven platforms. Mark has worked for various technology companies from Digital Equipment Corp, Forté Software, AmberPoint and now Vantiq. Mark has help customers across many verticals develop, design, architect highly complex and scalable applications and systems.
Mark is currently working as the Platform and Accelerator Product Manager working with customers, consultants, partners, and engineering to understand and help define the product direction.

Politics

Fintech Kennek raises $12.5M seed round to digitize lending

Published

on

Google eyed for $2 billion Anthropic deal after major Amazon play


London-based fintech startup Kennek has raised $12.5 million in seed funding to expand its lending operating system.

According to an Oct. 10 tech.eu report, the round was led by HV Capital and included participation from Dutch Founders Fund, AlbionVC, FFVC, Plug & Play Ventures, and Syndicate One. Kennek offers software-as-a-service tools to help non-bank lenders streamline their operations using open banking, open finance, and payments.

The platform aims to automate time-consuming manual tasks and consolidate fragmented data to simplify lending. Xavier De Pauw, founder of Kennek said:

“Until kennek, lenders had to devote countless hours to menial operational tasks and deal with jumbled and hard-coded data – which makes every other part of lending a headache. As former lenders ourselves, we lived and breathed these frustrations, and built kennek to make them a thing of the past.”

The company said the latest funding round was oversubscribed and closed quickly despite the challenging fundraising environment. The new capital will be used to expand Kennek’s engineering team and strengthen its market position in the UK while exploring expansion into other European markets. Barbod Namini, Partner at lead investor HV Capital, commented on the investment:

“Kennek has developed an ambitious and genuinely unique proposition which we think can be the foundation of the entire alternative lending space. […] It is a complicated market and a solution that brings together all information and stakeholders onto a single platform is highly compelling for both lenders & the ecosystem as a whole.”

The fintech lending space has grown rapidly in recent years, but many lenders still rely on legacy systems and manual processes that limit efficiency and scalability. Kennek aims to leverage open banking and data integration to provide lenders with a more streamlined, automated lending experience.

The seed funding will allow the London-based startup to continue developing its platform and expanding its team to meet demand from non-bank lenders looking to digitize operations. Kennek’s focus on the UK and Europe also comes amid rising adoption of open banking and open finance in the regions.

Featured Image Credit: Photo from Kennek.io; Thank you!

Radek Zielinski

Radek Zielinski is an experienced technology and financial journalist with a passion for cybersecurity and futurology.

Continue Reading

Politics

Fortune 500’s race for generative AI breakthroughs

Published

on

Deanna Ritchie


As excitement around generative AI grows, Fortune 500 companies, including Goldman Sachs, are carefully examining the possible applications of this technology. A recent survey of U.S. executives indicated that 60% believe generative AI will substantially impact their businesses in the long term. However, they anticipate a one to two-year timeframe before implementing their initial solutions. This optimism stems from the potential of generative AI to revolutionize various aspects of businesses, from enhancing customer experiences to optimizing internal processes. In the short term, companies will likely focus on pilot projects and experimentation, gradually integrating generative AI into their operations as they witness its positive influence on efficiency and profitability.

Goldman Sachs’ Cautious Approach to Implementing Generative AI

In a recent interview, Goldman Sachs CIO Marco Argenti revealed that the firm has not yet implemented any generative AI use cases. Instead, the company focuses on experimentation and setting high standards before adopting the technology. Argenti recognized the desire for outcomes in areas like developer and operational efficiency but emphasized ensuring precision before putting experimental AI use cases into production.

According to Argenti, striking the right balance between driving innovation and maintaining accuracy is crucial for successfully integrating generative AI within the firm. Goldman Sachs intends to continue exploring this emerging technology’s potential benefits and applications while diligently assessing risks to ensure it meets the company’s stringent quality standards.

One possible application for Goldman Sachs is in software development, where the company has observed a 20-40% productivity increase during its trials. The goal is for 1,000 developers to utilize generative AI tools by year’s end. However, Argenti emphasized that a well-defined expectation of return on investment is necessary before fully integrating generative AI into production.

To achieve this, the company plans to implement a systematic and strategic approach to adopting generative AI, ensuring that it complements and enhances the skills of its developers. Additionally, Goldman Sachs intends to evaluate the long-term impact of generative AI on their software development processes and the overall quality of the applications being developed.

Goldman Sachs’ approach to AI implementation goes beyond merely executing models. The firm has created a platform encompassing technical, legal, and compliance assessments to filter out improper content and keep track of all interactions. This comprehensive system ensures seamless integration of artificial intelligence in operations while adhering to regulatory standards and maintaining client confidentiality. Moreover, the platform continuously improves and adapts its algorithms, allowing Goldman Sachs to stay at the forefront of technology and offer its clients the most efficient and secure services.

Featured Image Credit: Photo by Google DeepMind; Pexels; Thank you!

Deanna Ritchie

Managing Editor at ReadWrite

Deanna is the Managing Editor at ReadWrite. Previously she worked as the Editor in Chief for Startup Grind and has over 20+ years of experience in content management and content development.

Continue Reading

Politics

UK seizes web3 opportunity simplifying crypto regulations

Published

on

Deanna Ritchie


As Web3 companies increasingly consider leaving the United States due to regulatory ambiguity, the United Kingdom must simplify its cryptocurrency regulations to attract these businesses. The conservative think tank Policy Exchange recently released a report detailing ten suggestions for improving Web3 regulation in the country. Among the recommendations are reducing liability for token holders in decentralized autonomous organizations (DAOs) and encouraging the Financial Conduct Authority (FCA) to adopt alternative Know Your Customer (KYC) methodologies, such as digital identities and blockchain analytics tools. These suggestions aim to position the UK as a hub for Web3 innovation and attract blockchain-based businesses looking for a more conducive regulatory environment.

Streamlining Cryptocurrency Regulations for Innovation

To make it easier for emerging Web3 companies to navigate existing legal frameworks and contribute to the UK’s digital economy growth, the government must streamline cryptocurrency regulations and adopt forward-looking approaches. By making the regulatory landscape clear and straightforward, the UK can create an environment that fosters innovation, growth, and competitiveness in the global fintech industry.

The Policy Exchange report also recommends not weakening self-hosted wallets or treating proof-of-stake (PoS) services as financial services. This approach aims to protect the fundamental principles of decentralization and user autonomy while strongly emphasizing security and regulatory compliance. By doing so, the UK can nurture an environment that encourages innovation and the continued growth of blockchain technology.

Despite recent strict measures by UK authorities, such as His Majesty’s Treasury and the FCA, toward the digital assets sector, the proposed changes in the Policy Exchange report strive to make the UK a more attractive location for Web3 enterprises. By adopting these suggestions, the UK can demonstrate its commitment to fostering innovation in the rapidly evolving blockchain and cryptocurrency industries while ensuring a robust and transparent regulatory environment.

The ongoing uncertainty surrounding cryptocurrency regulations in various countries has prompted Web3 companies to explore alternative jurisdictions with more precise legal frameworks. As the United States grapples with regulatory ambiguity, the United Kingdom can position itself as a hub for Web3 innovation by simplifying and streamlining its cryptocurrency regulations.

Featured Image Credit: Photo by Jonathan Borba; Pexels; Thank you!

Deanna Ritchie

Managing Editor at ReadWrite

Deanna is the Managing Editor at ReadWrite. Previously she worked as the Editor in Chief for Startup Grind and has over 20+ years of experience in content management and content development.

Continue Reading

Copyright © 2021 Seminole Press.