Connect with us

Politics

Why Context is Crucial to Successful Edge Computing – ReadWrite

Published

on

John Keever


The very nature of technology innovation lends itself to the types of buzzwords and jargon that can often impede people’s understanding of the technologies themselves. These buzzwords range from metaphorical, but ultimately easy-to-understand, terms like “cloud;” to downright literal terms like “Internet of things.” Somewhere in between is where we get terms like “edge computing,” which is where the technology itself and the term used to describe it has one essential thing in common – they require context.

Why Context is Crucial to Successful Edge Computing

In IT, we call it a “use case.” Still, that term is essentially a tangible manifestation of the context in which technology will be most effective, whether that’s a manufacturing scenario, a telematics platform, or an IoT integration. Even within IoT, context is crucial because it can be used in something as simple as a smart thermostat, something as advanced as an MRI machine, or any number of use cases in between.

The real challenge when it comes to edge computing isn’t so much to create a device, but rather to make sure that device can operate and transmit data reliably.

People focus on the platform side of the business all too often because that’s where they’re going to see ROI on the data and the analytics. But, still, if they don’t have the right things going on at the network edge, then all of that wonderful back-end processing isn’t going to amount to much.

Edge computing tends to be overlooked

Edge computing tends to be overlooked because most people simply take it for granted. This happens a lot during the manufacturing process especially, because there’s a mindset that when you buy a device like a laptop or a smartphone, that device is going to communicate with other devices through an interface that’s driven by the user.

We are thinking — “use the smartphone to send data to the laptop, and then use the laptop to send the same data to the printer.”

In the context of IoT devices, that’s not really how things work.

Without proper edge management, maintenance costs can quickly skyrocket for a device that’s meant to be self-sustaining. And we’re not just talking about rolling trucks to troubleshoot a router. In some cases, these devices are literally designed to be buried in the ground alongside crops to measure soil moisture.

I0T is a small footprint device meant to exist and operate on its own

In the IoT realm, we’re building these new, small-footprint devices that are meant to exist and operate on their own. The initial interactions we’re having with most of our customers and business partners center on the question of, “How do we connect to this thing? How do we deal with this protocol? How do we support this sensor?”

Some of the biggest challenges arise when we get down to the electronics level and start figuring out how to interface from the electronics up into the first level of the software tier.

Communication

In the world of IoT, devices are built with some form of communication standard in mind. However, remembering that the actual data that they transfer – and how they transfer it – is another piece of the puzzle altogether. In addition, the devices have to be maintained for the entire lifespan of the device.

Maybe the temperature went up, or the temperature went down, or the device is just periodically meant to pulse some information back into the network to do something.

Most of the time, people are challenged with designing these things, and it might be the first time they’ve ever been challenged with worrying about the issues. People forget it’s not plug-and-play, like a laptop or printer.

Modern cellular devices consume data

Even something as simple as the data itself – and understanding how modern cellular devices consume data compared to their Wi-Fi and 3G counterparts – can derail an entire IoT project before it even gets off the ground. It’s a lot more challenging world to deal with.

Is the device properly scaled and calibrated?

Another key area of that world involves being able to make sure that devices are properly scaled and calibrated, and that the data they transmit is handled in a meaningful way. For example, if something goes wrong with the connection, that data needs to be properly queued so that, when the connection is reestablished, it can still end up where it was meant to go.

Many otherwise very successful companies have learned these types of lessons the hard way by not taking into account how their devices would behave in the real world. For instance, they might be testing those devices in a lab when they’re ultimately designed to use cellular data. The cost of that critical communication function ends up being so high that the device isn’t a viable product from a business standpoint.

What is the first job or function of the device — will it work as intended?

Of course, it can be even more disastrous when developers focus too much on how the device will work before they’ve put enough time into figuring out whether the physical device itself is going to work in the first place.

Whether it’s some kind of simple telematics device for a vehicle, an advanced module for use in manufacturing, or any number of devices in between, the all-important work of making sure that a given device and its components will work the way it’s intended is often relegated to the people with the least experience.

Appreciate the complexity

In many cases, people get thrown into it, and they don’t appreciate the complexity they’re dealing with until they’ve already suffered any number of setbacks. It could be an environmental issue, a problem with battery life, or even something as simple as where an antenna needs to be placed. Then, once it’s been placed in the field, how will it be updated?

Is the item or device really ready to be shipped? Test, test test.

When these types of devices fail after already being placed in the field, the cost of replacing and reshipping them alone can completely torpedo the entire product line. That’s why it’s so important to test them in the field in smaller groups and avoid being seduced by the garden path of scaling them up too quickly.

Grand plans are great, but starting small and iterating over time is the ultimate case where an ounce of prevention is truly worth more than a pound of cure.

Delivering to the customer — the “last mile.” But think “first mile first.”

People often talk about edge computing as a “Last mile” technology, and as the last mile of a marathon, it is the most challenging of all.

Historically, large telecom and IT companies describe the connection to a device or the edge as the “Last Mile,” as in delivering data services from the curb to the house.

But that is an incorrect viewpoint in IoT.  Everything starts with the device — the originator of the data. Therefore, connecting to the device and delivering data to the application infrastructure is crossing the “First Mile.”

Either way, once we have the proper understanding and context of how edge computing functions in the real world, the finish line is already in sight.

Image Credit: valdemaras d.; pexels; thank you!

John Keever

Chief Technology Officer, Telit IoT Platforms Business Unit

John Keever currently serves as the CTO of the Telit IoT Platforms Business Unit. He came to Telit from ILS Technology, a company that Telit acquired in 2013. Mr. Keever founded ILS Technology and began serving as an executive vice president and chief technology officer in October 2000. He has more than 30 years of experience in automation software engineering and design. Mr. Keever holds patents in both hardware and software.
Mr. Keever came to ILS Technology from IBM Corporation where he was a global services principle responsible for e-production solution architectures and deployments. Mr. Keever enjoyed over 18 years of plant floor automation experience with IBM and is the former world-wide development and support manager for Automation Connection, Distributed Applications Environment, PlantWorks and Data Collection hardware and software products. His prior experience within IBM includes lead marketing and solutions architecture responsibilities for General Motors, BMW, Chrysler, Tokyo Electron, Glaxo-Wellcome, and numerous other global manufacturing companies.
He holds a bachelor’s degree in mechanical engineering from North Carolina State University, a master’s degree in mechanical engineering, with minors in both electrical engineering and mathematics, from North Carolina State University. He has also completed post-graduate work in computer engineering and operating systems design at Duke University.
I’ve always been passionate about mechanical, electrical and computer engineering, having pursued them in my bachelor’s and master’s degrees. Founding my own company, ILS Technology, and working for a global IoT enabler like Telit has given me valuable insight into both the business and technical sides of IoT and technology that I would like to share with the ReadWrite community.
Along with founding my own company, I hold over 30 years of experience in automation software engineering and design and 18 years of plant floor automation experience with IBM. This experience, coupled with a master’s degree in mechanical engineering, gives me the foundation and knowledge necessary to contribute valuable insights for ReadWrite’s audience that can help improve their technical knowledge and share new ideas on legacy practices.
ReadWrite strives to produce content that favors reader’s productivity and provide quality information. With 30 years of experience in automation software engineering and design and 18 years of plant floor automation experience with IBM, I believe I have the foundation and knowledge necessary to contribute valuable and quality insights for ReadWrite’s audience that will not only help improve their technical knowledge, but also share new ideas on legacy practices.

Politics

Fintech Kennek raises $12.5M seed round to digitize lending

Published

on

Google eyed for $2 billion Anthropic deal after major Amazon play


London-based fintech startup Kennek has raised $12.5 million in seed funding to expand its lending operating system.

According to an Oct. 10 tech.eu report, the round was led by HV Capital and included participation from Dutch Founders Fund, AlbionVC, FFVC, Plug & Play Ventures, and Syndicate One. Kennek offers software-as-a-service tools to help non-bank lenders streamline their operations using open banking, open finance, and payments.

The platform aims to automate time-consuming manual tasks and consolidate fragmented data to simplify lending. Xavier De Pauw, founder of Kennek said:

“Until kennek, lenders had to devote countless hours to menial operational tasks and deal with jumbled and hard-coded data – which makes every other part of lending a headache. As former lenders ourselves, we lived and breathed these frustrations, and built kennek to make them a thing of the past.”

The company said the latest funding round was oversubscribed and closed quickly despite the challenging fundraising environment. The new capital will be used to expand Kennek’s engineering team and strengthen its market position in the UK while exploring expansion into other European markets. Barbod Namini, Partner at lead investor HV Capital, commented on the investment:

“Kennek has developed an ambitious and genuinely unique proposition which we think can be the foundation of the entire alternative lending space. […] It is a complicated market and a solution that brings together all information and stakeholders onto a single platform is highly compelling for both lenders & the ecosystem as a whole.”

The fintech lending space has grown rapidly in recent years, but many lenders still rely on legacy systems and manual processes that limit efficiency and scalability. Kennek aims to leverage open banking and data integration to provide lenders with a more streamlined, automated lending experience.

The seed funding will allow the London-based startup to continue developing its platform and expanding its team to meet demand from non-bank lenders looking to digitize operations. Kennek’s focus on the UK and Europe also comes amid rising adoption of open banking and open finance in the regions.

Featured Image Credit: Photo from Kennek.io; Thank you!

Radek Zielinski

Radek Zielinski is an experienced technology and financial journalist with a passion for cybersecurity and futurology.

Continue Reading

Politics

Fortune 500’s race for generative AI breakthroughs

Published

on

Deanna Ritchie


As excitement around generative AI grows, Fortune 500 companies, including Goldman Sachs, are carefully examining the possible applications of this technology. A recent survey of U.S. executives indicated that 60% believe generative AI will substantially impact their businesses in the long term. However, they anticipate a one to two-year timeframe before implementing their initial solutions. This optimism stems from the potential of generative AI to revolutionize various aspects of businesses, from enhancing customer experiences to optimizing internal processes. In the short term, companies will likely focus on pilot projects and experimentation, gradually integrating generative AI into their operations as they witness its positive influence on efficiency and profitability.

Goldman Sachs’ Cautious Approach to Implementing Generative AI

In a recent interview, Goldman Sachs CIO Marco Argenti revealed that the firm has not yet implemented any generative AI use cases. Instead, the company focuses on experimentation and setting high standards before adopting the technology. Argenti recognized the desire for outcomes in areas like developer and operational efficiency but emphasized ensuring precision before putting experimental AI use cases into production.

According to Argenti, striking the right balance between driving innovation and maintaining accuracy is crucial for successfully integrating generative AI within the firm. Goldman Sachs intends to continue exploring this emerging technology’s potential benefits and applications while diligently assessing risks to ensure it meets the company’s stringent quality standards.

One possible application for Goldman Sachs is in software development, where the company has observed a 20-40% productivity increase during its trials. The goal is for 1,000 developers to utilize generative AI tools by year’s end. However, Argenti emphasized that a well-defined expectation of return on investment is necessary before fully integrating generative AI into production.

To achieve this, the company plans to implement a systematic and strategic approach to adopting generative AI, ensuring that it complements and enhances the skills of its developers. Additionally, Goldman Sachs intends to evaluate the long-term impact of generative AI on their software development processes and the overall quality of the applications being developed.

Goldman Sachs’ approach to AI implementation goes beyond merely executing models. The firm has created a platform encompassing technical, legal, and compliance assessments to filter out improper content and keep track of all interactions. This comprehensive system ensures seamless integration of artificial intelligence in operations while adhering to regulatory standards and maintaining client confidentiality. Moreover, the platform continuously improves and adapts its algorithms, allowing Goldman Sachs to stay at the forefront of technology and offer its clients the most efficient and secure services.

Featured Image Credit: Photo by Google DeepMind; Pexels; Thank you!

Deanna Ritchie

Managing Editor at ReadWrite

Deanna is the Managing Editor at ReadWrite. Previously she worked as the Editor in Chief for Startup Grind and has over 20+ years of experience in content management and content development.

Continue Reading

Politics

UK seizes web3 opportunity simplifying crypto regulations

Published

on

Deanna Ritchie


As Web3 companies increasingly consider leaving the United States due to regulatory ambiguity, the United Kingdom must simplify its cryptocurrency regulations to attract these businesses. The conservative think tank Policy Exchange recently released a report detailing ten suggestions for improving Web3 regulation in the country. Among the recommendations are reducing liability for token holders in decentralized autonomous organizations (DAOs) and encouraging the Financial Conduct Authority (FCA) to adopt alternative Know Your Customer (KYC) methodologies, such as digital identities and blockchain analytics tools. These suggestions aim to position the UK as a hub for Web3 innovation and attract blockchain-based businesses looking for a more conducive regulatory environment.

Streamlining Cryptocurrency Regulations for Innovation

To make it easier for emerging Web3 companies to navigate existing legal frameworks and contribute to the UK’s digital economy growth, the government must streamline cryptocurrency regulations and adopt forward-looking approaches. By making the regulatory landscape clear and straightforward, the UK can create an environment that fosters innovation, growth, and competitiveness in the global fintech industry.

The Policy Exchange report also recommends not weakening self-hosted wallets or treating proof-of-stake (PoS) services as financial services. This approach aims to protect the fundamental principles of decentralization and user autonomy while strongly emphasizing security and regulatory compliance. By doing so, the UK can nurture an environment that encourages innovation and the continued growth of blockchain technology.

Despite recent strict measures by UK authorities, such as His Majesty’s Treasury and the FCA, toward the digital assets sector, the proposed changes in the Policy Exchange report strive to make the UK a more attractive location for Web3 enterprises. By adopting these suggestions, the UK can demonstrate its commitment to fostering innovation in the rapidly evolving blockchain and cryptocurrency industries while ensuring a robust and transparent regulatory environment.

The ongoing uncertainty surrounding cryptocurrency regulations in various countries has prompted Web3 companies to explore alternative jurisdictions with more precise legal frameworks. As the United States grapples with regulatory ambiguity, the United Kingdom can position itself as a hub for Web3 innovation by simplifying and streamlining its cryptocurrency regulations.

Featured Image Credit: Photo by Jonathan Borba; Pexels; Thank you!

Deanna Ritchie

Managing Editor at ReadWrite

Deanna is the Managing Editor at ReadWrite. Previously she worked as the Editor in Chief for Startup Grind and has over 20+ years of experience in content management and content development.

Continue Reading

Copyright © 2021 Seminole Press.