Many analytics and machine learning use cases connect to data stored in data warehouses or data lakes, run algorithms on complete data sets or a subset of the data, and compute results on cloud architectures. This approach works well when the data doesn’t change frequently. But what if the data does change frequently?
Today, more businesses need to process data and compute analytics in real-time. IoT drives much of this paradigm shift as data streaming from sensors requires immediate processing and analytics to control downstream systems. Real-time analytics is also important in many industries including healthcare, financial services, manufacturing, and advertising, where small changes in the data can have significant financial, health, safety, and other business impacts.
If you’re interested in enabling real-time analytics—and in emerging technologies that leverage a mix of edge computing, AR/VR, IoT sensors at scale, and machine learning at scale—then understanding the design considerations for edge analytics is important. Edge computing use cases such as autonomous drones, smart cities, retail chain management, and augmented reality gaming networks all target deploying large scale, highly reliable edge analytics.
Several different analytics, machine learning, and edge computing paradigms are related to edge analytics:
When designing solutions requiring edge analytics, architects must consider physical and power constraints, network costs and reliability, security considerations, and processing requirements.
You might ask why you would deploy infrastructure to the edge for analytics? There are technical, cost, and compliance considerations that factor into these decisions.
Applications that impact human safety and require resiliency in the computing architecture are one use case for edge analytics. Applications that require low latency between data sources such as IoT sensors and analytics computing infrastructure are a second use case that often requires edge analytics. Examples of these use cases include:
Cost considerations are a significant factor in using edge analytics in manufacturing systems. Consider a set of cameras scanning the manufactured products for defects while on fast-moving conveyor belts. It can be more cost-effective to deploy edge computing devices in the factory to perform the image processing, rather than having high-speed networks installed to transmit video images to the cloud.
I spoke with Achal Prabhakar, VP of engineering at Landing AI, an industrial AI company with solutions that focus on computer vision. “Manufacturing plants are quite different from mainstream analytics applications and therefore require rethinking AI including deployment,” Prabhakar told me. ”A big focus area for us is deploying complex deep learning vision models with continuous learning directly on production lines using capable but commodity edge devices.”
Deploying analytics to remote areas such as construction and drilling sites also benefits from using edge analytics and computing. Instead of relying on expensive and potentially unreliable wide area networks, engineers deploy edge analytics infrastructure on-site to support the required data and analytics processing. For example, an oil and gas company deployed a streaming analytics solution with an in-memory distributed computing platform to the edge and reduced the drilling time by as much as 20 percent, from a typical 15 days to 12 days.
Compliance and data governance is another reason for edge analytics. Deploying localized infrastructure can help meet GDPR compliance and other data sovereignty regulations by storing and processing restricted data in the countries where the data is collected.
Unfortunately, taking models and other analytics and deploying them to edge computing infrastructure isn’t always trivial. The computing requirements for processing large data sets through computationally intensive data models may require re-engineering before running and deploying them on edge computing infrastructure.
For one thing, many developers and data scientists now take advantage of the higher-level analytics platforms that are available on public and private clouds. IoT and sensors often utilize embedded applications written in C/C++, which may be unfamiliar and challenging terrain for cloud-native data scientists and engineers.
Another issue may be the models themselves. When data scientists work in the cloud and scale computing resources on-demand at relatively low costs, they are able to develop complex machine learning models, with many features and parameters, to fully optimize the results. But when deploying models to edge computing infrastructure, an overly complex algorithm could dramatically increase the cost of infrastructure, size of devices, and power requirements.
I discussed the challenges of deploying AI models to the edge with Marshall Choy, VP of product at SambaNova Systems. “Model developers for edge AI applications are increasingly focusing more on highly-detailed models to achieve improvements in parameter reduction and compute requirements,” he noted. “The training requirements for these smaller, highly-detailed models remains daunting.”
Another consideration is that deploying a highly reliable and secure edge analytics system requires designing and implementing highly fault-tolerant architectures, systems, networks, software, and models.
I spoke with Dale Kim, senior director of product marketing at Hazelcast, about use cases and constraints when processing data at the edge. He commented that, while equipment optimizations, preventive maintenance, quality assurance checks, and critical alerts are all available at the edge, there are new challenges like limited hardware space, limited physical accessibility, limited bandwidth, and greater security concerns.
“This means that the infrastructure you’re accustomed to in your data center won’t necessarily work,” Kim said. “So you need to explore new technologies that are designed with edge computing architectures in mind.”
The more mainstream use cases for edge analytics today are data processing functions, including data filtering and aggregations. But as more companies deploy IoT sensors at scale, the need to apply analytics, machine learning, and artificial intelligence algorithms in real-time will require more deployments on the edge.
The possibilities at the edge make for a very exciting future of smart computing as sensors become cheaper, applications require more real-time analytics, and developing optimized, cost-effective algorithms for the edge becomes easier.
Isaac Sacolick is the author of Driving Digital: The Leader’s Guide to Business Transformation through Technology, which covers many practices such as agile, devops, and data science that are critical to successful digital transformation programs. Sacolick is a recognized top social CIO, a long-time blogger at Social, Agile and Transformation and CIO.com, and president of StarCIO.
Copyright © 2020 IDG Communications, Inc.
Copyright © 2020 IDG Communications, Inc.
Explore the IDG Network
Many businesses are now exploring how edge analysis is different from conventional data processing solutions and how it could be beneficial to their operations.
Edge analytics introduces and brings up an approach to data analysis in which a preset analytical calculation is executed on data instead of transferring it back to a consolidated data store. It makes sure that the process of data collection, processing, and survey is carried out right at the edge of a network in real time. This allows business enterprises to set required bound and strictures on which information is worth conveying to an on-premise or a cloud data pool for future use. Ever since edge analytics has come into play, solutions providers around the world have been taking recourse to the approach, along with cloud, in order to deal with piles of IoT data.
A number of researches have been conducted, and research teams across the world have come up with best insights and intuitions about edge analytics. When it comes to putting up a strong IoT solution, edge analytics strategies have proven to be beneficial in more than one way. Some edge analytics benefits offered to businesses include:
Faster pace: For most of the business organizations, speed or pace is considered as the most important parameter to their core business. For example, the dependency of a financial venture on high-bandwidth exchange procedures means that an interruption of mere milliseconds can end up giving way to undesirable consequences. In the healthcare sector, losing track of even a few seconds can lead to dire sequels. And, for companies that offer data-related services to consumers, dawdling speed can prove to be mayhem, as it would disappoint the customers and cause indelible damage to the brand. So, quite naturally, speed is no longer just a viable advantage; rather, it is one of the best practices every business should hold on to.
At the same time, the most significant advantage of edge computing is its aptness and potential to shoot up network performance by minimizing unwanted remission and suspension. The fact that IoT edge computing devices happen to develop data sectionally curtails the need for the collected information to travel as far as it would have to under a conventional cloud structure.
Flexibility: As business enterprises start growing, it’s not always possible for them to perfectly calculate the IT infrastructure essentials, and setting up a keen and out-and-out data center is also a big-budget proposition. The advancement in cloud-based technology and edge computing, however, have made it pretty much hassle-free for enterprises to gauge their operations. Gradually, calculating, loading, and analytics capabilities are being rolled into expedients with smaller footprints. Edge analytics allows organizations to magnify and multiply the network’s scope and abilities.
**Reliability: **While the propagation of IoT edge computing strategies escalates the attack surface for networks, it also doles out an array of security leads. The conventional cloud computing structure is innately consolidated, which makes it quite susceptible to DDoS (Distributed Denial of Service) attacks and power disruptions. Edge computing metes out dispensation, storage, and applications across a wide variety of data centers, which makes it difficult for any single interference to dismantle or affect the network.
Adaptability: The adaptability and flexibility of edge analytics also make it extremely versatile. By consorting and associating with local edge data centers, business ventures can now easily fix on appropriate markets without having to capitalize in costly infrastructure development. Edge data centers make it possible for them to serve the end-users competently with minimum latency. This has proved to be highly useful for content providers looking to drop-ship non-stop streaming services. Simultaneously, it also endows IoT devices to accumulate considerable amounts of actionable data. Instead of awaiting resources to log in with their devices and connect with integrated cloud servers, edge computing devices are always tethered in and always engendering data for future examination.
#analytics #edge computing #from our experts #edge computing #iot #data analytic
It sometimes makes sense to treat edge computing not as a generic category but as two distinct types of architectures: cloud edge and device edge.
Most people talk about edge computing as a singular type of architecture. But in some respects, it makes sense to think of edge computing as two fundamentally distinct types of architectures: Device edge and cloud edge.
Although a device edge and a cloud edge operate in similar ways from an architectural perspective, they cater to different types of use cases, and they pose different challenges.
Here’s a breakdown of how device edge and cloud edge compare.
First, let’s briefly define edge computing itself.
Edge computing is any type of architecture in which workloads are hosted closer to the “edge” of the network — which typically means closer to end-users — than they would be in conventional architectures that centralize processing and data storage inside large data centers.
#cloud #edge computing #cloud computing #device edge #cloud edge
Agrochemical companies manufacture a range of offerings for yield maximisation, pest resistance, hardiness, water quality and availability and other challenges facing farmers. These companies need to measure the efficacy of their products in real-world conditions, not just controlled experimental environments. Single-crop farms are divided into plots and a specific intervention performed in each. For example, hybrid seeds are sown in one plot while another is treated with fertilisers, and so on. The relative performance of each treatment is assessed by tracking the plants’ health in the plot where that treatment was administered.
#featured #deep learning solution #tiger analytics #tiger analytics deep learning #tiger analytics deep learning solution #tiger analytics machine learning #tiger analytics ml #tiger analytics ml-powered digital twin
The proliferation of big data analytics solutions has significantly redefined businesses’ data processing over the years. It has already proven a key solution for identifying and deriving meaningful insights from vast datasets. With emerging technologies like artificial intelligence, machine learning and the cloud, data professionals are now leveraging cognitive analytics to drive real-time decision making. It presents much greater potential than big data analytics, unlocking the value of big data by making a system more self-reliant, and information contained more accessible.
Since data is considered the oil of today’s digital economy, data analytics is an indispensable economic driver. Over the years, it has evolved exponentially including from descriptive to diagnostic and predictive to prescriptive. Cognitive analytics is now likely to become the next frontier of this data analytics trend. It exploits high-performance computing power by integrating artificial intelligence and machine learning techniques with data analytics approaches.
#big data #latest news #how can cognitive analytics go beyond big data analytics? #analytics #cognitive analytics #cognitive computing
Experts say that you can’t become a skilled driver within a week. Driving, like many other skills, requires patience and lots of practice. Even if you’ve been driving for several years, it’s always a good strategy to learn new techniques and keep improving.
In this article, we’ll explain a few tricks that’ll help you become a better driver and avoid unexpected road accidents. These tricks will include everything, starting from taking professional driving lessons in Melbourne to properly using all the features in your car.
So, without any further ado, let’s get started.
1. Learn from Experts
If you’re an absolute beginner, the first step towards becoming a better driver would be to join a dedicated driving school in Melbourne. These schools have professional driving instructors who have years of experience in training novice drivers.
They’ll help you understand the basics of driving and also give you extra tips to stay confident behind the wheel. Another potential benefit of joining a driving school is that it’ll also help you pass the driving license test more easily. Why? Because the instructors will also share different rules and regulations that you must follow during the test.
2. Always Set Your Mirrors Correctly
Another crucial tip that’ll help you become a better driver is to adjust all the mirrors correctly. Many people keep the side mirrors too close that they only see the rear portion of their car and not the actual road.
Keep in mind that if you’re doing this, you won’t be able to know how many cars are behind you and it’ll become challenging to change lanes. So, learn how to adjust the slider mirror and the internal rearview mirror so that you always have a clear view of the back.
3. Maintain a Safe Distance From Other Cars in Traffic
While driving in traffic, make it a habit to maintain a safe distance from the car in front of you. The general thumb rule says that you should keep a distance of at least two full-length cars from the cars in front of you. This way even if the other driver brakes hard, you won’t go colliding into his/her vehicle.
4. Always Use Indicators While Changing Lanes
When it comes to driving in traffic, it’s quite natural to change lanes from time to time. However, switching lanes without using any signals may confuse other drivers and may become the reason for unexpected accidents.
So, while changing lanes, make sure to check the side and rearview mirrors first and then use the correct indicator. If you’re a beginner, your instructor from the driving school in Melbourne will ask you to master this tactic.
#driving school melbourne #driving school near me #driving school south morang #driving lessons melbourne #driving school werribee