We had the pleasure to interview Mr. Bharath Ramesh, Head of HPE Edge AI Software & Solutions. In this interview we discussed IoT, cloud and edge technologies, security and future opportunities.
Hi Bharath, thanks for joining us. First I’d like to ask what are the key drivers for customers to deploy workloads on datacenter, cloud and edge?
Customers are quickly realizing the business value of the data that they generate and already possess, much of which originates from the physical world i.e., sound, light, pressure, temperature, vibration etc. Frequently this data is either not being sensed and digitized, or is trapped inside closed proprietary systems.
But technology has evolved to the point where accessing the data is no longer cost prohibitive or technically difficult.
If analysis of the data allows the user to have an improved understanding of industrial asset condition and potential downtime, this can be a valuable efficiency win. If it provides an insight into production bottlenecks that helps increase velocity, this can be a great productivity ROI story.
But data movement can be costly and risky, and choosing the appropriate location (edge, datacenter or cloud) to run the analysis workload is important.
Typically this decision is a function of where data is collected and the nearest location where sufficient compute resides to produce the insight. Increasingly both of these elements are present at the edge itself, but some insights still require a centralized viewpoint (e.g., comparing performance across plants in multiple geographies).
In such scenarios, a curated subset of data is sent from the edge to a datacenter/cloud, and all the locations operate as a continuum of analytics capabilities.
HPE provides the ability for workloads and data to seamlessly move across this spectrum depending on business need rather than technology availability.
Where are the opportunities?
Many industrial customers with a well-established fleet of assets and production lines are continually looking for ways to reduce cost of operation. Improving the manufacturing process, the product or both simultaneously is a great path to yield such an outcome. Monitoring the health of production equipment allows maintenance to occur based on need vs. a fixed time schedule, thus reducing unnecessary replacements and cost of technician time. When failures can be proactively detected, it also reduces the likelihood of unplanned downtime requiring an expensive rush fix.
Traditional statistical techniques are giving way to demanding machine learning (ML) algorithms that draw upon historical information to improve prediction accuracy.
Another major cost driver for manufacturers is product quality, and a new wave of quality assurance (QA) systems are using artificial intelligence (AI) tools to greatly improve the accuracy and speed of defect detection. These systems enable even a non-IT user to train models based on sample data and use defect escapes to continually refine the QA effectiveness.
Best of all, these systems leverage data from existing assets such as cameras, and can be easily scaled to perform inspections after more production steps. Defects can now be caught earlier where cost of rework is lower, and fewer bad products eventually reach the end user with a resultant overall business saving. HPE ourselves have adopted such a QA framework in our server production factories with great results.
Just a few years ago, advanced computing technologies to implement these improvements were not available on the factory floor. This is no longer the case and the ability to deploy datacenter and cloud-like capabilities in a rugged form factor on the industrial edge is a reality. An entire production line instrumented in this manner, helps customers get closer to the gold standard of a 100% overall equipment effectiveness (OEE) score.
What are they top edge computing architecture considerations to support digital transformation in industrial use cases?
Security must be the foremost consideration because the edge inherently has a larger attack surface than a datacenter or cloud. For example, physical access to a compute system fitted on a remote pump is easier than getting to a server in an access protected and monitored datacenter. Data must therefore be encrypted, not just in transit, but also at rest, and immutably tied to the generating system so it cannot be manipulated to drive wrong outcomes.
The system performing the analysis must run trusted software from the metal upwards, so customers can confidently bet their business on the insight it generates.
Remotely accessed systems management provides an additional layer of scrutiny, as issues are highlighted to IT staff as soon as their occur and in many cases can be remedied without initiating a truck roll. Regular updates of firmware and software help the distributed compute infrastructure keep ahead of reported vulnerabilities.
Once a customer deploys edge analytics, the resulting user interest inevitably drives an increased workload demand.
The simple data aggregation and event processing gateway makes way for rugged edge servers such as HPE Edgeline that offer high levels of security, monitoring and flexibility. CPU based compute blends with GPU or FPGA accelerated compute to run new AI inference models. Even the use of large AI training accelerators at the edge has become reality.
Wholly wired connectivity moves to a mix of wired and wireless, so new sensors and backhaul links can be added more easily. This is a key HPE competency through our Aruba networking business.
But retrofitting an asset with compute is an expensive activity. So the Industrial IoT architecture must assume that the workload demand will grow and provision for future scale to minimize such retrofits.
How do you capture Cloud-Edge convergence in Industrial IoT?
While the edge sits in the real time data flow and offers an immediacy of insight, it lacks the macroscopic view of a cloud looking across long time horizons or wide geographies. Businesses need both types of insights, so the full benefits of Industrial IoT can only be realized when the cloud and edge seamlessly interoperate.
However, the historical security paradigm to secure factory floor compute is to completely airgap it from any external network. While threats from the IT domain cannot easily penetrate into operations with such an architecture, it also stifles the flow of data necessary to mesh fast and deep insights.
New security paradigms are being adopted where firewalls sitting in the DMZ between the Operations Technology (OT) and IT domains allow selected data to permeate the boundary and enable edge-to-cloud interoperability.
Additionally, there is a strong desire to get a cloud-like user experience at the edge where the user can easily spin up or decommission resources, and access application libraries as needed for their role – all without having to involve IT.
The line of business would also like to pay only for consumed capacity rather than paying the full compute infrastructure CapEx upfront out of their budget. HPE GreenLake delivers exactly such secure cloud-like platform capabilities spanning edge, datacenter and cloud domains.
What are the challenges preventing adoption of edge technologies?
Industrial users have been long accustomed to building a customized solution from the ground-up to solve their problems. But this approach is infeasible given the complexities of IoT projects.
Many of the foundational issues around industrial bus connectivity, ruggedization, security etc. are now addressable using off the shelf IT technology. Proven architectures exist to deploy compute across edge, datacenter and cloud, and can be simply onboarded to allow these users to begin at 75-90% complete instead of 0% each time.
It can be a challenge to convince users that this building-block approach yield faster proof of value and reduces overall cost by using a validated stack. This approach also imparts an increased user focus on the data and analysis which is what drives their business differentiation – not the underlying technologies.
HPE provides such building-block solutions for Quality Assurance (QA), Condition Based Monitoring (CBM) etc., including Pointnext Advisory and Professional Services (A&PS) to customize the implementation.