As a sneak preview of our upcoming “Edge Computing and IoT” event, we sat down with Mr. Blaine Mathieu, CEO at Pratexo, to discuss edge computing and the power of IoT and Artificial Intelligence.
Hi Blaine, thanks for joining us today. As my first question I’d like to ask what is holding organizations back from adopting further IoT and AI solutions?
Of course, there are many factors. But I am convinced that a big one has been how the adoption of edge computing has lagged the adoption of IoT and even AI. Google Trends data shows that IoT and ML took off as terms in 2014, but edge computing didn’t begin until 3 years later. It’s my position, backed up by speaking to many companies over the years, that the lag is explained by a lot of failed POCs and pilots that drove these companies to realize that the edge is necessary for implementing successful IoT and AI systems, which is where most of the data needs to be analyzed and acted on.
How are organizations you work with adopting intelligence at the edge?
They are starting small, by introducing basic applications running on the edge that can enable valuable actions, while collecting the data necessary to train ML models. When that is done, more advanced applications – including these models – can be streamed back to the edge for processing in real-time. The basic point is to use a platform that can easily evolve with you over time as your system gets more intelligent.
With the explosion of data being generated now, how do you see edge computing solving challenges around managing costs and reducing environmental impact?
Interestingly, probably half the systems we work on are related to sustainability efforts in some way. This is a huge trend and will be a big driver of IoT, AI, and therefore edge computing, over the coming decade. As to cost savings, this is certainly core to the value of edge computing – being able to process massive amounts of data locally instead of transporting and processing in central clouds. But I find that the more interesting use cases are around enabling the organization to do new and better things, not only saving money.
Can you share your perspective on digital twins?
The concept of digital twins is very hot and getting hotter – and definitely requires data to be processed on the edge. The challenge facing it is that many organizations are scoping these projects as being too big. For example, a digital twin of an entire nuclear plan or an entire building complex. Those are great goals but we need to get started with ‘micro-twins’ of smaller systems, and then build over time.
Do you see trust (through security & reliability) becoming a critical component to accelerating innovation?
So much has been written and said about security – it’s no longer an afterthought for POCs and pilots. Relating to reliability/availability, I believe the new hot area of focus is on systems that must be permanently or partially disconnected from the internet or central clouds. This may be for reasons of security, privacy, legal compliance, or just operating in tough environments that don’t let connections happen. Most POCs don’t consider this issue today but then it rears its head when we head to production. My company, Pratexo, has spent a lot of time working to solve these issues for our clients.