IoT For All
IoT For All
This week on the IoT For All Podcast, Pratexo CEO Blaine Mathieu joins us to share everything companies should know about edge computing and how it complements IoT. Blaine shares some of the most transformative and unique use cases he’s seen edge computing enable, including windmills that automatically slow to prevent collisions with birds.
Blaine also shares his thoughts on what companies should consider before deciding to implement edge computing into their solutions - when it might be most beneficial, especially in environments where connectivity is difficult or unstable, and what some of the biggest benefits companies can expect. Blaine also shares his thoughts on distributed computing and swarm computing, how they work, and in what use cases the technology really shines.
Blaine Mathieu is a former Gartner analyst, software company founder, CEO, and multi-time CMO and Chief Products Officer at both public tech giants and private software and IIoT startups. He is based out of the SF Bay area and leads intelligent edge platform company Pratexo.
Interested in connecting with Blaine? Reach out to him on Linkedin!
About Pratexo: Pratexo is the plug-and-play edge computing platform-as-a-service that unlocks the value of AI and IoT by simplifying and accelerating the design, provisioning, and management of your computing infrastructure at the far edge.
(01:09) Introduction to Blaine Mathieu
(03:00) How did Pratexo get started?
(05:03) Can you share some impactful edge computing use cases?
(13:25) What are the advantages of edge computing?
(18:00) How does edge computing work in settings without stable connectivity?
(20:08) Distributed computing vs. swarm computing?
(22:45) What should people consider when determining if edge computing is right for their solution?
- You are listening to the IoT For All Media Network.
- [Ryan] Hello everyone, and welcome to another episode of the IoT For All podcast. I'm your host, Ryan Chacon, and on today's episode, we have Blaine Mathieu, the CEO of Pratexo joining us. He has been everything from a Gartner analyst to a startup founder, and now the CEO of Pratexo, a intelligent edge computing platform company. So on this episode, we're going to talk about edge computing. We're going to talk about everything from how edge computing unlocks the value of IoT and AI, and then what processing local data in real time can do for a solution here, the benefits, the advantages, and we're going to answer a question about can edge computing platforms really be a one size fits all solution, or is there another approach that needs to be thought about? But before we get into this episode, if any of you out there are looking to enter the fast growing and profitable IoT market, but don't know where to start, check out our sponsor Leverege. Leverege is an IoT solutions development platform, provides everything you need to create turnkey IoT products that you can white label and resell under your own brand. To learn more, go to IoTchangeseverything.com. That's IoTchangeseverything.com. And without further ado, please enjoy this episode of the IoT For All podcasts. Welcome Blaine to the IoT For All show. Thanks for being here this week.
- [Blaine] Thanks Ryan, great to be here.
- [Ryan] Yeah, it's fantastic to have you. We're live with video now so the podcast is just slowly moving up quality wise. So we're really appreciate you taking the time out to talk more about what you have going on and answering some questions about edge computing. So it's going to be a really good conversation, I'm looking forward to it.
- [Blaine] Well, I'm excited to be one of your first video guests and I've been listening to the IoT For All podcast for years, I think.
- [Ryan] Wow, thank you.
- [Blaine] Just truly impressed with the quality of the production and the guests, you guys are the industry experts and I love it.
- [Ryan] Well, I really appreciate that. The team will find a lot of appreciation in that comment, so we work hard and we're glad you're finding value. So I wanted to start off this conversation by having you give a quick introduction about yourself, background, experience, anything you think would be relevant, interesting for our audience to know about who they're listening to.
- [Blaine] Well, I'll try to keep it brief. Let's see. I'm originally Canadian, was known as an internet guru back in the old days of the first.com boom. Gartner brought me down to Silicon Valley where I was one of their first internet analysts covering the space. And then from then I've been an executive in enterprise software companies, which eventually became SAS Companies. Got into the big data analytics space about 10 years ago, and then directly into the IoT, which was starting to become the AI machine learning space about six years ago or so. The one year prior to joining Pratexo was at an interesting VC called Momenta Partners that's focused on edge AI and IoT related startups, and that's where I found Pratexo and I began with Pratexo in January of this year, as their CEO.
- [Ryan] Fantastic, yeah, we actually know Momenta very well. We've spoken to them many times, met them at events, back when events were something that people went to regularly.
- [Blaine] Exactly.
- [Ryan] So speaking of Pratexo, tell us a little bit about the company, what do you all do? What's the focus? And I'd love to hear a little bit more just talking about the backstory of the company as well. Anything that kind of drew you in to kind of, to joining.
- [Blaine] Yeah, yeah, well. So interesting, really interesting story, so the company was born out of a venture incubator called Northscaler located in Norway, actually. Norway in Sweden, and in 2019, basically, as often happens with young startups, originally a team of brilliant technologists, actually a lot of background in video gaming, electronic arts, and in building high performance computing systems, that's that was sort of where this engineering team came from and what they were doing. But by 2019, they saw that a lot of this computing that used to be done in data centers was going to have to move to the edge, closer to where data was created for all the reasons we understand. Latency, security, privacy, costs, you name it. And so they began to take this knowledge from building resilient systems based on the principles of distributed computing and figuring out how to move it to the edge, and even to the far edge, I joined the company, as I mentioned early in this year, in 2021 to really begin the commercialization of this cool technology that they had built over the previous few years. And going into why I joined fundamentally after spending the previous 10 years, I guess, in big data analytics, IoT, and this emerging sort of industrial machine learning space. I also saw that the edge was going to be the thing, a lot of solutions and applications and technology had built with the premise that everything was moving to the cloud. But as you mentioned earlier, about unlocking the value of AI and IoT, it was pretty obvious to me that to really unlock the value of IoT and AI, you did have to move a lot of this compute capability down to the far edge, into a building, on a ship, in a transformer station, in a windmill, you name it, and that's why I joined Pratexo, to help make that happen.
- [Ryan] That's fantastic. We've had a lot of guests in the past mentioned edge computing, talk a bit more in detail about what it is, but I'd love it if you could kind of expand on kind of the unique offering that Pratexo has to the industry, and if you connect that with any use cases or applications of the technology and kind of just talking more about what the technology is being used for and kind of maybe why it was chosen to kind of go edge versus maybe just to the cloud or kind of the advantages that this has kind of brought to those individual industries and use cases that you are focusing on.
- [Blaine] Sure, you said you want me to keep my answers short and this a big question--
- [Ryan] No, no! This is a good one.
- [Blaine] This is a great question. I think of edge computing as being a lot where IoT was, sir, I actually probably even a better analogy where machine learning and AI was four or five years ago: If you remember back then, every company was a machine learning company. It didn't matter what you did, you were machine learning. Every website, every startup was a windmill company. It didn't matter, right? And now I'm seeing this in edge computing. Everybody works on the edge, everybody's edge this and edge that. But fundamentally, in the real world, what you've got is a lot of systems that were originally architected for the cloud on the assumption that you're going to be able to put data in the cloud and process it there, and then they happen to do some stuff on the edge. Maybe a limited amount of data filtering, and it was pretty clear to me that that just wasn't going to be good enough. You need to actually realize the value of IoT, for example, where we know there are many reasons why IoT is not moving as fast as it could have, or should have, the complexity of the entire ecosystem, which you talk about in a lot of your podcasts and try to figure out how to solve. But I think one of the things is there's just too much data to actually push it all to cloud-based applications. The vast majority of IoT data is dark data. It's not actually being utilized to make a decision or take an action in real time. And there's a solution to that with edge computing. And I think the same thing is in danger of happening with industrial machine learning and AI. If we think central clouds are going to be where we can run all these AI models, I think we're going to be disappointed. Central clouds are fantastic for training models, for creating models, there's a huge use cases for central clouds, don't get me wrong. The edge isn't going to eat the cloud, they have to collaborate together. But then when you're actually running the model in real time, again, in a windmill, in some of these use cases I'll talk about in a second, it has to be run close to where the machines need to be controlled, the devices need to be accentuated. That's just the reality. So that's why I position edge computing and Pratexo particularly not as yet another technology you have to master, not as, "Okay, I started with IoT and I'm starting to do some AI projects in my innovation lab, and now I've got edge computing. Oh my God," right? No, it's about, you can now finally unlock that value of IoT and your upcoming value that you want to get from AI and machine learning. So I'm happy to talk about a couple of use cases, if you like.
- [Ryan] Yeah, yeah. I think that'd be great. I think the last thing you just mentioned there is super important that people have to understand that this is not a new technology that they themselves have to learn, it's more of now the potential is there to more easily unlock the value of edge computing and put it to work in the way that it was intended.
- [Blaine] Totally, I learned that lesson well over the last nine months being at Pratexo, because I began pitching edge computing, a lot of it is about educating on edge computing, let alone on Pratexo, and as I was talking more and more about edge computing, I found the audience was like, "Oh no, there's another thing I have to master." And I realized, no, don't think about it that way. Think about it, you've already invested millions in your IoT infrastructure. You're not actually maximizing the value of that today, are you? No, we know we aren't, right? Think of all those cool projects you've got somebody in your innovation group working on for AI and machine learning. Well, think about edge computing as the way to get there.
- [Ryan] Okay. That makes sense.
- [Blaine] So I can talk about, yeah, let me talk about a couple of use cases. What I love that's really interesting is actually commercial windmills in Europe. So here's the situation in many European countries. As you know, we've got these super large windmills that, unfortunately, have the capacity to injure birds and take out bird populations. So in many countries, they literally require a person to sit at the base of the windmill with a clipboard or a counter counting how many birds get hit by the windmill, how many flocks of birds flying nearby so they can track the impact on the environment. This is crazy when you think about it, that we have people actually doing this job today, right? So imagine replacing those people or these groups of people with a ring of high definition cameras around the windmill that are sending, that are sensing the birds that are actually using AI, not only to detect an individual bird, because that's very hard, right? Unless the birds are quite close. But based on flight patterns and flocking patterns, you can identify the species of the bird fairly easily using machine learning, and then take those image sensor feeds, push them down into the base of the windmill, where you're running one node, one compute node at the windmill edge, running a few AI processors to run that data in real time and have a really good, accurate record of how that windmill is impacting the bird population without having to hire an army of people to run around at the base of your windmill. I think it's a really cool use case related to environmental responsibility.
- [Ryan] Sure.
- [Blaine] But a huge ROI for these power companies that are starting to stand these systems up.
- [Ryan] Absolutely, I mean, IoT has taught me a ton over the last number of years. And one of those things is, it's taught me about a lot of different jobs and roles that I never knew existed. And this is one of them so. So it's very fascinating. When you hear about windmills, you hear about how the impact it has on birds, but you never really think about how that's monitored, how that's controlled, how that's the data they're able to pull from that in order to make better decisions to kind of help this not happen, so that's very fascinating, for sure.
- [Blaine] Actually, and it's not only about making better decisions in the long run, they can literally change the pitch of the propellers in real time to almost immediately slow down or stop the windmill if a flock of geese is about to fly into it. So it is amazing how they can control these systems in real time. Another one you'll like, maybe not much out there as the windmill, but working with Norwegian power utility. And right now they have these large systems, these of transformers, right, that are scattered throughout the country. The way they can tell if a transformer is about to have a problem is, Lars trudges eight hours through the snow to a remote transformer station, opens the door and actually listens to the transformer okay? You can tell by the sound that transformer is making, if it's sparking, or arching, if it's popping, if it's having various other signs that you're going to have a potential issue with this transformer in the next few hours, days, or weeks. When you have tens of thousands of transformers spread around the country, that's a lot of Lars and a lot of time that Lars can't be there listening because he's somewhere else. So think about setting up your edge compute nodes in each transformer station, microphones, also comparing the sound file and reading that through machine learning algorithm, to the modbus data feed, which has all the other data, sensor readings off the transformer and being able to know in real time and predict, actually, what's going to be happening to that transformer and good ol' Lars can do something else instead of listening to transformer stations.
- [Ryan] Yeah, that's very interesting. I think just the application of this technology of the edge computing and what it's bringing to industries that you would probably never even have thought of before getting involved in this space is truly fascinating. So I appreciate you kind of sharing those use cases and I'm sure there's plenty of other ones that you all are exploring. And I want it to be, before we get into potentially other use cases, I do want to ask you if you, when you talk to these companies and you engage with these organizations and bring up edge computing, and oftentimes they may not understand edge computing or know the benefits and values of edge computing, how do you kind of promote edge computing to them? What are the true benefits and advantages of going, bringing edge computing into an IoT solution? And I know not every use case requires it, but I know there are definitely some benefits and advantages to using edge computing. We've kind of touched on some of them through our conversation so far, but if somebody was to kind of come and ask you non-technical background, just being like, "Hey Blaine, what is the, why are we going to the edge computing route? Are we complimenting it with the cloud too? Or are we completely nixing the cloud? Like what are we doing and why?" How do you usually answer that?
- [Blaine] Yeah, well, yeah. So the sort of canonical benefits of edge computing in general are things like privacy. So for example, many healthcare organizations, hospitals in Europe are prohibited by law from allowing any patient related data, including data off IoT sensors like heart monitors to go outside of the hospital, all right? You have to be computing that data locally in the hospital to take action in real time. So privacy is a big one. Security is another one. I mentioned, I think the power distribution grid, actually, again by law, they're not allowed to connect to the public internet or push their data through to central clouds. They need to in secure VPN environments. So they need to run their ecosystem securely. Of course, latency issues. So again, even back to the windmill example, you need to be able to turn those blades now and even a second, or a few seconds while you're computing the model up on the cloud, then you send the signal back down to rotate the blades might be too late for that poor bird that's about to fly by. So latency is another sort of canonical benefit. And then it gets, I think, even a little more interesting because we haven't talked about too much about it yet, but Pratexo enables not just edge computing, but distributed computing on the edge, right? So you can have a series of edge nodes all collaborating together to solve a problem. And what does that do for you, right? First of all, it gives you high resiliency and reliability. So if one of your compute nodes goes down, the other ones can keep processing the data. So in fact, in these windmills I'm talking about, we're not putting just one compute node, but it's actually a little cluster of three edge nodes that are doing this compute collaboratively, and that way it's always on. Always available and reliable. And then another benefit that you can do with that is enabled scalability. So as your data processing needs increase, as you add more sensors on your manufacturing line, more cameras to your windmill, whatever the case is, then you can just add another compute node, and that compute node will join the local cluster, scale up its capability and increase what it's doing accordingly. The last thing that I really like about edge computing in general is this thing, I wish there was an easier term for it, but this notion of data democratization, right? Because, imagine a scenario you're on a container ship. The average container ship these days has at least 10 and sometimes many more separate control stations. You've got HVAC systems, you've got navigation, engine monitoring, cargo monitoring, all these different systems connected to different sets of sensors, different computers, running that data. Now imagine instead if you had a central distributed micro cloud at sea, okay? Where you had all the sensors and devices connected into this micro cloud made up of a bunch of separate compute nodes, so if one goes down, you've got no problem. The system is still gonna run. It's only by sort of democratizing those data feeds, exposing those data feeds to different software applications, increasingly intelligent that you're gonna get to this future of autonomous shipping. You can never do it when you've got a bunch of disconnected, siloed IoT based systems and even AI machine learning based systems that are all running separately, and you're relying on your crew to be the integration factor, right? So latency, security, privacy, scalability, and I think a more longer term one is this notion of democratizing data feeds and being able to take action on them in real time.
- [Ryan] That makes a lot of sense. And out of curiosity, when you work in these different environments, are there ever environments in which, maybe there's not access to the public internet or some kind of issue, or I guess anomaly in the way that it's set up, and if so, how does edge computing kind of work in those situations?
- [Blaine] Yeah, it's a great question because this is another reason why, one of the many explanations for the relatively high failure rate of IoT related POCs are the fact that they never make it, they never realized the full ROI, even if they make it out of POC stage. Same with, I see this again, coming with machine learning is because when you build that POC, probably you're not as concerned about your connection to the cloud a lot. I know security's an ongoing topic on the IoT For All podcasts and a lot of POCs aren't built to be secure from the get-go. So they work properly, they run no problem, but then especially in industrial environments, I'd say that easily three quarters of the use cases we are involved in, and it's a wide range of industrial related use cases, either have sporadic connections to central clouds or no connection at all, okay? Because of the reasons I said, some legal privacy, and back to my micro cloud at sea. You're in a storm, even if you have normally very expensive satellite links to get data up when you're in a storm in the north sea, your satellite link probably doesn't work, right? So you need to have the ability to do that compute locally. And so that's why at Pratexo, we've actually invested a lot of brain power into designing and architecting systems that can run disconnected, not just run disconnected, but that can be installed in a disconnected environment, so don't require a connection to the cloud to install them and also the data can maintain enhanced and monitor it without requiring you to phone home, so to speak. So that's what I'd say the majority of our use cases that we support have that kind of characteristic today.
- [Ryan] Gotcha, okay! And also one thing you mentioned earlier, you talked about distributed computing and there's another term I've heard used often, which is swarm computing. And I was wondering if you could kind of define what those are and how that kind of is impacting or coming to edge computing in a sense.
- [Blaine] Yeah, swarm computing is a really interesting topic and sometimes I avoid it because it blows people's minds, but think of the notion of distributed computing, but more on an ad hoc basis. So this is very, becoming increasingly common in situations that involve moving things, vehicles, robots, airplanes, you name it, they're in different situations at different points in time. One example I like to use, a pilot we're working on with the organization that'll remain nameless, is doing this with armored personnel carriers. So imagine, you've got a group of APCs moving through the desert and they've got cameras mounted on them, they can see in a couple of directions and use object detection to detect a person, a strange person, an object, maybe even an IED or something really bad like that. And the APC that's in the front of the line has a perfect view, okay? The APCs right behind it, basically can't see anything 'cause they're in a dust cloud. So what you would like to do is to be able to swarm the compute capability of the rear APCs toward the one in the front, so the one on the front can be maximally detecting in real time using machine learning what's going on around it. Now let's say the APCs changed their configuration, they move in a two up formation. Now you have to change the compute configuration again in real time, right? Swarming your capability in a different capacity. Another example I like to use is related to service dogs actually in rescue situations. And we're working on a project like this right now, as well. Imagine you're on the site, like that building that collapsed in Florida recently, and you've got a pack or a swarm of dogs that are searching through that environment to find people, detect signs of life, whatever the case is. So you've got, again, a camera mounted on that dog. You know the location, the position, is the dog sitting, is it lying down, using motion detection all in real time. And again, the compute capacity of those dogs is variable, depending on whether one's resting, they're working, it's running, you name it. So a lot of really interesting future use cases for swarm computing in the context of computing at the edge.
- [Ryan] Yeah, that's something new to us on the swarm computing side, we don't talk about too much. I've heard a few times, but I appreciate the definition and kind of the explanation there, but it's super fascinating stuff. One of the last questions I want to ask you before we wrap up is when people are thinking about edge computing, platforms, and computing in general, should they be thinking about these as kind of like a one size fits all solution? How do they, what should they be thinking about? What kind of stuff do they need to know ahead of time when they're going into that edge computing conversation, even if they're not, the technical people involved in the decision making, but they want to better prepare themselves to work with a company like yours, what do they need to know? And how should they be thinking about edge computing from a platform standpoint?
- [Blaine] Yeah, I think the key is to start small, but think big. And make sure you've got tools and platforms and solutions that enable that. So what I mean by that is, obviously, you want to get some value very quickly, right? You want to build a detector. Is there a problem, an imminent problem with this transformer or not? Just let me know, that's pretty easy to run that basic application, but then you want your software infrastructure and your supporting hardware to be able to support an evolution of those application use cases to increasingly complex, increasingly intelligent systems, and you don't want to have to rip out your software and your hardware and start over again every time you go from use case A, then you add in B, then you add in C, right? You want to be able to keep that computing infrastructure. So that's why Pratexo so is really focused on not just building sort of a one time edge compute platform, but built around this notion of a configurator. So you can continually change the configuration of your edge ecosystem so it evolves with you over time, and when you're ready, even if you're in a disconnected environment, you can stream up to your compute environment updates, new machine learning models, new applications that you want to run in the ship micro cloud, whatever the case is. So it's very flexible and you're not locked into one use case at one point in time, but you're sort of future proof, maybe the way to think about it.
- [Ryan] Yeah, that's something that we've seen across different technologies in the IoT space. When you're thinking about connectivity, for instance, choosing the right connectivity for a certain use case and being able to future-proof against potential other use cases you may want to build in addition to the one you're starting with. So the fact that you all are thinking about that and approaching from that same angle I think allows for more capabilities and more, in a sense, security in the decision making when a company is trying to decide on how to build a solution, what to build, and make sure that what they're building is for the future, because these technologies are going to be around and impacting industries all over the world for many, many years to come. And so I really appreciate your time today, kind of shedding light on a lot of these topics and what you all have going on over at Pratexo sounds absolutely fascinating. And I'd love it if you could kind of just wrap up by telling our audience a little bit more about where they can reach out. If they have questions, they want to learn more, get in touch, that kind of thing.
- [Blaine] Definitely check out Pratexo.com and I'm very easy to find on LinkedIn. So send me a message, I'd be happy to tell anybody more about edge computing, even swarm computing, and obviously especia