WEBINAR

Computer Vision on Every Asset - SaaS APM + Robotics for Energy Excellence

GE Vernova
Image credit: GE Vernova
Experts from GE Vernova’s Power and Energy Resources software, ANYbotics, and AWS share an inside look at how GE Vernova's AI-powered Asset Performance Management (APM) software is opening new frontiers in autonomous maintenance. This on-demand webinar session explores how the shift to SaaS-based APM, built on a microservice architecture, enables integration with robotic operating systems—unlocking new efficiencies and richer asset data than ever before.

You’ll learn how GE Vernova’s APM platform with computer vision integrates with robotics from ANYbotics to deliver continuous, high-fidelity asset inspection. You’ll also learn how these autonomous capabilities, powered by AWS infrastructure, pave the way for scalable, cost-effective, and proactive maintenance programs. We’ll also discuss real-world examples of how energy organizations are using this integrated approach to reduce downtime, extend asset life, and enhance safety—turning digital transformation into operational reality.

Thank you for requesting your free trial software.

Watch your email for your link to download the software.
You are on the way to experience the industry-leading HMI/SCADA, helping you to speed operator response and accelerate configuration. We have sent an email with a download link to download the free software and discover GE Vernova’s iFIX HMI/SCADA.

Did not receive the software download link?

Welcome Back
John thomas
Not You?
GE Vernova

SaaS APM + Robotics for Energy Excellence

Experts from GE Vernova’s Power and Energy Resources software, ANYbotics, and AWS share an inside look at how GE Vernova's AI-powered Asset Performance Management (APM) is opening new frontiers in autonomous maintenance. This session explores how the shift to SaaS-based APM, built on a microservice architecture, enables integration with robotic operating systems—unlocking new efficiencies and richer asset data than ever before.



--
TRANSCRIPT

Thanks for joining us. Good morning. Good afternoon and good evening. I'm really thank you for taking an hour out of your day to come listen to a really exciting topic, that we have on tap for you. And we're going to be going through today, our approach to SaaS APM with the integration of robotics for maintenance excellence. You'll hear from a few experts as we go. Really important topic. Something we're hearing a lot from the market. So we're really excited to bring you through, how we're looking at the space and bringing some things to market. So quickly before we go, it's a pretty packed schedule. We're hopefully going to say around Feel free to ask questions as we go.  
 
We'll have some introductions from our experts here on the call, and we're going to cover kind of the collaboration, what kind of why we're doing the work we are. We're going to get into the Autonomous Inspection products. Talk through ANYbotics and what they do and why we partner, and then get into the underlying technology that we leverage with AWS. And then we'll try to answer as many questions as possible, whether it's through the chat or at the end live. So again, these are just general housekeeping items. Feel free to submit your questions in the Q&A chat box. There is a resource center.  
 
We have a lot of good content there linked to learn more. We're going to cover a mix of kind of high level content and some deep use cases. So there's some extra content for you to look at. If you have any questions or want to learn more, our information is available. Feel free to reach out. We have three experts on this call. Three with domain expertise in different areas. So if there's more information that you want to explore, we'll make sure you have that information. And again, any questions we don't get to live, we'll follow up within 48 hours with the right team copy to answer those questions and get you the information that you need. So me, my name is Ryan. I lead product marketing here for the Power Energy Resources software business, with a focus on our platform and AI initiatives. I have the pleasure of working directly with this team as well. On the partner marketing side, I'll be your moderator. So as questions come in, I'll be moderating the questions.  
 
See if there's any that are trending and asking those questions to our experts as we go. So again, feel free to add it to the chat. And without further ado, we'll get it over to the experts, introduce themselves, and we'll be underway. Sounds good. Thank you. Ryan. Hi, my name is Neha Joshi. I'm the computer vision product expert at GE Vernova. My product is Autonomous Inspection that we will be talking more about today. However, very, very excited, that, we will be talking about our partnership with ANYbotics and, AWS and how it is transforming our, robotic inspection in the energy space. Thank you. Over to you Oussama. Hello. Good morning. Good afternoon. Good evening everyone. I'm Oussama Darouichi, Global Director of Partnerships at ANYbotics. Very excited about the partnership with GE Vernova.  
 
And also excited about the work we’re doing together. I've been working with them for a couple of years now and looking forward to bringing this full solution with GE and AWS, to you guys and to the broader industrial landscape. Looking forward to be sharing with you more information on this. Over to you Dario. And thank you very much, Oussama. Hi. My name is Dario Rivera. I'm a Principal Solutions Architect here at AWS. Been around for about 12 years at AWS. In a number of different environments. Happy to be talking with you. In regards to our relationship with ANYbotics and GE Vernova, as well as, diving deep on SageMaker and some of the capabilities it offers. And, how this platform that you'll be learning about is aligned to the capabilities that SageMaker offers and how it might be able to solve some of your problems as well. So with that, I'll give it back to Ryan. Perfect. All right. Let's get on to why we're here today. So really, the big picture is the convergence of what is referred to as physical AI that Oussama get into in digital AI, which, both Dario and Neha will touch on. And, and really there's some big picture items here. And these are sources from a couple. So ABI Research, Innovate Energy Now, Global Newswire and some others. Why why we're here. on the specialty manufacturing chemical side, primarily in oil and gas, expect to implement AI powered maintenance solutions by next year.  
 
And AI, we talk about it's a pretty big umbrella. It's inclusive of machine learning your agentic topics and some other elements that are getting some of the hype right now. We're really going to focus in on the machine learning side of things. With our computer vision product, 44%, of those polled in this research have adopted or are piloting AI in maintenance. So maintenance specific activities that's across robotics or software. now, based on Global Newswire research, are considered preventative maintenance to be a core to what they do. They want to get ahead of their failures and continue to get ahead of those potential failures versus being reactive. And so really, when you look at the manufacturers, so you think about producing widgets or other items or in the gas power space your turbines, controls, some form of robotics, whether that's fixed arms, whether it's mobile robotics, which we'll hear about today. And then again, we're expecting to see a some pretty large growth that Oussama will hit on. So how do we at GE Vernova with AWS and with ANYbotics get ahead of some of this growth to better support your organizations? Kind of the next area of maintenance.  
 
So really when we think about this and why we're here today, the convergence of physical AI and digital AI, is really going to help with a few things. And our asset performance management can help with that, right. Continuous inspections. We can further help centralize your data, your images, emissions data, inspection data. And you'll hear more on the use cases from Neha and Oussama. And really using the multi modal context, which is how do you take data sources that aren't just ones and zeros, using different sensor types, different cameras, bring them into one space and actually help to analyze that and turn it into actual information that can be used across the applications that you have deployed or made accessible to your operators or maintenance teams, your engineering. And the big part of that is you're going to hear a lot of topics today around automation in AI.  
 
Our approach here across digital and physical is how do we keep the humans in the loop on how we make these decisions? So you won't hear anything around full agentic today. You won't hear anything about automated workflows. You will hear a little bit of that from Oussama on operator routes and how that works. But our focus here at Vernova is to make sure that a human in the loop, to make those critical decisions for your business. And this is a quick reminder, how we think about this today. And as we go deeper and you look across our Asset Performance Management portfolio, especially, a SaaS deployment, which is where AWS comes to play. We're looking at how can we help get more data into our APM ecosystem to leverage to help you make decisions. A lot of you on the call are likely familiar with our portfolio today. Neha is going to talk through a little bit of the emerging use cases and how we're thinking about embedding this technology.  
 
But if you focus on the bottom, your Essentials and Autonomous Inspection is really our scalable microservice foundation to how we deploy our software. And a lot has gone into that work to make what you're about to hear possible. And then how we think about it going forward is, how do we use this data collected from robotics and actually turn that into time series or other data elements that you can then leverage in your applications from APM to help give more fidelity to your asset data and your assets as a whole. So without further ado, I'm going to hand it over to Neha. And we're going to get into the product side and, and go from there. Over to you Neha. Sounds good. Thank you Ryan. So folks, before we dive into the entire end to end solution on robots, I, I think let's first understand, the computer vision product itself, and then we can see, like, how computer vision product fits into, this robotic entire robotics solution, end to end solution and how ANYbotics and AWS is helping us. So when we did our VOCs with the customers, we saw that there are two areas, in the manual inspection and sensor based inspection.  
 
These are the two areas where our customers are energy sector customers are still struggling with, different, problems like resource, manual inspection gets resource intensive, as you know, you have to send, person for doing inspection at the unsafe locations. And also the manual inspection is, delayed or reactive most of the times. So it's very, very hard to tackle the problem before it occurs. Right. So it's it's it's basically you cannot even predict something that is happening due to these manual inspection process, the way it was. Sensor based inspections, on the other hand, gets to be very expensive because you have to attach the sensor. Sometimes you have to bring your, asset down to attach the sensors. And we have also seen that sensor based inspections are, losing the visual context in some sense. So, it cannot get the surface level, issues understood.  
 
So what we have come up with is the computer vision based product that can automatically detect, on the surface, the issues that are happening at the surface level. So understanding this workflow is like, end to end workflow on a four steps. Number one is collecting the images. So you can see that, on the premises, we support fixed cameras, we support mobile inspectors going around picking up the pictures. And now we are starting, supporting the robots as well. In the future, we will also support drone and satellite. That's on our roadmap. So step number two is capturing the images, and uploading those images into the cloud. So all the images that are captured in step number one gets automatically uploaded to the cloud. Step number three is where our Autonomous Inspection product, which is running the computer vision based models, deep learning models to analyze those images, run the image analytics on top of it. And then step number four is sending the results, the readings or you know, based on the use case, it sends all the data to the time series. You can also send the alerts in our APM system. So basically it's a four step process. There are some optional steps that are on our roadmap where you would insert, human into the loop, review the results that are coming back from the machine learning model and then decide, give or thumbs up or thumbs down, or revise the prediction or revise the results and then train the model so that next time when the similar image comes in, automatically it gets, it gets transferred. The results will be correct as per the human review. So this is the process.  
 
And now you can see that how, ANYbotics robot is going to help us bringing the images automatically to our ecosystem. So that's the idea. So at step number three, like I mentioned, we have deep learning machine learning models running in our alongside with our APM products. So we have gauge reading machine learning model. So automatically that can understand the needle position. It can give you, you can set up a threshold values where you can see that when the needle reaches that certain number, say medium, or high priority alert, and you can say that in the UI itself, in Autonomous Inspection. We do support circular as well as, square, gauge readings at this point. The second one we have is on thermal profiling, which is a statistical model that we run. On top of the model, you can also draw some regions, and you can, based on the region, you can, you can identify hot spots and cold spots and you can set again, you can set up the alerts as well as you can get these readings into your time series. You can drop to 15 regions and you can monitor the entire, you know, transformer per se at this point. There are couple of models where it can understand the change detection at the surface level so it can understand the corrosion. It can figure out there is a corrosion, there is a change happening in the asset and it can notify you.  
 
There is another model that we have which is kind of like extension of this existing models that can understand the severity of the corrosion. So not only detects the corrosion, but it will also give you severity of the corrosion. And based on that, based on the severity, you can make the decision, for the, for maintenance decision that you want to take. So it will, you can, you can create that recommendation as well and create an alert, create a case out of that. So all this is giving you is, you know, the data into time series. And once the data gets into time series, you can have your further, APM applications like Rounds Pro or Mechanical Integrity, or even Predictive Analytics. You can run Predictive Analytics on top of that, and you can get, you know, get the maintenance action flow going on. We also have MCC panel monitoring, deep learning model, which is in the pilot stage right now. So if you have any requirements for monitoring the MCC panel for like lights and switches and, digital panel reading, please let us know. We would be happy to work with you.  
 
As you can see, these are all machine learning models. So it needs, training to be there so that it can, perform as per the expectation. And the confidence level will go higher, more you give the images. So, so when we onboard our customers, we will also get their, pictures to train the model so that, the model is not a generic model, but it is trained on the customer images. So we are we are going to extend this solution to ANYbotics, as our mobile sensors, for capturing images in the flow. AWS service, will help us into areas, robot as a service for collecting images as well as our essential platform to, host these models, deep learning models in real time. So with that, I'm going to hand it over to Oussama to take you through the robotic journey. But before I transition, let me see if there are any questions in the Q&A. Ryan, any questions? Yeah, we have one. So, the so the first question is great use cases. What's the approach to use cases outside of the one we're seeing on the screen right now? How do how do we approach it with customers to help support other use cases? Great question. So so this is just the beginning. This is just the start you know. So we are we have launched our product with these five, five machine learning models.  
 
But at the end of the day, you know that, this is this is a software building new machine learning models. So new emerging cases that we are hearing right now, it's clearer detection. Then we are hearing about, thermography. So thermal profiling and thermography is a different thing. So we we are hearing about that sound video processing. We are hearing gas leak detections. We are hearing about that, any other type of detection. So our process is going to be like working with the customer alongside. And then we will be able we will train our models. We will build our models, of course, alongside with the customers. We will we will train the model with customer data. And then we will deploy those models into, into our ecosystem. Right. Perfect.  
 
And we have a question as well before we move on, I think it's a pertinent one on this use case. From Raj, how accurate is the estimation and classification of corrosion severity? Or if you can talk to that at a high level. Yes, absolutely. So, so Raj, we are we actually partner with, with our customer, one of our customers when brought this model into production. So, so what we did, again, we train the model with customer images, and then we worked with them to, make sure that, the confidence level goes higher. You know, that's the that's going to be our process, normal process to make sure that the customers, what they see. So the severity level could be three or could be four. So we train the models and understand work with their SMEs to make it to the right, level of accuracy. And we are also following a standard, in the, in the world, in the industry world integrity distribution for App s13. So we are, we are kind of like, you know, working with both sides so that we can bring the standard to that. And I would be happy to work one on one with you to make sure that, you know, the accuracy is high enough before we deploy your, your model into the into production. Perfect. All right. Great.  
 
At this point, let's move over to Oussama, for time. So, Neha, thank you for answering the questions. Again, everyone the chat box is open. If we don't get to it live, please keep asking. We will follow up. Great questions so far. Over to you, Oussama. Absolutely. Thank you. Yes, thank you very much. Thank you, Neha. Hello, everyone. I'm Oussama and, looking forward to walk you through the ANYbotics journey and also the physical AI part of this presentation I'll try and present before. So basically, who is ANYbotics? ANYbotics. We are, robotics company, based in Switzerland, Zurich, with an office in San Francisco. We have over 200 experts working around robotics, AI, industrial automation. And our objective is to deliver safe, reliable inspection data to the world's most demanding industrial environments.  
 
Therefore, work with, with dangerous and difficult industries such as oil and gas, chemical facilities, power generation, more utilities, ways to access players, metal and, and mining. Basically ANYbotics has been pioneering autonomous robots for industrial inspection. And we are active across the globe. We have operations from North America, Brazil, Europe and APAC countries, where we also have a network of partners, to service and deploy our robotics solutions. With GE we have a global partnerships at the technological level to bring to you, the, the full solution as well as with AWS. So as a company, our ANYbotics solution, our ANYmal solution is aiming at providing more data driven, insights. And this data, has the purpose to address one or multiple of the three challenges you see here. As a company, we aim at de-risking industrial, inspection operations. Right. So by putting robots instead of humans in hazardous environments. We also aim at helping our clients to reduce the pressure, on their costs by, avoiding equipment downtime through more data. And of course, for example, the GE APM solution provides more insights and actionable work orders. And finally, last but not least, in many critical markets where we are active, we see more and more a workforce shortage issue of skilled operators in, within our clients facilities.  
 
So our clients ask us more and more, hey, can we use robotics to automate difficult, repetitive, boring jobs? And we can put the workforce into more, more demanding from, analytical, work. So this is more or less, the challenges we are addressing with our solution. And more data, as Neha mentioned, can come either by having multiple data, generating more or more insights. Of course, if you have workforce shortage issue, then this is not an easy solution. The second one, the more data can also can also come from more sensors. But of course, at some point there's only so much sensor you can deploy in your facility because otherwise you will simply destroy this investment.  
 
If you sensorize everything everywhere. And the the the change making proposition that ANYbotics providing is to use mobile robots, advanced autonomous mobile robots to have sensors work around the asset. We flip the paradigm instead of having sensors on us, on assets fixed, we have mobile, mobile sensors. And when you think about that, the cost of sensoring your assets dramatically decreases, as you deploy, this robot. Thanks to this dynamic Autonomous Inspection solution, hopefully you can get even more data and faster operational insights than traditional approaches where you have to rely only on, on human, and sensor based technologies. So this is something that we, that we are seeing as an evolution in the industry. Of course, autonomous industry has been something around for many years now. I mean, we have, been having, automated factories for many years. We see automotive, for example, industry adopting very well arms. The second one is automated warehousing. I think with the Covid we all remember the workforce shortage issue. They had what everyone was ordering online that they were giving, signing bonus for people to go work in the warehouses. And now we see those challenges being also translated into in more dangerous environments, hence enabling autonomous industry.  
 
Also when it comes to Autonomous Inspection and maintenance. ANYbotics is on the market with two solutions ANYmal and ANYmal X. Both are mobile robots that are completely autonomously. ANYmal is the robot you see here on screen. The right one is completely certified products on the market for over two years now certified FCC, it is completely water and dust protected, whereas ANYmal X, the one on the right is EX certified robot for zone one and zone two environments, particularly relevant for oil and gas and chemical applications. As you can see, on the right side in the video, the robot has been designed for industrial purposes, extremely durable, extremely solid, and it can really tackle difficult environment whether we are indoor or outdoor. And also the industrial environment, as it is. I'm happy to give you more details as a follow up, on those technicalities. At the end of the day, what we want to do is to bring this mobile sensing technology to, to benefit your daily operations. Whether we're talking about routine operators rounds or by freeing up Part of the personal, especially person to the high level tasks or by providing remote operations, for example, it can be something very close, like a confined space close by.  
 
And you don't want to to go through the hassle just in the robot to gather the data. Increased spatial awareness or something, even if in more and more more ambitious, like offshore offshore deployments and offshore operations. We also want to provide value for your asset condition monitoring by providing you richer, repetitive high level and high quality data to your to your systems. And of course, by increasing the site safety, for example, having robots, walking around and monitoring gas concentration conditions or performing first responder missions. All this, has been enabled physical AI technology brings to you different capabilities depending on the sensor configuration that you have. The standard configuration comes with the visual camera, 4K resolution, high quality pictures, high quality videos, 20 times optical zoom. And with that, as Neha mentioned before, we will unlock guage reading analysis. You can can unlock, valve position reading the, that the payload also has a thermal camera for thermographic use cases as well as directional microphone.  
 
You can add on top of that, you can add an acoustic imaging payload, which will either work to detect partial discharge, detect abnormal vibrations, or localize gas leakages. Steam leakage vacuums. And last but not least, we also have a gas concentration payload for toxic gases or hydrocarbons. The robots work around sniff the environment and tell you if there is presence of a specific gas depending on the cartridge to have in the back the back of the of the robot. And all this data is flexibly integrated into the GE’s APM solution through all the rest API and also leveraging AWS infrastructure.  
 
As you will be discovering afterwards with Dario. This autonomous operations or this mobile sensing technology can work in two ways. Either you have completely autonomous, activities, inspection activities, or you have scheduled missions for your robots at different moments of the day to the different, kind of things, right? Checking temperature reading gauges and so on. And you can have multiple conditions, a large, very large flexibility when it comes to mission configuration and mission execution. You can also want at some point to run one specific activity with the robot to collect one specific data point. So you don't want to run the whole mission. And, and just to get one data point or data points around one specific asset. In that case, you can do what you call pick an act or remote control missions, and you will click on a specific asset in the brain of the robot. In the in the 3D representation of the robot, the robot would reach that inspection location, and then you can gather the data using one or multiple of the sensors that the, the robot is carrying. So our clients are using this robot in many different ways, from oil and gas to metals, going through utilities and power plants and so on. So here are some examples of use cases.  
 
But of course, more than happy to follow up with you and to hear more about your specific use cases. But this can be, for example, inspecting motors to have an acoustic monitoring, to see if there is a fingerprint change over time can be also a thermal inspection of a specific pump. To see this electrical failure, we can try to detect steam leakages, or, for example, detecting the presence of H2S in a specific environment. I've been working for many years now. Couple of years, with the O&M team from GE Vernova, in power plants and some use cases that are running there, can be categorized into two buckets. Either they're trying to verify dangerous areas or they want to monitor critical equipment. So verification of dangerous areas can be through steam verification of inspection of areas where potential steam linkages have occurred in other similar plants. And then the trying to understand if in their own plant, that steam leak is happening. When it comes to critical equipment monitoring, they want to understand what equipment degradation over large operation period by using for example, the microphone or thermographic camera, as well as by running autonomous routine inspection. All these data, of course, then flows back to the APM software as Neha presented before of positive data and provide actionable insights. Also, using the AWS cloud. Now I'm super happy to hand over to Dario, from AWS to walk us through the Data Integration Data Foundation and how AWS uses the data. Yeah, Oussama just before we make that jump.  
 
We got a couple questions here for you before we move on. A message- a question from Gan What type and how many sensors can this accommodate? I'm assuming that's in reference to the ANYmal and ANYmal X robots. Okay, so the configuration that you see here on the screen on the right side, the robot can carry all the sensors at the same time. Let me pull back this one here. So upload what is it. Let me just find it. Sorry about that. Up. Here you go. The sensors here can all be carried at the same time. This is on the top side, so the standard payload is what comes standard with the lighter in the back. This is the basic configuration, but in the same in the same robot, you can simultaneously add the acoustic imager payload as well as the gas concentration meter. Thank you. And then Neha I believe we have one quick for you before we move on. From Chris, there was comments about the integration of ANYbotics in the GE’s APM. Can this solution also be used and integrate with other solutions? So I'm assuming that's in reference to Autonomous Inspection and robotics. Can that data be used and integrated into other places as well? Yes.  
 
Other places means other sensor data, I believe other systems. So I believe other systems as well, like if they want to move it from into APM, but then also into maybe, a data platform. You probably follow up on that one as a deeper dive, but I guess I yeah, I need to yeah. So Chris will follow up on, on a deeper dive on that one, around some of the integration technology we use in the API front to understand that question further. All right. Great. Dario, over to you. And again, the chat bot is open. Keep asking questions. We're happy to keep answering. All right. Well thank you Oussama. Thank you Ryan. Thank you Neha. Certainly, a lot of amazing technology in terms of how it integrates together. And it's just so cool to look at. But what I wanted to showcase for you was really the bricks and mortar that a lot of this platform is built off of. And with that, taking a step back and asking, you know, what are some of the challenges that many companies are having now? And where we saw ANYbotics and GE Vernova kind of leveraging some of the capabilities of AWS to solve. Talked about a lot of data collection, getting all these pictures and integrating them into one place and then making sense of them, overcoming all these disparate data silos that have very difficult ways of integrating in one place or another.  
 
And then, governing the data, ensuring that you have a clear understanding of not only who's getting access to the data, but how the data is being used, having lineage associated to, how the data is changed over time, and then scaling with operational excellence, ensuring that there's infrastructure behind the scenes so that whatever you're doing with that data - building analytics, AI, ML models, even generative AI that there are capabilities to be able to host that data, do the processing against it, and then eventually, be able to get a unique insights that allow you to make business decisions that are going to add value to your customers. So with that in mind, we recognize that your data is a differentiator. It's the thing that ensures that your business is going to be value to the customers that you're trying to target. And so we see that there's these four, capabilities or categories that we are, constantly focusing on, and then all the tools or capabilities or processes that are ensuring that those categories are going to be met from the storage and managing the governance acted upon the data and the experience. And this is really the kind of capabilities that AWS tries to target when it comes to managing all the data that your enterprise has to deal with.  
 
As we looked at the ANYbotics Solutions with Autonomous Inspection, we saw that the robot collects a lot of the data goes into S3 on the AWS platform. And then, probably going through a number of different buckets that have data processing occur, munching the data and doing it in such a way that eventually the Autonomous Inspection and the ML models used it within it, have the ability to make the analytics that, represent the dashboard, that you have the predictive insights or other kind of operational insights to allow you to make informed decisions. And so behind the scenes is the platform from AWS to make that happen. So this is where SageMaker comes in. We're going to do kind of a deep dive on SageMaker in regards to some of the capabilities it offers, from all those different categories we talked about before, from governing to storage to acting on the data, and then, giving you an experience that allows your users, to be able to make sense of all that data that's being collected. So with that, there's three sort of tiers that we talked about from the SageMaker standpoint.  
 
That is the engine that's running a lot of the Autonomous Inspection data processing that you just heard about from Neha. It starts with an open lakehouse, or essentially a data lake that, we're going to go into a little bit and then from there, the data and AI governance, in regards to all the tools that allow you to you know, munch all that data processing in different sort of transforms. And as well as get to see, and secure how that data is being used. And then the Unified Studio, which is just a comprehensive collection of tools that data analysts and scientists and machine learning specialists have the ability to use very quickly against the data to make sense of it and do the informed, graphics and all the other sort of things that we often hear about that allow the data to be useful to, all the people that are trying to get insights from it. So let's start with the first one, which is the Lakehouse. And we know that starts with S3. Many of you probably heard of S3 or simple storage service. Essentially hyperscale storage and blob storage object storage. And with that, Redshift, which is a, cloud scale, sequel data warehouse, it allows you to take a lot of the data that's highly integrated with S3, and put it in such a way that you can build queries off of it and get very, unique insights. And so what's really useful about this is that you have a number of different capabilities built in between both of these tools that, give you zero ETL or extract, transform and load kind of capabilities so that you can do all sorts of things that are commonly built off of that data. And so therefore you have federated queries that are built at hyperscale.  
 
We have something called glue, which allows you to essentially transform the data in a number of different ways and give you all the kind of capabilities that are built into, preparing the data in such a way that allow, some of the tools you need to, ingest that data to make those informed decisions that we talked about previously. And all of this is running on a, you know, open source Apache Iceberg open API. So when it comes to integration with other third party tools, as we talked about here with Salesforce and CRMs, Instagram, you know, so forth and so on, all the tools that we talked about that are going to be that are used by ANYbotics as well as, Autonomous Inspection with Vernova, all of that makes it really simple, that integration, given the kind of tools that this offers off the SageMaker.  
 
All with fine-grained access control, which is what we're going to go into next, which is the data and AI governance. And so with that, it's built off of something called DataZone, which allows you to understand the data quality and the classification, lineage, get fine grained permissions. Then you have guardrails that ensure that the data is not being used in a way that it shouldn't, that the machine learning models and even the generative AI models that are built into SageMaker, ensure that the data is only outputting things in a way that makes sense to the guardrails that you apply to it. And so with that, there's a concept of responsible AI that, AWS pays a lot of attention to in terms of reaching out to the community to understand how the community is positioning, the understanding of AI and the responsible use of it, as well as ensuring that it's being managed in a easy to to utilize way by the users of these tools that SageMaker offers within the SageMaker catalog. And so that's where the Unified Studio comes in, in that it provides a massive set of different kinds of tools from Sequel analytics, via Redshift and Athena to data processing associated with, you know, massive, elastic MapReduce and, AWS glue, model development, Gen AI App development. Streaming of the data, business insights with something called QuickSight, which gives you, very quick way of prompting, against the data and giving you, dashboards that are automatically generated for you based off of the data that you prompted against.  
 
And obviously, search analytics with OpenSearch. And so all of this essentially provides you all the capabilities that you commonly are looking for to ensure that you're having great success with the data from, notebooks to visual editors to Gen AI IDEs, with Q developer that allow you to experiment and train, prepare and integrate and run a lot of queries to prompting against the data that it gives you, that informed decision that we all there, we're all hearing about that are, out in the community today associated to Gen AI and AIML usage. So with that, one of the things that we want to just highlight here is the aspect of moving quickly on the data, via the ingestion. And then preparing of the data and then having the models that are readily available to you that you can, iterate off of to have the fine tuned models that are associated to your use case all supported under SageMaker, that it has a hosting environment that allows these models to, execute at scale and then with that, having the ability to start moving and building fast some of these capabilities that you're trying to target for your use case. So with that, I'll open up to any questions that, Ryan may see, from you guys that are associated to the SageMaker platform or anything AWS specific.  
 
Yeah, definitely. And we still have, the Q&A box open in a few minutes as well. So keep asking as we go through a couple that came in and Dario, great cover on that. I think responsible AI is really important to talk through. As a reminder to those listening. Right. Two years ago we, we entered a strategic collaboration agreement with AWS. This is actually how the ANYbotics relationship came to be that with our gas power teams, we've been thinking for quite a while on how to use this technology in energy and how to use it responsibly. So I just wanted to double down on that, because the approach we're taking, as you heard from Neha on how we train developed models, how we test the models, I think is really important, as these organizations think about going this route. So really good topic, on the responsible front, but on the, on the the AWS side. Dario, question for you.  
 
There's a question here about being in a region where they're looking for more of a point of presence, local deployment. Can you just give a little highlight on how AWS is approaching some of these markets that have some data sovereignty requirements? I believe this question comes from the Saudi region. So do you mind addressing that a little bit? Yeah. So what we know all the time is that there's customer demand coming from all over the world. And we obviously have to work with the local governments to align on building regions within, you know, specific areas, particularly when we talk about Saudi, which has a number of different dynamics associated to it. But one of the things that are is very clear is that when a region is positioned within a given country, whatever the country is around the world, the data stays within that region and will never leave it unless you specifically who are the owner of the data, have the ability or choose to move it to another region or another, facility in some way. Regions are set up with something called availability zones, and there's three or more availability zones. Each availability zone is representative of a number of data center collections within each availability zone.  
 
And they're, you know, connected together through hyperscale networking. And, you know, you may see data being moved around within those availability zones based off of your architectures or in the case of, you know, some, region level services like S3, they have the ability to distribute copies of data across availability zones, but it always stays within that region. And that's clear to, keep in mind in regards to data sovereignty associated to specific regions where, have strong data sovereignty concerns. Hopefully that answers. That's a a great answer. I know it's a hot topic as well. I'm on that thread Neha, this might be for you and Dario. Around how we're thinking about like, introductions of new model types or technology into what we just talked about. That might be an expansion of the machine learning capabilities, or the generative AI from which is obviously on top of people's mind. Can I guess. Neha, to start with you, how are we thinking about exploring some of these additions to what we're doing and kind of what that means? Yeah. So for our roadmap, right. Definitely we are looking at various areas on use of visual language models, VLMs. So, so similar to LLMs. There are visual models that are generally available, that we can tap into, and those use cases that we see from our customers around, like, you know, how do we, if the, if the problem doesn't happen for that asset, how do you train your model? Right?  
 
I mean, that's the that's the basic question. Like when we are creating the models and we say that we train the model with your assets and your problems and issues, to identify those. How do you even train it using the existing images? So what we do, we, we are looking into using, VLMs as well as synthetic data generation. So that's another area that we are looking into that. Also on our roadmap we are looking at bring your own model as well. But it's like a high achieving goal because of you know, having that trained visual model and hosting it on top of, SageMaker that we will, we will experiment with that as well. But definitely we are looking into those. Awesome. That's perfect. Great overview. Oussama, we have a couple for you. And, and one of these might tie in to some of the work you do on the ground as well. A question here is about how robotics are going to evolve and what is ANYbotics approach to helping customers and organizations get prepared for that use of robotics. What's what's your approach to help folks really understand where they could fit and how they could work for them? Yeah, that's a great question.  
 
I think that, the robotics component is just the physical representation of part of the solution. I think what really matters, it's a whole process from the, from from the onboarding, the pre deployment phase, the deployment phase, the post deployment phase, and of course the renewal part or the evolution of technology. I mean, the questions here are all not only short term question also long term questions. So and the pre deployment phase we talk about site evaluation. Is your site actually ready to to host robots at the end of day. It's it's a brownfield facility designed with humans in mind maybe sensors but not robots. So what we want is transparent discussion. We go on site of site evaluation. After that we can even discuss, but we will discuss about the terms of use cases. We work with you, we work with the also with Neha’s team to understand exactly how the solution should, should look like to provide value for you. And then you can even quantify and expand the potential value of what the robotic solution can bring. And of course, our engineers would come on site, not only commission the first part of the robotics journey on site, but also train your team. There's also a skill set that is transferred from ANYbotics to your team, where your team would become certified robot users, and even in some cases, certified robot trainers, so they can train other other colleagues within the company.  
 
And then, of course, throughout the journey, we have different modalities from this reactive, directive. More than just answer questions, support to in case you have questions to more proactive things, what we call through care package, we can discuss them or we have three to 2 to 8 days a month, one for our engineers sitting with your team and really discussing with them, how things are going, helping them improve the deployment. Fine tuning it until the real value is found. So we really have a wide range of services, to support you in your robotics journey. Awesome. And then on, on the back of that one, I know we've had in the space we operate. Right. The ATEX compliance is really a big thing. Like the one question was, what's really the difference between ATEX certified not and when is that ATEX robot going to be ready or expected to be available. Yeah. Good question. So when it comes to ATEX certification it depends on your site. Right. So it's all in gas and chemical facilities. Some areas have different ethics ratings. And of course you cannot go there with a robotics technology unless it is ATEX certified. All the hot work permit, which is a whole paperwork. And then it defines the purpose of autonomous operations. Right.  
 
If you have someone babysitting the robot and they have to go through a long and tedious process to get the robot, validated onsite, then you're just defeating the purpose. So this depends on your site regulations. When it comes to availability, the robots is available for, for for delivery at the, in 2026. So we had the first generation released, one year and a half ago. We learned based on multiple deployments. And now we're working. We've been working to improve it. And then that robot will be available as of next year for shipment. We already have a nice preorder list. Therefore, depending on when you come, it might be, later in the year. Great. No. That's perfect. And and I think we're, that's the questions that we had come through. I know so for folks listening on recording, you can still ask your questions. This isn't the end be all, end all. We're still available. You can reach out again. Our contact information is here. If you have more questions. Want to talk deeper. Use cases. More than happy to. We owe a few people in the chat, some follow up via email on some more technical questions, and we'll be happy to follow up in 48 hours. But, first of all, I just want to say thank you, everybody for joining. Thank you for giving us 45 minutes of your time.  
 
And Oussama Neha, Daria, thanks for bringing your expertise to our customer base, and the rest of the community. I think this is a great topic, something that we're interested in really hearing back, how we can partner and deliver more value via APM robotics and in the cloud together. So thank you so much. Thank you all for joining. Thank you everyone, and feel free to follow us on LinkedIn and reach out to us from that avenue as well. Thank you so much for joining. Bye bye. Thank you. Bye bye.