Subscribe to our newsletter

Every fall, GE Global Research holds a scientific gathering called the Whitney Symposium highlighting the latest scientific trends. Last year the two-day event explored industrial applications of artificial intelligence. We sat down with Mark Grabb and Achalesh Pandey, two GE scientists looking for ways to apply AI to  jet engines, medical scanners and other machines. “We are starting to see significant performance increases from the combination of deep learning and reinforcement learning, where you have a human in the loop correcting the system,” Grabb said. “Once you build a smooth user experience and get the system going, people don’t even know they are correcting the AI along the way.” Here’s the edited version of our conversation.

watercolor sketch of robot typing on conceptual keyboard

“We are also starting to see more applications where AI is becoming part of a “living system,” where it’s continuously learning,” says GE’s Mark Grabb. All images credit: Getty Images

GE Reports: AI research has been around for decades. What makes the present exciting?

Achalesh Pandey: One reason is the huge amount of data that’s been made available and the enormous amounts of computing power. There’s also a complete transformation in the field of deep learning. When you look at Apple’s Siri, Amazon’s Alexa or Google Now, you see the progress in speech recognition. The same thing is now happening in image and video recognition.

Mark Grabb: We are also starting to see more applications where AI is becoming part of a “living system,” where it’s continuously learning. There’s a new analytical structure that’s being used for AI. We are starting to see significant performance increases from the combination of deep learning and reinforcement learning, where you have a human in the loop correcting the system. Once you build a smooth user experience and get the system going, people don’t even know they are correcting the AI along the way.

GER: Can you give us an example of this symbiosis between people and machines?

MG: You can start with a simple Google search. When you look at the results and select the most relevant ones, you become the human part of the learning loop.

AP: You can train Alexa to recognize your speech patterns and accent. It converts your voice into text and then searches through the database to answer the query or carry out the task. Amazon’s Echo, which is powered by Alexa, can now control GE lights, dishwashers, ovens, washing machines and other appliances.

Futuristic technology background

“We’ve started creating digital replicas of machines in the cloud,” says GE’s Achalesh Pandey.

GER: Let’s move onto bigger machines. How can you apply AI to a gas turbine?

MG: It’s the same principle. At GE, we are writing software like Predix, which is the cloud-based operating system for machines that allows us to connect them to the Industrial Internet. But we also have a tremendous number of domain experts. There’s a lot of physics and domain knowledge that’s required to build good analytics and machine learning models. We have actually built AI systems that help data scientists more quickly and more effectively capture the domain knowledge across all the people inside GE building these models. So AI comes in even in the developing of analytics.

Manufacturing and service is the next important set of applications. An AI system can provide the workers with the intelligence they need to make an informed decision — say, whether to scrap or repair a turbine blade. But the human makes the final call based on his or her expertise. All of that information is collected in a closed loop to make the system smarter and smarter, so next time around it provides even better insights.

Finally, you have your pure “machine-inside-the-machine” AI, as well. This could start with computer vision. Once the machine can see, it can grab things and move it somewhere. That’s just the beginning.

AP: We can also use AI to design, build and operate things in a more intelligent way. It will help us eliminate downtime in our factories and help our customers like hospitals operate medical scanners in a more optimal way and allow them to scan more patients.

We’ve started creating digital replicas of machines in the cloud. It could be any machine — an MRI scanner, a jet engine or a wind farm. We call this the digital twin. We can then run simulations that allow us to optimize the operations for multiple different outcomes.

Future is near - testing lab for new cyber part diagnostics

“I believe that in the next five years AI will start moving onto machines,” Pandey says.

GER: So you are building an outcome factory?

MG: Exactly. It’s a data factory. Just think about Google. Think about what work GE is doing connecting machines to the internet. We are building an analytical machine that is co-optimizing all of your outcomes. It has these AI portals for the people to give guidance, but also to receive it when the optimal policies are coming down from the top. We know that this is where analytics is going.

AP: Our Predix platform will become a knowledge broker. Our people will be able to converse with it and exchange information. After a few iterations, we come out with the right solution. It will be accessible to anyone. In a way, we are democratizing data science and domain knowledge and building a system where all parties can work in a symbiotic manner and at scale. We are building a powerhouse of knowledge. I think this is the future.

GER: What else do you see in the future?

AP: I believe that in the next five years AI will start moving onto machines. Eventually, the machines will be so intelligent they will start collaborating themselves and optimizing things. This is nirvana.