Dave Bartlett is a data scientist who spends much of his time sifting through gigabytes of data and seeking useful bits of information. But on his best days, he digs up treasures he hasn’t even been looking for.
Earlier this year, Bartlett, who is the chief technology officer at GE Aviation, and his team discovered that jet engines flying between certain pairs of cities in the Middle East and Asia performed differently than identical engines serving on other routes. “We looked deeper at the data coming in and saw that the engines experienced different wear patterns,” Bartlett says. “We started seeing correlations and patterns involving air quality, the weather and pilot behavior. This finding gave us new clues in why it was happening and possible courses of action to manage it.”
Bartlett and the team can gain such insights because of Predix, a powerful new software platform GE developed specifically to connect people, data and machines over the Industrial Internet. The company just announced that it will open Predix to other software developers.
Bartlett is in New York today to attend GE’s Minds + Machines conference (you can watch a Livestream the bottom of the page). He talked to GE Reports editor Tomas Kellner about Predix, the Industrial Internet, and the value of big data.
Tomas Kellner: I want to start by asking about the Predix platform, but first: I’m getting caught up on the word “platform.” What does it mean in the software context?
Dave Bartlett: Let’s start with a train platform, which is a useful analogy. It’s a safe, secure, efficient and reusable structure that allows you to easily board a train. It’s also very scalable, allowing a single rider as well as a crowd to get on and off. It also provides other services. It has kiosks where you can buy a ticket, a cup of coffee and a paper. When I lived in New York, there was one platform I used where I could even drop off and pick up my dry cleaning. Without a platform, getting on the train would be pretty dangerous and it would require a lot of effort. It would be different every time, depending on the speed of the train and whether there is someone waiting to pull you up.
TK: Let’s ride that train to the tech world. What does a technology platform look like?
DB: It is similar in many ways. One platform that everybody can relate to is the smartphone. You can use it as a base to develop new applications or to download new applications for personal use. You can use it to make a call, but you can also use it to do many other things safely and efficiently, like surf the internet, take photos, listen to your favorite music or check email. Many of these apps are like that dry cleaning kiosk. They provide a valuable service that makes people more efficient.
TK: Can you take me from a smartphone to Predix?
DB: Predix is also a technology platform, not deployed on a phone that you hold in your hand, but rather behind the closed doors of a data center connected to data lakes and other forms of big data storage. Like Google’s Android or Apple’s iOS operating systems, it has a set of software services that help developers quickly build apps for the industrial internet.
You can also deploy Predix in the cloud to make it more widely accessible and available. There’s even a version that can run on machines like jet engines, gas turbines, and locomotives. But it’s all there to do the same thing - to run apps that allow us to do big data analytics, monitor machines remotely and have them ‘talk’ to each other. Like the cement train platform, it provides a stable and scalable way to quickly develop and deploy new applications.
TK: Still, do we really need Predix?
DB: Well, theoretically I guess not, but good luck jumping on a moving train! The software world works pretty much the same way. If you don’t have a platform, every time you want to write software, you have to start from scratch, and create and invent everything. That’s not a very efficient or responsible way to do business. Many of the apps for the Industrial Internet share common base services or features to manage how they perform work and share data. Predix gives you those services and that translates to speed to market of new apps which means faster value for customers.
TK: Predix is a big data software platform and GE has been a big iron company. Why did GE start developing software?
DB: Software is key to transforming big iron into brilliant iron. Lots of companies can do analytical work. What makes GE so unique is our installed machine base with the deep domain expertise of the engineers that built them, tightly coupled with a global network of data scientists. To do this right, you first need the physics-based analytical skills of the engineers who understand why something is happening. Secondly, you need the data scientists who, like miners of minerals, go deep down into the lode of data looking for valuable things to surface. They look for patterns and connections that we may have not even thought about. It’s this marriage of the physical and digital that creates the most powerful result.
TK: I’ve heard data scientists talk about the marriage of operational technology and information technology, or OT and IT. What is it and why is it important?
DB: Most people are familiar with IT, which is a big family that includes desktops, laptops, servers, phones and many other devices. OT, on the other hand, are the machines, the jet engines, MRIs and turbines that can now communicate data over the Industrial Internet. Bringing these two worlds together is analogous to the union of data scientists and engineers. We need OT and IT together to monitor the machines remotely, detect and adapt to changes, and predict future behavior.
TK: Predix can predict the future?
DB: In a way, yes. It seems like ancient history, but not too long ago, if you wanted to make a phone call, you actually had to find a physical phone handset wired to the wall. A similar situation still applies in many factories. When you want to check on a machine, you have to go where it’s located and watch it run and maybe do some diagnostics right there. But our implementation of the Industrial Internet frees us from that just as smart phones moved us away from the old limited telephony models.
Predix is a game changer. It gives us the intelligence to tell us what’s going on in a machine, whether something’s wrong, and what we can do about it before it affects our customers. Like doctors, we can also use analytics to predict when they might fall ill in the future.
Let me give you an example. Like any other machine, a jet engine collects dirt and corrosion and you have to wash it on a periodic basis. This is a big deal since a water wash can increase its efficiency anywhere from 1 to 18 percent. But here’s the rub. If you do it too often, it becomes unnecessarily expensive. If you wait too long, you burn more fuel and wear your parts more, which will also cost you. But with a Predix app, you can tell precisely when its’ the right time to schedule your next water wash of your engine.
We call these apps Predictivity solutions. They run on Predix just like apps run on a smartphone in a secure, scalable and consistent way. Predictivity apps can also keep an eye on locomotives, collect data about heat and vibration, and predict when you need to perform maintenance or replace parts. We have many solutions like this across all GE industries. In fact, this year we are on track to drive over $1 billion of value from them.
TK: If Predix provides such a good value to GE, why is GE opening it up?
DB: A software platform becomes more powerful the more people use it. GE will continue using it, but making it available externally will also allow our customers and business partners to write their own software and become more successful. We want Predix to become the Android or iOS of the machine world. We want it to become the language of the Industrial Internet.
TK: What’s next on your agenda?
DB: The opportunities for solutions based on Predix are almost unlimited. They will enable our teams in field to more effectively capture data from devices such as borescopes, wearables, and new robotics, as they are deployed. They will unlock new insights and value from data to help our customers be more profitable and provide better customer service, and they will help us get the most out of our supply chains and industrial assets.
The Predix train to better business value and customer outcomes is well on its way. Announcing this week that the platform is open to the public is our way of shouting, All Aboard!
Broadcast live streaming video on Ustream
Nobody wants to be late. But at a busy airline hub like Atlanta or Chicago, even a brief delay in aircraft arrival can result in missed connections and cascade into a major inconvenience. The Bureau of Transportation Statistics reported that this year alone, more than 750,000 flights operated by U.S. carriers arrived late and over 25 percent of flights were delayed by 15 minutes or more or cancelled. The estimated economic costs of these delays, both for airlines and for passengers, are in the billions.
Two years ago, GE Aviation partnered with the consulting firm Accenture to tackle the problem. Their joint-venture, called Taleris, developed a software system for the Industrial Internet that feeds on data coming from sensors on planes, air traffic, weather and other sources. It could help a large domestic airline prevent 1,000 delayed departures and cancellations and help more than 165,000 passengers get to their destination on time. “A connected device or a machine becomes something entirely new, because interconnectedness opens up entirely new dimensions,” writes GE Chief Economist Marco Annunziata in his new report on the topic. “Combining the digital and the physical accelerates value creation in a way that we are only beginning to understand. GE is building a new kind of industrial company.”
GIF credits: GE’s Datalandia series.
Over the last three years, GE has invested more than $1 billion in software and analytics, and opened up software centers in the U.S., Europe and China.
Annunziata says that GE is in a unique position to connect people, data and machines because its deep engineering expertise, the size of its industrial base, and Predix, its software platform for the Industrial Internet. There are some 28,000 GE jet engines in service, over 21,000 GE locomotives and 1.4 million GE healthcare devices like MRIs and CT scanners.
That’s a lot of machines, which is particularly useful when they get connected and trigger the network effect, also known as Metcalfe’s Law. In the 1980s, the American electrical engineer Robert Metcalfe stated that the value of a telecom network grows exponentially with the square of the number of connected users. He was talking about phones, but the law applies to computers, smartphones and industrial machines as well. “The simultaneous development of interconnected hardware and an enabling software platform under the same roof redefines the very nature of a company,” Annunziata writes. He says that the union of “physical and digital” is transforming GE “from a traditional industrial equipment producer to a full-range customer solutions provider, able to maximize customer outcomes and profitability.”
Click on the infographic to explore the growth of the Industrial Internet.
Predix is already supporting dozens of applications ranging from railways to healthcare, where doctors can use it to track equipment, manage wait times, monitor medicine dosage and better utilize staff.
Customers like Air Asia, Norfolk Southern and Columbia Pipeline Group have embraced the software as a tool that will help them save costs and make them more competitive. Annunziata writes that GE is on track to earn $1 billion this year from software applications, and that “digital-driven industries” will enjoy faster growth and higher margins than traditional industrial players.
GE is also applying big data at home. Annunziata writes that the company is connecting research, design, engineering and manufacturing to speed up the development of new products and respond to customer demand faster. GE calls this approach the Brilliant Factory.
“We have argued in previous works that the Industrial Internet and advanced manufacturing are not only transforming individual machines and systems, but they are also changing the nature of economies of scale, transforming the economic landscape and blurring the lines between manufacturing and services,” Annunziata writes. “In a similar way, industrial companies that combine the digital and the physical become fundamentally different in the way they operate and in the value they can provide to customers and shareholders.”
You can find the full version of Annunziata’s paper here.
When the power goes out, electricity providers are often left in the dark along with their customers. That status quo is what’s keeping Naresh Acharya up at night. He is now planning to use some of the world’s most powerful supercomputers to help keep the lights on, while also allowing wind farms to produce more electricity and making the electrical grid more efficient.
“Right now, the power grid isn’t transparent,” Acharya says. “Grid operators don’t always see when something happens. We want to help them maximize the use of their assets in real time.”
GIF credits: GE’s Datalandia series.
Acharya works as a senior engineer at GE’s global research labs in upstate New York. He says that in order to keep their systems safe, grid operators sit down every few months to figure out the maximum amount of power that can run safely through their systems in the worst conditions.
“These analytical tools have been in place for decades and they are very rigid,” Acharya says. “The worst-case scenario may apply to just a few days during a heat wave or a winter storm. This type of thinking is leading us to overdesign and overbuild the grid. With real-time knowledge, we could be getting much more out of our assets without building out a new grid.”
Acharya’s team at GE Global Research is now working with GE Energy Consulting, the Pacific Northwest National Laboratory and Southern California Edison on a software system that could simulate and control the grid in real time.
Many of the tools currently used by utilities to manage the grid were designed for computers with a single processing core, like traditional PCs. As a result, they cannot take advantage of high-performance computers with multiple cores, which are available today. “Utilities can monitor the health of the power grid, but the problem is that anything can go wrong at any time,” Acharya says. “Today, we can’t find out quickly what are the best actions to take.”
The team is building grid analytics tools for powerful multi-core computers, like the machines at the national laboratory, that can carry out multiple tasks at a time. This method, called parallel processing, allows the team to screen data coming over the Industrial Internet, from sensors, generators and other equipment distributed along hundreds of miles of high voltage wires that make up the grid. The software is able to extract from the data deluge a few dozen key signals that have the biggest impact on the stability of grid. “It tells us where we might have a weak spot,” Acharya says.
Grid operators will be able to use the system to quickly answer questions like which generators should increase or decrease output, and what is the optimum amount of electricity that should be flowing through the grid at a given time.
The team is already able to apply parallel processing to existing GE power management software developed by GE Energy Consulting, and to speed it up. The scientists will now use their findings to develop new software specifically designed for parallel processing.
The long term goal is to help utilities maximize the use of their systems. This, in turn, could increase the amount of renewable power flowing through the grid.
Some wind farms, for example, turn the blades of their turbines out of the wind when the grid cannot take any more electricity. “The new system will help utilities to predict outages and fix equipment before it breaks down,” Acharya says. “But it will also help them bundle in more renewable power from wind and solar farms without building new grids, which is becoming harder to do.”
Getting shale gas out of the ground is one thing. But taking it to customers is quite another.
American pipeline operators are investing as much as $40 billion every year to maintain, modernize and expand their networks. The shale gas boom is putting operators under pressure to move more gas to market faster and more safely, and many U.S. pipelines have been in service for at least two decades.
“We need an agile and comprehensive pipeline solution that could be delivered quickly and allows for a more real-time view of pipeline integrity across our interstate natural gas pipelines,” says Shawn Patterson, president of operations and project delivery at Columbia Pipeline Group.
Columbia runs a 15,000-mile gas pipeline network linking the Gulf Coast to the mid-Atlantic region and the Northeast. It will soon start using GE software and big data to monitor its network in almost real time, and streamline its operations and planning.
The technology, called Intelligent Pipeline Solution, combines GE software and hardware with Accenture’s data integration expertise. It runs on Predix, GE’s industrial software platform, and links pipelines to the Industrial Internet for the first time.
The world’s pipelines stretch for some 2 million miles, enough to wrap themselves 80 times around the equator. GE estimates every 150,000 miles of pipeline generates an amount of data equal the entire printed collection of the Library of Congress, or 10 terabytes.
Brian Palmer, chief executive of GE’s Measurement & Control unit, says that the new system will help customers like Columbia make the right decisions at the right time to keep their assets safe. It will help them send repair machinery and crews where they are needed most, and speed up response time to problems.
The system is designed to harvest data from sensors installed along the pipes and equipment, sync it with external data sources and deliver to customers detailed analytics and risk assessment from key points of the network. “The goal is to help pipeline operators make proactive, rather than reactive decisions,” Palmer says.
The “Intelligent Pipeline Solution” is the first commercial product GE and Accenture have offered up since they formed their software and big data partnership in 2013. The companies expect the system to be operational in the first half on 2015.
Kids sometimes make grown-ups see complicated things in simple ways. GE’s new ad about ”brilliant machines" connected to the Industrial Internet is tapping into that power.
The spot features a small boy who can’t speak but whose voice box produces beeps that allow him to talk to toys, the electrical grid, aircraft and many other machines. “Lots of companies have been trying to tell their Industrial Internet story and we had to take a different approach to make it stand out,” says Peter McCallum, senior director at BBDO, the creative agency that made the ad. “We wanted to tell an industrial-scale story at a human level to elicit emotion and ensure that it resonated with far-reaching audiences.”
The story follows the boy from birth, when his primal “beep” causes considerable distress to his parents. But by the time he is in elementary school, his special power allows him to switch the TV to the football game for dad (the ad will air for the first time during the NFL kickoff), restore electricity to an entire town and make planes fly on time. “We liked the idea that it’s his natural language,” McCallum says. “He does not have to put on a cape to have these powers. It’s sort of a metaphor for GE.”
Unlike the boy, the Industrial Internet is real. It could soon link billions of machines and devices ranging from smartphones and thermostats to jet engines and medical scanners.
GE believes the network could add $10 and $15 trillion – the size of today’s U.S. economy - to global GDP over the next 20 years. The company’s software arm has developed a software platform called Predix that allows railroads, oil drilling companies, wind farms, hospitals and other customers to perform prognostics on machines, reduce downtime and increase efficiency.
The “Boy Who Beeps” is the first in a series of stories and other content that GE plans to roll out through the rest of the year to illustrate the power of the Industrial Internet.
As power outages go, the iguana affair was a mundane one. On July 27, a hapless lizard shorted a piece of electrical equipment in the middle of the Florida Keys and knocked out power for 11,000 local residents. It was an act of nature no outage prevention system would have been able to predict. But since the lizard climbed inside a substation, repair crews were able to locate the short and restore power in 10 minutes. “Spotting problems within the limited space of a substation is relatively easy,” says Chris Prince, application engineer at GE Digital Energy. “It’s a high-density nerve center for distributing power. But good luck finding and fixing a problem that fast when it happens along the miles of overhead and underground lines connecting the substation to the consumers.”
In an era when a smartphone can quickly pinpoint its location on a map, few power providers know that customers lost electricity before somebody calls them. Fewer still can see whether the outage was caused by a fallen tree branch, a lightning or faulty equipment.
Americans are taking notice. A new survey measuring public perception of grid resiliency found that many respondents were willing to pay $10 per month on top of their electricity bill to make sure that the grid becomes more reliable (see the results here).
The survey, which was commissioned by GE’s Digital Energy business, also found that a majority wanted utilities to start using digital communications and social media tools to keep them informed in real time during a power outage. “Consumers want to see investment in technology that prevents power outages and reduces the time it takes to turn power back on,” says John McDonald, director of technical strategy and policy development at GE Digital Energy.
Utilities are getting the message. As wireless networks became more ubiquitous, power companies started placing digital sensors along its lines and inside switches, breakers, smart meters and other devices. The sensors feed grid data to data collection centers where algorithms process it to create a virtual map of the distribution network.
Prince says that the network map resembles a tree where the transmission lines that come from the power generation plants are the roots and the substation is the trunk. “The feeder circuits that reach out through neighborhoods to residential customers are the branches where we’ve traditionally had the least visibility,” he says.
Prince says that an outage event could be due to a single cause or multiple causes nested together. This makes the prime cause harder to find. But digital systems that connect the grid to the Industrial Internet and string together smart sensors, controls, and software can quickly detect and locate trouble, and then isolate a likely problem area along the right branch. “Instead of telling the crew to patrol miles of a line, I can narrow the location to a block or two,” he says. “At the same time, the control system will quickly restore power to the other lines leading from the trunk that did not suffer any damage, and bring power back to those customers sooner.”
McDonald says that a number of utilities have already started implementing automation on the feeder lines where they see the most value. NSTAR, for example, brings electricity to 1.1 million customers living in central and eastern Massachusetts. Starting in 2009, the company reached out to GE and started building a “self-healing” grid. Today the system consists of 2,000 smart switches and 5,000 voltage and current sensors. The system was already tested by Hurricane Irene in 2011 and Hurricane Sandy in 2012.
During Irene, 500,000 customers lost power but the system was able to reroute it and turn the lights back on for close to half of them within an hour. NSTAR calculates that the new grid visibility has allowed it avoid 600,000 customer outages so far. “The concept of grid monitoring has been around for decades,” McDonald says. “But the advances in big data, software, fiber optics and digital wireless communications now really bring it alive.”
Not even lizards can stop it.
Click here to enlarge and download.
GE and Pivotal said they built the first industrial-scale “data lake” system that could supercharge how companies store, manage and glean insight from information harvested from machines connected to the Industrial Internet.
The system, which has already tracked more than 3 million flights and gathered 340 terabytes of data, can analyze data 2,000 times faster than previous methods and cut costs tenfold. It is so powerful that it crunched through a complex task that would have taken a month to compute in just 20 minutes.
“Big Data is growing so fast that it is outpacing the ability of current tools to take full advantage of it,” said Bill Ruh, vice president of GE Software. Dave Bartlett, computer scientist and chief technology officer for GE Aviation, said that industrial data lakes will help companies predict future problems and run machines more efficiently, sustainably and profitably. They will also help GE maintain and service machines better. “We are getting the most life out of our assets,” he said.
The industrial data lake will have numerous applications across many industries and types of hardware, from jet engines and locomotives to medical scanners.
Bartlett says a data lake can swallow massive streams of data and store it in whatever form it arrives, much like a large body of water drinks from its tributaries.
This is different from a standard data warehouse, where data is classified and categorized at the point of entry. “Instead of slicing, dicing and classifying the data, we capture the metadata, which is data about the data,” Bartlett says. “Metadata provides a more robust and varied context at the time of analysis that’s been missing from conventional data storage.”
Bartlett says that a data lake allows companies to ask many more questions from a given data set than they used to. “A numeric sequence in a database is only as meaningful as the context that can be applied,” he says. “By itself, it is just a number that the data warehouse might translate to what you paid two years ago to overhaul a particular kind of jet engine. But a data lake can provide the metadata to drive numerous analytics associated with that event, including the reasons behind the overhaul and how to better avoid or predict such overhauls in the future.”
Bartlett, who studied biology and ecosystems before he jumped into computer science, uses a biological metaphor to describe the data lake concept. “A data lake is like a pond in the woods – a richly diverse ecosystem,” he says. “You have complex food webs composed of millions of organisms, from algae and plants all the way up to top predators. Other factors such as water depth, available oxygen, nutrient levels, temperature, salinity and flow create the context of an intricate, interconnected ecosystem. If you throw a line in the water you never know what you will catch. It is an exciting place to fish! The questions and analytical opportunity are almost limitless.”
"On the other hand,” he says, “a more traditional database is more like a fish farm where all the species have been pre-classified and fed the same diet and health supplements. Some intensive tanks even employ biosecurity measures – a far contrast from the rich open natural ecosystem. If you throw a line in the water here, you have a pretty good idea of what you will catch! While useful, it has more limitations as to what it can teach us.”
Some 25 airlines are already streaming data into GE’s and Pivotal’s data lake system to better manage and maintain their fleets. The robust system is allowing service crews to better analyze performance anomalies. When a jet engine reports a temperature that’s higher than usual, for example, the system seeks insights and looks for similar events in the past, based on the type of engine, its age, service history and many other factors. “The magic happens when you marry the traditional engineering approach with the data science enabled by the data lake,” Bartlett says. “It opens up a whole new world of possible ‘what if’ questions.”
The industrial data lake works with GE’s Predix industrial software platform and massively parallel processing architecture systems like the open-source Apache Hadoop. Bartlett says the combination will have numerous applications across many industries and types of hardware, from jet engines and locomotives to medical scanners.
“When you dive into the data lake, you start seeing questions you didn’t even know how to ask,” Bartlett says. “It gives a transformational ability to your business model.”
Africa’s economic expansion was largely driven by commodities over the last decade. But today, satisfying demand involves more than just pulling ore and minerals from the ground faster. Companies like South Africa’s platinum producer Lonmin are embracing the Industrial Internet and Big Data to go smarter about their jobs.
In 2007, when platinum prices hit an all-time high, Lonmin wanted to maximize its smelter’s production and efficiency and open bottlenecks. But workers and machines were struggling to keep up. Delays in the filtering and drying of the raw material going into the smelting furnaces, for example, led to process interruptions, which increased equipment wear and tear.
The slag plant, which concentrated and recycled the material coming out of the furnaces on the other end of the process, also experienced frequent spillages that wasted precious production time.
Many managers in this situation would start thinking about spending capital on new equipment. But Lomnin invested in software, hoping that better information and understanding of what’s happening at the mill would lead to improvement.
The Industrial Internet helped workers at Lonmin’s platinum smelter in South Africa become more productive. Top Image: Smelter illustration.
Lonmin brought in GE’s Mine Performance system built around its Predix industrial software platform. The company first used the system to monitor and evaluate the filtering and drying process. The gathered data allowed Lonmin to increase throughput in the section that feeds the furnaces with raw material by 10 percent.
The results were so good that Lonmin decided to apply Mine Performance to the slag plant. Today, spillages have been eliminated and platinum recovery from slag is up by 1.5 percent. Although other factors were also involved, the process optimization software played a major role.
Finally, the smelter used the system to bring high sulfur dioxide emissions into allowable range, and to better manage equipment maintenance and reduce downtime. A new software monitoring tool flags inefficiencies and helps the team apply resources where they are most needed. “Once Mine Performance is running and the people are used to it, it’s very difficult to manage without it,” says Percy French, the smelter’s automation manager. He says that without the system, “we would incur additional cost for inefficiencies and we would definitely have equipment damage due to our inability to control the process in the same way as an analytically driven system.”
Lonmin is an important model for Africa’s future. Although the base of Africa’s economic growth has become broader in recent years, the continent still depends heavily on commodities to drive its booming trade with the world and especially China.
Before manufacturing picks ups, companies like Lonmin will have to power Africa’s growth. Embracing the latest technologies is a smart way to lead.
At first glance, Air Asia’s fleet of Airbus A320 planes look like any other passenger aircraft. But look under the hood and you will find an array of sensors and proprietary technology developed by GE that make their pilots smarter.
That’s because the systems gather performance, weather, flight path and other data and feed it over the Industrial Internet to the cloud, so that it can be crunched by software and analytical engines built and operated by GE Aviation’s Flight Efficiency Services unit. The system looks for hidden patterns and saving opportunities, and allows the airline to cut its annual fuel bill by more than 1 percent. Doesn’t seem like much? Consider that it’s on average about 550 pounds of jet fuel - the equivalent of 11 packed suitcases - per hour of flight.
Pilots and airline managers see the results on a dashboard and use it to make better informed decisions about how much fuel they need and which path they are going to take. There are dozens of other airlines around the world already using the system, including Air New Zealand, China Airlines, WestJet, and EVA Air.
But the smarts go beyond fuel savings. GE is also working with Air Asia and the Department of Civil Aviation (the regional equivalent of the FAA) to roll out a GPS-based flight path program at 15 Malaysian airports, and another eight in Thailand and Indonesia. The goal is to improve their efficiency and possibly increase capacity.
While GPS does not sound revolutionary in other contexts, keep in mind that most aircraft still use radio beacons to determine their position. The new system, called Required Navigation Performance (RNP), was first designed by Alaska Airlines pilot Steve Fulton after going through many sweat-soaked night landings at the mountain-rimmed airport in Juneau, AK. It was further developed by GE Aviation.
“The inspiration was both frustration and concern,” Fulton said. “As pilots in southeast Alaska, we were regularly operating in difficult weather conditions with limited navigation aids. We understood that there was very little margin for error. We had training, experience, and the best in that generation of ground-based navigation equipment and the associated aircraft instrumentation. But still, even with all of that, there were times when a pilot could be put in a very tight spot.”
The system has since helped open up airports in the Himalayas, mountainous southern New Zealand, hilly downtown Rio de Janeiro and elsewhere around the world.
Subscribe to GE Reports and stay tuned for more aviation coverage from the Farnborough Airshow, which is taking place this week.
There is more to the Internet of Things (IoT) than FitBits and smartphone-controlled thermostats. While consumer goods are some of the IoT’s most visible applications, they’re just one part of the vast and game-changing phenomenon that could soon encompass 200 billion connected devices and add trillions of dollars to the economy.
In fact, experts estimate that the IoT will resonate strongly in the “invisible” industrial sector, capturing and analyzing data generated by drilling rigs, jet engines, locomotives and other heavy-duty machines.
This network is called the Industrial Internet and it’s already helping companies shave costs and boost performance. Union Pacific, America’s largest railroad company, has improved productivity by wiring its locomotives with sensors that monitor parts and supply data to algorithms that try to predict whether a component might break down and when. “Industrial data is not only big, it’s the most critical and complex type of big data,” says Jeff Immelt, chairman and CEO of GE. “Observing, predicting and changing performance is how the Industrial Internet will help airlines, railroads and power plants operate at peak efficiency.”
GE is developing sensors that could be printed inside machines. Top Image: Maintenance crews can already gather data from jet engines like the GEnx.
GE is betting big on the Industrial Internet. The company believes the network could add $10 and $15 trillion – the size of today’s U.S. economy - to global GDP over the next 20 years. Its software arm has developed a software platform called Predix that allows Union Pacific, as well as oil drilling companies, wind farms, hospitals and other customers to perform prognostics, reduce downtime and increase efficiency.
Capturing Big Data and transmitting it to dedicated servers presents its own set of technological and logistical challenges. That’s why GE, AT&T, Cisco and IBM teamed up this spring to launch the Industrial Internet Consortium. The goal of this open, not-for-profit group is to break down technology silos, improve machine-to-machine communications and bring the physical and digital worlds closer together.
To do that, member companies will pool their R&D capabilities to develop common server architectures and advanced test beds to standardize key components of the Industrial Internet.
Bill Ruh, vice president of global software at GE, recently told Mike Barlow of the O’Reilly Radar blog that turning data into usable insights will require an industry-wide effort – channeled by organizations like the IIC – to produce standardized infrastructure and processes that are fast, accurate, reliable and scalable.
Massive gas turbines are also getting connected to the Industrial Internet.
While the possibilities of the Industrial Internet are just beginning to be harnessed, companies aren’t waiting around. In a speech to power company executives, Wall Street analysts and investors at the Electrical Products Group Conference this spring, GE’s Immelt said that by the end of the year, he expected GE to launch over 40 “Predictivity” industrial analytical applications, which could generate more than $1 billion in revenue for the company.
The Internet is no longer just about email, e-commerce and Twitter, says Joe Salvo, manager of the Complex Systems Engineering Laboratory at GE Global Research. “We are at an inflection point,” he says. “The next wave of productivity will connect brilliant machines and people with actionable insight.”
GE will partner with the innovative Southern California venture capital firm Frost Data Capital on a business incubator focused on machine data, predictive analytics and the Industrial Internet. “Frost is looking to incubate really big problems that drive revolutionary change,” says Bill Ruh, vice president of GE Software. “That’s how we found them.”
The incubator, called I3 for Industrial Internet Incubator, practices lean methodology techniques that allow its startups to scale quickly. GE and Frost Data executive teams will start by identifying ideas for potential new businesses. “We don’t look at outside business plans like a typical VC,” says John Vigouroux, managing partner and president at Frost Data Capital. ”We validate the ideas through the marketplace and with real customers. This is our expertise. If the problems are genuine opportunities for startups, we’ll start a company and hire a CEO.”
Vigouroux says that this approach allows Frost Data to start companies with a small amount of capital and operate them more predictably. “It’s a great model, not a home run-or-bust model,” he says. “Since we don’t need gobs and gobs of money, we can do singles, doubles, triples, whatever works for the customer. We think that seven out of 10 startups could grow up to be successful.”
The partners did not disclose the amount of funding behind I3, but Vigouroux says that the incubator is designed to participate in any financing round. I3 will be incubating and taking to market several companies simultaneously. Big Data pioneer Stuart Frost, who started Frost Data after he sold his company Datallegro to Microsoft for $275 million in 2008, calls this model “parallel entrepreneurship.”
Besides business ideas, GE brings to the incubator its expertise in predictive analytics and experience with “intelligent” machines connected to the Industrial Internet. “We will bring them the problems, the customers and industrial capability,” Ruh says. “We believe that this will move the [analytics] market even faster and give customers what they are looking for. The first mover advantage is also very important.”
Frost Data has started 17 companies so far, with GE investing in three of them. Ruh says that GE and Frost Data have already started five new companies over the last 60 days alone. He said that a big company like GE could not move so fast by itself. “We’ve discovered someone who is really good at this,” Ruh says.
The initiative could have multiplier effect on the growth of the Industrial Internet. The Wall Street Journal pointed out yesterday that the value of the Industrial Internet will increase as more people and companies get connected to it, following Metcalfe’s law.
"Just like the development of the commercial Internet, that will require investment and innovation from lots of different people and companies," the paper wrote.
Pathologists are doctors and scientists who study tissue, blood and other biological specimens to find the cause of illness and a route to treatment. Many of them still wield a microscope as their main weapon. They load it with a sample slide, analyze the contents through the eyepiece and dictate their findings to a voice recognition system or an administrative assistant who transcribes them into a report.
But do it dozens of times a day and the job can become a pain, literally. “Every time I reach for a new slide, I have to take my eyes off the lens and check the forms for that case,” says Ian Cree, a pathology professor at Warwick Medical School in Coventry, UK. “You can get a sore neck from hours at the microscope.”
This traditional way of studying samples is now making researchers like Cree wonder whether it’s time to put the microscope away for good. Two years ago, Cree and his team started testing a new digital system that allows them to scan in images of tissue slides and patient histories, attach matching barcodes and upload everything to a massive computer database. Cree now views his samples on a computer monitor and controls their flow with his mouse.
The picture above shows how Omnyx’s Image Analysis Application can help with measurements for Dako HercepTestTM, a test commonly used by pathologists in assessing treatment options for breast cancer patients. Top image: Skin melanoma showing digital measurements of Breslow’s Depth and distance to margins.
The system is efficient in other ways, too. After biomedical scientists scan the slides, which hold prepared biopsied tissues some 5 microns thick, the samples don’t travel to Cree’s desk as they once did, but instead go straight back to storage. This makes it easier to preserve and keep track of slides while their images are viewed by the pathologist to find the diagnosis.
The technology allows one pathologist to study around 150 slides a day, increasing efficiency in the lab by about 13 percent. “Digital pathology puts everything directly on the screen in front of you, including the paperwork,” Cree says. “Everything is linked and I can even collaborate with my colleagues without stepping out into the corridor. It’s much quicker and better for everyone, including the patient.”
The system in Cree’s office is called Omnyx Integrated Digital Pathology*. It was developed by Omnyx LLC, a joint venture between GE Healthcare and the University of Pittsburgh Medical Center formed in 2008. Omnyx takes advantage of the power of the Industrial Internet, connectivity and data analysis. It could allow doctors to reach beyond hospital walls to create global “pathology networks” that could be global. “We can connect doctors in rural and underfunded hospitals with pathology experts,” says Omnyx CEO Mamar Gelaye. “The technology helps eliminate access as a variable in quality of care.”
Similar Big Data systems are already helping doctors in Sweden to analyze X-rays images of rural patients and improving diagnostics in Washington State.
Omnyx first scans samples with a high-resolution camera and stores the images in a digital archive. Pathologists can access the archive in real time and pull up the desired samples.
The system can be easily scaled up from just one lab to a hospital or even an entire healthcare network. Doctors can use it to collaborate with peers and specialists, improve the accuracy and speed of diagnosis, and quickly obtain second opinions.
Once Omnyx has gathered enough data, the system could also help pathologists to analyze the data to seek hidden correlations. “The human eye is extremely good at looking for patterns on a slide,” Cree says. “But we are very bad at determining how much of something is on it. That’s where digital pathology adds value for the patient.”
There are, for example, nine grades of prostate cancer tumor. The grade a doctor gives a tumor determines whether a patient receives radical surgery or a more conservative drug treatment. But Cree says that pathologists are about 60 percent successful in identifying the right grade when they use only a microscope and their eyes.
Machine vision and software analysis could improve the odds. “It may be that in the future we find the optimum cut for radical surgery is a tumor grade 4.3 and above, rather than 4, so that we are able to refine our advice and offer a more individualized level of care to patients,” says Dr. David Snead, the pathologist who leads the Omnyx implementation at Coventry and Warwickshire Pathology Services.
The Coventry team has used the new system to study samples from 300 patients. They are now in the middle of a 10-month-long validation study that will include 3,000 cases. The team will compare results generated from Omnyx versus to those using only a microscope to ensure there are no discrepancies.
"We know pathology will evolve, and our solution is committed to stand ready for that transformation,” says Omnyx CEO Gelaye: “We want to use information technology to bring patient pathology cases to the right pathologist for the right diagnosis at the right time.”
* Omnyx Integrated Digital Pathology is a registered trademark.
Blowout preventers, or BOPs, are incredibly complex machines that weigh 750,000 pounds and tower 60 feet above seafloor oil wells. They serve as the last line of defense in case anything goes wrong. It takes workers about 18 months to build one and they serve for as long as 30 years.
It takes as much as $16 million to retrieve a BOP to service it, and keeping them in shape all those years can get expensive. As a rule of thumb to stay safe, maintenance crews often replace as many as 20 percent of BOP parts every time they come to the surface. The stakes are high. Each day a rig is not drilling can cost the offshore operator $3 million in lost productivity. But it also means that they effectively rebuild an entire BOP every five years.
But there is a more scientific way to service BOPs. GE engineers recently developed a predictive maintenance system called Sealytics BOP Advisor that collects and analyzes data about pressures, valve positions and flows from underwater sensors attached to the machines. The system, not thumbs, helps customers determine what needs to be fixed and when. “Telling a customer what to fix after it has failed is relatively easy,” says Bob Judge, director of product management at GE Oil & Gas. “Telling them to fix something before it costs them money is the magic.”
Sealytics is one example of how GE combines the power of data analytics with the Industrial Internet to expand its services offering. People rightly consider GE a builder of “big iron” like jet engines, locomotives and gas turbines, but services already account for a stunning 75 percent of the company’s profits.
GE believes that Big Data will push that share even higher. “Industrial data is not only big, it’s the most critical and complex type of big data,” says Jeff Immelt, GE chairman and CEO. “Observing, predicting and changing performance is how the Industrial Internet will help airlines, railroads and power plants operate at peak efficiency.”
Immelt is speaking today at the Electrical Products Group Conference, an annual gathering of industrial executives, Wall Street analyst and investors in Longboat Key, FL. He will talk about “reinventing services for the future” and the role that the combination of hardware and software will play in reducing unplanned downtime and making customers more productive (see infographic).
Immelt says that GE will have launched over 40 “predictivity” analytical applications by the end of the year. He expects them to generate more than $1 billion in revenues.
Applications like Sealytics for BOPs, Trip Optimizer for locomotives, and Wind Power Up for wind turbines are built on Predix, a cloud-based software platform designed by GE software engineers specifically for the Industrial Internet. Ultimately, workers will use them to access performance data and monitor machines remotely from anywhere and at any time.
The Internet is no longer just about email, ecommerce or Twitter, says Joe Salvo, manager of the Complex Systems Engineering Laboratory at GE Global Research. “We are at an inflection point. The next wave of productivity will connect brilliant machines and people.”
Click here to download.
GE said today it would acquire the cyber security company Wurldtech to expand its digital arsenal for protecting critical infrastructure and operations technology.
GE has started connecting jet engines, power plants, locomotives and other technology to the Industrial Internet, an emerging digital network that links machines, data and software with people.
Wurldtech’s security software is unique since it helps to protect information technology (IT) that allows machines to think, as well as operations technology (OT) – the hardware and software that monitors and controls machines and connects them to people. “The world of OT security needs to be foundationally different from traditional IT detection systems,” said Bill Ruh, GE vice president who runs GE Software. “Nine out of the world’s top 10 automation providers use Wurldtech security products.”
GE said in a statement that the acquisition will help the company boost the reliability of it’s Industrial Internet operations. “Securing connected machines has a unique set of complexities that are very different from protecting a datacenter,” Ruh said. “At GE, we are focused on software platform security, protecting critical infrastructure and helping to ensure the reliability of Industrial Internet operations for our customers and industries.”
Wurldtech is based in Vancouver, British Columbia. Its products like Achilles Test and Achilles Threat Intelligence help customers spot vulnerabilities in products and infrastructure, find their root case and protect assets.The company’s systems are securing oil and gas operations, electric systems, and medical, nuclear and chemical plants around the world.
Blowout preventers, or BOPs, are incredibly complex machines that sit deep on the sea floor and serve as the last line of defense if something in the oil well goes wrong.
These 250,000-pound, 60-foot steel behemoths have to be regularly pulled up, inspected and serviced. As a rule of thumb, workers often replace as many as 20 percent of their parts to keep them safe, effectively rebuilding the entire machine every five years. “That’s one way to do it,” says Bob Judge, director of product management at GE Oil & Gas.
The other, smarter way involves connecting BOPs to the Industrial Internet, and collecting and analyzing the data they send up. “We need to move from the ‘break-fix’ model to a predictive maintenance model,” Judge says. “What if you had a technology gathering BOP data so that the next time you pull it out you know exactly what needs to be replaced and have the replacement parts available on the drilling rig?”
Judge says that it costs between $10 and $16 million to surface a BOP. Predictive maintenance could save drilling companies millions in unplanned downtime.
He compares such systems to the oil life indicator in a 2014 model car. “Not so long ago, you changed your oil every 3,000 miles,” he says. “But a new Chevy Tahoe will tell that you have 27 percent of oil life remaining, based on mileage and other engine conditions that affect the oil. That’s predictive analytics.”
Judge and his team spent the last couple of years studying BOPs and data about wellbore and hydraulic system pressures, solenoid current draws, valve positions and other information. The data is already being supplied by sensors on BOPs and stored in a database called “datalogger.”
They used the data to develop a system called SeaLytics BOP Advisor that allows crews to monitor the health of BOP components and determine how many cycles they have gone through, what needs to be fixed and when. “When there is a problem, the drilling contractor will know within seconds,” Judge says.
The technology is similar to GE systems that already monitor jet engines and locomotives. It is built on Predix, the first industrial-strength platform built by GE for machine analytics.
Judge said that he had an epiphany for SeaLytics when he saw a demonstration of myEngines, which allows airlines to remotely monitor the status of their jet engine fleets and streamline maintenance and repair. “I thought if we could have the same model that’s been proven to work to benefit the drilling industry,” Judge says.
Two customers, the drilling contractors Atwood Oceanics and Brazil’s Queiroz Galvão Óleo e Gás (QGOG), already ordered the technology. “Telling a customer what to fix after it has failed is relatively easy,” Judge says. “Telling them to fix something before it costs them money is the magic.”