Two years ago, the artist and musician David Byrne created a series of installations called Playing the Building, in which he converted cavernous warehouses in New York, London and Minneapolis into gigantic musical instruments.
Byrne stood on the shoulders of artists like composer Annie Gosfield, who more than a decade ago recorded sounds produced by massive metal presses, welding guns and mallets banged on acid baths at an electric motor factory in Nuremberg, Germany, and turned them into a harmonious industrial symphony called Flying Sparks and Heavy Machinery.
Industrial companies are now also joining the movement. Last year, GE and CSX let Ladytron’s Reuben Wu record a giant container shipping terminal in Ohio, and in July, GE invited DJ and musician Matthew Dear to hunt for interesting sounds at its global research headquarters in upstate New York.
GE also set up Dear with a library of 1,000 sounds generated by machines spanning its entire industrial portfolio, from jet engines to MRI scanners. Dear turned the sounds into a propulsive track called Drop Science. He talked to GE Reports editor, Tomas Kellner, about his sonic adventure.
Tomas Kellner: What attracted you to the project?
Matthew Dear: I’ve become known as someone who likes to mess with everyday sounds and incorporate them into music. You can hear that all the way back to my [early records like] Backstroke. My albums are always very dense with natural sounds that sound very real and electronic at the same time.
TK: Was the creative process for Drop Science different from making a track for one of your records?
MD: When I’m making a record, I usually produce a song in a week or so. It’s a very organic process. But a project like this can be a daunting and overwhelming experience. Just the sound bank I was given to start with was massive.
Matthew Dear recorded sounds for his track Drop Science at GE Global Research labs in Niskayuna, NY.
TK: How did you start?
MD: I tried to listen to as many of the sounds as I could and started looking for those that would fit. A lot of it was just noise that could be a little too aggressive. Nobody wants the sound of an engine running at full volume scratching their ear drum.
Once I found the base sounds, the sounds that I really thought could work, I started whittling them down even further. I put them into samplers and ran them through all sorts of processing equipment in my computer, little plugins that made them sound a little bit different.
It’s almost like a painter taking all his paints, starting to mix them on a board and then getting the palette ready for the painting.
"Nobody wants the sound of an engine running at full volume scratching their ear drum."
TK: How long did it take you to finish Drop Science?
MD: It took me about a week and a half to arrive at a core group of sounds that could easily become something bigger. That’s when I started playing with sequencing and the length of the track.
It was a little confusing at first because some things sounded just too machine-like, just a little too mechanical. But once I made some executive decisions, it fell into place.
TK: What kind of executive decisions?
MD: The drums, for example. I decided that I’m going to create them in my studio with my own equipment. But they will be a reflection of my time spent at the research center. It was kind of a push and pull between what happened and what I imagined happened.
TK: Do you have a favorite sound?
MD: At the lab, they had a really long tube, a test apparatus. I was running up and down and messing with it and hitting it with a stick. It sounded like you were dropping a coin into a thousand-foot well. It had a really cool, bubbly reverb sound.
Matthew Dear is making metal music.
TK: What about machine sounds?
MD: It terms of the sounds they gave me, I liked the MRI sounds the best. They were hit or miss, but those that really worked reminded me of a lot of stuff that I use in my own songs. They were very cyclical, almost like a sequencer. They sounded like an analog synthesizer that I could loop and play with.
TK: What kind of sounds did GE give you?
MD: They gave me a whole batch of sounds. There were sounds from some equipment north of the Arctic Circle in Norway and also from a jet engine test. I couldn’t go to all of those places myself. I took them and blended them with those that I recorded personally.
TK: Could you record anywhere at the lab?
MD: We didn’t really have any closed doors. I got to see a lot of things and talked to a lot of people. I did not feel at any time that GE had its own notion of what the result should sound like, as it could with a project like this. They were very open to the artistic experience. They wanted the whole thing to be a very natural process.
TK: Can you describe it?
MD: It was like a melding of minds. There was the team from m ss ng p eces filming it, we had The Barbarian Group, which is known for very creative ads that push boundaries, and then GE, which is just so massive and creative in its own sense. It did not feel at any point like it was a commercial experience.
TK: Did you draw inspiration from other projects?
MD: I’m familiar with David Byrne’s installations. It was great to be able to continue the legacy of blending machines and music.
Getting shale gas out of the ground is one thing. But taking it to customers is quite another.
American pipeline operators are investing as much as $40 billion every year to maintain, modernize and expand their networks. The shale gas boom is putting operators under pressure to move more gas to market faster and more safely, and many U.S. pipelines have been in service for at least two decades.
“We need an agile and comprehensive pipeline solution that could be delivered quickly and allows for a more real-time view of pipeline integrity across our interstate natural gas pipelines,” says Shawn Patterson, president of operations and project delivery at Columbia Pipeline Group.
Columbia runs a 15,000-mile gas pipeline network linking the Gulf Coast to the mid-Atlantic region and the Northeast. It will soon start using GE software and big data to monitor its network in almost real time, and streamline its operations and planning.
The technology, called Intelligent Pipeline Solution, combines GE software and hardware with Accenture’s data integration expertise. It runs on Predix, GE’s industrial software platform, and links pipelines to the Industrial Internet for the first time.
The world’s pipelines stretch for some 2 million miles, enough to wrap themselves 80 times around the equator. GE estimates every 150,000 miles of pipeline generates an amount of data equal the entire printed collection of the Library of Congress, or 10 terabytes.
Brian Palmer, chief executive of GE’s Measurement & Control unit, says that the new system will help customers like Columbia make the right decisions at the right time to keep their assets safe. It will help them send repair machinery and crews where they are needed most, and speed up response time to problems.
The system is designed to harvest data from sensors installed along the pipes and equipment, sync it with external data sources and deliver to customers detailed analytics and risk assessment from key points of the network. “The goal is to help pipeline operators make proactive, rather than reactive decisions,” Palmer says.
The “Intelligent Pipeline Solution” is the first commercial product GE and Accenture have offered up since they formed their software and big data partnership in 2013. The companies expect the system to be operational in the first half on 2015.
Kids sometimes make grown-ups see complicated things in simple ways. GE’s new ad about ”brilliant machines" connected to the Industrial Internet is tapping into that power.
The spot features a small boy who can’t speak but whose voice box produces beeps that allow him to talk to toys, the electrical grid, aircraft and many other machines. “Lots of companies have been trying to tell their Industrial Internet story and we had to take a different approach to make it stand out,” says Peter McCallum, senior director at BBDO, the creative agency that made the ad. “We wanted to tell an industrial-scale story at a human level to elicit emotion and ensure that it resonated with far-reaching audiences.”
The story follows the boy from birth, when his primal “beep” causes considerable distress to his parents. But by the time he is in elementary school, his special power allows him to switch the TV to the football game for dad (the ad will air for the first time during the NFL kickoff), restore electricity to an entire town and make planes fly on time. “We liked the idea that it’s his natural language,” McCallum says. “He does not have to put on a cape to have these powers. It’s sort of a metaphor for GE.”
Unlike the boy, the Industrial Internet is real. It could soon link billions of machines and devices ranging from smartphones and thermostats to jet engines and medical scanners.
GE believes the network could add $10 and $15 trillion – the size of today’s U.S. economy - to global GDP over the next 20 years. The company’s software arm has developed a software platform called Predix that allows railroads, oil drilling companies, wind farms, hospitals and other customers to perform prognostics on machines, reduce downtime and increase efficiency.
The “Boy Who Beeps” is the first in a series of stories and other content that GE plans to roll out through the rest of the year to illustrate the power of the Industrial Internet.
As power outages go, the iguana affair was a mundane one. On July 27, a hapless lizard shorted a piece of electrical equipment in the middle of the Florida Keys and knocked out power for 11,000 local residents. It was an act of nature no outage prevention system would have been able to predict. But since the lizard climbed inside a substation, repair crews were able to locate the short and restore power in 10 minutes. “Spotting problems within the limited space of a substation is relatively easy,” says Chris Prince, application engineer at GE Digital Energy. “It’s a high-density nerve center for distributing power. But good luck finding and fixing a problem that fast when it happens along the miles of overhead and underground lines connecting the substation to the consumers.”
In an era when a smartphone can quickly pinpoint its location on a map, few power providers know that customers lost electricity before somebody calls them. Fewer still can see whether the outage was caused by a fallen tree branch, a lightning or faulty equipment.
Americans are taking notice. A new survey measuring public perception of grid resiliency found that many respondents were willing to pay $10 per month on top of their electricity bill to make sure that the grid becomes more reliable (see the results here).
The survey, which was commissioned by GE’s Digital Energy business, also found that a majority wanted utilities to start using digital communications and social media tools to keep them informed in real time during a power outage. “Consumers want to see investment in technology that prevents power outages and reduces the time it takes to turn power back on,” says John McDonald, director of technical strategy and policy development at GE Digital Energy.
Utilities are getting the message. As wireless networks became more ubiquitous, power companies started placing digital sensors along its lines and inside switches, breakers, smart meters and other devices. The sensors feed grid data to data collection centers where algorithms process it to create a virtual map of the distribution network.
Prince says that the network map resembles a tree where the transmission lines that come from the power generation plants are the roots and the substation is the trunk. “The feeder circuits that reach out through neighborhoods to residential customers are the branches where we’ve traditionally had the least visibility,” he says.
Prince says that an outage event could be due to a single cause or multiple causes nested together. This makes the prime cause harder to find. But digital systems that connect the grid to the Industrial Internet and string together smart sensors, controls, and software can quickly detect and locate trouble, and then isolate a likely problem area along the right branch. “Instead of telling the crew to patrol miles of a line, I can narrow the location to a block or two,” he says. “At the same time, the control system will quickly restore power to the other lines leading from the trunk that did not suffer any damage, and bring power back to those customers sooner.”
McDonald says that a number of utilities have already started implementing automation on the feeder lines where they see the most value. NSTAR, for example, brings electricity to 1.1 million customers living in central and eastern Massachusetts. Starting in 2009, the company reached out to GE and started building a “self-healing” grid. Today the system consists of 2,000 smart switches and 5,000 voltage and current sensors. The system was already tested by Hurricane Irene in 2011 and Hurricane Sandy in 2012.
During Irene, 500,000 customers lost power but the system was able to reroute it and turn the lights back on for close to half of them within an hour. NSTAR calculates that the new grid visibility has allowed it avoid 600,000 customer outages so far. “The concept of grid monitoring has been around for decades,” McDonald says. “But the advances in big data, software, fiber optics and digital wireless communications now really bring it alive.”
Not even lizards can stop it.
Click here to enlarge and download.
GE and Pivotal said they built the first industrial-scale “data lake” system that could supercharge how companies store, manage and glean insight from information harvested from machines connected to the Industrial Internet.
The system, which has already tracked more than 3 million flights and gathered 340 terabytes of data, can analyze data 2,000 times faster than previous methods and cut costs tenfold. It is so powerful that it crunched through a complex task that would have taken a month to compute in just 20 minutes.
“Big Data is growing so fast that it is outpacing the ability of current tools to take full advantage of it,” said Bill Ruh, vice president of GE Software. Dave Bartlett, computer scientist and chief technology officer for GE Aviation, said that industrial data lakes will help companies predict future problems and run machines more efficiently, sustainably and profitably. They will also help GE maintain and service machines better. “We are getting the most life out of our assets,” he said.
The industrial data lake will have numerous applications across many industries and types of hardware, from jet engines and locomotives to medical scanners.
Bartlett says a data lake can swallow massive streams of data and store it in whatever form it arrives, much like a large body of water drinks from its tributaries.
This is different from a standard data warehouse, where data is classified and categorized at the point of entry. “Instead of slicing, dicing and classifying the data, we capture the metadata, which is data about the data,” Bartlett says. “Metadata provides a more robust and varied context at the time of analysis that’s been missing from conventional data storage.”
Bartlett says that a data lake allows companies to ask many more questions from a given data set than they used to. “A numeric sequence in a database is only as meaningful as the context that can be applied,” he says. “By itself, it is just a number that the data warehouse might translate to what you paid two years ago to overhaul a particular kind of jet engine. But a data lake can provide the metadata to drive numerous analytics associated with that event, including the reasons behind the overhaul and how to better avoid or predict such overhauls in the future.”
Bartlett, who studied biology and ecosystems before he jumped into computer science, uses a biological metaphor to describe the data lake concept. “A data lake is like a pond in the woods – a richly diverse ecosystem,” he says. “You have complex food webs composed of millions of organisms, from algae and plants all the way up to top predators. Other factors such as water depth, available oxygen, nutrient levels, temperature, salinity and flow create the context of an intricate, interconnected ecosystem. If you throw a line in the water you never know what you will catch. It is an exciting place to fish! The questions and analytical opportunity are almost limitless.”
"On the other hand,” he says, “a more traditional database is more like a fish farm where all the species have been pre-classified and fed the same diet and health supplements. Some intensive tanks even employ biosecurity measures – a far contrast from the rich open natural ecosystem. If you throw a line in the water here, you have a pretty good idea of what you will catch! While useful, it has more limitations as to what it can teach us.”
Some 25 airlines are already streaming data into GE’s and Pivotal’s data lake system to better manage and maintain their fleets. The robust system is allowing service crews to better analyze performance anomalies. When a jet engine reports a temperature that’s higher than usual, for example, the system seeks insights and looks for similar events in the past, based on the type of engine, its age, service history and many other factors. “The magic happens when you marry the traditional engineering approach with the data science enabled by the data lake,” Bartlett says. “It opens up a whole new world of possible ‘what if’ questions.”
The industrial data lake works with GE’s Predix industrial software platform and massively parallel processing architecture systems like the open-source Apache Hadoop. Bartlett says the combination will have numerous applications across many industries and types of hardware, from jet engines and locomotives to medical scanners.
“When you dive into the data lake, you start seeing questions you didn’t even know how to ask,” Bartlett says. “It gives a transformational ability to your business model.”
Africa’s economic expansion was largely driven by commodities over the last decade. But today, satisfying demand involves more than just pulling ore and minerals from the ground faster. Companies like South Africa’s platinum producer Lonmin are embracing the Industrial Internet and Big Data to go smarter about their jobs.
In 2007, when platinum prices hit an all-time high, Lonmin wanted to maximize its smelter’s production and efficiency and open bottlenecks. But workers and machines were struggling to keep up. Delays in the filtering and drying of the raw material going into the smelting furnaces, for example, led to process interruptions, which increased equipment wear and tear.
The slag plant, which concentrated and recycled the material coming out of the furnaces on the other end of the process, also experienced frequent spillages that wasted precious production time.
Many managers in this situation would start thinking about spending capital on new equipment. But Lomnin invested in software, hoping that better information and understanding of what’s happening at the mill would lead to improvement.
The Industrial Internet helped workers at Lonmin’s platinum smelter in South Africa become more productive. Top Image: Smelter illustration.
Lonmin brought in GE’s Mine Performance system built around its Predix industrial software platform. The company first used the system to monitor and evaluate the filtering and drying process. The gathered data allowed Lonmin to increase throughput in the section that feeds the furnaces with raw material by 10 percent.
The results were so good that Lonmin decided to apply Mine Performance to the slag plant. Today, spillages have been eliminated and platinum recovery from slag is up by 1.5 percent. Although other factors were also involved, the process optimization software played a major role.
Finally, the smelter used the system to bring high sulfur dioxide emissions into allowable range, and to better manage equipment maintenance and reduce downtime. A new software monitoring tool flags inefficiencies and helps the team apply resources where they are most needed. “Once Mine Performance is running and the people are used to it, it’s very difficult to manage without it,” says Percy French, the smelter’s automation manager. He says that without the system, “we would incur additional cost for inefficiencies and we would definitely have equipment damage due to our inability to control the process in the same way as an analytically driven system.”
Lonmin is an important model for Africa’s future. Although the base of Africa’s economic growth has become broader in recent years, the continent still depends heavily on commodities to drive its booming trade with the world and especially China.
Before manufacturing picks ups, companies like Lonmin will have to power Africa’s growth. Embracing the latest technologies is a smart way to lead.
At first glance, Air Asia’s fleet of Airbus A320 planes look like any other passenger aircraft. But look under the hood and you will find an array of sensors and proprietary technology developed by GE that make their pilots smarter.
That’s because the systems gather performance, weather, flight path and other data and feed it over the Industrial Internet to the cloud, so that it can be crunched by software and analytical engines built and operated by GE Aviation’s Flight Efficiency Services unit. The system looks for hidden patterns and saving opportunities, and allows the airline to cut its annual fuel bill by more than 1 percent. Doesn’t seem like much? Consider that it’s on average about 550 pounds of jet fuel - the equivalent of 11 packed suitcases - per hour of flight.
Pilots and airline managers see the results on a dashboard and use it to make better informed decisions about how much fuel they need and which path they are going to take. There are dozens of other airlines around the world already using the system, including Air New Zealand, China Airlines, WestJet, and EVA Air.
But the smarts go beyond fuel savings. GE is also working with Air Asia and the Department of Civil Aviation (the regional equivalent of the FAA) to roll out a GPS-based flight path program at 15 Malaysian airports, and another eight in Thailand and Indonesia. The goal is to improve their efficiency and possibly increase capacity.
While GPS does not sound revolutionary in other contexts, keep in mind that most aircraft still use radio beacons to determine their position. The new system, called Required Navigation Performance (RNP), was first designed by Alaska Airlines pilot Steve Fulton after going through many sweat-soaked night landings at the mountain-rimmed airport in Juneau, AK. It was further developed by GE Aviation.
“The inspiration was both frustration and concern,” Fulton said. “As pilots in southeast Alaska, we were regularly operating in difficult weather conditions with limited navigation aids. We understood that there was very little margin for error. We had training, experience, and the best in that generation of ground-based navigation equipment and the associated aircraft instrumentation. But still, even with all of that, there were times when a pilot could be put in a very tight spot.”
The system has since helped open up airports in the Himalayas, mountainous southern New Zealand, hilly downtown Rio de Janeiro and elsewhere around the world.
Subscribe to GE Reports and stay tuned for more aviation coverage from the Farnborough Airshow, which is taking place this week.
There is more to the Internet of Things (IoT) than FitBits and smartphone-controlled thermostats. While consumer goods are some of the IoT’s most visible applications, they’re just one part of the vast and game-changing phenomenon that could soon encompass 200 billion connected devices and add trillions of dollars to the economy.
In fact, experts estimate that the IoT will resonate strongly in the “invisible” industrial sector, capturing and analyzing data generated by drilling rigs, jet engines, locomotives and other heavy-duty machines.
This network is called the Industrial Internet and it’s already helping companies shave costs and boost performance. Union Pacific, America’s largest railroad company, has improved productivity by wiring its locomotives with sensors that monitor parts and supply data to algorithms that try to predict whether a component might break down and when. “Industrial data is not only big, it’s the most critical and complex type of big data,” says Jeff Immelt, chairman and CEO of GE. “Observing, predicting and changing performance is how the Industrial Internet will help airlines, railroads and power plants operate at peak efficiency.”
GE is developing sensors that could be printed inside machines. Top Image: Maintenance crews can already gather data from jet engines like the GEnx.
GE is betting big on the Industrial Internet. The company believes the network could add $10 and $15 trillion – the size of today’s U.S. economy - to global GDP over the next 20 years. Its software arm has developed a software platform called Predix that allows Union Pacific, as well as oil drilling companies, wind farms, hospitals and other customers to perform prognostics, reduce downtime and increase efficiency.
Capturing Big Data and transmitting it to dedicated servers presents its own set of technological and logistical challenges. That’s why GE, AT&T, Cisco and IBM teamed up this spring to launch the Industrial Internet Consortium. The goal of this open, not-for-profit group is to break down technology silos, improve machine-to-machine communications and bring the physical and digital worlds closer together.
To do that, member companies will pool their R&D capabilities to develop common server architectures and advanced test beds to standardize key components of the Industrial Internet.
Bill Ruh, vice president of global software at GE, recently told Mike Barlow of the O’Reilly Radar blog that turning data into usable insights will require an industry-wide effort – channeled by organizations like the IIC – to produce standardized infrastructure and processes that are fast, accurate, reliable and scalable.
Massive gas turbines are also getting connected to the Industrial Internet.
While the possibilities of the Industrial Internet are just beginning to be harnessed, companies aren’t waiting around. In a speech to power company executives, Wall Street analysts and investors at the Electrical Products Group Conference this spring, GE’s Immelt said that by the end of the year, he expected GE to launch over 40 “Predictivity” industrial analytical applications, which could generate more than $1 billion in revenue for the company.
The Internet is no longer just about email, e-commerce and Twitter, says Joe Salvo, manager of the Complex Systems Engineering Laboratory at GE Global Research. “We are at an inflection point,” he says. “The next wave of productivity will connect brilliant machines and people with actionable insight.”
GE will partner with the innovative Southern California venture capital firm Frost Data Capital on a business incubator focused on machine data, predictive analytics and the Industrial Internet. “Frost is looking to incubate really big problems that drive revolutionary change,” says Bill Ruh, vice president of GE Software. “That’s how we found them.”
The incubator, called I3 for Industrial Internet Incubator, practices lean methodology techniques that allow its startups to scale quickly. GE and Frost Data executive teams will start by identifying ideas for potential new businesses. “We don’t look at outside business plans like a typical VC,” says John Vigouroux, managing partner and president at Frost Data Capital. ”We validate the ideas through the marketplace and with real customers. This is our expertise. If the problems are genuine opportunities for startups, we’ll start a company and hire a CEO.”
Vigouroux says that this approach allows Frost Data to start companies with a small amount of capital and operate them more predictably. “It’s a great model, not a home run-or-bust model,” he says. “Since we don’t need gobs and gobs of money, we can do singles, doubles, triples, whatever works for the customer. We think that seven out of 10 startups could grow up to be successful.”
The partners did not disclose the amount of funding behind I3, but Vigouroux says that the incubator is designed to participate in any financing round. I3 will be incubating and taking to market several companies simultaneously. Big Data pioneer Stuart Frost, who started Frost Data after he sold his company Datallegro to Microsoft for $275 million in 2008, calls this model “parallel entrepreneurship.”
Besides business ideas, GE brings to the incubator its expertise in predictive analytics and experience with “intelligent” machines connected to the Industrial Internet. “We will bring them the problems, the customers and industrial capability,” Ruh says. “We believe that this will move the [analytics] market even faster and give customers what they are looking for. The first mover advantage is also very important.”
Frost Data has started 17 companies so far, with GE investing in three of them. Ruh says that GE and Frost Data have already started five new companies over the last 60 days alone. He said that a big company like GE could not move so fast by itself. “We’ve discovered someone who is really good at this,” Ruh says.
The initiative could have multiplier effect on the growth of the Industrial Internet. The Wall Street Journal pointed out yesterday that the value of the Industrial Internet will increase as more people and companies get connected to it, following Metcalfe’s law.
"Just like the development of the commercial Internet, that will require investment and innovation from lots of different people and companies," the paper wrote.
Pathologists are doctors and scientists who study tissue, blood and other biological specimens to find the cause of illness and a route to treatment. Many of them still wield a microscope as their main weapon. They load it with a sample slide, analyze the contents through the eyepiece and dictate their findings to a voice recognition system or an administrative assistant who transcribes them into a report.
But do it dozens of times a day and the job can become a pain, literally. “Every time I reach for a new slide, I have to take my eyes off the lens and check the forms for that case,” says Ian Cree, a pathology professor at Warwick Medical School in Coventry, UK. “You can get a sore neck from hours at the microscope.”
This traditional way of studying samples is now making researchers like Cree wonder whether it’s time to put the microscope away for good. Two years ago, Cree and his team started testing a new digital system that allows them to scan in images of tissue slides and patient histories, attach matching barcodes and upload everything to a massive computer database. Cree now views his samples on a computer monitor and controls their flow with his mouse.
The picture above shows how Omnyx’s Image Analysis Application can help with measurements for Dako HercepTestTM, a test commonly used by pathologists in assessing treatment options for breast cancer patients. Top image: Skin melanoma showing digital measurements of Breslow’s Depth and distance to margins.
The system is efficient in other ways, too. After biomedical scientists scan the slides, which hold prepared biopsied tissues some 5 microns thick, the samples don’t travel to Cree’s desk as they once did, but instead go straight back to storage. This makes it easier to preserve and keep track of slides while their images are viewed by the pathologist to find the diagnosis.
The technology allows one pathologist to study around 150 slides a day, increasing efficiency in the lab by about 13 percent. “Digital pathology puts everything directly on the screen in front of you, including the paperwork,” Cree says. “Everything is linked and I can even collaborate with my colleagues without stepping out into the corridor. It’s much quicker and better for everyone, including the patient.”
The system in Cree’s office is called Omnyx Integrated Digital Pathology*. It was developed by Omnyx LLC, a joint venture between GE Healthcare and the University of Pittsburgh Medical Center formed in 2008. Omnyx takes advantage of the power of the Industrial Internet, connectivity and data analysis. It could allow doctors to reach beyond hospital walls to create global “pathology networks” that could be global. “We can connect doctors in rural and underfunded hospitals with pathology experts,” says Omnyx CEO Mamar Gelaye. “The technology helps eliminate access as a variable in quality of care.”
Similar Big Data systems are already helping doctors in Sweden to analyze X-rays images of rural patients and improving diagnostics in Washington State.
Omnyx first scans samples with a high-resolution camera and stores the images in a digital archive. Pathologists can access the archive in real time and pull up the desired samples.
The system can be easily scaled up from just one lab to a hospital or even an entire healthcare network. Doctors can use it to collaborate with peers and specialists, improve the accuracy and speed of diagnosis, and quickly obtain second opinions.
Once Omnyx has gathered enough data, the system could also help pathologists to analyze the data to seek hidden correlations. “The human eye is extremely good at looking for patterns on a slide,” Cree says. “But we are very bad at determining how much of something is on it. That’s where digital pathology adds value for the patient.”
There are, for example, nine grades of prostate cancer tumor. The grade a doctor gives a tumor determines whether a patient receives radical surgery or a more conservative drug treatment. But Cree says that pathologists are about 60 percent successful in identifying the right grade when they use only a microscope and their eyes.
Machine vision and software analysis could improve the odds. “It may be that in the future we find the optimum cut for radical surgery is a tumor grade 4.3 and above, rather than 4, so that we are able to refine our advice and offer a more individualized level of care to patients,” says Dr. David Snead, the pathologist who leads the Omnyx implementation at Coventry and Warwickshire Pathology Services.
The Coventry team has used the new system to study samples from 300 patients. They are now in the middle of a 10-month-long validation study that will include 3,000 cases. The team will compare results generated from Omnyx versus to those using only a microscope to ensure there are no discrepancies.
"We know pathology will evolve, and our solution is committed to stand ready for that transformation,” says Omnyx CEO Gelaye: “We want to use information technology to bring patient pathology cases to the right pathologist for the right diagnosis at the right time.”
* Omnyx Integrated Digital Pathology is a registered trademark.
Blowout preventers, or BOPs, are incredibly complex machines that weigh 750,000 pounds and tower 60 feet above seafloor oil wells. They serve as the last line of defense in case anything goes wrong. It takes workers about 18 months to build one and they serve for as long as 30 years.
It takes as much as $16 million to retrieve a BOP to service it, and keeping them in shape all those years can get expensive. As a rule of thumb to stay safe, maintenance crews often replace as many as 20 percent of BOP parts every time they come to the surface. The stakes are high. Each day a rig is not drilling can cost the offshore operator $3 million in lost productivity. But it also means that they effectively rebuild an entire BOP every five years.
But there is a more scientific way to service BOPs. GE engineers recently developed a predictive maintenance system called Sealytics BOP Advisor that collects and analyzes data about pressures, valve positions and flows from underwater sensors attached to the machines. The system, not thumbs, helps customers determine what needs to be fixed and when. “Telling a customer what to fix after it has failed is relatively easy,” says Bob Judge, director of product management at GE Oil & Gas. “Telling them to fix something before it costs them money is the magic.”
Sealytics is one example of how GE combines the power of data analytics with the Industrial Internet to expand its services offering. People rightly consider GE a builder of “big iron” like jet engines, locomotives and gas turbines, but services already account for a stunning 75 percent of the company’s profits.
GE believes that Big Data will push that share even higher. “Industrial data is not only big, it’s the most critical and complex type of big data,” says Jeff Immelt, GE chairman and CEO. “Observing, predicting and changing performance is how the Industrial Internet will help airlines, railroads and power plants operate at peak efficiency.”
Immelt is speaking today at the Electrical Products Group Conference, an annual gathering of industrial executives, Wall Street analyst and investors in Longboat Key, FL. He will talk about “reinventing services for the future” and the role that the combination of hardware and software will play in reducing unplanned downtime and making customers more productive (see infographic).
Immelt says that GE will have launched over 40 “predictivity” analytical applications by the end of the year. He expects them to generate more than $1 billion in revenues.
Applications like Sealytics for BOPs, Trip Optimizer for locomotives, and Wind Power Up for wind turbines are built on Predix, a cloud-based software platform designed by GE software engineers specifically for the Industrial Internet. Ultimately, workers will use them to access performance data and monitor machines remotely from anywhere and at any time.
The Internet is no longer just about email, ecommerce or Twitter, says Joe Salvo, manager of the Complex Systems Engineering Laboratory at GE Global Research. “We are at an inflection point. The next wave of productivity will connect brilliant machines and people.”
Click here to download.
Wind is growing up. A recent survey of the industry found the average size of commercial turbines has grown 10-fold in the last 30 years, from diameters of 50 feet in 1980 to nearly 500 feet today. Turbines with larger rotors harness more wind and generate more power “without proportional increases in their mass or the masses of the tower and the nacelle that houses the generator,” according to the report.
Engineers at GE’s wind business just came up a new method to lengthen existing wind blades and increase the rotor diameter by 40 percent. The longer blades allow turbines to harness wind moving at lower speeds and boost power production by more than 20 percent.
Images: GE recently sent six Instagram photographers to climb 264 feet up a wind turbine in Cape Cod, Mass., and share their view with world at #GEInstaWalk2014.
Mark Johnson, engineering leader at GE Renewable Energy, said that his team tapped GE’s expertise in engineering aerodynamics, materials science, structural engineering and controls. They found a way to cut a standard 37-meter [120 feet] blade roughly in half and insert a 7-meter [23 feet] blade extension (see time-lapse video below). The research project has fetched 16 patent applications so far.
The extended blades have gone tough tests exceeding requirements set by the International Electrotechnical Commission (IEC), including static strength tests and fatigue tests totaling more than 6 million cycles.
GE has invested more than $2 billion in renewable energy R&D. The research includes projects like developing blades covered with “tensioned fabric” that could be assembled on location. The new design would cut moving costs and make it easier to transport large blades, which can already stretch half way across the football field.
Engineers are also using supercomputers to develop the ideal wind blade profile that’s both quiet and energy-efficient. “If you change the blade design to be quieter, you can spin the rotor faster to produce more power and still meet noise regulation standards,” says Giridhar Jothiprasad, a mechanical engineer with GE Global Research.
E.ON and EDP Renewables have recently started using GE’s “brilliant” wind turbines connected to the Industrial Internet. The technology could allow them to squeeze up to 5 percent in additional energy production from their existing turbine fleets.
Another innovation, the “space-frame” tower, will allow operators to place nacelles as high as 450-feet in locations that were previously hard to reach.
Says Johnson: “At GE, we take big swings to help our customers reach their goals and operate more successfully.”
GE said today it would acquire the cyber security company Wurldtech to expand its digital arsenal for protecting critical infrastructure and operations technology.
GE has started connecting jet engines, power plants, locomotives and other technology to the Industrial Internet, an emerging digital network that links machines, data and software with people.
Wurldtech’s security software is unique since it helps to protect information technology (IT) that allows machines to think, as well as operations technology (OT) – the hardware and software that monitors and controls machines and connects them to people. “The world of OT security needs to be foundationally different from traditional IT detection systems,” said Bill Ruh, GE vice president who runs GE Software. “Nine out of the world’s top 10 automation providers use Wurldtech security products.”
GE said in a statement that the acquisition will help the company boost the reliability of it’s Industrial Internet operations. “Securing connected machines has a unique set of complexities that are very different from protecting a datacenter,” Ruh said. “At GE, we are focused on software platform security, protecting critical infrastructure and helping to ensure the reliability of Industrial Internet operations for our customers and industries.”
Wurldtech is based in Vancouver, British Columbia. Its products like Achilles Test and Achilles Threat Intelligence help customers spot vulnerabilities in products and infrastructure, find their root case and protect assets.The company’s systems are securing oil and gas operations, electric systems, and medical, nuclear and chemical plants around the world.
Blowout preventers, or BOPs, are incredibly complex machines that sit deep on the sea floor and serve as the last line of defense if something in the oil well goes wrong.
These 250,000-pound, 60-foot steel behemoths have to be regularly pulled up, inspected and serviced. As a rule of thumb, workers often replace as many as 20 percent of their parts to keep them safe, effectively rebuilding the entire machine every five years. “That’s one way to do it,” says Bob Judge, director of product management at GE Oil & Gas.
The other, smarter way involves connecting BOPs to the Industrial Internet, and collecting and analyzing the data they send up. “We need to move from the ‘break-fix’ model to a predictive maintenance model,” Judge says. “What if you had a technology gathering BOP data so that the next time you pull it out you know exactly what needs to be replaced and have the replacement parts available on the drilling rig?”
Judge says that it costs between $10 and $16 million to surface a BOP. Predictive maintenance could save drilling companies millions in unplanned downtime.
He compares such systems to the oil life indicator in a 2014 model car. “Not so long ago, you changed your oil every 3,000 miles,” he says. “But a new Chevy Tahoe will tell that you have 27 percent of oil life remaining, based on mileage and other engine conditions that affect the oil. That’s predictive analytics.”
Judge and his team spent the last couple of years studying BOPs and data about wellbore and hydraulic system pressures, solenoid current draws, valve positions and other information. The data is already being supplied by sensors on BOPs and stored in a database called “datalogger.”
They used the data to develop a system called SeaLytics BOP Advisor that allows crews to monitor the health of BOP components and determine how many cycles they have gone through, what needs to be fixed and when. “When there is a problem, the drilling contractor will know within seconds,” Judge says.
The technology is similar to GE systems that already monitor jet engines and locomotives. It is built on Predix, the first industrial-strength platform built by GE for machine analytics.
Judge said that he had an epiphany for SeaLytics when he saw a demonstration of myEngines, which allows airlines to remotely monitor the status of their jet engine fleets and streamline maintenance and repair. “I thought if we could have the same model that’s been proven to work to benefit the drilling industry,” Judge says.
Two customers, the drilling contractors Atwood Oceanics and Brazil’s Queiroz Galvão Óleo e Gás (QGOG), already ordered the technology. “Telling a customer what to fix after it has failed is relatively easy,” Judge says. “Telling them to fix something before it costs them money is the magic.”
Last fall, the Kadlec Health System in Washington State started testing a new cloud-based technology that mashes up professional networking and diagnostics. The system allows doctors to create a professional profile, store patient images and data together in one place, view them from anywhere and access intuitive analytics. “It’s like LinkedIn professional networking meets diagnostic imaging,” said Jeanine Banks, general manager of Commercial Cloud Solutions at GE Healthcare IT, which developed the technology. “There is a lot of waste in the system. We want to help rein in the costs and make the system far more efficient.”
A recent study published in the Journal of American Medical Association found that almost 40 percent of patients are misdiagnosed in primary care1. Another report by the American College of Physicians discovered that unnecessary testing and medical procedures, and extra days in the hospital caused by wrong diagnoses could add up to $800 billion per year, close to one-third of all U.S. healthcare costs.
On Monday, John Dineen, president and CEO of GE Healthcare, Bill Ruh, who runs GE’s Global Software Center, and Michael Leavitt, the former secretary of U.S. Health and Human Services discussed the state of American healthcare and the ways to improve it with technology. Their panel, which was moderated by technology investor and philanthropist Esther Dyson, was part of GE’s Centricity LIVE conference focused on IT in healthcare.
Ruh and Dineen reminded everyone that over the last two decades many consumer-facing industries got thoroughly remade and that healthcare won’t be different. “There was an architectural shift of technology,” Ruh said. “We changed how we deliver and interact with music and books.”
Dineen said that the healthcare landscape was now also changing “from cost plus to profit and loss. The consumer will start making buying decisions,” Dineen said. “There’s going to be transparency. There is going to be a real focus on productivity and customer satisfaction and that’s going to require tremendous investment…The industry will pivot over the next few years.”
Industrial Internet systems like the GE technology that’s now working at Kadlec will be one driver of change. But former Sec. Leavitt said collaborative tools that bring together patients, insurers and providers will help distribute the risk associated with healthcare costs. “Exchanges will allow consumers to make trade-offs,” Leavitt said. “If you stay with me and get your body in a better shape, I’ll give you a better [insurance] price.”
Next-generation healthcare will also focus on outcomes. Dineen said that engineers used to be concerned chiefly with building better machines and “taking the technology to the next level.” But medical systems in the future will have to combine high quality and lower costs with results.
Dineen and Ruh stressed the need to focus on predictive analytics, which has started empowering other industries. Dineen said that in aviation, Industrial Internet systems can already see “a signature of a problem and get it fixed when [the aircraft] comes to a shop and not on a mountain top.”
“It’s not that you get this magic answer that something is going to break,” Ruh said. “You get early indicators. You still need to have experts in the loop.”
Dineen said that right now, the healthcare industry was going through “this clumsy period when the incentives have not kicked in” yet. He listed three stages of the IT revolution in healthcare that need to take place. They include connecting machines and digitizing data, getting data from siloes like primary care providers, as well as the “rich stage,” which involves analysis and learning from the data.
Researchers estimate that the majority of healthcare costs stem from preventable chronic health conditions rather than disease prevention and early detection. Dineen called the status quo “unproductive.” The new system will have the rewards and the incentives to change that, he said.
Top image: The Revolution CT scanner is one of many devices designed by GE engineers to help doctors improve healthcare, advance medical knowledge and fix “broken” hearts.
1Journal of American Medical Association 20122Reuter’s, citing study by American College of Physicians