Google begins to stake out its place in healthcare

It’s no secret that Alphabet is knee deep in healthcare projects.

Through acquisitions, new hires, partnerships and research efforts, Alphabet, parent company of Google, is using its expertise to organize and analyze enormous volumes of healthcare data.

In mid-November, DeepMind—a London-based Alphabet company focused on AI research— handed off its health team to a newly formed Google division, called Google Health. Former Geisinger President and CEO David Feinberg, MD, is overseeing the new division of Google which he officially joined this month.

Feinberg also will collaborate with leaders at other Alphabet companies working in healthcare, such as Verily Life Sciences, Jeff Dean, senior fellow at Google, explained on Twitter.

The decision to hire Feinberg followed other prominent hires at Alphabet last year. This past summer, former Cleveland Clinic CEO Toby Cosgrove, MD, became an executive adviser to the Google Cloud healthcare and life sciences team. Meanwhile, another Alphabet company, Verily Life Sciences, hired Vivian Lee, a radiologist and former CEO of University of Utah Health, in May as president of health platforms.

“If you look at the key people they are hiring, those are people who not only understand data but also are influential and have connections. It doesn’t take much of a jump to say: You have to have the people who are networked and who understand the healthcare industry to help drive out this new focus on the healthcare arena,” says Bob Fuller, managing partner for healthcare at Clarity Insights, an IT consulting firm focused on data analytics. Analysts expect Alphabet’s IT prowess to have a significant impact on the work of healthcare providers in numerous ways, such as:

· Pushing data standards and, thus, increasing interoperability between disparate systems, such as electronic medical records and PACS.
· Providing a cloud-based platform, so that organizations can store, organize and analyze data using artificial intelligence.
· Using advanced data analysis techniques, including AI, to provide insights into how to detect, treat and prevent diseases, such as diabetes or cancer.
· Automating documentation processes to ease the burden on providers.

Arielle Trzcinski, senior analyst at Forrester Research, says that Alphabet is responding to societal pressure to reduce costs and improve quality in healthcare. “There is a need. I think we are seeing a need coming from consumers, coming from providers, coming from health plans, and vendors are rising to try to meet that need,” she says.

Alphabet—which was created in a 2015 restructuring of Google—is approaching the healthcare sector through various segments of the company, including Google Cloud; Google Brain Team, an AI research organization; and Verily Life Sciences, which uses technology to improve healthcare outcomes.

Interoperability
Google Cloud’s aim is to provide storage, computing and AI resources to customers, which now include the Cleveland Clinic and Stanford Medicine.

“It really gets to the heart of the power of data to advance healthcare and with all the that data exponentially increasing now,” says Greg Moore, MD, a neuroradiologist and vice president for healthcare and life sciences at Google Cloud, referring to data from sources such as electronic medical records, PACS and genomic sequencing.

That is why Google Cloud is promoting interoperability through application programming interfaces.

For example, Google purchased Apigee, an API management company, in 2016, which now runs on Google Cloud. Apigee has several marquee healthcare customers, including Walgreens, the Cleveland Clinic and Rush University Medical Center. In addition, the company launched Apigee Healthcare APIx, a tool that includes 16 preconfigured APIs based on HL7’s Fast Healthcare Interoperability Resources data standard.

Fuller says that the work of Google and other large tech firms to increase interoperability is “helping to propel the adoption of FHIR and is going to be a key piece of allowing us to have better access to data. The other thing this is going to drive is more open architectures with some of the large EMR vendors, like the Epic and Cerner, where they have been somewhat closed in the past. They are starting to adopt this strategy of opening up for APIs.”

Google also is touting its Cloud Healthcare API, currently in alpha, as a standard-based “bridge” between applications built in Google Cloud and data not only from electronic health records but also data from other sources, such as PACs.

Euan Ashley, MD, a cardiologist and co-director of the Stanford Medicine Clinical Genomics Program, a Google Cloud customer, says, “I think Google’s approach is to play to its strengths. The cloud allows them an infrastructure to store and organize data but also the computing resources to run AI algorithms.”

Artificial intelligence
In addition to selling Google Cloud’s technology to external customers, Google and other Alphabet companies have been using those computing resources combined with expertise in AI to uncover new insights into detecting and predicting the onset of diseases.

For example, the former DeepMind health team, now part of Google Health, built a mobile phone app, called Streams, which is designed to help physicians and nurses spot life-threatening conditions in hospitalized patients. The app currently is being used at some hospitals in the United Kingdom’s National Health Service to detect and treat acute kidney injury. But the DeepMind/Google team would like to upgrade that app with alerts for other serious conditions, such as sepsis, and to allow clinicians to review patients’ vital signs and record that information directly into the app, according to DeepMind’s website.

The app’s algorithm currently is not based on machine learning, but that also is an avenue for future improvement, according to the website.

Google also is partnering with other organizations to improve medical care.

Researchers at Google, the University of Chicago, the University of California, San Francisco, and Stanford University used deep learning methods to predict in-hospital mortality rates, 30-day unplanned readmission rates, prolonged length of stay and discharge diagnoses. The deep learning models—which analyzed a total of 46,864,534,945 data points from de-identified patient records—outperformed traditional analytic methods, the researchers said in a May 2018 article in the journal, npj Digital Medicine.

Google also has been using machine learning to diagnose diseases. In ophthalmology, Google researchers used machine learning to detect signs of diabetic retinopathy in retinal scans, while in pathology, they used machine learning to develop an algorithm to detect metastatic breast cancer from digital images.

Moore says these projects demonstrate that it is possible for machine learning to analyze digital images and diagnose disease on par with medical specialists, such as pathologists and ophthalmologists.

In the case of ophthalmology specifically, he adds, machine learning could provide patients with access to an evaluation for diabetic retinopathy in geographic areas where there are shortages of trained specialists, such as India.

Google is using AI not only in disease detection, treatment and prevention, but also to automate the process of creating progress notes in electronic health records. In a pilot project, called digital-scribe, Google and Stanford Medicine are using machine learning and voice recognition to create electronic notes by analyzing recordings of conversations between clinicians and patients during office visits.

The goal is to “relieve doctors of a lot of the documentation burden that they have today. And you would also then probably get richer and more interesting information into the medical record,” Dean, who leads Google Brain Team, told an audience at Google Cloud Next 18 in July.

Commercial potential
When asked about the potential commercial applications of the healthcare specific research at Google, Moore says, “We really don’t comment on specific products in the road map.” Moore also noted that research projects “sometimes become products and sometimes they don’t.”

However, analysts envision commercial products emanating from this work. “When we think about natural language processing or machine learning, there is a lot of applicability to do predictive analytics or prescriptive analytics. We can help identify those patients who are at high risk or rising risk and get that patient the right care at the right time,” Trzcinski says. “I think that is one of the key value propositions of the work that they are doing, and why I think they are going to do this.”

Trzcinski says the predictive and prescriptive models could be “purchased by vendors, providers or health plans. They could license some of these models that are being developed, or you could see them become embedded in the vendors’ solutions.”

Fuller notes that Alphabet needs access to its own sources of patient data to develop AI-powered algorithms that solve specific problems in the delivery of healthcare services. “Deriving these deep insights is going to be all about access to data,” he says.

Verily Life Sciences is addressing this problem by partnering with Stanford Medicine and Duke University School of Medicine on Project Baseline, which is designed to discover new insights about human health and the onset of diseases.

The first step is a study to collect and analyze a comprehensive set of health data on a diverse group of 10,000 people over a multi-year period. The data—which is hosted on Google Cloud—includes information from clinical office visits, imaging studies, self-reports, sensor readings and bio specimen collection.

As the Project Baseline partners explain on a public-facing website, the project will use participant’s de-identified data to develop a database and advanced tools for […]

Tags:, ,

Add a Comment

Your email address will not be published.