Business Report
IBM Aims to Make Medical Expertise a Commodity
Big Blue thinks its Jeopardy! champion Watson can make money by offering health-care providers new expertise without hiring new staff.
U.S. cancer care is headed for a crisis, warned the American Society of Clinical Oncology in March. Cancer cases are projected to soar 42 percent by 2025 as America’s population ages, but the number of oncologists trained to treat them will grow by only 28 percent. That mismatch is likely to exacerbate existing inequalities in care between the fraction of patients treated by specialists at major academic centers and the many more who get care at community clinics or hospitals, mainly from general oncologists.
Enter a game show champion to save the day.
An attempt to transform cancer care is a major part of IBM’s efforts to make money from its Jeopardy!-winning Watson software. The company aims to offer health-care organizations a cheaper way to improve care by turning oncology expertise into a commodity.
This effort to break humans’ monopoly on cancer expertise is the advance guard of a model that IBM hopes it can eventually roll out across many areas of medicine. It is also the first real test of the company’s claims that Watson can move beyond Jeopardy! and earn money.
Whether Watson passes the test could be critical to IBM. The company’s revenue has declined for two years as technology’s shift to the cloud has left some of its core products behind. CEO Ginny Rometty’s promise to spend $1 billion on a new business group dedicated to commercializing Watson is just about the only turnaround prospect in sight.
IBM and collaborators are building two versions of Watson trained in oncology. Memorial Sloan Kettering Cancer Center, in New York, is beta-testing a version for lung, colorectal, and breast cancer. The University of Texas MD Anderson Cancer Center, in Houston, will use one this summer that advises its new fellows on treatments for leukemia. Both help oncologists decide on a treatment plan by ingesting the patient’s medical records and pairing that information with knowledge from medical journals, textbooks, and treatment guidelines.
Lynda Chin, a professor of genomic medicine at MD Anderson and a leader of the center’s Watson project, anticipates that in the future that kind of product will be highly valued by general oncologists and regional cancer practices. “Physicians are too burdened on paperwork and squeezed on revenue to keep up with the latest literature,” she says. That limits the care physicians can deliver, and it has financial consequences: “If you can’t make a decision based on your own knowledge, you have to refer the patient out, and that’s going to hurt your bottom line.”
A version of Watson to be tested this year with brain tumor patients from the New York Genome Center aims to provide oncologists with deep expertise in the new field of genomic medicine that would otherwise be expensive to obtain. This incarnation of Watson suggests treatment options based on details of the mutations detected in a person’s tumor by genomic sequencing. Using genome sequencing to direct cancer treatment is just becoming feasible thanks to the plummeting cost of the technology (see “Cancer Genomics”). But in practice, the challenges of interpreting genomic data keep it beyond the reach of most oncologists and clinics.
“It requires a heroic level of expertise and is entirely manual,” says Ajay Royyuru, director of the computational biology center at IBM’s Yorktown Heights lab. Doctors must chase down relevant research papers for the mutations they find in a patient’s tumor, try to understand how the mutations change the cancer cells’ physiology, and then work out which treatments could target the malfunctioning processes. Getting from a genome sequence to a treatment decision can take five to 10 months, says Royyuru—time that cancer patients can ill afford.
Using Watson, it takes minutes. Doctors need only load in the genomic data. A schematic is then generated showing which of the molecular processes inside a cell have been altered. An oncologist can explore those findings and click a button to see a list of possible treatments that would target the problem pathways.
Though technologically impressive, the Watson cancer projects are not yet contributing materially to IBM shareholders or helping many cancer patients. Although the deals with medical centers are intended to lead to marketable products, they are for now R&D investments, says Michael Karasick, who leads R&D for the Watson group and was previously director of the company’s research lab in Almaden, California. “Revenue comes when the product hits the market,” he says.
Some already have. For example, a Watson-based system for the medical insurer Wellpoint helps preauthorize requests for medical procedures. But Watson-based medical products haven’t been hitting the market at the rate IBM seems to have expected. A document leaked to the Wall Street Journal in January said that the Watson unit was falling behind on a projection that it would bring in $1 billion in revenue by 2018.
One problem is that Watson has struggled to accurately understand technical information (see “IBM Expands Plans for Watson”). It’s been flummoxed by medical jargon, the different ways researchers refer to the same thing in journal articles, and sloppy grammar in doctors’ jottings in patient files. Clinicians have had to spend more time than anticipated teaming up with IBM software developers to chase down the misunderstood acronyms or wrongly parsed sentences that caused Watson to misinterpret medical records or suggest incorrect treatments.
Michael Witbrock, vice president of research at the artificial-intelligence company Cycorp, says that IBM’s Jeopardy! winner was always going to need significant engineering to become an expert in any specific area. The game show calls for a mastery of general knowledge at a shallow level, not the kind of deep, layered expertise needed to treat cancer. “They went after industrial scope, not industrial depth,” says Witbrock.
Eric Brown, director of Watson technologies at IBM’s Yorktown Heights lab, says major changes to Watson, informed in part by feedback from the cancer projects, have helped it adjust to its new work. Although there is still a human training process, improved machine learning means Watson now requires less training to get good results, he says.
A company getting started with Watson today can make use of interfaces including one that involves clicking thumbs up or down next to its answers to test questions. In addition, a new team within IBM’s technical assistance group is dedicated to helping customers prepare data and use it to train Watson. Late last year the company launched a cloud-based platform where products can be built without having to bring IBM technology on site.
One thing those technical improvements haven’t done is shed any more light on whether renting out software that acts like a medical specialist can be a big business. Some people in the health-care industry are unsure.
The most successful products built on advanced data processing historically have been focused on managing costs and efficiency in populations of many patients, not improving what doctors do with individuals, says Russell Richmond, a board member for the health-care data company Explorys and previously CEO of McKinsey’s health-care division, Objective Health.
That kind of product speaks directly to profit margins and is in fact explicitly encouraged by the Affordable Care Act, which is reshaping the U.S. health-care industry. How products like the Watson-powered cancer advisors will make money is less clear. As Richmond puts it: “Helping a cancer patient get the best treatment is really good for humankind, but it may not generate a lot of profit.”