Want a Diagnosis Tomorrow, Not Next Year? Turn to AI

By collecting medical knowledge in a superintelligent AI, your GP can order tests or prescribe medications they’d normally outsource.
This image may contain Human and Person
Getty Images

Inside a red-bricked building on the north side of Washington DC, internist Shantanu Nundy rushes from one examining room to the next, trying to see all 30 patients on his schedule. Most days, five of them will need to follow up with some kind of specialist. And odds are they never will. Year-long waits, hundred-mile drives, and huge out of pocket costs mean 90 percent of America’s most needy citizens can’t follow through on a specialist referral from their primary care doc.

But Nundy’s patients are different. They have access to something most people don’t: a digital braintrust of more than 6,000 doctors, with expert insights neatly collected, curated, and delivered back to Nundy through an artificial intelligence platform. The online system, known as the Human Diagnosis Project, allows primary care doctors to plug into a collective medical superintelligence, helping them order tests or prescribe medications they’d otherwise have to outsource. Which means most of the time, Nundy’s patients wait days, not months, to get answers and get on with their lives.

In the not-too-distant future, that could be the standard of care for all 30 million people currently uninsured or on Medicaid. On Thursday, Human Dx announced a partnership with seven of the country’s top medical institutions to scale up the project, aiming to recruit 100,000 specialists—and their expert assessments—in the next five years. Their goal: close the specialty care gap for 3 million Americans by 2022.

In January, a single mom in her 30s came to see Nundy about pain and joint stiffness in her hands. It had gotten so bad that she had to stop working as a housekeeper, and she was growing desperate. When Nundy pulled up her chart, he realized she had seen another doctor at his clinic a few months prior who referred her to a specialist. But once the patient realized she’d have to pay a few hundred dollars out of pocket for the visit, she didn’t go. Instead, she tried get on a wait list at the public hospital, where she couldn’t navigate the paperwork—English wasn’t her first language.

Now, back where she started, Nundy examined the patient’s hands, which were angrily inflamed. He thought it was probably rheumatoid arthritis, but because the standard treatment can be pretty toxic, he was hesitant to prescribe drugs on his own. So he opened up the Human Dx portal and created a new case description: “35F with pain and joint stiffness in L/R hands x 6 months, suspected AR.” Then he uploaded a picture of her hands and sent out the query.

Within a few hours a few rheumatologists had weighed in, and by the next day they’d confirmed his diagnosis. They’d even suggested a few follow-up tests just to be sure and advice about a course of treatment. “I wouldn’t have had the expertise or confidence to be able to do that on my own,” he says.

Nundy joined Human Dx in 2015, after founder Jayanth Komarneni recruited him to pilot the platform’s core technologies. But the goal was always to go big. Komarneni likens the network to Wikipedia and Linux, but instead of contributors donating encyclopedia entries or code, they donate medical expertise. When a primary care doc gets a perplexing patient, they describe their background, medical history, and presenting symptoms—maybe adding an image of an X-ray, a photo of a rash, or an audio recording of lung sounds. Human Dx’s natural language processing algorithms will mine each case entry for keywords to funnel it to specialists who can create a list of likely diagnoses and recommend treatment.

Now, getting back 10 or 20 different doctors’ takes on a single patient is about as useful as having 20 friends respond individually via email to a potluck invitation. So Human Dx’s machine learning algorithms comb through all the responses to check them against all the project’s previously stored case reports. The network uses them to validate each specialist's finding, weight each one according to confidence level, and combine it with others into a single suggested diagnosis. And with every solved case, Human Dx gets a little bit smarter. “With other online tools if you help one patient you help one patient,” Komarneni says. “What’s different here is that the insights gained for one patient can help so many others. Instead of using AI to replace jobs or make things cheaper we’re using it to provide capacity where none exists.”

Komarneni estimates that those electronic consults can handle 35 to 40 percent of specialist visits, leaving more time for people who really need to get into the office. That’s based on other models implemented around the country at places such as San Francisco General Hospital, UCLA Health System, and Brigham and Women’s Hospital. SFGH’s eReferral system cut the average waiting time for an initial consult from 112 days to 49 within its first year.

That system, which is now the default for every SFGH specialty, relies on dedicated reviewers who get paid to respond to cases in a timely way. But Human Dx doesn’t have those financial incentives—its service is free. Today, though, by partnering with the American Board of Medical Specialities, Human Dx can now offer continuing education and improvement credits to satisfy at least some of the 200 hours doctors are required to complete every four years. And the American Medical Association, the nation’s largest physician group, has committed to getting its members to volunteer, as well as supporting program integrity by verifying physicians on the platform.

It’s a big deal to have the AMA on board. Physicians have historically been wary of attempts to supplant or complement their jobs with AI-enabled tools. But it’s important to not mistake the organization’s participation in the alliance for a formal pro-artificial intelligence stance. The AMA doesn’t yet have an official AI policy, and it doesn’t endorse any specific companies, products, or technologies, including Human Dx’s proprietary algorithms. The medical AI field is still young, with plenty of potential for unintended consequences.

Like discrepancies in quality of care. Alice Chen, the chief medical officer for the San Francisco Health Network and co-director of SFGH’s Center for Innovation in Access and Quality, worries that something like Human Dx might create a two-tiered medical system, where some people get to actually see specialists and some people just get a computerized composite of specialist opinions. “This is the edge of medicine right now,” Chen says. “You just have to find the sweet spot where you can leverage expertise and experience beyond traditional channels and at the same time ensure quality care.”

Researchers at Johns Hopkins, Harvard, and UCSF have been assessing the platform for accuracy and recently submitted results for peer review. The next big hurdle is money. The project is currently one of eight organizations in contention for a $100 million John D. and Catherine T. MacArthur Foundation grant. If Human Dx wins, they’ll spend the money to roll out nationwide. The alliance isn’t contingent on the $100 million award, but it would certainly be a nice way to kickstart the process—especially with specialty visits accounting for more than half of all trips to the doctor’s office.

So it’s possible that the next time you go in for something that stumps your regular physician, instead of seeing a specialist across town, you’ll see five or 10 from around the country. All it takes is a few minutes over lunch or in an elevator to put on a Sherlock Holmes hat, hop into the cloud, and sleuth through your case.