How Apple Is Putting Voices in Users’ Heads—Literally

Streaming calls and music straight to cochlear implants can empower those with hearing loss—and maybe, one day, everybody.
This image may contain Human Person Furniture Couch Clothing and Apparel
Cochlear

My conversation with Mathias Bahnmueller started as pretty much all my phone interviews do. “Can you hear me?” he asked, and I replied affirmatively. Then I asked him the same question. His answer was yes—he could hear me very clearly. And this was a tiny miracle.

That’s because Bahnmueller suffers from hearing loss so severe that a year ago he underwent surgery to install a cochlear implant—an electronic device in the inner ear that replaces the usual hearing mechanism. Around a million patients have undergone this increasingly mainstream form of treatment, and that’s just a fraction of those who could benefit from it. (Of the 360 million people worldwide with hearing loss, about 10 percent would qualify for the surgery.) “For those who reach a point where hearing aids no longer help, this is the only solution,” says Allison Biever, an audiologist in Englewood, CO who works with implant patients. “It’s like restoring a signal in a radio station.”

Cochlear implants bypass the usual hearing process by embedding a device in the inner ear and connecting it via electrodes to the nerve that sends audio signals to the brain. The implant gets sound from an external microphone and sound processor that usually sits behind the ear. Until now, users have had to deal with balky remote controls to adjust the settings. And dealing with smartphones has required a separate piece of equipment that vexes communication thanks to its low quality and annoying lags. But Bahnmueller, a 49-year-old executive in automotive safety, has recently been testing a new solution. The reason I was coming through so clearly is that his over-the-ear device linked to the implant was streaming directly from his iPhone—essentially putting the conversation in his head.

The system he was using came from a collaboration between Apple and Cochlear, a company that has been involved with implant technology since the treatment’s early days. The firms announced last week that the first product based on this approach, Cochlear’s Nucleus 7 sound processor, won FDA approval in June—the first time that the agency has approved such a link between cochlear implants and phones or tablets. Those using the system can not only get phone calls directly routed inside their skulls, but also stream music, podcasts, audio books, movie soundtracks, and even Siri—all straight to the implant.

“While our devices have been built to support hearing aids for years, we found that the experience of people trying to make a phone call was not always a good one,” says Sarah Herrlinger, Apple’s director of global accessibility policy. “So we brought together a lot of people in different areas around the company to start investigating ways to make the process easier.” As she indicates, Apple’s accessibility team has been working for several years to support conventional hearing aids—an initiative whose results are made apparent by not only the dozens of hearing-related products in the App Store, but also a Hearing Aid Mode built into the iOS settings. It connects with hearing aids whose manufacturers have adopted the free Apple protocols, earning them a “Made for iPhone” approval. Apple also has developed a feature called Live Listen that lets hearing aid users employ the iPhone as a microphone—which comes in handy at meetings and restaurants.

Taking on the task of making iPhones with cochlear implants was harder. “Our goal was to get rid of all those extra things that need batteries and can get in the way, so when a phone call comes in you just hit the button to answer it and that sound is streaming into your hearing aids,” says Herrlinger. It wasn’t an easy process, because this solution required pushing the Bluetooth wireless technology farther than usual. To do this, Apple’s accessibility team—which spans the company’s entire product line—had to tap the talents of its engineering staff in wireless, battery consumption, and UI design. “It’s a different type of device, so we had to do more iteration,” says Eric Seymour, Apple director of accessibility engineering.

To solve the huge problem of streaming high-quality audio without quickly draining the tiny zinc batteries in hearing aids, Apple had previously developed a new technology called Bluetooth LEA, or Low Energy Audio. The company released that (but didn’t talk about it) when the first Made for iPhone hearing aids appeared in 2014. Previously, the low-energy standard for Bluetooth—called LE—was used, as its name implies, only for tasks that are parsimonious in sending data, such as getting readings from heart rate monitors and FitBits. Apple says that LEA is the first use of the low-energy standard to stream high-quality music and voice while preserving LE’s battery-extending properties. “We chose Bluetooth LE technology because that was the lowest power radio we had in our phones,” says Sriram Hariharan, an engineering manager on Apple’s CoreBluetooth team. To make LEA work with cochlear implants he says, “We spent a lot of time tuning our solution to meet the requirements of the battery technology used in the hearing aids and cochlear implants.” Apple understood that, as with all wireless links, some data packets would be lost in transmission—so the team figured out how to compensate for that, and re-transmit them as needed. “All those things came together to figure out how to actually do this,” says Hariharan.

Cochlear

The user, of course, doesn’t have to deal with concepts like packet transmission—the controls are all in standard-looking settings, beginning with activation. Herrlinger says that an iPhone or iPod Touch pairs with hearing aids—cochlear and conventional—the same way that it finds AirPods or nearby Bluetooth speakers. I wondered whether the process of making iPhones work with hearing aids might have actually affected the development of AirPods themselves. At that point, a wired-in allegiance to Apple secrecy—maybe employees have that implanted in their heads, like cochlear devices—kicked in, and none of the five Apple people who were on the call with me would shed light on that matter. (After the conversation, I was able to get more illumination on the matter—the Made for iPhone work was focused on accessibility, and AirPods do not use Bluetooth LEA.)

Another aspect of the system announced last week is that it supports what’s known as a “bimodal” setup, where hearing on one side comes from an implant and hearing on the other side comes from a conventional hearing aid. This is fairly common, as hearing loss often is worse on one side. (In other cases, our friends in the health insurance industry decide that, despite medical indications, one cochlear implant is plenty for a customer.) A third partner in the collaboration, ReSound, worked with Apple and Cochlear so that the integration was seamless—and sometimes can work to its advantage. (In 2014, ReSound became the first company to sell an Made for iPhone hearing aid.) Herrlinger describes a situation in which a bimodal user is in a loud restaurant, dining with a person seated to her left. That user could drop the volume on the right hand side, blocking out the din, and concentrate on the conversation.

Oh, and the Nucleus 7 uses Apple’s location technology to implement a Find My Processor feature, which will be useful for kids with implants to locate the units after they’re jarred loose on the playground.

Though Cochlear—which, according to its senior VP for research and development Jan Janssen, has about half of the implant market—is the first to use the system, Apple will offer the technology free to qualified manufacturers. Right now it’s the only alternative, as Google, whose Android system is Apple’s chief mobile operating system competitor, says that its accessibility team’s hearing efforts have so far focused on captioning. Hearing aid support, the company says, is on the roadmap, but there’s no public timeline for now.

Merging medical technology like Apple’s is a clear benefit to those needing hearing help. But I’m intrigued by some observations that Dr. Biever, the audiologist who’s worked with hearing loss patients for two decades, shared with me. She says that with this system, patients have the ability to control their sound environment in a way that those with good hearing do not—so much so that she is sometimes envious. How cool would it be to listen to a song without anyone in the room hearing it? “When I’m in the noisiest of rooms and take a call on my iPhone, I can’t hold my phone to ear and do a call,” she says. “But my recipient can do this.”

This paradox reminds me of the approach I’m seeing in the early commercial efforts to develop a brain-machine interface: an initial focus on those with cognitive challenges with a long-term goal of supercharging everyone’s brain. We’re already sort of cyborgs, working in a partnership of dependency with those palm-size slabs of glass and silicon that we carry in our pockets and purses. The next few decades may well see them integrated subcutaneously.

I’m not suggesting that we all might undergo surgery to make use of the tools that Apple has developed. But I do see a future where our senses are augmented less invasively. Pulling out a smartphone to fine-tune one’s aural environment (or even sending vibes to a brain-controlled successor to the iPhone) might one day be as common as tweaking bass and treble on a stereo system.

For now, the implant and Apple’s new technology is more than enough for Mathias Bahnmueller. Before he had the surgery, Bahnmueller’s hearing difficulties were getting in the way of his job—he was unable to follow presentations at board meetings, for instance. They cut him off from his loved ones, too. He would ask his 10-year-old daughter to repeat what she’d said, and when she’d answer, “Never mind, it wasn’t important,” he’d be devastated.

Now that he has the implant, he can hear his daughter the first time she speaks. Using his new device, he listens to audiobooks streamed directly to his skull. And when he recently went to a noisy brewpub on date night with his wife, he pulled out his phone, changed the settings, and focused only on what she said.

Everyone else in the place was probably shouting to be heard. But the guy with the implant could hear his wife’s voice very clearly.


iPhone, You Phone