Content from original post
It’s too loud for me to hear inside the Cupertino coffee bar, but Achin Bhowmik says it doesn’t bother him. He’s got a superpower, he says. If I look closely—very closely—I can see the tiny plastic tubes reaching from his ear canals to small devices hidden behind his ears. The hearing aids are running machine-learning algorithms that continuously monitor his “acoustic environment” to help him hear what he wants to hear. In the coffee shop, the devices decide this is a “speech in noise” situation, and automatically dampen the sound of background chatter and espresso machines, and focus four directional mics (two in each device) to amplify my voice instead.