Apple Training Siri to Better Understand People With Atypical Speech

Spread the love
  • Yum

Apple is researching how to improve Siri to better understand people who talk with a stutter, according to new details shared by The Wall Street Journal in a piece on how companies train voice assistants to handle atypical speech.

ios14siriinterface


Apple has built a bank of 28,000 audio clips from podcasts featuring people who stutter, which could be used to train ‌Siri‌. The data that Apple has collected will improve voice recognition systems for atypical speech patterns, according to an Apple spokesperson.

Along with improving how ‌Siri‌ understands people with atypical speech patterns, Apple has also implemented a Hold to Talk feature for ‌Siri‌ that allows users to control how long they want ‌Siri‌ to listen for. This prevents ‌Siri‌ from interrupting users with a stutter before they’re finished speaking.

‌Siri‌ can also be used without voice all together, through a Type to Siri feature that was first introduced in iOS 11.

Apple plans to outline its work to improve ‌Siri‌ in a research paper set to be published this week, which will provide more details on the company’s efforts.

Google and Amazon are also working to train Google Assistant and Alexa to better understand all users, including those who have trouble using their voices. Google is collecting atypical speech data, and Amazon in December launched the Alexa Fund to let people who have speech impairments train an algorithm to recognize their unique vocal patterns.

Leave a Reply

%d bloggers like this: