
A bug in the iPhone’s dictation feature has raised concerns after it inadvertently linked the word, racist, with references to President Donald Trump.
Reports from The New York Times and other outlets on Tuesday revealed that when users attempt to transcribe the word, racist, using the iPhone’s dictation feature, it temporarily displays Trump before correcting to the intended word. Similar issues occurred with words like rampant and rampage, where Trump would appear briefly before being corrected. The glitch gained attention on TikTok, with one user remarking, “This is crazy. When you say ‘racist,’ ‘Trump’ pops up.”
Apple acknowledged the issue with its speech recognition model, noting that it occasionally misidentifies phonetically similar words. They confirmed awareness of the problem within the dictation system and announced that they are rolling out a fix. However, some experts believe this may not be just a technical glitch. John Burkey, founder of AI startup WonderRush Ai and a former member of Apple’s Siri team, speculates that the problem may have originated after a recent Apple server update. While he believes the data collected for AI improvements is unlikely to be the cause, Burkey suggests that somewhere within Apple’s system, a software code could have inadvertently been set to convert racist to Trump.
This isn’t the first time Apple’s AI has sparked political controversy. In 2018, Siri faced backlash after responding to the question “Who is Donald Trump?” by displaying nude photos, a glitch later traced back to Wikipedia editors manipulating the sources Siri used. Adding fuel to the fire, this latest issue began emerging just a day after Apple announced plans to invest $500 billion in the U.S. over the next four years. Following a meeting between Apple CEO Tim Cook and Trump at the White House on Friday, Apple revealed on Monday that it would build a 2.7 million-square-foot AI data center in Houston.