Researchers at Check Point found a vulnerability on Amazon’s Alexa voice assistant that gives hackers access to users’ entire voice history and personal data associated with their Alexa account. Hackers could exploit the newly discovered Amazon Alexa bug by tricking Alexa users into clicking a malicious Amazon link. By doing so, the attacker impersonates the user to access their audio recording, the list of installed Alexa Skills, and personal information. Alexa Skills are a set of voice-driven apps that enable various functions on Alexa. The vulnerability exists in over 200 million devices shipped worldwide. Amazon acknowledged the vulnerability but denies that hackers could access bank information.
Nature and origin of the Alexa bug
The Alexa bug originates from a Cross-Origin Resource Sharing (CORS) misconfiguration, which enables Cross-Site Scripting (XSS) attacks on Alexa domains. Hackers could execute the attacks by sending Ajax requests to Alexa subdomains from vulnerable Amazon domains. Such attacks allow hackers to acquire the CSRF tokens to exploit for impersonating the legitimate user. Check Point also found an SSL feature that prevented traffic inspection, and which hackers could bypass using a Frida SSL script.
An attack scenario involves a hacker creating a malicious Amazon link to redirect users to track.amazon.com – the vulnerable domain used for tracking packages. This subdomain allows the hacker to inject code and send special requests using the targeted user’s cookies to skillsstore.amazon.com/app/secure/your-skills-page. The system would then mistake the attacker for the legitimate user, thus granting them access to the list of installed Alexa skills and account details.
Risk posed by Alexa bug
According to Check Point, the attacker could access personal information such as usernames, phone numbers, home addresses, and banking data histories. The attacker could also retrieve the victim’s entire voice history. Additionally, cybercriminals have access to the list of Alexa Skills installed on the user’s Alexa account. Using these privileges, hackers could add a malicious skill or remove the installed skills or apps. When a user attempts to interact with the maliciously installed skill, hackers would gain access to the account.
Many users are accustomed to using virtual assistants to accomplish various activities around their houses. However, very few understand the security implications of using the technology for sensitive activities. The recently discovered Alexa bug is a reminder that your voice history could open a new attack surface for criminals to invade your private life through smart speakers. With virtual assistants performing advanced functions such as controlling other smart devices in home automation systems, the risk posed by such devices is chilling.
Javvad Malik, Security Awareness Advocate at KnowBe4, blames the attack on the increasing complexity of social engineering tactics.
“This attack, like the majority of cyber attacks, relies on social engineering the victims to click on a phishing link. Therefore, security awareness and training can help to prevent this, and many other attacks which utilise phishing from being successful.”
Response from Amazon
Amazon said it appreciated the efforts of the independent researchers in uncovering the potential security risks for its products. However, the company noted that hackers were yet to exploit the Alexa bug in the wild. The company also denied that hackers could access banking data history, saying that the information was always redacted in Alexa responses. The company also successfully patched the Alexa bug and promised to keep strengthening its products’ defenses.
Oded Vanunu, Check Point’s head of Products vulnerability research, said that the multilayered nature of the Alexa bug prevented Amazon from detecting the vulnerability. He lauded Amazon’s quick response in protecting over 200 million devices affected by the Alexa bug. He also advised users to control the type and amount of data on their Alexa accounts.
How to delete your voice history from Alexa
Vanunu pointed out that Alexa stored user’s voice history indefinitely. This practice exposed such data to possibly undisclosed future attacks. Additionally, given the previous controversial practice of using human transcribers for processing the user’s audio input commands, deleting the one’s voice history is highly recommended.
To delete your voice history, open the Alexa app and navigate to Settings > History. From the interface, you could delete voice entries individually. To delete multiple voice history records, visit Amazon Website and navigate to Alexa Privacy Settings and select Review Voice History. Additionally, you could delete audio data through voice commands. Saying, “Alexa, delete what I just said” deletes the last voice record while saying, “Alexa, delete everything I said today” would delete daily voice records.