Powered by artificial intelligence, technologies such as Siri, Alexa, Google Assistant, and Cortana have become ubiquitous in our culture.
With people across the world embracing the smart home assistants, the question of its security has never been more relevant.

Download Infographic

Download and share the infographic for free with employees.

Major vulnerabilities and threats

  • Voice squatting 
    Smart assistants can easily misinterpret one invocation over another similar-sounding one. Voice squatting is when a threat actor uses this flaw to take advantage or abuse the way an action is invoked.  
  • Voice masquerading
    A threat actor could run a malicious function in the background impersonating a legitimate one, to trick users into giving out their sensitive information, or eavesdrop on conversations without the knowledge of the user.
  • Third party users
    A malicious third party could potentially activate a home assistant and give it commands as majority of the devices cannot efficiently distinguish between people’s voices.

Best practices while using a smart assistant 

  • Turn off the smart assistant after its use.  
  • Create a strong password for the app/online account that controls your voice assistant. Enable MFA if available. 
  • Activate alerts that tells you when your voice assistant is actively listening. 
  • Review your default settings. Periodically, check the history and delete trivial, old recordings.