It is obvious that there is a clear advantage of having devices that are always listening for voice commands because it means you can access it from a distances and not fiddle with buttons. It is more definitely more efficient. However there is also a disadvantage and that is it is always listening.
Of course companies always deny that they are eavesdropping or recording your conversations, but the fact that our devices are constantly on the lookout for trigger words can be disconcerting for some. In fact recently researchers have found that malicious commands for digital assistants can actually be hidden inside of songs.
This is according to researchers at Berkeley who recently published a paper on it dubbed the CommanderSong concept which shows how voice commands can be hidden inside of music. In addition, researchers at Princeton and China’s Zhejiang University have demonstrated what they are calling the “DolphinAttack”.
This is where they hide malicious commands inside of music that can’t even be heard by the human ear, which in turn will trigger Siri and get it to do certain things, such as taking photos, making phone calls, and so on. So far there haven’t been any reports of such attacks or use of these exploits in the real world, but this is probably something that companies such as Apple, Amazon, and Google should probably look at.
. Read more about