For years, smartphone users have suspected that their devices were eavesdropping on their conversations to serve targeted ads. Now, recent revelations suggest these concerns could be justified.
A leaked pitch deck from CMG Local Solutions, a subsidiary of Cox Media Group (CMG), details a method it calls “active listening.” This method uses AI to combine voice data with online behavioral data to deliver hyper-targeted advertising.
The deck, obtained by 404 Media, states, “Advertisers can pair this voice-data with behavioral data to target in-market consumers.” It goes on to say that the technology can identify “ready-to-buy” consumers and create ad lists based on their spoken intentions.
A spokesperson for CMG told Newsweek that “CMG businesses have never listened to any conversations nor had access to anything beyond third-party aggregated, anonymized, and fully encrypted data sets that can be used for ad placement.”
In other words, CMG acquired existing voice datasets from third-party providers to combine with other sources. The company referred to the slide deck as “outdated materials for a product that CMG Local Solutions no longer sells,” adding that “although the product never listened to customers, it has long been discontinued to avoid misperception.”
While tech firms like Google, Meta (Facebook’s parent company), and Amazon were listed as CMG clients in the deck, all three have denied involvement with the active listening program.
A spokesperson for Amazon told Newsweek: “Amazon Ads has never worked with CMG on this program and has no plans to do so.”
A Google spokesperson told Newsweek: “All advertisers must comply with all applicable laws and regulations as well as our Google Ads policies, and when we identify ads or advertisers that violate these policies, we will take appropriate action.”
According to Google, Cox Media Group has been removed from the Google Partners Program as part of its review process. Google takes action when it finds ads or advertisers that violate its ad policies.
A Meta spokesperson told Newsweek: “Meta does not use your phone’s microphone for ads and we’ve been public about this for years. We are reaching out to CMG to get them to clarify that their program is not based on Meta data.”
Meta is investigating whether CMG violated its terms and conditions and says it will take appropriate action as necessary.
The third-party data referred to by CMG often comes from smartphone apps that capture data (voice or otherwise) based on the end user agreeing to the terms and conditions. However, research has shown that 91 percent of people agree to T&Cs without reading them. This jumps to 97 percent for those aged 18-24.
For end users who have already agreed to T&Cs, the first thing to do is check permissions on such apps.
“For an app to perform active listening, it would need microphone access permissions. On both Android and iOS devices, this permission is explicitly requested when the app is installed or updated,” said Luis Corrons, researcher and Norton Security Evangelist.
“Apps may also request background access, which allows them to continue listening even when not actively in use. You can detect microphone usage in several ways: iOS now shows an orange or green dot in the status bar when the microphone or camera is being used. Android also has visual indicators that alert users when the microphone is actively being accessed,” added Corrons.
Reviewing app permissions regularly can help identify if an app has unnecessary access, said the security researcher. But what are the technical differences between how virtual assistants like Siri or Alexa listen versus how other apps might conduct ‘active listening’ for advertising purposes?
“Assistants like Siri, Alexa and Google do listen because they need to listen for trigger words, such as ‘Hey Siri.’ Once they hear the activation, they show that the microphone is activated, letting the user know they are being ‘listened to.’ If an application would like to listen, they also need the microphone permission and use would, again, trigger the icon,” Corrons said.
There are best practices when it comes to eliminating unwanted background listening, added Corrons: “Always check the permissions for your voice assistants and use the most limited permissions. For example, only allow Siri or Alexa to activate when using the wake word and disable listening when the device is locked.”