Multiple healthcare companies have announced HIPAA-compliant Alexa skills, promising adequate protection for consumers who use Echo devices for health and wellness.
These new voice skills include the ability to manage prescriptions, find clinics, schedule or cancel appointments and get blood glucose readings and health tips.
HIPAA, or the Health Insurance Portability and Accountability Act of 1996, protects personal information that is stored or transferred electronically by health plans, healthcare clearinghouses and healthcare providers. It covers an individual’s physical or mental health or condition, medical services provided, payment for services rendered and anything that identifies the individual or could be used to identify the individual.
These healthcare providers say Alexa skills make them more accessible and make it easier for consumers to manage their health, which, in turn, may eliminate costly problems that arise from not complying with doctor’s orders. They also align with Amazon’s rumored plans for the healthcare industry, from its acquisition of online pharmacy PillPack last year to reports that it is building a health and wellness team within Alexa and has hired doctors to support healthcare projects.
And yet these new skills are three of a reported 56,750 in the U.S. overall, or .005%. So what about the other 56,747 and the data they access? Should they be subject to higher security standards, too?
Amazon did not respond to a request for comment.
A Google rep said its policies don’t allow “actions” that involve the transmission of information that could be considered protected health information.
Both Amazon and Google have dedicated pages that detail the data their devices and assistants collect and how users can manage it. They say the devices are not always recording but rather waiting to be summoned via wake words, and they light up when recording and/or sending data to the cloud. Alexa users can also configure “certain Echo devices” to play a specific tone whenever audio is sent to the cloud and review their voice recordings and delete them in Alexa privacy settings in the app or online.
Meanwhile, Google says it collects data that is “meant to make our services faster, smarter, more relevant and more useful,” including search history for users who opt-in. It saves conversations on Google servers until users delete them, but notes third parties may share information with Google Home based on their own privacy policies.
And mistakes happen. Like, say, when Amazon accidentally sent 1,700 recordings to the wrong user. Or when Amazon sent a recording of a Portland family to a contact in one of their phones without authorization to do so. Or when Google Home Mini recorded everything in a user’s home and sent those recordings to Google without the user’s knowledge.
Even HIPAA might not be enough in 2019
When President Clinton signed HIPAA into law on August 21, 1996, state-of-the-art technology included digital cameras, home computers and DVD players. Flatscreen TVs were still a year away.
Of course, HIPAA has been updated since then. For example, the Department of Health and Human Services issued guidance on cloud computing in 2016. And in December 2018, it asked for public input on how HIPAA could be modified.
But these new skills indicate that another broader update may be in order, as gaps remain in healthcare and beyond.
“The fundamental issue is that Alexa doesn’t identify who is activating the skill so that they can protect their privacy,” said Rosco Schock, CTO of mobile checkout company Powch. “Like a lot of things, you can have security or convenience but not both.”
Dan Linton, data privacy practice lead at marketing company W2O Group, agreed there seems to be little thought into controlling smart speaker access.
“It’s well known that similar voices can access Alexa, and while I don’t mind if a roommate turns off the lights or plays music, I’d rather them not have access to my health data or doctors’ appointments,” he said.
Linton also noted that Amazon puts the responsibility for compliance and security on the independent developers behind its skills.
And in the healthcare industry specifically, Fouad Khalil, vice president of compliance at cybersecurity company SecurityScorecard, said there was a record number of reported breaches in 2018, nearly half of which were a result of uncontrolled third-party access.
“Business associates [like developers] are contractually held accountable for HIPAA compliance, yet they are the major cause of data breaches,” he added.
But beyond questions surrounding developer compliance are concerns of giving hackers a new way to steal data and what Linton called “voice squatting,” which “could cause users to inadvertently enable malicious skills to which unwitting users could reveal sensitive health data to bad actors.”
And, according to Khalil, questions remain about how Amazon and its developers will ensure privacy and security of protected health information in this new format.
“Healthcare using voice technology is new and no standards or laws exist to protect patients beyond the standard HIPAA requirements,” Khalil added. “The Office for Civil Rights along with Health and Human Services must act quickly to develop strict guidelines around the use of this new technology before it proliferates everywhere.”
Sloan Gaon, CEO of programmatic health company PulsePoint, said federal regulation is needed to address how consumer data is and will be treated moving forward.
A 2018 story in the Berkeley Technology Law Journal made a similar argument, saying policymakers can’t assume market forces will curb abuses of consumer privacy with digital assistants. It also noted that the potential for abuse increases exponentially when the digital assistant is connected to other devices, such as televisions, computers, appliances, security cameras, phones and cars.
As a result, the story said that regulators and legislators “must take steps to minimize the risks and protect consumers’ interests and freedom” with a balanced policy that promotes competition and innovation along with social welfare.