What a 'backdoor' to encrypted devices would mean for health data security

Topics: Health Information Technology, Data Privacy Security, Politics and Policy, Federal Government, Regulatory, Mobile Health

The FBI has found a way to access a heavily encrypted iPhone, representing a short-term win for the bureau.

But in the longer-term, experts says its access method and proposed policy of establishing backdoors in encrypted devices may create new vulnerabilities that hackers could exploit to access the health data stored on millions of users' devices.

BACKGROUND >>>
For months, the FBI tried every technique in its arsenal to access a single iPhone 5c: the iPhone used by Syed Rizwan Farook who, along with his wife, killed 14 people at the Department of Public Health in San Bernardino, California.

After the attack, the FBI contacted Apple, seeking help bypassing encryption on Rizwan Farook's phone. Apple refused.

In doing so, Apple CEO Tim Cook said the FBI had asked for "something we consider too dangerous to create." Cook said the existence of a unique operating system to bypass the phone's security features "in the wrong hands" could threaten the privacy and security of its customers.

"This software -- which does not exist today -- would have the potential to unlock any iPhone in someone's physical possession," Cook said, adding that it would be similar to "a master key, capable of opening hundreds of millions of locks."

The controversy appeared on track to be resolved in court. But the FBI last week said it gained access to the iPhone without Apple's help, prompting the agency to drop the Apple case.

The FBI so far has declined to publicly disclose how it unlocked the iPhone, and it appears poised to test the unknown method on other iPhone models. This "ability to now unlock an iPhone through an alternative method raises new uncertainties, including questions about the strength of security in Apple devices," Katie Benner and Eric Lichtblau wrote in the New York Times.

Threats to mobile health data

These new uncertainties surrounding the privacy and security of iPhones have shined a light on the vulnerability of the devices, which have become troves for health data in recent years.

Consider, for example, Apple's flagship Health app. With it, people can count steps, track nutrition, measure sleep, monitor reproductive health and more. There's also ResearchKit, a framework for developing medical research apps, which partner universities and health systems are using to recruit participants and conduct studies on diabetes and epilepsy, as well as to collect genetic data. This spring, developers also will have access to CareKit, a more consumer-focused open source software platform that Apple hopes will make it easier for patients to monitor their health and coordinate with providers.

Then there are the third-party apps, like Fitbit and MyFitnessPal, which also record information about users' health.

"There are lots of things that we store on our phone that are personal and that we expect to be private and secure," David Harlow, a health care attorney and author of the HealthBlawg, said in an interview with American Health Line.

Data stored on smartphones may already be vulnerable due to software errors and other vulnerabilities. A study published March 8 found that the majority of 211 diabetes apps leak data (including more than just condition-specific information). And "this is hardly limited to diabetes apps," Eric Boodman wrote for STAT News.

But a technique that could unlock any iPhone, such as the one apparently used by the FBI in the San Bernardino case, would represent a new vulnerability that hackers could potentially exploit.

And some lawmakers are proposing requiring all technology vendors -- not just Apple -- to deliberately create such "backdoors" in all encrypted devices.

The backdoor debate

The concept of requiring backdoors in encrypted devices has been championed by some lawmakers for nearly a decade, and has gained renewed interest in light of the FBI vs. Apple case. 

Members of the House Homeland Security Committee in late February, proposed legislation (HR 4651) to establish a commission tasked with recommending a new encryption policy for the government. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.), the top-ranking Republican and Democrat on the Senate Intelligence Committee, also have signaled an interest in requiring backdoors.

Supporters of backdoors, including FBI Director James Comey, say that law enforcement needs a way to bypass encryption in case of a national security threat.

However, technology and data experts have raised concerns. "Backdoors make it possible to virtually track your every movement and to know your every thought," Patrick G. Eddington, a homeland security and civil liberties policy analyst at the CATO Institute, told American Health Line. By creating a backdoor, "You're making things that much easier for any malicious actor" who wishes to hack into a device, Eddington said.

In the health care space, a backdoor would create another potential route for hackers to access exploitable health information.

"It's possible ... to use this information to impersonate someone who has really good health insurance in order to get some big-ticket medical procedure," Harlow said. "And that's disastrous for the person who's been hacked because we want those records to be accurate."

Once a smartphone is hacked, "You can't put the genie back in the bottle; the information is out," Harlow said. "That's why the first line of defense," in this case preventing the use of a backdoor, "is so important here."

Implications for health privacy laws

Another unanswered question relates to liability. If a technology company builds a backdoor into their devices to help law enforcement, and if hackers later figure out how to use that backdoor illicitly, who bears liability for the resulting data loss?

If -- and that's a big if according to Eddington -- Congress were to mandate a backdoor to all encrypted devices, "There's no question that every manufacturer of software would be concerned about" liability under HIPAA.

Harlow said the privacy law probably would not kick in every time an encrypted device is lost or stolen because "the FBI could argue that the backdoor is" -- at least theoretically -- "only available to law enforcement, and therefore loss of an encrypted device would not be a breach."

But if hackers used a backdoor for unauthorized access to a health care app -- and the app developer noticed -- then HIPAA could come into play.

And Harlow said, "If you're talking about an app developer or promoter that is a covered entity or business associate under HIPAA, then there are very prescribed protocols that you go through."

That often means:

  • Breach notification requirements;
  • Exposure to fines; and
  • The obligation to fix the gap in security.

However, both Harlow and Eddington agreed a legislative mandate for a backdoor into encrypted device is unlikely any time soon.

Neither Burr nor Feinstein have introduced specific legislation yet. And the Homeland Security Committee's bill appears to be viewed "more as a kind of a cooling mechanism" to temporarily abate stakeholder's concerns, according to Eddington.

But even if lawmakers never require manufacturers to create deliberate backdoors into their devices, hackers will likely keep finding new ways to access iPhones and their apps -- so health care data will never be fully safe.

-- by Joe Infantino, senior staff writer