News | Artificial Intelligence | January 04, 2019

Artificial Intelligence Advances Threaten Privacy of Health Data

Study finds current laws and regulations do not safeguard individuals' confidential health information

Artificial Intelligence Advances Threaten Privacy of Health Data

January 4, 2019 — Advances in artificial intelligence (AI) have created new threats to the privacy of people's health data, a new University of California, Berkeley, a new study shows.

Led by University of California Berkeley engineer Anil Aswani, the study suggests current laws and regulations are nowhere near sufficient to keep an individual's health status private in the face of AI development. The research was published Dec. 21 in the JAMA Network Open journal.1

The findings show that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data, such as that collected by activity trackers, smartwatches and smartphones, and correlating it to demographic data.

The mining of two years' worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996's HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked.

"We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the U.S.," said Aswani. "The results point out a major problem. If you strip all the identifying information, it doesn't protect you as much as you'd think. Someone else can come back and put it all back together if they have the right kind of information."

"In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying healthcare data from another company and matching the two," he added. "Now they would have healthcare data that's matched to names, and they could either start selling advertising based on that or they could sell the data to others."

According to Aswani, the problem isn't with the devices, but with how the information the devices capture can be misused and potentially sold on the open market.

"I'm not saying we should abandon these devices," he said. "But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it's a net positive."

Though the study specifically looked at step data, the results suggest a broader threat to the privacy of health data.

"HIPAA regulations make your healthcare private, but they don't cover as much as you think," Aswani said. "Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It's supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it."

Aswani said advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance.

"Ideally, what I'd like to see from this are new regulations or rules that protect health data," he said. "But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing. The risk is that if people are not aware of what's happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to health care are actually increasing and not decreasing."

For more information: www.jamanetwork.com/journals/jamanetworkopen

Reference

1. Na L., Yang C., Lo C., et al. Feasibility of Reidentifying Individuals in Large National Physical Activity Data Sets From Which Protected Health Information Has Been Removed With Use of Machine Learning. JAMA Network Open, Dec. 21, 2018. doi:10.1001/jamanetworkopen.2018.6040


Related Content

News | Artificial Intelligence

Oct. 30, 2024 — HeartLung Technologies, a developer of AI tools for early detection of heart disease, lung cancer and ...

Home October 30, 2024
Home
News | Artificial Intelligence

Oct. 27, 2024 — Investigators from Gifu Heart Center and Fukuoka Sanno Hospital recently presented the results of the ...

Home October 29, 2024
Home
News | Artificial Intelligence

Oct. 14, 2024 – The Institute for AI Governance in Healthcare (IAIGH) has published the draft Healthcare AI Governance ...

Home October 15, 2024
Home
News | Artificial Intelligence

Oct. 2, 2024 — Eko Health recently announced a new independent study from researchers at Imperial College London ...

Home October 04, 2024
Home
Videos | Artificial Intelligence

In DAIC's continuing Thought Leadership Series with Dr. Jeffrey Soble, a practicing cardiologist at Rush University ...

Home July 29, 2024
Home
News | Artificial Intelligence

July 25, 2024 — Nanox, an innovative medical imaging technology company, announced that the AI Cardiac Solution ...

Home July 25, 2024
Home
News | Artificial Intelligence

July 18, 2024 — HeartFlow, Inc., a leader in non-invasive artificial intelligence (AI) heart care solutions, introduced ...

Home July 18, 2024
Home
News | Artificial Intelligence

July 18, 2024 — AliveCor, a global leader in AI-powered cardiology, today announced that the American Medical ...

Home July 18, 2024
Home
News | Artificial Intelligence

July 17, 2024 — HeartFlow, a leader in non-invasive artificial intelligence (AI) heart care solutions, today announced ...

Home July 17, 2024
Home
News | Artificial Intelligence

July 11, 2024 — Heart Test Laboratories, Inc. an artificial intelligence (AI)-powered medical technology company focused ...

Home July 11, 2024
Home
Subscribe Now