I saw a thought-provoking article in Gizmodo the other day about a clinical research app Glaxo Smith Kline (GSK) has developed using Apple’s open-source ResearchKit framework. The article featured this rather alarming headline: Apple’s Health Experiment is Riddled with Privacy Problems.
GSK’s Patient Rheumatoid Arthritis Data from the Real World (PARADE) study will use the app to track the mobility of several hundred participants suffering from rheumatoid arthritis, along with certain other data about the subjects. It’s not testing a drug; it’s collecting data to help GSK design better clinical trials in the future.
Before participating in the study, subjects must review a substantial volume of disclosure materials provided through the app. Data collected from participating subjects is anonymized by a private, for-profit service.
Apple doesn’t receive the anonymized data; GSK does. GSK’s use of the data was reviewed by an Institutional Review Board, but GSK uses another private, for-profit services to perform that function.
Despite the headline, the article identifies only two risks that concern privacy: the risk of re-identification and the risk of inadequate disclosure about how the collected data will be used. Re-identification risk isn’t really explored; the article just notes that there’s always a possibility that anonymized data will be re-identified. The article does critique the app’s presentation of material when seeking informed consent, pointing out that after a few screens of explanation, the app directs potential subjects to a nine-page pdf in 12-point type. There are legitimate questions about whether this method of seeking consent should ever be used in a mobile app environment.
The article doesn’t live up to its headline, though, because it identifies only two risks, neither of which is particular to using a ResearchKit app. More importantly, it fails to consider whether re-identification risk associated with use of a ResearchKit app differs from re-identification risk in any 21st Century study and whether GSK’s ResearchKit app leaves potential subjects better informed or less well-informed than participants in studies that don’t use such an app. But the concerns discussed in the article present two things to think about for app developers seeking to build these apps:
(1) It’s probably worth time, money, and a careful IRB review to make sure you can demonstrate that you have minimized re-identification risk and disclosed the risk that’s left. There are statistical techniques for assessing re-identification risk, and using them might give study participants more confidence about participating in studies that are conducted using a ResearchKit or similar app.
(2) Using a pdf to provide any sort of disclosure in a mobile app is questionable, but the article does discuss other ResearchKit apps where disclosure is done right, with multiple short disclosures in readable type, using plain English and explanatory graphics, presented on a succession of screens and followed by a quiz. Developers should look to these examples in formulating their own disclosures and consents.
Paying attention to these “Privacy by Design” considerations on the front end should inspire greater confidence in potential study participants and reassure nervous investors who would otherwise shy away from apps that might entail considerable privacy risk.