Sensors that measure human health are now commonplace. From fitness to home health care to the humble mobile phone, there is a wealth of data that could not only improve our health, but save lives.
Some of this data is being collected and presented to us as “daily goals”, relegating these advanced devices to glorified pedometers. True health interventions rely on much more sophisticated but potentially more invasive analysis. For mHealth to deliver on its promise, the privacy of this analysis must be addressed.
Watch the Webinar
Tozny is seeking partners with capabilities in innovative sensor analysis for mHealth. As a team, we will address core research questions, supported by critical engineering objectives, to address a research funding opportunity:
- Based on the data from commodity sensors, what innovative health analyses can be performed?
- Can this analysis be applied to timely and recognized interventions that could improve health outcomes, and can be facilitated with technology?
- How can this be accomplished in a privacy-preserving way so that users have informed consent for both the analysis and intervention, and in a secure way so that sensitive data stored about users is protected appropriately.
The process is performed in three phases: Sensing, Analysis, and Intervention. Each phase is enabled by privacy-enhancing technology: A Personal Data Service (PDS), developed under funding from the NSTIC program at NIST, with an emphasis on privacy, security, and strong cryptography.
Sensing: Commodity devices
Mobile phones are a vast sensor array that are augmented by external sensors like wearables, bluetooth blood pressure monitors, video cameras, bed sensors, and many others. The data collected from these devices is usually delivered to the cloud via user’s phone, an “IoT Hub”, or directly via a wifi connection.
- Example: A fitness band can sense an elderly patient’s motion and activity via the accelerometer. The patient consents to the collection, analysis, and intervention by a family member.
- Challenges: Accuracy of sensor data from commodity devices is not highly accurate, which could lead to bogus analysis and undesirable interventions. For instance, fitness band heart rate monitors worn on the wrist are not as accurate as those worn on the torso. Sensor accuracy should be increased where possible, inaccuracies should be systematically taken into account with some concept of confidence, and the analysis engine should include this when it triggers interventions.
- Privacy considerations: Sensor data can include extremely private information about users. Data must be made available to the systems in the analysis phase, but should be cryptographically access controlled so that only the relevant analysis systems have the data. Personal information like the user’s name should be stripped from the data in a manner that makes it difficult to identify them. For instance, the GPS sensor and the video camera data should not be combined in the same analysis system (unless it’s relevant to that specific analysis) in order to avoid identifying users.
Analysis: The core research problem
The sensor data is analyzed to determine simple metrics like the number of steps the user has taken, their sleep patterns, or advanced metrics like the user’s real identity. Advanced metric analysis is the core of the research problem. It can be computed on the device itself or in the cloud, and can take in data from multiple sensors. Critically, the analysis can indicate an urgent health situation or progress on a plan of care.
- Example: An analysis engine could determine that an elderly patient has had a fall and is not getting back up based on the accelerometer data from a fitness band.
- Challenges: While a lot of data is available, it is difficult to interpret. Analyses need to be tied to useful interventions, and confidence in the analysis has to be high enough (or the risk of the intervention low enough) that the intervention can be triggered. Data from multiple sensors needs to be correlated and lined up in time. Often, baseline data needs to be collected over a long period of time, giving rise to privacy considerations outlined below. Deviation from baseline data might or might not imply that something has gone wrong.
- Privacy considerations: Analysis may include sensitive health information, but the analysis engine should not have access to any information it does not need, including the user’s identity or the intervention methods. The output from analysis should not include sensor data and only be available to the intervention system.
Intervention: The payoff
The outcomes of the analysis can indicate that an urgent intervention is necessary, or a more minor course correction is suggested. It could alert the user directly, alert their healthcare provider, or alert a trusted family member. Such immediate interventions could vastly improve the health outcomes over less timely approaches.
- Example: The elderly patient’s fitness band could alert a trusted party within their family that they may have fallen. The patient may be given the opportunity (by the PDS) to say that they have not fallen, and so the intervention should not happen.
- Challenges: Accepted interventions that can be triggered with technology and tied to automated analysis will be difficult to identify. While some conditions may be easy to detect with sensors, there might not be an intervention. The “feedback loop” of the intervention may interfere with future analysis.
- Privacy considerations: Intervention data must include enough personal information for the intervention to be effective. It may be re-combined with some aspects of the analysis (but not the raw sensor data). For instance, the user’s identity and GPS location should be shared to support the intervention, even though that was never transmitted to the analysis engine.
Platform: Collection, Sharing, and Privacy
Tozny’s platform facilitates the interaction between the different components and addresses the privacy and security concerns inherent in this type of data collection and analysis. Without such a platform, this data collection will raise privacy concerns among regulators and will be a barrier to entry for the individuals whose data is being collected.
Tozny will allow data collection from all project participants’ sensors and analysis systems. Through fine-grained access control, participants can request access to data that is relevant to their analysis, and the end user can choose to release or not release data for that purpose.
Tozny’s PDS supports triple blind:
- The analysis engine cannot see identity data or irrelevant sensor data
- The intervention agent cannot see the results of analysis that they are not involved in
- Other sensors, analysis engines, and intervention agents can only see the data that is relevant to them.
Privacy and Trust Principles
The PDS supports the following principles, derived from the White House’s Precision Health Initiative Privacy and Trust Principles.
- Data Quality and Integrity: Data quality and integrity should be maintained at all stages -- collection, maintenance, use, and dissemination. Standards of accuracy, relevance, and completeness should be appropriately up-to-date. Users can directly report and correct data inaccuracies.
- Data Sharing, Access, and Use: Data access, use, and sharing should be permitted for authorized purposes only. Certain activities should be expressly prohibited, including sale or use of the data for targeted advertising. The PDS is privacy preserving: data can be accessed without identifying the user. The system has fine-grained access control to permit multiple tiers of data access.
- Governance: The project should include representation at all levels of program oversight, design, implementation, and evaluation. The PDS addresses this using NIST’s new Privacy Risk Management Framework (PRMF). Tozny is among the first to build a system from the ground-up using this process.
- Transparency: A dynamic information-sharing process should be developed to ensure all participants remain informed through all stages of participation. The PDS can facilitate the collection of aggregate data to allow for transparent public reporting, without identifying specific users.
- Respecting Participant Preferences: The project should be inclusive, engaging individuals from communities with varied preferences and risk tolerances about data collection and sharing. The PDS includes software services for management of information sharing and consent.
- Participant Empowerment through Access to Information: Enable participants’ access to the information they contribute in consumer-friendly and innovative ways. The PDS offers users the ability to access their own data and receive reports and information about their data.
Apply for the Pilot
We’re funded by NIST to bring security and privacy to select organizations.