Accelerating Responsible Use of De-identified Data in Algorithm Development
Digitization of health care continues to yield ever-growing data sources that offer, together with advancements in analytics and machine learning, significant opportunities for breakthroughs in care delivery and bio-medical discovery. While are detailed regulatory and industry guidelines for handling individually identifiable data, de-identified data is typically not subject to privacy laws.
Organizations struggle to determine how to balance privacy and security safeguards with innovative use of de-identified data in algorithm development. This is a challenging balance, which requires industry stakeholders to engage in good faith in establishing trust between institutions that produce data and those that use data. To that end, the Trust Framework for Accelerating Responsible Use of De-identified Data in Algorithm Development is a necessary first step to collaborate with industry stakeholders in establishing fair and achievable guidelines.
Review the Trust Framework
Health Evolution Forum is engaging health care and technology leaders to ensure that it represents a comprehensive understanding of the challenges and that potential standards being explored are well informed. This effort will proceed in an iterative fashion to develop and refine guidelines within each of the Trust Framework’s six principles.
Click here to download the draft Trust Framework
Submit Your Feedback
Please email Ye Hoffman, Director, Forum at YeH@healthevolution.com with your feedback on:
• Whether the Trust Framework principles are sufficient to address the overarching goal
• What are key questions that the Trust Framework must answer within the six principles
• Which industry organizations and subject matter experts to prioritize for soliciting input
Health Evolution Forum is grateful to the Work Group on Governance and Use of Patient Data in Health IT Products for their ongoing commitment, expertise, and contributions to this effort.