The Communications of the ACM (a leading journal in computer science) recently published my perspective on the proposed Common Rule revisions in the quarterly Computing Ethics column. You can also find the full text here as a pdf file.
Big Data Analytics and Revision of the Common Rule
By Jacob Metcalf
Communications of the ACM, Vol. 59 No. 7, Pages 31-33
“Big data” is a major technical advance in terms of computing expense, speed, and capacity. But it is also an epistemic shift wherein data is seen as infinitely networkable, indefinitely reusable, and significantly divorced from the context of collection.1,7 The statutory definitions of “human subjects” and “research” are not easily applicable to big data research involving sensitive human data. Many of the familiar norms and regulations of research ethics formulated to prior paradigms of research risks and harms, and thus the formal triggers for ethics review are miscalibrated. We need to reevaluate long-standing assumptions of research ethics in light of the emergence of “big data” analytics.6,10,13
The U.S. Department of Health and Human Services (HHS) released a Notice of Proposed Rule-Making (NPRM) in September 2015 regarding proposed major revisions (the first in three decades) to the research ethics regulations known as the Common Rule.a The proposed changes grapple with the consequences of big data, such as informed consent for bio-banking and universal standards for privacy protection. The Common Rule does not apply to industry research, and some big data science in universities might not fall under its purview, but the Common Rule addresses the burgeoning uses of big data by setting the tone and agenda for research ethics in many spheres.