
02/21/2025
Professor Harnett
There once was a young doctor who aspired to be a clinical researcher. For this example story, we refer to him as “the young doc”. The young doc is a relatively new clinical researcher with a large academic research center. His research focus is on outcomes analysis and primarily involves working with retrospective data. The young doc is seeking a promotion to associate professor.
After receiving privacy board approval for his investigator-initiated study and assuming his department would fund the cost to acquire data from the institutional data broker, he submitted his data request for the data through the established channels. Because this research was not funded, along with departmental budget issues, the business director declined his request.
Armed with an approved IRB protocol, he was still eager to proceed. He discovered that he could obtain data from the hospital’s electronic health record (EHR) reporting tools for the analysis. Subsequently, he downloaded patient data also known as protected health information (PHI), to his laptop.
While properly processed data for research from the EHR - with appropriate approvals – generally falls outside the reach of the federal Privacy Rule (HIPAA), improperly accessed data does not. An example is if the IRB grants a ‘waiver of consent’. The data the young doc downloaded contained certain direct identifiers. Circumventing the established process was a significant error.
A few weeks later, the young doc’s car was burglarized, and the laptop was stolen. He reported the incident to the institutional Privacy Director as required by federal law. Soon after, the young doc was contacted by the hospital and presented with a report showing he accessed patient records outside of clinical operations. The young doc acknowledged it was for research purposes. The system recorded data was downloaded from approximately 2,000 patients.
The young researcher’s situation worsened. He was then questioned by the Privacy Directors at the university and health system. The young doc admitted he had no protected time for research and used his personal laptop to access the data because he was conducting most of the analysis from home during his personal time. After all, he had an IRB approved protocol.
The young doc was now being investigated by the hospital for inappropriate access to clinical data and by the university information security teams for how the data was stored. The U.S. Department of Health and Human Services (HHS) final Security Rule (45 CFR § 164.312(a)(2)(iv) and (e)(2)(ii)) makes the use of encryption an addressable implementation specification. Therefore, the institutional policy governing data classification and minimum safeguards requires PHI to be encrypted.
However, the young doc believed he was compliant with the requirement to encrypt sensitive data. Because his personal computer was not fully encrypted, before downloading the data, he created an encrypted volume on his laptop’s hard drive using operating system settings. This is where he stored the data.
The Privacy Director requested documentation on his encryption methodology. Unfortunately, the young doc had no documentation of the process. He did not have screenshots of the configuration, confirmation prompts, or system log files. Activities such as encrypted directory events, status changes, unlock attempts, and policy settings, are logged on the local machine only. That machine is gone.
Therefore, the forensics team determined there was no evidence that the patients’ data was encrypted. It likely was, but without documented evidence, it was assumed to be unencrypted and reported to the HHS as a data breach.
Following the breach notification, the hospital was required to post a notice on their website and report it to a local news station. Letters were sent to everyone affected, explaining and apologizing for the potential data loss, and the health system paid for one year of credit protection for each person. The situation caused a public “good faith” erosion toward the hospital and university. The financial costs incurred by the health system exceeded tens of thousands.
The young doc was terminated from his clinical job for violating data use policy and reprimanded by the university for inappropriate use of personal devices and data management. He was required to retake HIPAA and data management training.
It got worse. A group of patients whose data was breached filed a class action lawsuit. The young doc hired a lawyer to defend his position as “non-willful neglect” because he claimed he encrypted the volume on his laptop but just could not prove it. Civil financial penalties differ significantly between willful and non-willful neglect of processes. This has to play out in the legal system.
What went wrong?
Despite an established process, the young doc acquired research data inappropriately and illegally. All activity in an EHR is granularly logged. Data was improperly stored because he used his personal laptop instead of a university-issued device that comes fully encrypted with documentation. The department failed to properly oversee the actions of a young researcher. Promotion to Associate Professor is now unlikely, and he will likely be terminated from the university as he no longer performs clinical duties and lacks external funding.
What should have been done?
He should have found a way to fund the cost for data from the Honest Broker through a compliant process and reclassification. Data should have been stored in enterprise data servers or on an institution-provided encrypted laptop. He should have better understood the compliance requirements for patient data used for research.
Moral of the story…
Researchers need to be properly trained in managing clinical data for research, especially if they have clinical access to the systems.
Comments