Healthcare IT security expert Mac McMillan, president of CynergisTek, pulled no punches as he explained the responsibilities that come with attesting to meaningful use (MU), the federal government financial incentive program to motivate healthcare providers to adopt electronic medical records. Based on his firm's experiences and those of peers in this field, a worrisome proportion of meaningful users may already have committed fraud.
McMillan believes that healthcare providers do not take the need to implement security measures as seriously as they should. Maintaining consistently high levels of security has not been part of the healthcare industry's culture.
In view of the fact that HIPAA rules have been law for a decade, complying with stage 1 and 2 meaningful use privacy and security requirements should be minimal. "They are exactly the same," he said. "If you are complying with the HIPAA rules, you are complying with the MU rules."
But many are not, if the results of the first federal audit of meaningful use attestees are representative. Forty-seven (80%) of 59 providers failed audits conducted in April 2012 by U.S. Centers for Medicare and Medicaid Services' (CMS) contractor Figliozzi and Company.
Risk analysis and assessment
Before any hospital or eligible provider formally joins the meaningful use program, a risk analysis and assessment needs to be undertaken or reviewed. The risk analysis is the process of analyzing any risks that could compromise patient data and records. The risk assessment refers to the process of conducting such an analysis. While both terms tend to be used interchangeably, the end result is to identify points of vulnerability and develop a remediation plan to correct them.
"This is nothing new," McMillan said. "Anyone complying with the HIPAA technical security rule has already done this, on at least an annual basis or whenever there is a major healthcare IT change, such as acquisition of new software. If you are already doing your risk assessments under HIPAA like you are supposed to be doing, you don't have to do another one."
If the potential attestee has not been complying with this HIPAA privacy and security rule, a risk analysis needs to be conducted in accordance with 45 CFR 164.308(a)(1). Technical controls within the certified electronic health record (EHR) system or modules also need to be implemented.
The day that the MU applicant has attested, it is necessary to retain copies of all system-related information, tests, self-audits, screen shots of controls, configuration files, and subsequent risk analyses so that this documentation will be available for a federal audit. To be compliant, the auditors require tangible production of actual evidence.
If the MU attestee has not done any of this, the attestee has committed fraud based upon the statement preceding the signature at sign at the end of the attestation:
I certify that the foregoing information is true, accurate, and complete. I understand that the Medicare/Medicaid EHR incentive program payment I requested will be paid from Federal Funds, and the use of any false claims, statements, or documents, or the concealment of a material fact used to obtain Medicare/Medicaid EHR incentive program payment may be prosecuted under Federal or State criminal laws and may also be subject to civil penalties.
What's a user to do?
"Not conducting a risk assessment becomes a fraud and financial issue," McMillan said. "Money can be taken back, additional fines can be added, and if somebody decides that someone has done something nefarious, you can go to jail. You need to realize this."
"Also, you cannot falsify documents with respect to the date that you conducted your risk assessment," he added. "I had one client who recently asked me if my company would do this. I told that client that I don't look good wearing orange."
A postdated assessment could be an audit issue; however, late is better than not at all.
McMillan also pointed out a difference between HIPAA and MU remediation requirements: The U.S. Office for Civil Rights (OCR) will give a healthcare provider "credit" for being in the process of implementing remediation; CMS offers only a 12-month grace period.
He also reminded the attendees that a certified EHR system or module needs to have its audit functions that demonstrate compliance turned on by the purchaser, not by the vendor. Each certified EHR system or module has a core set of technical security functions, but certification does not guarantee that they are optimally configured or deployed.
The purchaser will need to ensure these functions are enabled correctly and demonstrate the ability to use, review, and report on them. McMillan reminded attendees that their risk analysis should specifically measure the maturity of these controls and their adequacy to protect electronic health information. These criteria are defined in the U.S. Office of the National Coordinator for Health IT (ONC) meaningful use core measures (measure 15) dated November 2010. ONC also explains these in its published guide to privacy and security of health information.
The user of certified systems or modules is the party responsible for setting the rules and configuring the software properly; vendors are responsible for providing products that can meet the requirements. A participant asked what to do about an uncooperative vendor whose answer to problems with its software was to turn off the security system. McMillan's answer was blunt: "Fire the vendor. Demand a refund. And report them to the ONC."
What do audits request?
Meaningful use audits request certain categories of documentation:
- Documentation from ONC that shows the provider used a certified EHR system
- Documentation that supports the provider's attestation for the core set of MU criteria
- Documentation that supports the provider's attestation for the required number of menu set MU objectives
- For hospitals, information about the method used to report emergency department admissions
The provider has 14 days to produce the requested documentation. If it is inadequate, auditors may come to the facility to perform onsite reviews.
This may not be all of the information that is requested, however. In 2012, the Office of Inspector General (OIG) of the U.S. Department of Health and Human Services (HHS) conducted a survey on safeguards in EHRs as part of a study on fraud and abuse. The survey focused on how hospitals were implementing the features of EHRs that limit fraud and increase data integrity.
The survey requested data about access controls, outside entity access, audit log and metadata, entry of clinical notes, transmission of data, patient access and identity management, and any additional safeguards. It did not require entities to submit any documentation to substantiate their answers.
The survey's results were published in September; OIG has not stated what it will do with the findings. McMillan speculated this was a way for HHS to sample compliance, and it could also potentially be interpreted as "fair warning" action. It's possible that audits with teeth may follow as a result of the low compliance findings, he said.
McMillan believes that eligible providers, small clinics, pharmacies, and diagnostic testing labs may be totally outside their competency levels in dealing with federal privacy and security rules. These providers may not have knowledgeable enough staff. They may presume passwords represent encryption, for example. They think they are complying when they are not.
Small hospitals may also have limited staff IT resources and funds. If a provider doesn't understand what is requested, it's time to outsource the work to an experienced healthcare IT security firm, he suggested.
The money will be well-spent in exchange for the peace of mind of being in compliance, and also the knowledge that patient records and data are secure.
Copyright © 2013 AuntMinnie.com