Improving the Throughput and Transparency of the HITRUST Assurance Program: July 2020 Update
<< All Blogs

Date: July 28, 2020

By Bimal Sheth, Vice President of Assurance Services

Welcome back for the July update in our series on Improving the Throughput and Transparency of the HITRUST Assurance Program. For those of you that missed the April update, it can be found here. This blog post is the fifth part in a series of communications meant to increase transparency as we continuously work to improve our processes.

In this post, I will highlight two new automated quality checks that have been introduced.

MyCSF-Based Quality Checks

We continue to enable MyCSF-based quality checks with the expectation that many more checks will be enabled shortly. After a successful rollout of certain internal quality checks within MyCSF, the first HITRUST Authorized External Assessor-facing quality checks were enabled at the end of June. Further assessor- and assessed entity-facing quality checks will be progressively enabled over the next few weeks. For each of the quality checks, we would encourage you to review the check result and work to resolve the issue. In instances where you feel the check should not have populated, you have the option to override the check and provide an accompanying rationale which is reviewed as part of HITRUST’s quality assurance (QA) processes.

Post-Submission Quality Checks

HITRUST has taken steps to proactively identify and communicate issues in the assessments we receive much, much earlier than before through post-submission quality checks. The early identification of issues helps everyone involved:

  • The assessed entity gets an early warning that quality issues may be present in the assessment.
  • The details of the HITRUST assessment are still fresh on everyone’s minds.
  • The External Assessor team likely still has resources assigned to the engagement.
  • The 90-day validated assessment fieldwork maximum duration likely has not lapsed, meaning that there’s a strong likelihood that the identified issues can be fixed without impacting the certification date.

For the past month, the team has been piloting “check-in automated checks.” “Check-in” is HITRUST’s longstanding process of evaluating the suitability of assessment submissions to enter our QA queue and is performed within three business days of HITRUST’s receipt of assessments. “Check-in automated checks” were introduced in June and use analytics to identify over 100 possible quality issues in assessment submissions. The results of these checks are reviewed during check-in, and a determination is made about whether the assessment should be returned to the External Assessor. If the checks uncover issues that are significant and/or pervasive, the assessment is returned to the External Assessor. If not, the submission proceeds into the QA queue for QA at a later date. This does not mean there are no QA issues, but rather any potential issues will be worked as part of the normal QA process.

When the “check-in automated checks” result in the assessment being returned, both the assessed entity and the External Assessor are notified. HITRUST returns the assessment object in MyCSF and also provides an Excel workbook containing actionable recommendations, an explanatory narrative, and answers to some commonly asked questions. While implementation of the communicated recommendations is optional, addressing them greatly reduces the duration of HITRUST’s subsequent QA of the assessment.

Figure 1 shows an example of a potential quality issue communicated in this workbook. We can use the example item in Figure 1 to understand the data contained in each column:

  • Column B – Related MyCSF Report – This is the source report that was pulled from MyCSF to run the check. Most quality checks, this example included, are based on data presented in MyCSF’s “External Assessor Report.” Other data sources considered include the External Assessor Timesheet, the Organizational Information pages within MyCSF, documents linked within MyCSF, and the Organizational Overview and Scope document.
    figure1-A-C
  • Column C Check Category – The type of check that was run that produced a result using the categories mentioned previously. For our example this is a potential issue with a not applicable (N/A) check. The checks could also include the following categories: comment issues, linked documentation issues, timesheet issues, and issues with scoping-related information (including issues in the Organizational Overview and Scope document).
    figure1-A-C
  • Column D – Check ID – The unique reference number of the automated quality check (NA.2, in this example). This reference number is used to relate the recommendation communicated on the workbook’s “Recommendations” sheet to the assessment-specific information on the “Reference Data” sheet.
    figure1-A-C
  • Column E – Potential Quality Issue Found via Automated Check – The name of the check that was found. For our example, the check identified a situation called “topic level mixed applicability.” Topic level mixed applicability occurs when there are multiple requirement statements about a topic, such as electronic signatures, and some are marked as not applicable while others are scored.
    figure1-E-G
  • Column G Amount/Occurrences – The number of times the specific issue was identified in the assessment (one, in our example).
    figure1-E-G
  • Column H – Description of Potential Issue – A detailed explanation of the potential issue. The example description offers more detail on what the potential issue is.
    figure1-H-I
  • Column I – Recommendation – Tells the user how to review the data in the Reference Data tab and what potential actions may be necessary. All recommendations end by encouraging no action be taken if the External Assessor and assessed entity feel that no action is warranted. In the case of our example, the recommendation is to reconsider the applicability of the requirement statements listed in the NA.2’s entry on the “Reference Data” sheet to see if they should shift to either scored or not applicable, such that all of the requirements potentially have the same outcome.
    figure1-H-I
Figure 1figure1

Figure 2 shows the “Reference Data” sheet filtered for the NA.2 entry. For this check, column D shows that seven requirements dealing with the topic of electronic signatures were included in the assessment. Of these seven, five were scored while two were marked as not applicable. The External Assessor and assessed entity should review the seven identified requirements dealing with electronic signatures to determine if the applicability of any should be adjusted for consistency.

Figure 2

figure2-B
If the External Assessor —in working with the assessed entity —determines that changes should be made, they should make them in MyCSF. HITRUST asks that External Assessors use the workbook as a tool and not respond to the concerns in the workbook. The workbook should also not be uploaded to MyCSF.

After the External Assessor has reviewed the workbook and made any necessary adjustments within MyCSF, the assessment is then re-submitted to HITRUST. At this point HITRUST will complete the check-in process and the assessment will enter QA review.

We are working to eliminate communicating recommendations using an offline Excel workbook by moving the “check-in automated checks” into MyCSF. We plan on making these checks available to External Assessors and assessed entities in the form of an “assessment health check” that can be run on-demand before submission to HITRUST. Thanks for your patience while we continue development on this exciting MyCSF enhancement.

Closing Thoughts

We appreciate all of the feedback we have received over the past several months, and we look forward to receiving more of your questions and thoughts. As a reminder, you can submit feedback through your Customer Success Managers, through our UserVoice page, or at feedback@hitrustalliance.net.

<< All Blogs

Chat Now

This is where you can start a live chat with a member of our team