Did VA Provide a Complete, Honest Answer to Senator Wicker’s Questions About PTSD C&P Exam Quality?

I have had this post in draft for over one year. I have hesitated to publish it because I have feared retribution. However, I am publishing it today because I decided to trust VA Secretary McDonald’s emphasis on honesty and transparency, and because our nation’s military veterans deserve to receive high quality (reliable and valid) compensation and pension examinations for PTSD and other mental disorders.

(This post should not be construed as representing the views or opinions of the U.S. Department of Veterans Affairs or the federal government.)

Senator Wicker Asks VA to Describe How the Department Conducts Quality Control of PTSD C&P Exams

United States Senate Seal (unofficial)In a 2012 hearing of the U.S. Senate Committee on Veterans Affairs,1 Senator Roger F. Wicker (R – Mississippi) asked the VA to:

Describe how the VA conducts quality control of PTSD C&P Exam Results, and C&P Examiner Performances.

This post will review the VA’s answer to the Senator’s questions and offer an analysis of those VA responses. I will break down VA’s response into four parts. But first, here is the relevant text of VA’s statement, which was in response to Sen. Wicker’s Question 5, on pages 21-22 of the hearing transcript:

VHA’s Office of Disability and Medical Assessment (DMA) conducts quality reviews of VA Compensation and Pension (C&P) examination requests made by VBA and examinations completed by VHA clinicians. The Quality Management section, an integral component of DMA’s quality and timeliness mission, is responsible for the collection and evaluation of VA disability examination data to support recommendations for improvement throughout the VHA and VBA examination process. The quality review program incorporates a three-dimensional approach consisting of an audit review process to assess medical-legal completeness, performance measures, and a review process to assess clinical examination reporting competence.

A mix of staff knowledgeable in both the clinical protocol/practices of the C&P examination process and staff with VBA rating experience perform the reviews. This monthly random sample can include all potential exam types. This quality review process started in October 2011, replacing the former C&P Examination Program that was discontinued in October 2010. Ongoing enhancements to data collection will provide VBA and VHA with detail data to support process improvement.

DMA is charged with improving the disability examination process by monitoring the quality of examinations conducted. Quality is monitored monthly using an audit review tool and the results are reported on a quarterly basis. This intense audit is conducted on all types of disability examinations, assessing consistency between the medical evidence and the examination report.

[Note: Please see the hearing transcript for the first paragraph of VA’s response concerning VBA’s STAR program, and the final paragraph, which discusses DMA’s disability examiner registration and certification process.]

Analysis

In the following sections I will analyze the constituent parts of VA’s statements and discuss concerns with each.

Quote #1 from VA’s Response:

The quality review program incorporates a three-dimensional approach consisting of an audit review process to assess medical-legal completeness, performance measures, and a review process to assess clinical examination reporting competence.

Problem: Ill-defined terminology.

    • The term, medical-legal completeness is not defined in any VA documentation as far as I could discover. It is not a term of art in veterans law, disability medicine, forensic psychiatry, or forensic psychology. Therefore it is not clear what the phrase means.
    • The phrase audit review process presumably refers to the Audit Review Tool (recently renamed the Audit Review Criteria) which is addressed below.
    • Performance measures are not relevant to the Senator’s questions, as they assess productivity, e.g., the percentage of exams completed within the required 30-day time frame. While important, productivity measures do not assess quality.
    • The phrase, a review process to assess clinical examination reporting competence, is vague. Does it refer to the competence of the reporting process? Or the competence of the disability examination itself?

Quote #2 from VA’s response:

A mix of staff knowledgeable in both the clinical protocol/practices of the C&P examination process and staff with VBA rating experience perform the reviews.

Problem A: Misleading.

One would think that the “knowledgeable” staff would be experienced C&P psychologists and psychiatrists. Unfortunately, that is not the case. None of DMA staff who conduct these reviews are psychologists or psychiatrists. In fact, none of them are mental health professionals.

Problem B: Limited Relevance

Including “staff with VBA rating experience” might sound good on the surface. Unfortunately, such a background is not that relevant when it comes to reviewing the quality of a C&P exam. Sure, a VBA staffer can help identify problems with the ‘rateability’ of the exam report, but how would they know if the evaluation methods, knowledge, integration of data, and reasoning of the examiner were consistent with professional standards?

A VBA Rater (Rating Veterans Service Representative or RVSR) does not conduct C&P examinations, and they are rarely medical professionals. They know how to obtain information from an exam report for rating purposes, but they do possess the requisite education, training, and experience to evaluate the overall quality of a C&P exam for PTSD or other mental disorders.

Quote #3 from VA’s response:

DMA is charged with improving the disability examination process by monitoring the quality of examinations conducted. Quality is monitored monthly using an audit review tool and the results are reported on a quarterly basis.

Problem: The Audit Review Tool does not measure the quality of a C&P examination.

The Audit Review Tool2 is used to determine if the report is sufficient for rating purposes only, and the review is not conducted by psychologists or psychiatrists.

Unfortunately VA does not make the Audit Review Tool available to the public, so I cannot reproduce it verbatim here.  (One could obtain a copy of the Audit Review Tool via a Freedom of Information (FOIA) request.3)Freedom of Information

But I will attempt to give you an idea of the type of items on the Audit Review Tool without revealing the exact wording.

Overall, it is worth noting that eight of the nine questions consist of simple yes/no responses that all correspond to specific items on the Disability Benefits Questionnaires (DBQs), the form C&P examiners must use to document their exam findings. For example, examiners must check a box indicating that they reviewed the veteran’s VBA claims file . Therefore the DMA quality reviewer simply looks to see if the examiner checked “Yes” or “No”.

Listed below is information about the questions on the FY2012 Audit Review Tool that pertain to the C&P examination. (The FY2012 Audit Review Tool is the version that was in effect at the time of the Senate hearing.)  Note that Questions 1-8 of the Audit Review Tool are used to assess the VBA’s examination request, thus those questions are not relevant to PTSD C&P exam quality determinations.

Question 9:  The examiner indicates the type of examination in a checkbox format. Comment: It is an okay question to ask, but it is not a major factor in the overall quality of the exam.

Question 10:  This question asks about nonpsychiatric medical exams, i.e., it is not relevant to C&P exams for PTSD and other mental disorders. Comment:  n/a

Question 11:  Essentially asks if the C&P examiner filled in all the required fields on the DBQ. Comment:  Hardly a in-depth assessment of quality.

Question 12:  Asks if the examiner addressed any contradictions between sources of information. Comment:  Since the reviewers are not experienced C&P psychologists or psychiatrists, they will not recognize all such contradictions, since many such problems would require extensive knowledge of clinical and forensic psychology and psychiatry.

Questions 13 &14:  Asks about the accuracy of the diagnosis the examiner provided. Comment: Since the reviewers are not psychologists or psychiatrists with experience conducting C&P exams, how would they be able to discern with confidence whether or not a diagnosis is accurate? Sure, they could consult DSM-IV (in use in 2012) or DSM-5 (the current edition of the diagnostic manual), but do they truly possess the education, training, and experience to assess a psychologist’s or psychiatrist’s diagnostic accuracy?

Question 15: Examiners must check boxes on the DBQ indicating whether or not they reviewed the claims file, which includes military records, and relevant post-discharge medical records. Comment: This is yet another “Were the boxes checked?” item.

Questions 16, 17, & 18:  If VBA requested certain information, did the examiner provide it? Comment: This is another simple ‘Yes or No’ item, that has nothing to do with the quality of the information provided.

Summary of Audit Review Analysis

Thus, it is not much of an exaggeration to say that the current review process consists of glancing at the DBQ to see if the examiner checked all the right boxes. 

Granted, the information checked via the Audit Review process is important with regard to the ‘rateability’ of the exam report. But the Audit Review process does nothing to evaluate the quality of the conclusions, including medical opinions and their rationale, upon which a Veteran’s disability rating are based.

In other words, a computer program could fill out a DBQ, based on random but coherent-sounding information, making sure all the appropriate boxes were checked and fields filled with text, and, even though the information would have nothing to do with the veteran’s actual symptoms or functional impairment, the Audit Review process would find the report acceptable.

Quote #4 from VA’s response:

“This intense audit is conducted on all types of disability examinations, assessing consistency between the medical evidence and the examination report.”

Problem: The statement does not correspond with what VA actually does.

How is consistency between the medical evidence and the examination report assessed? – It is not. Since the reviewers are not experienced C&P psychologists or psychiatrists, they do not possess the requisite education, training, and experience to understand and assess the relevant medical evidence. Consequently, they cannot assess the consistency between the medical evidence and the examination report.

Conclusion

Definition of obscurantismVA provided an eloquent response to Senator Wicker that slides across the page so gracefully, you almost want to believe everything it claims, even if you know better.

Since you do know better, and since no one wants to let someone’s silver-tongued prose cast a spell over them, you can decide for yourself if VA responded completely and honestly to Senator Wicker’s astute query.

My conclusion: VA’s response to Senator Wicker epitomizes well-crafted obscurantism.4, 5

What do you think? Please comment (reply) below.

 

Footnotes
1. S. Hrg. 112-508, VA MENTAL HEALTH CARE: EVALUATING ACCESS AND ASSESSING CARE, 25 April 2012. Text | PDF

2. The name of the Audit Review Tool was recently changed to Audit Review Criteria.

3. Since the Audit Review Tool and Audit Review Criteria are produced by the Veterans Health Administration (VHA), as opposed to the Veterans Benefits Administration (VBA), send a FOIA request via email to:

vhafoia2@va.gov

Or, if you mail your request, send it to:

FOIA REQUEST
CENTRAL OFFICE
DEPARTMENT OF VETERANS AFFAIRS
810 VERMONT AVE
WASHINGTON DC 20420

Be sure to specify which version of the Audit Review Tool and/or Audit Review Criteria you want. I suggest asking for every version, regardless of the name (Tool vs. Criteria).

For more information about FOIA requests, see this web page on the VA website:

http://www.oprm.va.gov/foia/howto_file_foia_request.aspx

4. obscurantism – … 2. deliberate obscurity or evasion of clarity. – Random House Kernerman Webster’s College Dictionary (2010).

5. Buekens & Boudry (2015) describe obscurantism in this way: “[when] the speaker…[sets] up a game of verbal smoke and mirrors to suggest depth and insight where none exists.” Reference: Buekens, F. & Boudry, M. (2015). The dark side of the long: Explaining the temptations of obscurantism.Theoria, 81, 126–142.