Font Size Down Font Size Up Reset Font Size

Sign Up for Committee Updates

 

Witness Testimony of John L. Wilson, Disabled American Veterans, Assistant National Legislative Director

Mr. Chairman and Members of the Committee:

The Disabled American Veterans (DAV) has consistently stated that the keys to successfully reforming the veterans’ benefits claims process are training and accountability, two elements central to producing quality results for veterans.  However, today the Veterans Benefits Administration (VBA) remains production driven from the Monday morning workload reports to personnel awards. 

The DAV strongly believes that quality should be rewarded at least on parity with production. However, in order for this to occur, VBA must first implement and then inculcate a comprehensive quality control program to compliment its quality assurance program.

VBA’s primary quality assurance program is the Systematic Technical Accuracy Review (STAR) program.  The STAR program can identify three types of errors—benefit entitlement, decision documentation/notification, and administrative.  STAR looks at whether a proper VCAA pre-decision “notice” was provided and whether the rating decision was merited based on the available evidence. 

Under the STAR program, VA reviews a sampling of decisions from regional offices and bases its national accuracy measures on the percentage with errors that affect entitlement, benefit amount, and effective date.  This is a technical review to ensure a variety of transactions are properly carried out.  Inconsistency in technical accuracy may signal several types of processing problems, including: uneven or insufficient understanding of governing criteria, operating rules that are too vague and allow an overly broad interpretation which, in turn, leads to significant variance, or outright arbitrariness in decision-making. Obviously, VA must detect inconsistencies before the cause or causes can be determined and remedied.

The STAR program was implemented in the late 1990s.  It was intended to be a national quality assurance program that would assist VBA in identifying processing vulnerabilities and error trends.  It was designed to draw statistically significant case samples from all regional offices, using stratified random sampling algorithms.  Using this tool, VBA could review a statistically valid case sample each year, and calculate its national error rates.  Using the STAR program was also intended to identify major national error trends, so the Compensation and Pension (C&P) program could initiate corrective measures.  Such corrective measures could include training, improved procedural guidance, or automated system improvements. 

The STAR program was not designed to provide evaluative data at the working unit or individual level.  There are two major reasons for this.  First, the sample sizes used in STAR are small.  While totals may be significant at the national level, breaking out the data to Regional Offices may not provide numbers significant enough to ascertain a trend with an individual employee.  Second, the STAR program essentially assesses the outcome, not the process of getting there.  So, a claim that took two years to process because of piece-meal development would not have an error called if the resulting decision was correct and all pre- and post-notification requirements were met.  Such processing delays would fall under the purview of quality control in-process reviews and not quality assurance programs such as STAR.

Quality control findings from local in-process reviews would assist a station in assessing overall performance towards achieving the goal of timely and accurate decisions on veterans’ claims.  VBA recognized the importance of such quality control when the STAR program was created to replace the ineffective Statistical Quality Control program.  At that time, VBA requested funding to implement a local C&P quality control program.  That program – called Systematic Individual Performance Assessment (SIPA) – was announced in 2000 as a new initiative to monitor individual performance.  Under this program, the VA would review an annual sample of 100 decisions for each adjudicator to identify individual deficiencies, ensure maintenance of skills, promote accuracy and consistency of claims adjudication, and restore credibility to the system. The reviewers would have performed related administrative functions, such as providing feedback on reviews, maintaining reports, and playing a role in employee development and ongoing training. Unfortunately, the VA abandoned this initiative during 2002, and proficiency is now apparently subjectively assessed by supervisors based on their day-to-day perceptions of employee performance. The SIPA program may have been abandoned due to inadequate resources. Without any quality assurance review on the individual level, the VA is unlikely to impose effective accountability down to the individual adjudicator level, where it must go if optimum quality is expected.

VBA elected instead to install a much reduced quality control procedure, where coaches’ review several cases per month for Veterans Service Representatives (VSR) and Rating Veterans Service Representatives (RVSR), as their quality control mechanism.  This hybrid process has not provided adequate individual accountability, or sufficiently robust data to identify local process improvements.  Coaches typically do not have the time to manage day-to-day operations and pull case files for ad hoc reviews of employees.

With significant attention on the C&P claims backlog, it is understandable that VBA wants to maximize its personnel and resources to direct claim decision-making.  However, technical accuracy is arguably the most important component of the C&P claims process.  Pushing cases faster will not help if significant numbers of cases are done wrong.   

VBA needs to elevate quality to the highest priority.  This means they should dedicate adequate resources to both quality assurance and quality control programs.  The VSRs, RVSRs and local management teams need to understand that high quality work will be recognized and rewarded.  Further, they need to understand that there will be clear thresholds for individual quality, repeated errors will be identified and associated with their processor, and that there will be appropriate consequences for insufficient accuracy rates.   

The STAR program was evaluated by the VA Office of Inspector General as part of its review of compensation rating accuracy in March 2009, in the report titled “Audit of Veterans Benefits Administration Compensation Rating Accuracy and Consistency Reviews.” The OIG determined that VBA’s STAR program does not provide a complete assessment of rating accuracy. 

During the 12-month period ending in February 2008, VBA’s STAR process did not effectively identify and report all errors in compensation claim rating decisions. VBA identified a national compensation claim rating accuracy of87 percent. Of the approximately 882,000 compensation claims measured by STAR reviewers, VBA estimated that about 87 percent were technically accurate.  The OIG, on the other hand, reviewed a random sampling of cases that had also been reviewed by STAR reviewers and found additional errors.  They projected an accuracy rate of only 78 percent. They also audited brokered cases. Of that sampling, they found an accuracy rate of 69 percent.  Combining the audit of brokered claims with those STAR reviewed claims, results in a projected accuracy rate of about 77 percent of claims.  The OIG determined that this equates to approximately 203,000 claims in that one year alone where veterans’ monthly benefits may be incorrect.

The OIG found that STAR reviewers did not identify some of the missed errors because they either did not thoroughly review available medical and non-medical evidence, did not identify the absence of necessary medical information, or inappropriately misclassified benefit entitlement errors as comments.

These findings are, on their surface, a result of STAR reviewers not finding all the errors in the cases they reviewed.  They also point to the need for greater management oversight and an effective formal training program for the STAR reviewers.  STAR reviewers could benefit from formal training; however STAR managers have said that current workload requirements do not allow for the amount of training time necessary.  This is a common theme for VBA that underlies even STAR reviews – quantity over quality – even in the area that is supposed to ensure quality of at least a technical nature.

The need for a quality control program as an adjunct to the STAR program can also be seen when considered through a review of the Board of Appeals for Veterans Claims’ Summary of Remands.  The summary represents a statistically large and reliable sample of certain measurable trends. The examples must be viewed in the context of the VA (1) deciding over 880,000 cases per year; (2) receiving over 133,000 Notice of Disagreements; and (3) over 49,000 appeals to the Board. The examples below are from fiscal year (FY) 2009:

  1. Remands resulted in 801 cases because no “notice” under section 5103 was ever provided to the claimant. In addition, there were 4,048 remanded for inadequate or incorrect notice, some of which may result from the current generic notice letters sent VBA. The DAV continues to call for changes to these letters to include more specific information, which could help lower the incidence of this error.
  2. VA failed to request for Service Medical Records in 1,739 cases and failed to request for personnel records in 1,511 cases. These numbers are disturbing because initially requesting a veteran’s service records is the foundation to every compensation claim.
  3. The Board remanded 7,814 cases for failure to request VA medical records. The disturbing factor here is that a VA employee can usually obtain VA medical records without ever leaving the confines of one’s computer screen.
  4. Another 3,187 cases were remanded because the claimant had requested a travel board hearing or video-conference hearing. Again, there is a disturbing factor here. A checklist is utilized prior to sending an appeal to the Board that contains a section that specifically asked whether the claimant has asked for such a hearing.

The examples above totaled 19,100 cases or 34 percent of appeals reaching the Board, all of which cleared the local rating board and the local appeals board with errors that are elementary in nature. Yet they were either not detected or they were ignored. Many more cases were returned for more complex errors. Regardless, a 34 percent error rate on such basic elements in the claims process involving VBA’s most senior rating specialists is simply unacceptable.

The problem with the current accountability system is that VBA employees who commit such errors are usually not held responsible. With no incentive to prevent such errors and a constant focus on production, quality will continue to decline.

DAV agrees with the VA OIG that VBA could improve the STAR program by establishing: a mechanism to ensure STAR reviewers evaluate all documentation related to the claim selected for review; a requirement that all STAR reviewer comments receive a second review to make sure the reviewer appropriately recorded the comment instead of a benefit entitlement error; procedures to review brokered claims as part of the STAR program; and minimum annual training requirements for each STAR reviewer that are comparable to regional office rating staff training requirements.

In addition, DAV recommends that VBA establish a quality control program that looks at claims in-process in order to determine not just whether a proper decision was made, but how it was arrived at in order to identify ways to improve the system.  Combining results from such quality control reviews with STAR’s quality assurance results and the data from remands from the Board of Veterans’ Appeals and the Court of Appeals for Veterans Claims could yield valuable information on trends and cause of errors.  If the data from all such reviews could be incorporated into a robust IT system, proper analysis of such data would provide management and employees important insights into processes and decisions.  This in turn would lead to quicker and more accurate decisions on benefits claims, and most importantly, to the delivery of all earned benefits to veterans, particularly disabled veterans, in a timely manner

That concludes my testimony.  I would be pleased to answer any questions the Committee may have.