Font Size Down Font Size Up Reset Font Size

Sign Up for Committee Updates

 

Witness Testimony of Kerry Baker, Disabled American Veterans, Assistant National Legislative Director

Mr. Chairman and Members of the Subcommittee:

On behalf of the 1.3 million members of the Disabled American Veterans (DAV), I am honored to appear before you today to discuss the effectiveness of the Veterans Benefits Administration’s (VBA’s) training, performance management, and accountability requirements.  In accordance with our congressional charter, the DAV’s mission is to “advance the interests, and work for the betterment, of all wounded, injured, and disabled American veterans.” 

TRAINING

VBA has a standard training curriculum for new claims processors and an 80-hour annual training requirement for all claims processors.  The training program in VBA is basically a three-stage system.  First, VBA policy requires new staff to complete some orientation training, which is provided in their home offices.  Second, they are required to attend a two to three-week centralized training course that provides a basic introduction to job responsibilities.  Third, new staff are required to spend several more months in training at their home offices, which includes on-the-job training and/or instructor-led training that follow a required curriculum via use of an on-line learning tool called the Training and Performance Support System (TPSS).  VBA policy states that all claims processors are required to complete a minimum of 80 hours of training annually.  VA Regional Offices (ROs) have some discretion over what training they provide to meet this requirement.

The VA’s three-phased program for new claims processors is designed to deliver standardized training, regardless of training location or individual instructors.  Topics included in the training program contain a lesson plan with review exercises, student handouts, and copies of slides used during the instructor’s presentation.  The VBA also has an annual 80-hour training requirement for new and experienced rating veterans’ service representatives (RVSRs) and veteran service representatives (VSRs). 

The first phase of training for new RVSRs is prerequisite training.  It begins at new RVRSs’ home regional offices when they begin working.  The prerequisite training is designed to lay the foundation for future training by introducing new employees to topics such as the software applications used to process and track claims, medical terminology, the system for maintaining and filing a case folder, and the process for requesting medical records.  VBA specifies the topics that must be covered during prerequisite training: however, ROs can choose the format for the training and the time frame.  New VSRs and RVSRs typically spend two to three weeks completing prerequisite training in their home office before they begin the second program phase.  Nonetheless, VA personnel informed the DAV that many new employees are only allowed approximately seven days to complete the training. 

The second phase of training is known as centralized training, wherein new VSRs and RVSRs spend approximately 3 weeks in classroom training.  The three-week time frame is misleading because the first and last portions of week one and three are utilized for travel.  Therefore, the actual training time is closer to two weeks. 

Participants from multiple ROs are typically brought together in centralized training sessions, which is usually at the Veterans Benefits Academy in Baltimore, Maryland.  Centralized training provides an overview of the technical aspects of the VSR and RVSR positions.  These training classes usually have at least three instructors, but the actual number can vary depending on the size of the group. VBA’s goal is to maintain a minimum ratio of instructors to students.

To practice processing different types of claims, VSRs work on either real or hypothetical claims specifically designed for training.  Centralized training for new RVSRs focuses on topics such as: systems of the human body; how to review medical records; and, how to interpret medical exams.  To provide instructors for centralized training, VBA relies on RO staff who have received training on how to be an instructor.  Centralized training instructors may be VSRs, RVSRs, supervisors, or other staff identified by RO managers as having the capability to be effective instructors. 

The VBA has increased the number of training sessions because of the influx of new staff.  In fiscal year 2007, VBA increased the frequency of centralized training and its student capacity at the Veterans Benefits Academy.  During fiscal year 2007, VBA held 67 centralized training sessions for 1,458 new VSRs and RVSRs.  Centralized training sessions were conducted at 26 different ROs during fiscal year 2007, in addition to the Veterans Benefits Academy.  By comparison, during fiscal year 2006, VBA held 27 centralized training sessions for 678 new claims processors.  Nonetheless, VBA has not run its benefits academy near to full capacity in 2008, the reasons for which are unclear.

When new VSRs and RVSRs return to their home office after centralized training, they are required to begin their third phase of training, which is supposed to include on-the-job, classroom, and computer-based training modules that are part of VBA’s Training and Performance Support Systems (TPSS), all conducted by and at the RO.  New VSRs and RVSRs typically take about 6 to 12 months after they return from centralized training to complete all the training requirements for new staff.

In addition to the foregoing three-phase training program, VBA also requires 80 hours of annual training for all VSRs and RVSRs.  The training is divided into two parts.  At least 60 hours must come from a list of core technical training topics identified by Compensation and Pension (C&P) Service.  VBA specifies more core topics than are necessary to meet the 60-hour requirement, so regional offices can choose those topics most relevant to their needs.  They can also choose the training method used to address each topic, such as classroom or TPSS training.  The RO managers decide the specificities of the remaining 20 hours. 

Analysis and Recommendations

The DAV has consistently maintained that VA should invest more in training adjudicators and decision makers, and that it should hold them accountable for higher standards of accuracy.  Nonetheless, such training has not been a high priority in VBA.  We have further consistently stated that proper training leads to better quality, and that quality is the key to timeliness.  Timeliness follows from quality because omissions in record development, failure to afford due process, and erroneous decisions require duplicative work, which add to the load of an already overburdened system.  The VBA will only achieve such quality when it devotes adequate resources to perform comprehensive and ongoing training, devotes sufficient time to each case, and imposes and enforces quality standards through effective quality assurance methods and accountability mechanisms. 

One of the most essential resources is experienced and knowledgeable personnel devoted to training.  More management devotion to training and quality requires a break from the status quo of production goals above all else.  In a 2005 report from VA’s Office of Inspector General, VBA employees were quoted as stating: “Although management wants to meet quality goals, they are much more concerned with quantity.  An RVSR is much more likely to be disciplined for failure to meet production standards than for failing to meet quality standards,” and that “there is a lot of pressure to make your production standard.  In fact, your performance standard centers around production and a lot of awards are based on it.  Those who don’t produce could miss out on individual bonuses, etc.”[1]  Little if anything has changed since the Inspector General has issued this report. 

A review of VBA’s training programs mentioned above reveals that its problems caused by a lack of accountability do not begin in the claims development and rating process—they begin in the training program.  Essentially, we can find little, if any, measurable accountability in VBA’s training program.

For example, despite VBA’s program requirements for its new hires to complete phase-one training before advancing to the phase-two centralized training, some VA employees anonymously informed the DAV that many candidates begin centralized training without having had the opportunity to participate in and/or complete phase-one training.  Additionally, candidates are not held responsible via formal testing on subjects taught during phase-one training.  While oversight may or may not exist for this portion of training, the DAV could find none. 

Without resorting to a critique of the substance of VBA’s subject matter taught during phase-two training, or any other phase for that matter, we limit our analysis again to accountability.  As in phase one, VBA refuses to test participants of phase-two training.  The obvious goal is to ensure employees attend the required course—ensuring that employees achieve VBA’s learning objectives appears to have no priority whatsoever. 

By now, a new employee has approximately one month of training and is supposedly prepared for phase-three training.  Keep in mind that during phase three, new employees will work on real-world cases whose outcomes affect the lives and livelihoods of disabled veterans and their families.  Real cases notwithstanding, again there is no accountability, no testing, and no oversight outside that of which is provided locally; again, that oversight is not measured nationally.    

The result of such an unsupervised and unaccountable training system is that no distinction exists between unsatisfactory performance and outstanding performance.  This lack of accountability during training further reduces, or even eliminates, employee motivation to excel.  This institutional mindset is further epitomized in VBA’s day-to-day performance, where employees throughout VBA are reminded daily that optimum work output is far more important quality performance and accurate work. 

The effect of VBA’s lack of accountability in its training program was demonstrated when it began offering skills certification tests to support certain promotions.  Beginning in late 2002, VSR job announcements began identifying VSRs at the GS-11 level, contingent upon successful completion of a certification test.  The test consisted of 100 multiple-choice questions, which were open-book.  The VA allowed participants to use on-line references and any other reference material, including individually prepared notes in order to pass the test.

The first validation test was performed in August 2003.  There were 298 participants in the first test.  Of these, 75 passed for a pass rate of 25 percent.  VBA conducted a second test in April 2004.  Out of 650 participants, 188 passed for a pass rate of 29 percent.  Because of the low pass rates on the first two tests, a 20-hour VSR “readiness” training curriculum was developed to prepare VSRs for the test.  A third test was administered on May 3, 2006, to 934 VSRs nationwide.  Still, the pass rate was only 42 percent.  Keep in mind that these tests were not for training, they were to determine promotions from GS-10 to GS-11. 

The VBA recently began similar testing with RVSRs.  The DAV was unable to obtain those tests result.  VA employees nonetheless informed us that VBA’s test results for RVSRs were no better than test results for VSRs.

These results reveal a certain irony, in that VBA will offer a skills certification test for promotion purposes, but does not require comprehensive testing throughout its training curriculum.  The following “accountability” portion of this testimony further illustrates the product of inadequate training. 

Mandatory and comprehensive testing designed cumulatively from one subject area to the next, for which VBA then holds trainees accountable, should be the number one priority of any plan to improve VBA’s training program.  Further, VBA should not allow trainees to advance to subsequent stages of training until they have successfully completed such testing. 

To be fair, the DAV understands that VBA is not solely at fault on this subject.  The VA employees union has objected to the type of testing mentioned herein.  We do not expect such objections to cease.  In fact, we feel the only way to moot these objections is for Congress to mandate such testing through statutory change.  Section 105 of H.R. 5892 mandates some testing for claims processors and VBA managers, which is an improvement over current practices; however, it does not mandate the type of testing during the training process as explain herein.  Measurable improvement in the quality of and accountability for training will not occur until such mandates exist. 

ACCOUNTABILITY

The DAV has consistently stated that, in addition to training, accountability is the key to quality, and therefore to timeliness as well.  As it currently stands, almost everything in VBA is production driven.  In addition to basing personnel awards on production, the DAV strongly believes that quality should be awarded at least on parity with production.  However, in order for this to occur, VBA must implement stronger accountability measures for quality assurance. 

VA’s quality assurance tool for compensation and pension claims is the Systematic Technical Accuracy Review (STAR) program.  Under the STAR program, VA reviews a sampling of decisions from regional offices and bases its national accuracy measures on the percentage with errors that effect entitlement, benefit amount, and effective date. 

Inconsistency signals outright arbitrariness in decision-making, uneven or insufficient understanding of governing criteria, or rules for decisions that are too vague or overly broad and allows them to be applied according to the prevailing mindset of a particular group of decision makers.  Obviously, VA must detect inconsistencies before the cause or causes can be determined and remedied.

Simply put, there is a gap in quality assurance for purposes of individual accountability in quality decision making.  In the STAR program, a sample is drawn each month from a regional office workload divided between rating, authorization, and fiduciary end-products.  For example, a monthly sample of “rating”-related cases generally requires a STAR review of 10 rating-related end products.[2]  Reviewing 10 rating-related cases per month for an average size regional office, an office that would easily employee more than three times that number of raters, is undeniable evidence of a total void in individual accountability.  If an average size regional office produced only 1,000 decisions per month, which we feel is quite conservative, the STAR program would only review one percent of the total cases decided by that regional office.  Those figures leave no room for trend analysis, much less personal accountability.

To put this in better perspective, according to VA’s 2007 performance and accountability report, the STAR program reviewed 11,056 compensation and pension (C&P) cases in 2006 for improper payments.  While this number appears significant, the total number of C&P cases available for review was 1,540,211.  Therefore, the percentage of cases reviewed was approximately seven tenths of one percent, or 0.72 percent. 

Another method of measuring the VA’s need for more accountability is an analysis of the Board’s Summary of Remands, while keeping in mind that its summary represents a statistically large and reliable sample of certain measurable trends.  The examples must be viewed in the context of the VA (1) deciding 700,000 to 800,000 cases per year; (2) receives over 100,000 NODs; and (3) submits 40,000 appeals to the Board.  The examples below are from October 2006 to October 2007. 

Remands resulted in 998 cases because no “notice” under section 5103 was ever provided to the claimant.  The remand rate was much higher for inadequate or incorrect notice; however, considering the confusing (and evolving) nature of the law concerning “notice,” we can only fault the VA when it fails to provide any notice. 

VA failed to make initial requests for SMRs in 667 cases and failed to make initial requests for personnel records in 578 cases.  The number was higher for additional follow-up records requests following the first request.  This number is disturbing because initially requesting a veteran’s service records are the foundation to every compensation claim.  It is claims development 101. 

The Board remanded 2,594 cases for initial requests for VA medical records and 3,393 cases for additional requests for VA medical records.  The disturbing factor here is that a VA employee can usually obtain VA medical records without ever leaving the confines of one’s computer screen. 

Another 2,461 cases were remanded because the claimant had requested a travel board hearing or video-conference hearing.  Again, there is a disturbing factor here.  A checklist is utilized prior to sending an appeal to the Board that contains a section that specifically asked whether the claimant has asked for such a hearing. 

The examples above totaled 7,298 cases or nearly 20 percent of appeals reaching the Board, all of which cleared the local rating board and the local appeals board with errors that are elementary in nature.  Yet they were either not detected or they were ignored.  Many more cases were returned for more complex errors.  But for nearly a 20-percent error rate on such basic elements in the claims process passing through VBA’s most senior of rating specialist is simply unacceptable. 

The problem with the VA’s current system of accountability is that it does not matter if VBA employees ignored these errors because those that commit such errors are usually not held responsible.  They therefore have no incentive to concern themselves with the quality of their work.  Above all else, these figures showing that the VA’s quality assurance and accountability systems require significant enhancement.   

Recommendation

Congress should require the Secretary to report on how the Department could establish a quality assurance and accountability program that will detect, track, and hold responsible those VA employees who commit egregious errors.  Such report should be generated in consultation with veterans’ service organizations most experienced in the VBA claims process. 

The DAV believes that effective accountability can be engineered in a manner that holds each VBA employee responsible for his/her work as a claim moves through the system while at the same time holds all employees responsible simultaneously.  As errors are discovered (definition of such errors to be determined, but specific to employee responsibility), employees responsible for such errors must be held accountable by forfeiture of work credit percentage. 

For example, if a Decision Review Officer (DRO) reverses a decision from a front-line rating specialist because of error, as opposed to difference of opinion or receipt of new evidence, then the front line employee should be subject to forfeiture of a portion of work credit that is normally used to track production standards applicable to performance bonuses.  In turn, if a case proceeds to the BVA and is reversed or remanded on similar grounds, then both the front-line rater as well as the DRO should forfeit work credit, and so one.  The same should apply to Veterans’ Law Judges at BVA when the Court of Appeals for Veterans Claims finds error in a BVA decision. 

Such a cumulative accountability system would effectively eliminate potential abuse of the system through the proverbial good-old-boy club.  One employee would be far less likely to cover for errors or look the other way from errors committed by a fellow employee if they knew their performance standards were equally at risk.  This type of system would ensure personal accountability at every stage in the claims process without seriously disrupting or dismantling VBA’s current performance measurement system. 

PERFORMANCE MEASUREMENT

VA’s benefits delivery system has become particularly multifaceted, especially when considering the various types of claims a beneficiary may file, the various stages of development and decision-making within each claim, and the potential changes that can occur at any particular stage of the claim.  Currently, VA utilizes over 50 pending end-product codes[3] for a multitude of actions.  The number of end-product codes may be further expanded by using “modifiers” that designate specific “issues” for types of claims within a certain broader category. 

The VA’s end product codes are used in conjunction with its productivity and work measurement system.  The productivity system is the basic system of work measurement used by C&P Service, but it is also used for report and tracking.  Additionally, VA’s end-product codes are also utilized in the STAR program.  The program is also a tool used for quantitative measurement, a tool utilized in preparing budget forecasts, and in distributing available staffing. 

Quantitative and productivity measurement are also tools used in comparing and tracking employment of resources.  Both productivity measurement and work measurement are tools available to management for this purpose.  Quantitative measurement also allows Central Office and Area Offices to compare stations and to track both local and national trends.  Productivity measurement and work measurement are complementary measurement systems that each depend, in part, on VA’s end product code system.  The end-product code system is further used in determining work credit provided to VA’s employees and is therefore vital in measuring employee production goals and awarding performance awards.  Changes should not be mandated that would cause VA to lose the ability to manage and track its day-to-day functions. 

The DAV finds no measurable flaws in VBA’s overall measurement systems.  In fact, our foregoing recommendations concerning improvements in VBA’s accountability would draw on the strengths in its measurement systems, thereby allowing easier and less disruptive implementation of stronger and more effective accountability. 

We hope the Subcommittee will review these recommendations and give them consideration for inclusion in your legislative plans.  Mr. Chairman, thank you for inviting the DAV to testify before you today. 


[1] Department of Veterans Affairs Office of Inspector General, Rep. No. 05-00765-137, Review of State Variances in VA Disability Compensation Payments 61 (May 19, 2005). 

[2] See M21-4, Ch. 3, § 3.02.

[3] M21-4, App. A, Glossary of Terms and Definitions. Manpower Control and Utilization in Adjudication Divisions (Pending End Product: “A claim or issue on which final action has not been completed.  The classification code identified refers to the end product work unit to be recorded when final disposition action has been taken.”).