Hearing Transcript on Examination of VA Regional Office Disability Claims Quality Review Methods – Is VBA’s Systematic Technical Accuracy Review (STAR) Making the Grade?
|
EXAMINATION OF THE U.S. DEPARTMENT OF VETERANS AFFAIRS REGIONAL OFFICE DISABILITY CLAIMS QUALITY REVIEW METHODS
HEARING BEFORE THE SUBCOMMITTEE ON DISABILITY ASSISTANCE AND MEMORIAL AFFAIRS OF THE COMMITTEE ON VETERANS' AFFAIRS U.S. HOUSE OF REPRESENTATIVES ONE HUNDRED ELEVENTH CONGRESS SECOND SESSION MARCH 24, 2010 SERIAL No. 111-68 Printed for the use of the Committee on Veterans' Affairs
U.S. GOVERNMENT PRINTING OFFICE For sale by the Superintendent of Documents, U.S. Government Printing Office
|
|
COMMITTEE ON VETERANS' AFFAIRS |
|||||
|
|
CORRINE BROWN, Florida |
STEVE BUYER, Indiana, Ranking |
|
||
|
|
|||||
|
|
Malcom A. Shorter, Staff Director SUBCOMMITTEE ON DISABILITY ASSISTANCE AND MEMORIAL AFFAIRS
Pursuant to clause 2(e)(4) of Rule XI of the Rules of the House, public hearing records of the Committee on Veterans' Affairs are also published in electronic form. The printed hearing record remains the official version. Because electronic submissions are used to prepare both printed and electronic versions of the hearing record, the process of converting between various electronic formats may introduce unintentional errors or omissions. Such occurrences are inherent in the current publication process and should diminish as the process is further refined. |
|
|||
C O N T E N T S
March 24, 2010
Examination of the U.S. Department of Veterans Affairs Regional Office Disability Claims Quality Review Methods
OPENING STATEMENTS
Chairman John J. Hall
Prepared statement of Chairman Hall
Hon. Doug Lamborn, Ranking Republican Member
Prepared statement of Congressman Lamborn
WITNESSES
U.S. Department of Veterans Affairs:
Belinda J. Finn, Assistant Inspector General for Audits and Evaluations, Office of Inspector General
Prepared statement of Ms. Finn
Bradley G. Mayes, Director, Compensation and Pension Service, Veterans Benefits Administration
Prepared statement of Mr. Mayes
U.S. Government Accountability Office, Daniel Bertoni, Director, Education Workforce, and Income Security
Prepared statement of Mr. Bertoni
American Legion, Ian C. de Planque, Assistant Director, Veterans Affairs and Rehabilitation Commission
Prepared statement of Mr. de Planque
American Veterans (AMVETS), Raymond C. Kelley, National Legislative Director
Prepared statement of Mr. Kelley
Disabled American Veterans, John L. Wilson, Assistant National Legislative Director
Prepared statement of Mr. Wilson
National Veterans Legal Services Program, Ronald B. Abrams, Joint Executive Director
Prepared statement of Mr. Abrams
MATERIAL SUBMITTED FOR THE RECORD
Follow-up Letter:
Post-Hearing Questions and Responses for the Record:
EXAMINATION OF THE U.S. DEPARTMENT OF VETERANS AFFAIRS REGIONAL OFFICE DISABILITY CLAIMS QUALITY REVIEW METHODS
Wednesday, March 24, 2010
U. S. House of Representatives,
Subcommittee on Disability Assistance and Memorial Affairs,
Committee on Veterans' Affairs,
Washington, DC.
The Subcommittee met, pursuant to notice, at 2:28 p.m., in Room 334, Cannon House Office Building, Hon. John Hall [Chairman of the Subcommittee] presiding.
Present: Representatives Hall, Donnelly, and Lamborn.
Mr. HALL. Good afternoon, everybody. Thank you for your patience.
Would we all please rise for the Pledge of Allegiance.
[Pledge of Allegiance.]
OPENING STATEMENT OF CHAIRMAN HALL
Mr. HALL. Welcome to the Subcommittee on Disability Assistance and Memorial Affairs' hearing entitled, Examination of the U.S. Department of Veterans Affairs (VA) Regional Office Disability Claims Quality Review Methods—Is the Veterans Benefits Administration's (VBA's) Systematic Technical Accuracy Review or STAR Making the Grade?
We are going to try to make abbreviated opening statements by myself and Ranking Member Lamborn as we understand votes will be called any time.
That said, I welcome you all here in what has been a profoundly historic and important week for the Nation and for our veterans. Over the last 7 days, the full Committee convened a successful Summit, which brought many of you and dozens of other top veteran stakeholders together to aid us in fixing the VA compensation and pension claims process.
From the Summit came a lot of very useful information which we welcome and look forward to using to solve the problems that VA faces in processing disability claims of our Nation's veterans.
Next in a rare Sunday session, Congress passed and the President signed a sweeping health care reform package. And I am pleased that Secretary Shinseki as well as the Chairman of the full VA Committee and the Armed Services Committee have signed a letter and sent it to the veterans service organizations (VSOs) stating unequivocally that TRICARE and VA care will not be adversely affected by the national health care reforms.
Also in the past few days, we passed the End Veteran Homelessness Act of 2010 to provide funding to help Secretary Shinseki's goal of ending homelessness for America's warriors, the Help Heroes Keep Their Homes Act, the COLA, the cost of living increase for veterans, and the National Guard Employment Protection Act.
This particular hearing will be about the Systematic Technical Accuracy Review (STAR) technical review system and we will look at the accuracy of assessing disability compensation and pension claims rating and the disparity between accuracy in the different regional offices (ROs).
Using this quality control tool, VBA should be able to focus attention on poorly-performing ROs and help the agency direct additional staff and training to the problem offices and at the same time look at those who are the highest-performing ROs and find out what they are doing right.
The STAR system was implemented in October of 1998. Since fiscal year 2007, VBA has set for itself a goal of completing compensation claims ratings without error 90 percent of the time.
Its long-term strategy goal is 98 percent. Unfortunately, we are still far from achieving that goal. And until the STAR system provides an accurate accounting of the error rate at VBA, it is difficult to envision a path to meeting this goal.
So we are honored to have today the Office of the Inspector General (OIG), the U.S. Government Accountability Office (GAO), both of which have produced studies revealing issues that may be those impeding the efficiency and consistency of the STAR system. We are looking forward to hearing their testimony.
I am personally troubled by GAO's finding that the VBA claims processing accuracy rate is particularly low in cases pertaining to post-traumatic stress disorder (PTSD) and traumatic brain injury (TBI). Today we hope to analyze these studies and hear testimony regarding those issues from OIG and GAO.
There have been improvements made by the VBA to the claims rating system and the STAR system. We look forward to hearing about these improvements and how we, the Subcommittee, the full Committee, and Congress, can help.
To fully understand STAR, it is important to review the compensation and pension (C&P) rating system itself. To assess a claim, the application for disability assistance must be developed, a process that involves obtaining all necessary evidence to support the veteran's claim. After development, the claims go to a rating veterans service representative (RVSR).
The RVSRs determine if that disability is service-connected and assigns a percentage rating that is intended to represent the average earning reduction a veteran with that condition would experience in a civilian occupation. The veteran is then notified of that decision.
For reopened claims, the assigned diagnostic codes affect the veteran's overall combined percentage of disability. The ROs, regional offices personnel applying the appropriate diagnostic code percentages to determine the combined level of disability.
Once that claim is completed, it is declared an end product and the result of that claim is cleared and a work credit is given to the regional office. So a completed claim and corresponding cleared end product is then subject to review by STAR reviewers.
In the 110th Congress, I introduced H.R. 5892, which sought to improve VBA's quality control measures. This bill was incorporated into an omnibus veterans' package, which was signed into law as Public Law 110-389. The Veterans Disability Benefits Claims Modernization Act of 2008, H.R. 5892, was part of that. So our hearing should also provide a chance to gauge how well these quality control measures are working.
I thank you for being here, in advance for your testimony, and now I would like to recognize Ranking Member Lamborn for his opening statement.
[The prepared statement of Chairman Hall appears in the Appendix.]
OPENING STATEMENT OF HON. DOUG LAMBORN
Mr. LAMBORN. Well, thank you, Mr. Chairman.
And thank you for waiting, although I would have been happy for you to go ahead and start. But I know as a courtesy, you in a spirit of bipartisanship, you wanted to wait. So I thank you for that.
And I do apologize. There was a misunderstanding between my staff and I. I thought that we were going to start after this first series of votes because I specifically asked that. And once they found the mistake, they notified me immediately, but I was two buildings away. So I got here as quickly as I could.
But I know everyone's time here is extremely valuable and so I apologize. It is totally the fault of me and my office.
Mr. Chairman, I want to, like you, welcome everyone to this hearing on Department of Veterans Affairs' STAR Program. Throughout my tenure on this Committee, my fellow Members and I have called for stronger accountability within the VA claims system. For too long, the primary focus has been on production and this has led to an error rate that is unacceptable.
I believe that the VA's greatest challenge, the claims backlog, is largely attributable to hasty decisions made without proper regard for accuracy. The ramifications of this approach can be seen throughout the entire system.
Therefore, VA employee performance awards cannot be based entirely on production. There must also be a valid measure of quality.
Under the STAR Program, a statistically valid sample of rating decisions from various regional offices is reviewed for accuracy. While this method may be useful from a macro perspective, it is not sufficient for ensuring individual accountability.
VA must be able to identify employees in need of individualized remedial training. Without this essential component of the quality assurance process, VA will have perpetual problems in its claims system.
In the 110th Congress, this Committee passed a provision that was included in the Veterans Benefits Improvement Act of 2008 that required VA to conduct a study on the effectiveness of the current employee work credit system.
I believe the upcoming report, along with the testimony that we will hear today, will provide valuable feedback for the Department to improve its Quality Assurance and Accountability Program.
I look forward to hearing from our witnesses today, and I thank you all for your participation.
And, Mr. Chairman, I thank you also and I yield back.
[The prepared statement of Congressman Lamborn appears in the Appendix.]
Mr. HALL. Thank you, Mr. Lamborn.
We are moving at lightning speed now. We would like to ask our first panel to join us at the witness table. We have Belinda J. Finn, Assistant Inspector General for Audits and Evaluations, Office of Inspector General, U.S. Department of Veterans Affairs, and Daniel Bertoni, Director of Education, Workforce, and Income Security with the Government Accountability Office.
Welcome to you both, and your full written statements are entered in the record. So we will recognize you for 5 minutes each for however much of that you would like to give to us directly.
Ms. Finn, welcome, and you are now recognized for 5 minutes.
STATEMENTS OF BELINDA J. FINN, ASSISTANT INSPECTOR GENERAL FOR AUDITS AND EVALUATIONS, OFFICE OF INSPECTOR GENERAL, U.S. DEPARTMENT OF VETERANS AFFAIRS; ACCOMPANIED BY LARRY REINKEMEYER, DIRECTOR, KANSAS CITY AUDIT OPERATIONS DIVISION, OFFICE OF INSPECTOR GENERAL, U.S. DEPARTMENT OF VETERANS AFFAIRS; AND DANIEL BERTONI, DIRECTOR, EDUCATION, WORKFORCE, AND INCOME SECURITY, U.S. GOVERNMENT ACCOUNTABILITY OFFICE
Ms. FINN. Thank you, Chairman Hall.
I am fighting some allergies that sometimes cause coughing fits, so I am trying to avoid that.
Mr. HALL. You are excused.
Ms. FINN. Thank you for having us here today. I am pleased to be here to discuss our review of VBA's Quality Assurance Program and how VBA can improve the programs to better serve our veterans.
I am joined today by Larry Reinkemeyer, who is the Director of the Kansas City Audit Operations Division.
The OIG is committed to proactively reviewing the Department's key internal controls to identify weaknesses before they escalate into significant problems.
Over the past 2 years, my office has audited three components of VBA's Quality Assurance Program, rating accuracy, rating consistency, and VBA's Site Visit Program.
Although our written statement covers our work in all three areas, my comments today are only addressing our audit of the Systematic Technical Accuracy Review or STAR Program.
In March 2009, we issued the audit of VBA's compensation rating accuracy and consistency reviews. We concluded that VBA's STAR process did not effectively identify and report errors in compensation claim rating decisions. We projected that about 77 percent of claims were accurate as opposed to VBA's reported accuracy rate of 87 percent.
This equated to approximately 88,000 additional claims where veterans' monthly benefits may be incorrect.
We identified five areas where VBA needed to improve the STAR Program. VBA agreed with all of our recommendations and just recently reported that all corrective actions were complete.
First, the STAR reviewers did not identify some errors because they either did not thoroughly review available evidence or they inappropriately misclassified benefit entitlement errors as comments that did not count against the accuracy rate.
Second, STAR management required regional offices to report quarterly on actions taken to correct errors but did not follow-up to ensure that these offices actually took the corrective actions on comments.
Third, VBA excluded brokered claims from STAR reviews. The officials told us that the STAR Program's primary focus was to assess and report rating accuracy for each of the individual regional offices. Since two regional offices are involved in brokered claims, these officials believed it would be difficult to assign responsibility for rating accuracy on a brokered claim.
Fourth, the STAR reviewers did not ensure regional offices submitted all of the selected compensation claim ratings for review. In fact, the offices did not submit about seven percent of the requested ratings for the 12-month period ending February 2008. We reviewed a sample of these unsubmitted claims and identified a benefit entitlement error rate of approximately 22 percent.
Last, the STAR reviewers were not required to complete formal training on an annual basis. In contrast, the regional office staff that prepare and complete ratings are required to complete 80 hours of training per year to stay current on laws, policies, and processes that affect rating claims.
The STAR management relies on the regional offices to take corrective actions on the issues identified by the STAR team. Since April 2009, the OIG Benefits Inspection Division has issued eight reports where we looked at regional office procedures to ensure the accurate and timely correction of errors identified by STAR.
Our analysis of 148 errors found that regional office staff had not corrected 27 percent of these and in some cases had erroneously reported to STAR that the errors had been corrected.
In closing, VBA is under tremendous pressure to process claims and reduce the growing backlog. Without an effective and reliable Quality Assurance Program, VBA leadership cannot adequately monitor performance to make necessary program improvements and ensure veterans receive accurate and consistent ratings.
Mr. Chairman, thank you again for the opportunity to be here today. Mr. Reinkemeyer and I would be pleased to answer any questions you may have.
[The prepared statement of Ms. Finn appears in the Appendix.]
Mr. HALL. Thank you, Ms. Finn. You had 4 seconds remaining. Good job.
Ms. FINN. Thank you.
Mr. HALL. Mr. Bertoni, welcome. You are now recognized for 5 minutes.
Mr. BERTONI. Mr. Chairman, Members of the Subcommittee, good afternoon. I am pleased to be here to discuss the Department of Veterans Affairs' efforts to improve the quality of disability decisions.
For years, we have noted that VA's claims processing challenges not only include making quicker decisions in reducing its claims backlog but also improving accuracy and consistency.
My statement today focuses on steps VA has taken in response to recommendations from us and others to enhance its quality assurance tools, namely the Systematic Technical Accuracy Review or STAR Program, as well as other programs designed to address decisional consistency.
Since STAR was first implemented, we have made numerous recommendations for improvement. For example, very early on, we noted that STAR reviewers lacked organizational independence because they also had claims processing duties and reported directly to regional managers whose claims they may review.
Per our recommendation, VA moved to require organizationally independent STAR reviewers who are precluded from making claims decisions.
In subsequent work, we found that STAR sampling was insufficient to ensure the accuracy of disability pension decisions.
VA addressed our findings by consolidating pension processing into three locations and establishing a separate STAR review for pension claims.
More recently we reported that VA is not using STAR to separately assess the accuracy of the benefits delivery at discharge and quick-start claims, alternative processes for fast tracking VA disability compensation claims for active-duty servicemembers.
To date, the Agency has opted not to evaluate the extent to which staff are accurately developing and rating these claims, although such information could better inform training and focus program monitoring efforts.
VA's Office of Inspector General has also recommended changes to the STAR Program which VA has begun to address, including establishing minimum annual training requirements for reviewers, mandating an additional supervisory review of STAR reports, sampling brokered claims for accuracy, and implementing more stringent procedures for conducting STAR reviews.
Finally, the Agency has begun to take steps to address deficiencies that both we and VA's Inspector General have identified with its consistency review programs, which assess the extent to which regional offices and individual raters make consistent decisions on the same claim.
We recommended that VA conduct systematic studies of impairments identified as having potentially inconsistent decisions. And in fiscal year 2008, VA did initiate this effort.
However, last year, the Inspector General reported that VA had not followed through on its plans to conduct additional reviews, which was attributed in part to insufficient STAR staffing resources.
The Agency has since developed a consistency review strategy and is in the process of conducting fiscal year 2010 reviews. However, these efforts have only recently begun and it is too early to assess their impact.
And despite various recommendations for improvement and actions to address them, VA has struggled over the years to improve the accuracy rate for disability compensation decisions which was 84 percent in fiscal year 2009 and well short of VA's stated goal of 90 percent.
VA has attributed its inability to meet accuracy goals in part to the large numbers of newly-hired personnel conducting claims development work and their general lack of training and experience. We have also noted that human capital challenges associated with providing training to help new staff become more proficient will likely continue into the near future and could impact quality.
Thus, it is important that VA continue to improve and maintain a robust quality assurance framework that not only supports staff in their understanding of the very complex business of making timely, accurate, and consistent disability decisions but also ensures that all veterans receive the benefits they are legally entitled to.
Over time, VA's newly-implemented quality assurance initiatives have the potential to improve decisional accuracy and consistency if compliance with enhanced protocols, procedures, and standards is sustained.
However, it is imperative that VA remain proactive in its quality assurance efforts going forward, especially as aging veterans and more veterans from current conflicts add to VA's already substantial claims workloads.
And, Mr. Chairman, this does conclude my statement. I am happy to answer any questions you may have. Thank you.
[The prepared statement of Mr. Bertoni appears in the Appendix.]
Mr. HALL. Thank you, Mr. Bertoni.
I am just going to yield to Mr. Lamborn to make a comment about his questions.
Mr. LAMBORN. Mr. Chairman, I am going to waive my questions for the sake of time and just follow-up in writing to the degree that we need further explanations.
So thank you.
Mr. HALL. Thank you, Mr. Lamborn.
And I will try to squeeze a few questions in here before we have to go across the street to vote.
Ms. Finn, your OIG report indicates that there are about 203,000 claims where a veteran is receiving incorrect monthly benefits.
Does this number reflect only veterans who are being underpaid? I ask because it is important to know whether there is a systematic bias toward underpaying rather than overpaying veterans. What does your data suggest?
Ms. FINN. No. We found errors are reported both for overpayments and underpayments.
Mr. HALL. Can you give us a ratio or is that roughly equal?
Mr. REINKEMEYER. No. I think most of the errors are underpayments where the claims examiner did not identify all the issues. A veteran may have filed his claim and had eight or nine issues and a couple of them were omitted in the rating decision. So I would say most, but we have no numbers on that. And the 203,000 is the projected number based on our sample.
Mr. HALL. Okay. Well, that would be a good thing if you can cull that from your data. That would be good for us to know if it is tilted toward underpayment and how much.
Mr. REINKEMEYER. Sure.
[The VA Inspector General George Opfer, subsequently followed up in a letter dated April 29, 2010, which appears in the Appendix.]
Mr. HALL. Ms. Finn, your testimony also indicates that VA reported that it has completed actions to implement the recommendations of the OIG report of March 2009.
Are there plans at OIG to follow-up on this?
Ms. FINN. We do not have immediate plans for follow-up right now. We do over time, though, follow-up on selected reports to actually assess the effectiveness of corrective actions.
We always follow-up with management to ensure that they can report to us what they have done.
Mr. HALL. Thank you.
Your study observed that STAR reviewers do not identify some errors because they either do not thoroughly review available evidence, they fail to identify the absence of necessary medical information, and sometimes they misclassify errors in a way that resulted in errors not being counted against the accuracy rate.
What do you believe can cure these deficiencies? Is it a training issue alone?
Ms. FINN. I believe our recommendation in the report related to increased supervisory reviews on those issues. Certainly to ensure that comments that should be corrected should be counted as errors were not counted.
Mr. HALL. Thank you.
And you noted that in your testimony that STAR management required that regional offices report quarterly on actions taken to correct benefit entitlement errors, but they did not require or follow-up to ensure regional offices were actually taking corrective actions on the comments made by the STAR reviewers.
Do you have any indication that VA has remedied this concern and what assurances the corrective action at VA as purported has actually been implemented in a satisfactory manner?
Ms. FINN. The only assurance we have is VBA's response to our recommendation. And as I alluded to in my oral and written testimony, we look at this issue when we go on our regional office inspections and we do not always find that STAR errors have been actually corrected.
Mr. HALL. And lastly for you, Ms. Finn, your review indicates that brokered claims are experiencing a 69 percent accuracy rate.
What impact do you believe brokered claims will have on the overall VBA claims accuracy rate in the short term and long term or can you tell yet?
Ms. FINN. No, I cannot tell you. We are conducting a national audit looking at the brokering or redistribution of claims program. And we are assessing the impact on the timeliness of claims processing and also how this could possibly impact accuracy or how the brokering program is being implemented in respect to accuracy.
So I do not at this point have any real solid projections. Our sampling that we did a year ago was based on a sample of the brokered claims to determine our error rate at that time.
Mr. HALL. I would ask you, Ms. Finn, and then Mr. Bertoni also, the following question.
In the list of the 2009 STAR accuracy ratings for all regional offices, that we were given, the most accurate RO was Des Moines with an accuracy rating of 92.34 percent, ranging down to Baltimore with a 69.34 percent accuracy rate. The Washington Regional Office is not rated but we understand its quality level may be lower than Baltimore. Have you already or do you have plans to try to identify what it is that the Des Moines office is doing that others maybe should be emulating or what it is that Baltimore is doing that others should avoid or is it not that simple? Are there other factors?
Ms. FINN. I do not know that it is that simple. We have not yet visited the Des Moines office on our benefit inspection. We have, however, gone to Baltimore. And we do a selected review of claims. We are looking at specifically claims for PTSD, TBI. We have looked at diabetes and also we look at brokered claims.
So our review of claims is not consistent necessarily with the STAR methodology and, therefore, the numbers are not directly comparable.
We did find when we looked at those selected claims in Baltimore that they had about 38 percent inaccuracy on those types of selected claims. We also found a number of management issues in Baltimore in that they had had leadership vacancies and they also had had staff removed from the regular regional office operations to work on a disability pilot.
Mr. HALL. Right.
Ms. FINN. So we felt that impacted their ability to do quality work.
Mr. HALL. So the virtual RO pilot project and DES pilot may have been hurting, at least temporarily, the overall accuracy rating?
Ms. FINN. We felt it was an issue that needed to be addressed. We could not directly quantify the impact.
Mr. HALL. Okay. Well, thank you very much.
Mr. Bertoni, would you care to address that same question?
Mr. BERTONI. Yes. In terms of the variation in quality, I think the OIG testimony bears out some interesting issues with the accuracy rate itself.
I think one question might be variation might be due perhaps to quality of the review across the regions. I mean, we see lack of training, folks who may not know completely the rules of the game who may be reviewing these claims. And in some regions, they may be doing a more thorough job. In other regions, they may not.
The other part is that there is some blame to lay on the regions in terms of quality. I think you cannot underrate regional management. There are good managers out there who really embrace quality and it is ingrained in the staff. They are doing innovative things in terms of training.
And we have a report we are about to issue to this Committee in regard to training and some actually best practice type things that are going on out there.
And also the ability to hire staff and keep staff. And I think it is more difficult in certain areas than others, especially some of the larger urban areas, and supervisors.
So I think it is a range of things. It is a combination of things that sort of lead to this variance.
But I would also, again, hearken back to the quality of the reviews. Is it true that these two regions are entirely different? It may be reviewers are making mistakes and it might lead to some of this disparity.
Mr. HALL. Mr. Bertoni, in our last hearing examining the benefit delivery at discharge (BDD) and quick-start programs, you noted that VA lacks sufficient and specific performance measures for assessing the accuracy of decisions on BDD claims. And you recommended that VA consider options for separately estimating the accuracy of such decisions.
However, VA has asserted that the cost of sampling pre-discharge claims as part of STAR would outweigh the benefits.
If VA subjected BDD and quick-start claims to STAR review, could we make these programs better, even better than they apparently are? If so, what staffing, training, or other resources do you think would be needed?
Mr. BERTONI. I think the answer is yes. There were 250,000 initial claims that year and BDD represented a little over 50,000. So that is about 20 percent of the new claims coming into the pipeline.
So if you can make a difference there, you can make a difference. And the goal is to make BDD claims an even larger part of the initial claims.
The idea that it is not cost effective, I do not know if I have seen enough to agree to that. VA did do some analysis for us, but it was only on the cost side, full-time equivalents, space, modifications to STAR. They never pulled a sample of BDD claims to say what did we get from this additional review. So what is the return on investment on the other side is not clear.
And I continue to believe that VA continues to process more claims outside the traditional process. They are reengineering processes. They are reengineering, you know, how they do work. And I think they are using in many cases sort of STAR as more of a blunt instrument rather than using it to really drill down into these specialized new business processes to see how they are doing.
And I think at the end of the day, they need to do more of that to find out where they are getting the return on investment and where they need to walk away from things.
Mr. HALL. You asserted in your testimony that the accuracy rate was 86 percent in fiscal year 2008 and 84 percent for fiscal year 2009, well short of VBA's goal of 90 percent.
So at this point, what additional recommendations do you have for improving the STAR Program?
Mr. BERTONI. I think there is a mixture here again. There are program design issues that they have to address, but there is also management and oversight. And I think the OIG really hit on many of those.
I was extremely surprised to hear that you have such an important program and had no training requirements. I was very surprised. The fact that they have not done that yet is something that they really need to move on.
A supervisory review, you know, having another look at these cases before they are blessed or deemed to be correct is very important. It is part of their quality assurance process. They need to build in some management controls and certainly training.
We have a lot of new staff and integrating those staff and putting them in a position to be able to accurately assess claims is going to take good management, good supervisory review to sort of teach people as to, you know, what needs to be done. So I think that is very important.
Mr. HALL. Thank you, Mr. Bertoni.
You also note that due to increased Congressional support, VBA has increased its compensation and pension staffing. Specifically you point out that VBA more than doubled the size of the quality assurance staff, allowing it to increase the scope of quality assurance reviews.
Do you believe this additional staff has made a difference or are you of the opinion that staffing alone is insufficient to address deficiencies? I think you started to answer this question, but—
Mr. BERTONI. Staffing never hurts. And if you staff up a program and you give folks the tools that they need to do the job they need to do, it could make a difference. And that means again training them up and putting them in a position to make a difference.
So the numbers will help. But, again, it is going to take time. There is a learning curve. But over time, I think it will be effective, but you also have to build in the management controls to make sure that the reviews do have the authority and the oversight teeth that they should have.
Mr. HALL. Thank you.
Some VSOs point to the high rate of remand ordered by the Board of Veterans' Appeals (BVA) or the U.S. Court of Appeals for Veterans Claims (CAVC) as indications that the accuracy rate is a lot lower than reported by VBA.
What is your assessment of how the errors found by the Board and the Court impact VBA's reported claims processing accuracy rate?
And maybe, Ms. Finn, if you could comment on that as well.
Ms. FINN. Mr. Chairman, we have not done any specific work on the appeals process, so I would prefer not to comment.
I would note, however, that remands involve other issues other than just whether the rating decision was accurate. There could be procedural reasons for a remand. But other than that, I will leave that one to Mr. Bertoni.
Mr. BERTONI. We had worked in that area not very recently, but I would say again there are a range of reasons why a case would be remanded back. Sometimes it is a matter of it has been so long that the impairments change, new impairments are introduced, other evidence has not been considered. So there are a lot of reasons.
But I think overall in hearing what I have heard today with the OIG report, what we know and work that we have done in the past in terms of missing claims and how that could affect error rates, I would say the error rate—I do not have a whole lot of faith in the error rate or, I am sorry, the accuracy rate of 84 percent. It is probably some other lower figure, but I do not know what that is.
Mr. HALL. And, lastly, Mr. Bertoni, before the changes in the STAR Program were mandated by Public Law 110-389, only ten cases per month for each RO were being reviewed by the STAR system. Today VA staff advises us that twice as many cases are being reviewed, 240 cases per RO per year.
How many total cases were decided by the regional offices last year and is a sample of 20 cases per month per RO sufficient or should that sample be larger, particularly for ROs to process a larger number of claims?
Mr. BERTONI. Unfortunately, I do not have the number of total cases. But any time you can ramp up the numbers is going to make the projectability of what you do more rigorous.
Our concern again with the current system is STAR tends to be used more as a blunt instrument to account for accuracy of all claims regardless of varied business processes.
More and more, we see these claims being processed through these alternative or re-engineered processes in order to expedite things. And the Agency could be using STAR more effectively to assess how they are doing with these alternative processes. They are not. And BDD again is a good example of that.
Every re-engineered process should include some kind of quality assurance whether it is continuous or a snapshot or a limited investment. Without doing that, you waste a lot of time. You do not know what you are getting for your money and you might end up where you do not want to be at the end of the day.
I think failure to build any regular or limited investment quality assurance into any of these alternative processes is a failure of management. And we have noted along the way these varied again re-engineered or alternative means by which cases are being decided where the quality assurance, the STAR review was not sufficient in our view to get a good sense of how accurate and consistent many claims are.
Mr. HALL. Well, thank you, Mr. Bertoni and Ms. Finn and Mr. Reinkemeyer. I am grateful for your testimony. And we may have further questions for you that we will send you in writing.
We now have a series of votes called on the floor, approximately 30 minutes we are told, so we will ask our next panels to be patient, please, as you are used to, I am sure, by now.
And the first panel is excused. Thank you again.
The Subcommittee is recessed for votes.
[Recess.]
Mr. HALL. Welcome again, and thank you for your patience. The Subcommittee on Disability Assistance and Memorial Affairs will now resume its hearing on the STAR Review, Making the Grade. And our second panel, thank you for joining us again. Ronald B. Abrams, Joint Executive Director, National Veterans Legal Services Program (NVLSP); John L. Wilson, Assistant National Legislative Director, Disabled American Veterans (DAV); Raymond C. Kelley, National Legislative Director of AMVETS; and Ian C. de Planque, Assistant Director, Veterans Affairs and Rehabilitation Commission of the American Legion.
Gentlemen, thank you so much for being here today. And your full statement, written statement, of course is entered into the record. So we will give you each 5 minutes to expound upon it, and starting with Mr. Abrams. You are now recognized.
STATEMENTS OF RONALD B. ABRAMS, JOINT EXECUTIVE DIRECTOR, NATIONAL VETERANS LEGAL SERVICES PROGRAM; JOHN L. WILSON, ASSISTANT NATIONAL LEGISLATIVE DIRECTOR, DISABLED AMERICAN VETERANS; RAYMOND C. KELLEY, NATIONAL LEGISLATIVE DIRECTOR, AMERICAN VETERANS (AMVETS); AND IAN C. DE PLANQUE, ASSISTANT DIRECTOR, VETERANS AFFAIRS AND REHABILITATION COMMISSION, AMERICAN LEGION
Mr. ABRAMS. Thank you, Mr. Chairman. I have some good news. We have just met with hundreds of service officers for Legion and Purple Heart. And I am happy to tell you that when they talked to me they said STAR does catch errors, they do order the VA to fix them, and there is some training based on the errors caught by STAR. However, based on the same comments of these service officers, and our own personal experience, we find that the training based on the errors caught by STAR is not generally doing the job. The errors still continue and the emphasis is still on production in the regional offices.
NVLSP believes that the quality of the regional office adjudications is much worse than what is reported by the VA. And all you have to do if you want to look at it in an independent check is look at the statistics produced by the Board and the Court of Appeals for Veterans Claims. And I can tell you as far as consistency goes, I defy almost anyone to look at 20 evaluations of mental conditions and make rhyme or reason out of them. The rating schedule is not applied on a consistent basis.
I am not going to over the BVA statistics. You all have them. I will say that the results of what we do with the American Legion, and we have done over forty quality checks in regional offices, we find that the quality is much worse. In many instances claims are denied in a premature manner or benefits are not paid at the proper rate because the regional office was more concerned about claiming work credit and reducing the backlog than taking the time to develop and analyze the claim properly. STAR shows about a 16 percent error rate. We are finding over a 30 percent error rate, in general.
Here are some of the errors that we have found that you should know about. Assignment of erroneous low ratings for service-connected mental conditions. Erroneous denial of claims for service-connection for mental conditions. Failure to consider 38 USC 1154(b), the Combat Veteran Statute. Erroneous denials of claims for individual unemployability. Failure to consider presumptive service-connection. And inadequate requests for medical opinions. Sometimes doctors are not asked the right questions and that leads to a litany of problems.
Based on that we ask that the VA work measurement system be altered so that quality, production, and timeliness are concepts that really drive the system. That is the most important thing. You need to change that to make any effective change. We would also like a quality control with teeth and an independent quality check.
I will leave it at that except to say that what the VA is doing now is obviously not working. We need to have things changed. Thank you.
[The prepared statement of Mr. Abrams appears in the Appendix.]
Mr. HALL. Thank you, Mr. Abrams. Mr. Wilson, you are now recognized.
Mr. WILSON. Thank you, sir. Mr. Chairman and Members of the Subcommittee, I am pleased to appear before you on behalf of Disabled American Veterans to address VBA’s Systematic Technical Accuracy Review program.
Successfully reforming the veterans benefits claims process will require training and accountability, two elements central to producing quality results for veterans. VBA continues to struggle with the quantity of claims, with quality, training, and accountability taking a backseat. VBA’s primary quality assurance program is the STAR program. The STAR program can identify three types of errors: benefit entitlement, decision documentation and notification, and administrative.
The STAR program was evaluated by the VA OIG in the March 2009 report, which determined the program did not provide a complete assessment of rating accuracy. The OIG found STAR processors did not effectively identify and report all errors in compensation claim rating decisions. While VBA stated STAR reviewers achieved an 87 percent technical accuracy rate, the OIG projected an accuracy rate of only 78 percent based on its review of STAR reviewed cases. The OIG determined that this equates to approximately 203,000 claims in that 1 year alone with veterans monthly benefits may be therefore incorrect. The OIG determined the STAR reviewers did not identify some of the missed errors because they either did not thoroughly review available medical and non-medical evidence, did not identify the absence of necessary medical information, or inappropriately misclassified benefit entitlement errors in the comments section. These findings point to the need for greater management oversight and an effective formal training program for the STAR reviewers.
We had heard reports that while STAR reviewers could benefit from formal training current workload requirements do not allow for the necessary training time. This common theme for VBA underlies even STAR reviews; quantity over quality. Even in the area that is supposed to ensure quality of at least a technical nature.
The need for a quality control program as an adjunct to the STAR quality assurance program can also be seen when considered through a review of the Board of Veterans' Appeals summary of remands. Nineteen thousand one hundred cases, or 34 percent of appeals reaching the BVA in fiscal year 2009, were due to remands for notice not being provided to claimants, failed requests for service medical records and personnel records, or ignored travel Board requests. These elementary errors were either undetected or ignored. A 34 percent error rate on such basic elements in the claims process is simply unacceptable.
With no incentive to prevent such errors, and a constant focus on production, quality will continue to decline. DAV agrees with the VA OIG recommendations to improve the STAR program. In addition, we recommend VBA establish a quality control program that looks at claims in process in order to determine not just whether a proper decision was made but how it was arrived at in order to identify ways to improve the system. Combining results from such quality control reviews with STAR’s quality assurance results, and the data from remands from the Board of Veterans' Appeals, and the Court of Appeals for Veterans Claims, could yield valuable information on trends and causes of errors. If this data could be incorporated into a robust IT system, proper analysis of such data would provide management and employees insights into processes and decisions. With a modern IT system, VBA would be able to do quality control in real time, not just after the fact. This in turn would lead to quicker and more accurate decisions on benefits claims and more importantly to the delivery of all benefits earned by the veteran, particularly disabled veterans, in a timely manner.
That concludes my testimony. I would be happy to answer any questions.
[The prepared statement of Mr. Wilson appears in the Appendix.]
Mr. HALL. Thank you, Mr. Wilson. Mr. Kelley? You are recognized.
STATEMENT OF RAYMOND C. KELLEY
Mr. KELLEY. Thank you for giving AMVETS the opportunity to present our views on the STAR program. AMVETS agrees with VA’s Office of Inspector General’s March, 2009 report that identified eight issues that will improve the process of reviewing claims for errors, and AMVETS is please to see VBA is taking action to correct these issues. AMVETS is concerned, however, with what is done with the information that is gleaned from STAR.
For the STAR program to truly be effective AMVETS believes three things must be done. First, STAR must be enhanced so trends and errors can be easily identified by regional offices. With this information, VBA must hold ROs accountable for failures in accuracy and insist that the ROs develop improvement strategies and include training for these accuracy issues.
Second, VBA must change its culture of timeliness and strive for a culture of accuracy. Whether or not STAR is completely accurate in its review is important but not nearly as important as what is done to ensure the same mistakes are not made again.
Third, OIG must conduct periodic reviews of the STAR program to ensure that its accuracy ratings are within a 3 percent error margin.
Even though AMVETS believes the STAR program is effective, we believe it could be expanded to ensure that specific programs such as BDD and specific conditions can be tracked for anomalies that occur so improvement strategies and specific training can be implemented. After the Veterans Claims Assistance Act (VCAA) was introduced in the first quarter of fiscal year 2002, nearly half of all errors were VCAA related. VBA had the ROs retrain their claims processors and in the last 8 months of the fiscal year these types of errors were reduced by one-third. This type of accountability needs to be the rule and not the exception.
As you know, the STAR program identifies errors in claims decisions each month through random sampling of each regional office. The results of the reviews are sent to the ROs. Errors that are found are to be corrected by the RO who made the decision. The corrections are made, but all too often the ROs do not implement strategies to ensure that claims processors do not continually repeat the same mistakes.
AMVETS believes the reason these strategies are not developed is that the culture within the regional offices is one of timeliness and not one of accuracy. STAR has consistently found that nearly 20 percent of claims are in error over the past decade, but VBA has done little to ensure that mistakes that are made in the regional offices are understood and that strategies for improvements are put in place. On the other hand, VBA does require ROs to develop corrective action plans if they do not reach strategic goals for production, inventory, and timeliness. This paradigm must be flipped.
The March, 2009 OIG report clearly defines the gaps in the STAR program that have caused the 10 percent disparity in compensation and pension rating accuracy. AMVETS believes VBA is taking action to close these gaps, however we believe it is important to have this accuracy fall within a 3 percent margin of error. Therefore, AMVETS requests that OIG conduct a follow up to the 2009 report to ensure VBA’s gap solutions are productive and that OIG continue to conduct periodic reviews of STAR to be sure the program reaches and maintains that 3 percent margin of error.
Mr. Chairman, this concludes my remarks and I will be happy to answer any questions that you have.
[The prepared statement of Mr. Kelley appears in the Appendix.]
Mr. HALL. Thank you, Mr. Kelley. Mr. de Planque?
STATEMENT OF IAN C. DE PLANQUE
Mr. DE PLANQUE. Thank you Mr. Chairman and Members of the Committee. On behalf of the American Legion I would like to say that the STAR program is a potentially effective tool that if used in a more effective manner could really help VA deal with the accuracy issues. It is refreshing to see that in the OIG report from March 2009, that the issues that OIG identified VA concurred with and has recently reported that they are making the changes to correct those.
The American Legion would recommend three points which could help VA enhance the efficiency of STAR and the ability for it to be effective. The first point is to create an aggregate record of all of the errors that are reported through STAR so that it can be analyzed. The second point would be to use the collected data to develop a targeted training program. And the third point would be to have regular, outside, impartial oversight of the process.
Going back to elaborate on the first part. The errors are being reported back to the ROs. However, to the best of our knowledge there is no one consolidated effort to aggregate these mistakes, common errors, and deficiencies that are being noted. If you combine this with common errors and deficiencies noted from the Board of Veterans' Appeals, the Appeals Management Center (AMC), and the Court of Appeals for Veterans Claims, you can develop an overall picture both on a regional office level, perhaps even on a team level, but more importantly on a national level of what the common errors within VA are. And you can use that to determine where to devote the resources to improve accuracy within VA.
That leads directly into point number two, targeted training. We have reports and when we have spoken to VA employees on the American Legion quality review visits to regional offices, it is noted and VA notes that when STAR identifies problems they are reported back to the teams and the coaches for correction involved. However, if you can notice a national trend, if you can notice that consistently VA is experiencing problems with, say, rating mental health disorders, or improperly asking for exams, then this can be set into a targeted training program so that VA is getting the most use possible out of their 80 to 85 hours of mandated training, that it is going to correct the areas that you need corrected. If you take a math test and you realize that you are having problems with binomial equations, then you need to go back and do some extra work on binomial equations so that the next time you take that math test you get it right. This is exactly the same sort of thing that VA could use this aggregate data to accomplish.
And the third point that I want to make is about outside oversight, third party oversight. In the recent report this morning that was published in the Federal Register on the OIG investigation of Togus, Maine, one of the things that was pointed out, 25 percent of the STAR errors were not corrected in accordance with VBA policy. And two examples that are listed here, STAR instructed the regional office to inform a widow that her child could be entitled to Dependency and Indemnity Compensation (DIC) benefits. There is no evidence in the claims folder showing staff informed the widow of this potential entitlement. Furthermore, the RO staff erroneously informed STAR that they corrected the error. Secondly, STAR instructed the RO to send a notification letter for a burial claim to the proper claimant. While the staff informed STAR that they corrected the error they did not send the revised error to the proper claimant.
So clearly, and this is not new that we have seen in the reports of the OIG of the various specific regional offices, even if STAR is capturing the errors they are not necessarily being followed up on, which is why you need third party oversight. You need somebody to go in and double check that they are crossing the I's, dotting the T's, and correcting these errors.
VA has potentially a very effective tool here and we want them to be able to use it effectively. And with the three points that we believe will be effective to create an aggregate of that information that can be searched for trends, to use those trends to target the training, and to add outside oversight, we believe that this can be an effective tool for VA.
I thank you very much, Mr. Chairman and the Committee, and we would be happy to answer any questions.
[The prepared statement of Mr. de Planque appears in the Appendix.]
Mr. HALL. Thank you, is it Mr. de Planque or de Planque, just—
Mr. DE PLANQUE. It is de Planque, technically.
Mr. HALL. De Planque, okay, thank you. Just like to try to get the correct pronunciation. You mentioned that training should be more specialized. Who do you think should oversee this training? Should it be standardized at either the RO or national levels, or individualized for each local VA office?
Mr. DE PLANQUE. What I am proposing here, what the American Legion is proposing here, is to have s
Sign Up for Committee Updates
Stay connected with the Committee