|
Post by Banjo on Feb 22, 2012 18:18:12 GMT 7
This is from the Ombudsman's report on Centrelink 2010.
PART 3 – ACCESS TO REVIEW
3.1 The right to have decisions reviewed, even if not accessed, is an important contributor to customer, citizen and government confidence in the system of administrative decision making. It follows that awareness of that right is valuable in itself.
3.2 Centrelink’s customers have a more specific need to be aware of their right to have decisions reviewed. Centrelink advises its customers of this right through its customer service charter, on the reverse of its decision letters, on its internet site and in a variety of customer-oriented publications. In our experience the right of review appears to be well-known.
3.3 Centrelink’s procedures manual, e-Reference, says that where a customer expresses dissatisfaction with a decision, a CSA should provide an explanation of that decision and then seek clarification about whether the person is seeking a review.
3.4 There is no single method or avenue by which a customer must seek review, nor any form of ‘magic words’ which must be used. Centrelink’s customers can access the agency through a large variety of channels. Some of these are particularly targeted at ‘doing business’ with Centrelink, such as call centres, and CSCs. Others provide an avenue for feedback, such as the ‘email Hank’, ‘send a message to the CEO’ or Customer Relations Unit facilities. Centrelink’s anecdotal view is that most review requests are made verbally either by phone or in person, but customers can also seek review by sending a message via the internet-accessible ‘secure messaging service’, by completing the review request form, by letter or by email.
3.5 This ‘no wrong door’ approach means that staff in contact with customers need to be aware of the review process. Moreover they must be alert to the fact that a comment or complaint may be a request for review and should be acted upon. Despite this, many Centrelink customers complain to this office about their difficulties in having a request for review recognised. We have seen cases where a customer has clearly expressed their dissatisfaction, often on several occasions. Nonetheless, their objections were not responded to appropriately.
3.6 Although Centrelink has conducted extensive staff training in review practices, it is clear from cases such as Ms P’s that more training is needed to assist staff to distinguish between complaint and request for review. Training should ensure all staff act on any possible request for review and provide clear advice on review options in the event of any doubt. From the case studies, it appears that more emphasis is needed on ensuring compliance with the requirement to record requests for review on the APL (mandated by internal procedures since 2001) to support active monitoring of reviews.
|
|
|
Post by Banjo on Feb 22, 2012 18:27:35 GMT 7
Prioritising requests for review
3.7 Centrelink’s multi-layered, multi-model review process has distinct benefits for the agency, in that it allows the more numerous and less senior ODM staff to carry the bulk of the internal review load. Sometimes a specialist team of CSAs conducts all reviews of a particular type. Wherever ODMs and CSAs are used for reviews there will be greater capacity for the more experienced (and more expensive) AROs to focus on the cases which require specialised attention.
3.8 It is our view that reviews should be conducted by AROs in the first instance, and the cases in which this does not occur should be an exception based on a benefit to and informed consent of the customer, not solely on administrative expedience. However as not all review cases are seen by an ARO in the first instance - or at all - access to ARO consideration must be guided by some form of ‘review triage’. Therefore Centrelink must consider how to prioritise reviews.
3.9 Centrelink already prioritises some reviews. ‘No income’ cases have a 14 day (95%) timeliness standard, but the countdown commences only when the case is with an ARO, not at the point at which the review request is received and possibly considered initially by an ODM. In cases where possible eight week non-payment penalties may result, the case is reviewed by a CSA in a specialist team. To deal with the large volume of appeal requests arising from the Economic Security Strategy Payment (ESSP) scheme and the fact that the payment was granted automatically based on eligibility for another payment, specialist CSA review teams were created to consider each request and explain the decisions to their customers. This also provided an opportunity to establish whether the customer wanted further review by an ARO.
3.10 Priority for and access to ARO reviews should be determined by factors which consider the nature of the review and potential impacts on customers as well as administrative efficiency. This office suggests that Centrelink develop criteria against which a request for review should be assessed by an ARO in the first instance. These should include: • vulnerability: cases where a customer is known to be vulnerable – due to no income, being homeless, or suffering from a mental illness • complexity: where it is clear from the outset that a case is complex • consequences: taking into account the significance to a customer of decisions being reconsidered, and the capacity or otherwise to mitigate the impact of those decisions while under review, for example, payment pending review (PPR), discussed at 6.6 • consent: cases in which the customer has not given, or been in a position to give, informed consent to something other than an ARO review.
|
|
|
Post by Banjo on Feb 22, 2012 18:34:23 GMT 7
Likelihood of a successful outcome
4.29 While customers may have many motivations for seeking review, the most obvious is that they want the original decision changed. Centrelink refers to the rate at which decisions are either set aside or varied as the ‘change rate’. Some cases are also simply withdrawn.
Change rates27 for 2009-10 were as follows: • of the decisions reviewed in an ODM review of any type, 55% were affirmed by the ODM, and 39% changed. • in abbreviated ODMs, the change rate was 69% • 21% of ODM reviews went on to be considered by an ARO, during which 38% were changed • 34% of decisions were changed by AROs in direct-to-ARO reviews • at the SSAT, some 26% of appeals resulted in a change to Centrelink’s decision. 4.30 While it is clear from this data that many customers are having decisions changed at each level of review, there are also significant, but unexplained trends in this data which raise questions about how reviews are conducted under each model.
These trends include the: • relatively high number of decisions being varied or set aside at ODM stage • relatively constant and significant percentage of decisions being changed at the ARO stage, regardless of whether the case had been through ODM review • much greater proportion of abbreviated ODMs compared to standard ODMs being overturned (although the number of abbreviated ODMs is itself small) • significant proportion of those who have received an ODM review going on to request an ARO review. 4.31 As a rough guide, using these 2009-10 figures, if 100 appeals were conducted through an ODM review, 39 would result in a changed decision. Twentyone of the original group would go on to have an ARO review, and the ARO would change the decision under review in about eight of those cases. In all, 47 of the original 100 decisions would be changed in the internal review process. This figure is significant. While a change does not necessarily indicate that a previous decision was wrong, the ultimate change rate again reinforces the importance of supporting quality decision making at first instance, and having ready access to an efficient and effective internal review process.
|
|
|
Post by Banjo on Feb 22, 2012 18:40:59 GMT 7
CAUSES OF DELAYTime allowed, and time taken, to complete reviews. 5.1 Complaints to the Ombudsman frequently involve delays in the review process, and at the extreme, we have investigated one case of a two year delay. 5.2 The Administrative Review Council (ARC) has noted that an agency that chooses to make internal review mandatory should ensure the review system is as worthwhile as possible for the applicant and does not operate as a potential barrier to effective merits review. 5.3 In practice, review by an ARO is mandatory prior to a customer being able to appeal to the SSAT. Where customers are not going to be satisfied until they have had an independent review, an internal process is likely to be perceived as an obstacle on the path to independent review. 5.4 Even where a customer might otherwise be content to have a decision reviewed internally, the time taken to conduct an internal review can undermine confidence in the internal process such that the applicant will not be content to accept the result. Conversely, the customer may become a victim of ‘appellant fatigue’, as in the case of Mr P in ‘Tired and unhappy’ and may lack the emotional resources to continue to fight their case. 5.5 Centrelink commits to advising customers of the outcomes of their reviews, in writing, within 28 days. It sets an internal goal for standard ODM reconsideration completion within seven calendar days, and a target of 75% of cases meeting that standard. The figures in the table below show the results for the 2009-10 year Attachments:
|
|
|
Post by Banjo on Feb 22, 2012 18:53:32 GMT 7
5.6 In these figures ‘ODM Reviews’ includes standard ODM reviews and only those abbreviated ODMs which do not go on to AROs.32 ‘Direct to ARO’ figures include decisions not previously considered by any ODM. 5.7 It is possible that a customer who has been given a standard ODM reconsideration will not ask for further review, even where the decision is unchanged. There are many reasons for not seeking review, for instance, they may be content that they have had a second consideration; understand the decision better; not see any value in pursuing it further; not feel that they have the strength or willingness to persist; or even have failed to understand that further review is possible. They may wish to avoid the possibility that a further review decision would worsen, rather than improve, the outcome for them. 5.8 Reviews which go directly to an ARO generally take longer than those that go only to an ODM. Centrelink’s intention that the ODM process be a quick check may explain this. 5.9 It needs to be stressed that the customer is often quite vulnerable and that an adverse administrative decision, or delays in making a review, can impact greatly on their livelihood. The cases below are instructive, but more needs to be understood about the causes of delay. 5.10 Further analysis needs to be conducted to identify the reasons underpinning the timeframes for the various combinations of review as an important factor informing the triage of reviews (as discussed at paragraph 3.8), and informing customers of the impacts of choosing one form of review over another.
Administrative breakdown 5.11 The ARC has noted: It is preferable to have a simplified structure consisting of one layer of review by a senior officer uninvolved in the primary decision. Agencies should concentrate on making this single layer of review as effective as possible; to ensure that in most cases it is the final. 5.12 The complexity of a multi-layered review process can trigger a variety of delays which might be considered under the twin headings of ‘administrative breakdown’ and ‘administrative drift’. Risks increase with complexity, for example the risks and consequences of documents going astray while being moved from one reviewing officer to another as illustrated in case studies in this report. Centrelink has advised that to address these pitfalls it is increasing its use of electronic file transfer and workload management. 5.13 The Ombudsman’s office sees many delays caused by administrative breakdowns which are not unique to the review process, such as lost documents, delays in gathering documents, inadequate handover procedures for staff on leave, flawed or non-existent follow-up procedures and failure to manage workloads. Three cases succinctly illustrate administrative breakdowns in the review context. 5.14 We have investigated complaints where the review request has not been recorded, or recorded incorrectly, on the APL. As a consequence, no review proceeded, sometimes after many contacts by the customer, and it has taken the intervention of this office for the review to commence. Although guidelines and procedures exist for following up outstanding ODM and ARO reviews, those practices do not address this issue. It is our experience that if a review request is not correctly recorded, there is no way for Centrelink to monitor the progress of the review, and without this mechanism, the risk of delay is higher. Like the ‘Waiting by the fax’ case 5.15 Review process models should be analysed to detect single points of failure which may result in cases stalling and falling outside current APL monitoring processes.
Administrative drift 5.16 Some delays are caused by what this office has labelled ‘administrative drift’. Delay often results from a matter drifting far beyond anyone’s expectation. Some of the reasons are familiar and pervasive—a file being given a lower priority than other matters or being put aside in the ‘too hard’ basket to be looked at later; responsibility for a decision passing from one officer to another; or one aspect of a case being reconsidered or referred for advice before a final decision on the whole case is made. 5.17 The importance of being able to provide further information for a review was discussed above at 4.18. This office understands that, ultimately, the need for certainty in decision making dictates that there must be a cut-off date for providing information. The following cases illustrate the lengthy delays which can result when a process is designed such that if a particular trigger event does not occur, or does not appear to have occurred (in this case the receipt of a fax) the workflow will cease indefinitely.
|
|
|
Post by Banjo on Feb 22, 2012 18:55:37 GMT 7
|
|