A benchmark too far JHI print article

of 6
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information Report
Category:

Shopping

Published:

Views: 12 | Pages: 6

Extension: PDF | Download: 0

Share
Description
A benchmark too far JHI print article
Transcript
  This article appeared in a journal published by Elsevier. The attachedcopy is furnished to the author for internal non-commercial researchand education use, including for instruction at the authors institutionand sharing with colleagues.Other uses, including reproduction and distribution, or selling orlicensing copies, or posting to personal, institutional or third partywebsites are prohibited.In most cases authors are permitted to post their version of thearticle (e.g. in Word or Tex form) to their personal website orinstitutional repository. Authors requiring further informationregarding Elsevier’s archiving and manuscript policies areencouraged to visit:http://www.elsevier.com/copyright  Author's personal copy A benchmark too far: findings from a national survey of surgical site infection surveillance J. Tanner a,* , W. Padley a , M. Kiernan b , D. Leaper c , P. Norrie a , R. Baggott a a De Montfort University, Leicester, UK  b Southport and Ormskirk Hospitals NHS Trust, Southport, UK  c University of Newcastle upon Tyne, Newcastle upon Tyne, UK  A R T I C L E I N F O  Article history: Received 6 July 2012Accepted 8 November 2012Available online 15 January2013 Keywords: Surgical site infectionSurveillance S U M M A R Y Background:  The national surgical site infection (SSI) surveillance service in Englandcollates and publishes SSI rates that are used for benchmarking and to identify theprevalence of SSIs. However, research studies using high-quality SSI surveillance reportrates that are much higher than those published by the national surveillance service. Thisvariance questions the validity of data collected through the national service.  Aim:  To audit SSI definitions and data collection methods used by hospital trusts inEngland.  Method:  All 156 hospital trusts in England were sent questionnaires that focused onaspects of SSI definitions and data collection methods. Findings:  Completed questionnaires were received from 106 hospital trusts. There wereconsiderable differences in data collection methods and data quality that caused widevariation in reported SSI rates. For example, the SSI rate for knee replacement surgery was4.1% for trusts that used high-quality postdischarge surveillance (PDS) and 1.5% for truststhat used low-quality PDS. Contrary to national protocols and definitions, 10% of trusts didnot provide data on superficial infections, 15% of trusts did not use the recommended SSIdefinition, and 8% of trusts used inpatient data alone. Thirty trusts did not submita complete set of their data to the national surveillance service. Unsubmitted dataincluded non-mandatory data, PDS data and continuous data. Conclusion:  The national surveillance service underestimates the prevalence of SSIs and isnot appropriate for benchmarking. Hospitals that conduct high-quality SSI surveillance willbe penalized within the current surveillance service. ª  2012 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved. Introduction The national surgical site infection (SSI) surveillance servicein England was established in 1997 to enable hospitals tocompare their SSI rates against a benchmark, and to use SSIdata to improve the quality of patient care. 1 This service isadministered and run by the Health Protection Agency (HPA).Participating hospitals undertake surveillance in at least one of17 categories of surgical procedures that are encompassed inthe surveillance system. In 2004, the Department of Healthmandated that acute hospital trusts which perform ortho-paedic surgery should undertake a minimum of three months ofsurveillance each year in at least one specified category. In2008, all hospitals were required to have systems in placeto identify re-admissions and, additionally, optional * Corresponding author. Address: Clinical Nursing Research, DeMontfort University, The Gateway, Leicester LE1 9BH, UK. Tel.: þ 44 (0)116 2013885; fax:  þ 44 (0) 116 2013821. E-mail address:  jtanner@dmu.ac.uk (J. Tanner). Available online at www.sciencedirect.com Journal of Hospital Infection journal homepage: www.elsevierhealth.com/journals/jhin 0195-6701/$  e  see front matter   ª  2012 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.http://dx.doi.org/10.1016/j.jhin.2012.11.010Journal of Hospital Infection 83 (2013) 87 e 91  Author's personal copy postdischargefollow-upwasintroducedusingeitheroutpatientclinic review or direct patient contact via questionnaire.Participating hospitals are required to follow the nationalsurveillance protocol which outlines follow-up methods anddefinitions.Similar national surveillance programmes exist acrossEurope, and there is some state-level surveillance in theUSA. 2,3 SSI rates published by the English programme aresimilar to those from other European countries which arecollated and published by the European Centre for DiseaseControl. 2 However, an anomaly exists as SSI rate data derived fromnational programmes are not comparable with data obtainedfrom high-quality surveillance studies or research trials of SSIinterventions. For example, the national SSI rate for largebowel surgery in England is 10%, but studies with high-qualitysurveillance report rates of 19%, 20%, 22%, 26% and 27%. 4 e 8 This discrepancy brings into question the accuracy and val-idity of national surveillance data.These concerns have been expressed at government level.In 2009, the Public Accounts Committee reviewed thenational SSI surveillance service, and concluded that theDepartment of Health did not have an understanding of thetrue scale of SSIs in England because of a ‘lack of decentdata’. 9 The purpose of this study was to audit SSI definitions,postdischarge surveillance (PDS) and data collection methodsused by acute hospital trusts in England. Methods All 156 trusts in England that provide secondary care withinhospitals were invited to participate in the study. Letters weresent to named lead surveillance staff or lead infectionprevention nurses in each trust asking them to completea paper-based or online questionnaire. Reminders were sentout two weeks after the initial mailing.The questionnaire used a tick-box format to collect SSIsurveillance data such as: collection methods, definitions,identification of SSIs, staff resources and reporting mecha-nisms. The questionnaire was initially piloted amongst a smallnumber of surveillance nurses and amended to address their comments. Data from the returned paper and online ques-tionnaires were transferred on to a database and analysedusing descriptive statistics. Exact Pearson Chi-squared testsand Mann e Whitney U-tests were used to test nominal leveldata and interval level data, respectively, for statisticaldifferences. Results In total, 106 questionnaires (response rate 68%) werereturned and included in the analysis. Sixty-two of the 106respondents also provided SSI rates. The findings are presentedwithin the following headings e variations in patient follow-up:inpatient, re-admission and postdischarge; variations in datacollection methods; variations in SSI definitions; and reportingdata to the national programme. Variations in patient follow-up: inpatient,re-admission and postdischarge Patient follow-up methods varied, with most hospital trustsusing PDS rather than inpatient and re-admission surveillancealone. Nine trusts (8%) used inpatient follow-up alone, 24 trusts(22%) used inpatient and re-admission follow-up alone, and 73trusts (68%) used inpatient, re-admission and postdischargefollow-up. Longer patient follow-up was related to higher SSIrates asdemonstrated in TableI, which shows SSI rates for kneereplacement surgery by follow-up method, comparing inpa-tient and re-admission follow-up with inpatient, re-admissionand postdischarge follow-up. Trusts that undertook PDS for knee surgery in 2010 or 2011 had significantly higher SSI ratesthan trusts that collected inpatient and re-admission dataalone ( P  ¼ 0.013 and  P  ¼ 0.045, respectively).Where surveillance was mandatory, patient follow-up wasshorter. Table II shows the distribution of patient follow-up for mandatory (e.g. orthopaedic) surgical procedures comparedwith that for non-mandatory surgical procedures (e.g. breast,cardiac or large bowel). Shorter patient follow-up (inpatientalone, inpatient and re-admission alone) was significantly morelikely after mandatory procedures ( P  ¼ 0.048). Variations in data collection methods Direct patient contact was the most common method ofsurveillance among the 73 hospital trusts that undertook PDS.Seventeen, 16 and 15 trusts, respectively, used patient self-assessment questionnaires, telephone calls from surveillancestafforacombinationofboth(TableIII).Only10trustsdidnotuseany direct patient contact, choosing instead to use ‘passive’ 10 methods of review via outpatient clinics or primary care liaison.Whilst the majority of hospital trusts undertook directpatient contact for PDS, there was considerable variation in thequality of patient contact. The 39 trusts that used patient self-assessmentquestionnairesreportedresponseratesrangingfrom100% to 2%, with five trusts reporting response rates below 10%. Table I Knee surgical site infection (SSI) rate by patient follow-up Knee replacement 2010 e 2011 Knee replacement 2011 e 2012No. of hospitaltrustsMean SSIrateMedian SSIrateNo. of hospitaltrustsMean SSIrateMedian SSIrateInpatient alone or inpatient andre-admission alone10 0.45 0.10 a 8 0.68 0.35 b Inpatient, re-admission and postdischarge 14 2.47 1.46 a 24 3.09 1.80 b a Mann e Whitney U,  P  ¼ 0.013. b Mann e Whitney U,  P  ¼ 0.045.  J. Tanner et al. / Journal of Hospital Infection 83 (2013) 87  e 91 88  Author's personal copy The average response rate was 55%, although 15 trusts tele-phonednon-responderstoincreasepatientcontact.Seventrustswith response rates below 50% did not use any additional follow-up methods, including two trusts with response rates of 10%.The level of patient contact affected SSI rates. Hospitaltrusts with high levels of patient contact had consistentlyhigher mean and median SSI rates than trusts that had lowlevels of patient contact (Table IV). Table IV shows SSI rates after hip and knee replacement surgery for trusts that had highpatient contact (questionnaire response rates  > 50%, allpatients contacted via telephone, or low response to ques-tionnaires but supplemented with telephone calls) comparedwith trusts that had low patient contact (questionnaireresponse rate  50% or outpatient review alone). The numbersof trusts were too small for meaningful statistical analysis. Variations in SSI definitions The HPA definition of an SSI is a modified version of thatgiven by the US Centers for Disease Control and Prevention(CDC). The HPA definition was used by 89 trusts (85%). Ninetrusts (9%) reported using the CDC definition, with three trusts(3%) using both the HPA and CDC definitions, and three trusts(3%) using the ASEPSIS scoring system. ASEPSIS is a scoringsystem based on additional treatment, serous discharge,erythema, purulent exudate, separation of deep tissues,isolation of bacteria and duration of inpatient stay.Unlike the CDC definition, the HPA definition requires thata clinician’s diagnosis of an SSI must be supported by evidenceof clinical signs and symptoms. Ten hospital trusts (10%) statedthat additional evidence was not required to support a clini-cian’s diagnosis of an infection. Additionally, four trusts (3%)reported that the prescription of antibiotics in primary carewas sufficient evidence to diagnose an SSI.Eleven hospital trusts (10%) reported omission of superficialinfections in their SSI data. Eight out of 85 trusts (10%) thatcollected data on mandatory orthopaedic procedures dis-counted superficial SSIs. Two out of 16 trusts (12%) thatcollected SSI data on colorectal procedures discounted super-ficial SSIs, and one of 13 trusts (8%) that collected SSI data oncardiac procedures discounted superficial SSIs. Reporting data to the national programme Thirty of the 106 hospital trusts (28%) reported that they didnot send all of the SSI data they collected to the nationalsurveillance service. These trusts said that they only submittedmandatory data, they did not submit postdischarge data, or they only sent three months’ worth of data even if they hadundertaken continuous surveillance. The wide variation inquarterly data (three-month period) is exemplified in largebowel surgery, where one trust presented consecutive quar-terly SSI rates as 12%, 22%, 13% and 25%. Only eight trustsprovided reasons for non-submission of data to the nationalprogramme. The reasons were: ‘entering data on HPA system islaborious’, ‘lack of resources to enter data’, ‘feedback is tooslow’, ‘feedback has limited use’, and ‘clinicians have reser-vations about service, think it is not sufficiently robust’.Forty hospital trusts (38%) provided their commissionerswith regular SSI data reports, and SSI surveillance was includedas a target within the Commissioning for Quality and Innovation(CQUIN) payment framework for 23 trusts (22%). Table II Follow-up for mandatory and non-mandatory procedures byparticipating hospital trusts Mandatoryprocedures N  ¼ 85 hospitaltrustsNon-mandatoryprocedures N  ¼ 52 hospitaltrustsInpatient alone 7 (8.2%) 1 (1.9%)Inpatient and re-admissionalone22 (26%) 8 (15%)Inpatient, re-admission andpostdischarge56 (66%) 43 (83%) Table III Surveillance data collection methods No. of hospitaltrustsInpatient surveillanceMicrobiology/case note review/patientcontact by surveillance staff28Microbiology/case note review 21Case note review 15Case note review/patient contact bysurveillance staff9Patient contact by surveillance staff 6Microbiology 6Case note review/microbiology/patientcontact by surveillance staff/computer flag4Patient contact by surveillance staff/microbiology5Computer flag alone 2Combination of above 7Re-admission surveillanceMicrobiology/case note review 20Microbiology/case note review/patientcontact by surveillance staff16Patient contact by surveillance staff 12Microbiology 11Liaising with ward staff re: re-admissions 9Microbiology/patient contact by surveillancestaff8Computer flag 7Case note review 6Combination of above 6Postdischarge surveillancePatient self-assessment questionnaire alone 17Telephone call alone 16Patient self-assessment questionnaire plustelephone call15Telephone call plus information from primarycare and/or outpatient clinic8Patient self-assessment questionnaire plusinformation from primary care and/or outpatient clinic7Outpatient clinic alone 6Outpatient clinic plus information fromprimary care4  J. Tanner et al. / Journal of Hospital Infection 83 (2013) 87  e 91  89  Author's personal copy Discussion Using surveillance data for benchmarking  In 2008, the HPA stated that ‘valid benchmarks must bebased on standardised definitions and monitoring systems’. 11 This study found that SSI data collection methods are notstandardized and there is non-compliance with protocolmethods and definitions. For example, contrary to the nationalprotocol methods and definitions, 10% of hospital trusts did notprovide data on superficial infections, 15% of trusts did not usethe HPA definition of infection, and 8% of trusts only collecteddata from inpatients. The effect of these deviations from theguidelines on SSI rates has been demonstrated previously inother studies. 12 Variations in the methods and the quality of data collectedappear to cause further discrepancies. Supporting the findingsof many other studies, 10 hospital trusts that used PDS hadhigher SSI rates than trusts that did not use PDS. However, thisstudy also found variations within PDS; direct patient contactappeared to be a more robust method of PDS than outpatientreview, although this finding should be treated with caution asthe number of trusts that used outpatient review alone wassmall. Additionally within PDS,trusts with highlevels of patientcontact had higher SSI rates than trusts with low levels ofpatient contact. Both these findings are new. Whilst thenational programme recommends the use of direct patientcontact or outpatient review for PDS, this study suggests thatthere may not be parity between these methods. It is clear thatensuring a high patient response rate is paramount, and itcannot be assumed that non-responders do not have SSIs. Thisis supported by a recent qualitative study of patients with deepSSIs, which found that nine out of 17 patients were not awarethat they had SSIs and would not have described their woundsas infected. 13 For the reasons outlined earlier (non-compliance withprotocol, different data collection methods and variations indata quality), it is suggested that it is not appropriate to usethe current national SSI surveillance service for benchmarking.For accurate benchmarking at a national level, definitions andmethods should be standardized and systems must be intro-ducedtoensurecompliancewith protocolsand definitions.Theservice should also incorporate some validation methodologiesfor quality assurance purposes.Currently, hospital trusts that have high SSI rates arelabelled as ‘outliers’ by the national surveillance service andare investigated by the HPA. 14 This study found that trustswhich undertake high-quality surveillance (inclusion of alltypes of SSIs, use of the recommended SSI definition and PDSusing direct contact with all patients) have higher SSI rates.As a direct consequence, trusts that conduct high-qualitysurveillance may be penalized.There are financial implications for having ‘high’ SSI rates.The opportunity to generate income may be reduced, ascommissionersmaychoosetocommissionsurgery fromhospitaltrusts with ‘lower’ SSI rates. This is already a possibility as 40trusts in this survey were required to provide their commis-sioners with SSI reports. Additionally, income may be lost fromfailure to meet quality targets within the CQUIN paymentframework. Twenty-three trusts in this survey had CQUINtargets that included SSI surveillance.The possibility of financial penalties may shape the way thathospital trusts collect or submit SSI data. In a study of over 300US hospitals to explore unintended consequences resultingfrom Medicare’s decision to stop financing SSIs, infectionprevention staff reported changes in the way that SSIs werecoded and documented by hospital staff to minimize financiallosses. 15 It is interesting to note that 30 trusts in the currentstudy did not submit all their data to the national programme.Several reasons were given for this, ranging from lack ofresources to lack of confidence in the national programme.A more positive interpretation of hospital trusts choosingnot to submit all their data to the national programme could bethat these trusts see the value in collecting continuoussurveillance data or postdischarge data, and are using thesedata for their own purposes rather than for benchmarking withother trusts. Implications for surveillance services Some findings in this study have raised issues that should beconsidered when designing national surveillance services, suchas the inclusion of superficial infections, mandatory surveil-lance and PDS. Superficial infections Ten percent of hospital trusts did not report superficial SSIsto the national programme, contrary to the HPA SSI definitionand the HPA reporting guidelines. This raises an interestingpoint about the perception of superficial infections, andwhether they should be included in SSI rates. Superficialinfections are perceived to be less important than deep SSIs asthey cause fewer complications and are less costly. 16 However,there is an argument for capturing superficial SSI data, as thereis evidence that superficial infections can develop into deepinfections. 17 Presenting superficial infections separately from Table IV Surgical site infection (SSI) rates for hip and knee replacement surgery by high- and low-quality postdischarge surveillance Low-quality postdischarge surveillance a High-quality postdischarge surveillance b No. of hospitaltrustsMean SSIrateMedian SSIrateNo. of hospitaltrustsMean SSIrateMedian SSIrateHip replacement 2010 e 2011 6 0.70 0.55 9 1.32 0.90Hip replacement 2011 e 2012 7 0.65 0.90 11 1.60 1.60Knee replacement 2010 e 2011 4 1.98 2.27 6 3.49 2.5Knee replacement 2011 e 2012 5 1.57 1.7 13 4.19 2.4 a Includes patient questionnaires with response rates of 50% or less and no supplementary telephone calls, outpatient review only. b Includes direct patient questionnaires with response rates above 50%.  J. Tanner et al. / Journal of Hospital Infection 83 (2013) 87  e 91 90
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks
SAVE OUR EARTH

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!

x