We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This book traces the changing political and social roles of classical education in late antique Gaul. It argues that the collapse of Roman political power in Gaul changed the way education was practiced and perceived by Gallo-Romans. Neither the barbarian kingdoms nor the Church directly caused the decline of classical schools, but these new structures of power did not encourage or support a cultural and political climate in which classical education mattered; while Latin remained the language of the Church, and literacy and knowledge of law were valued by barbarian courts, training in classical grammar and rhetoric was no longer seen as a prerequisite for political power and cultural prestige. This study demonstrates that these fundamental shifts in what education meant to individuals and power brokers resulted in the eventual end of the classical schools of grammar and rhetoric that had once defined Roman aristocratic public and private life.
Emergency department (ED) visits for epilepsy are common, costly, and often clinically unnecessary. Configuration of care pathways (CPs) that could divert people away from ED offer an alternative. The aim was to measure patient and carer preferences for alternative CPs and to explore the feasibility of implementing the preferred CPs in the National Health Service (NHS) England with a wider group of stakeholders.
Methods
Formative work (provider survey, service-user interviews, knowledge exchange, and think-aloud piloting) informed a discrete choice experiment (DCE) with six attributes: access to care plan, conveyance, time, epilepsy specialist today, general practitioner (GP) notification, and epilepsy specialist follow-up. This was hosted online with random assignment to two of three scenarios (home, public, or atypical). Logistic regression generated preference weights that were used to calculate the utility of CPs. The highest ranked CPs plus a status quo were discussed at three online knowledge exchange workshops. The nominal group technique was used to ascertain stakeholder views on preference evidence and to seek group consensus on optimal feasible alternatives.
Results
A sample of 427 people with epilepsy and 167 friends or family completed the survey. People with epilepsy preferred paramedics to have access to care plan, non-conveyance, one to three hours, epilepsy specialists today, GP notification, and specialist follow-up within two to three weeks. Family and friends differed when considering atypical seizures, favoring conveyance to urgent treatment centers and shorter time. Optimal configuration of services from service users’ perspectives outranked current practice. Knowledge exchange (n=27 participants) identified the optimal CP as feasible but identified two scenarios for resource reallocation: care plan substitutes specialist advice today and times of strain on NHS resources.
Conclusions
Preferences differed to current practice but had minimal variation by seizure type or stakeholder. This study clearly identified optimal and feasible alternative CPs. The mixed-methods approach allowed for robust measurement of preferences, whilst knowledge exchange examined feasibility to enhance implementation of optimal alternative CPs in the future.
Advances in artificial intelligence (AI) have great potential to help address societal challenges that are both collective in nature and present at national or transnational scale. Pressing challenges in healthcare, finance, infrastructure and sustainability, for instance, might all be productively addressed by leveraging and amplifying AI for national-scale collective intelligence. The development and deployment of this kind of AI faces distinctive challenges, both technical and socio-technical. Here, a research strategy for mobilising inter-disciplinary research to address these challenges is detailed and some of the key issues that must be faced are outlined.
Campylobacter spp. are leading bacterial gastroenteritis pathogens. Infections are largely underreported, and the burden of outbreaks may be underestimated. Current strategies of testing as few as one isolate per sample can affect attribution of cases to epidemiologically important sources with high Campylobacter diversity, such as chicken meat. Multiple culture method combinations were utilized to recover and sequence Campylobacter from 45 retail chicken samples purchased across Norwich, UK, selecting up to 48 isolates per sample. Simulations based on resampling were used to assess the impact of Campylobacter sequence type (ST) diversity on outbreak detection. Campylobacter was recovered from 39 samples (87%), although only one sample was positive through all broth, temperature, and plate combinations. Three species were identified (Campylobacter jejuni, Campylobacter coli, and Campylobacter lari), and 33% of samples contained two species. Positive samples contained 1–8 STs. Simulation revealed that up to 87 isolates per sample would be required to detect 95% of the observed ST diversity, and 26 isolates would be required for the average probability of detecting a random theoretical outbreak ST to reach 95%. An optimized culture approach and selecting multiple isolates per sample are essential for more complete Campylobacter recovery to support outbreak investigation and source attribution.
The purpose of this study was to explore overall recovery time and post-concussive symptoms (PCSS) of pediatric concussion patients who were referred to a specialty concussion clinic after enduring a protracted recovery (>28 days). This included patients who self-deferred care or received management from another provider until recovery became complicated. It was hypothesized that protracted recovery patients, who initiated care within a specialty concussion clinic, would have similar recovery outcomes as typical acute injury concussion patients (i.e., within 3 weeks).
Participants and Methods:
Retrospective data were gathered from electronic medical records of concussion patients aged 6-19 years. Demographic data were examined based on age, gender, race, concussion history, and comorbid psychiatric diagnosis. Concussion injury data included days from injury to initial clinic visit, total visits, PCSS scores, days from injury to recovery, and days from initiating care with a specialty clinic to recovery. All participants were provided standard return-to-learn and return-to-play protocols, aerobic exercise recommendations, behavioral health recommendations, personalized vestibular/ocular motor rehabilitation exercises, and psychoeducation on the expected recovery trajectory of concussion.
Results:
52 patients were included in this exploratory analysis (Mean age 14.6, SD ±2.7; 57.7% female; 55.7% White, 21.2% Black or African American, 21.2% Hispanic). Two percent of our sample did not disclose their race or ethnicity. Prior concussion history was present in 36.5% of patients and 23.1% had a comorbid psychiatric diagnosis. The patient referral distribution included emergency departments (36%), local pediatricians (26%), neurologists (10%), other concussion clinics (4%), and self-referrals (24%).
Given the nature of our specialty concussion clinic sample, the data was not normally distributed and more likely to be skewed by outliers. As such, the median value and interquartile range were used to describe the results. Regarding recovery variables, the median days to clinic from initial injury was 50.0 (IQR=33.5-75.5) days, the median PCSS score at initial visit was 26.0 (IQR=10.0-53.0), and the median overall recovery time was 81.0 (IQR=57.0-143.3) days.
After initiating care within our specialty concussion clinic, the median recovery time was 21.0 (IQR=14.0-58.0) additional days, the median total visits were 2.0 (IQR=2.0-3.0), and the median PCSS score at follow-up visit was 7.0 (IQR=1-17.3).
Conclusions:
Research has shown that early referral to specialty concussion clinics may reduce recovery time and the risk of protracted recovery. Our results extend these findings to suggest that patients with protracted recovery returned to baseline similarly to those with an acute concussion injury after initiating specialty clinic care. This may be due to the vast number of resources within specialty concussion clinics including tailored return-to-learn and return-to-play protocols, rehabilitation recommendations consistent with research, and home exercises that supplement recovery. Future studies should compare outcomes of protracted recovery patients receiving care from a specialty concussion clinic against those who sought other forms of treatment. Further, evaluating the influence of comorbid factors (e.g., psychiatric and/or concussion history) on pediatric concussion recovery trajectories may be useful for future research.
The COVID-19 pandemic increased food insufficiency: a severe form of food insecurity. Drawing on an ecological framework, we aimed to understand factors that contributed to changes in food insufficiency from April to December 2020, in a large urban population hard hit by the pandemic.
Design:
We conducted internet surveys every 2 weeks in April–December 2020, including a subset of items from the Food Insecurity Experience Scale. Longitudinal analysis identified predictors of food insufficiency, using fixed effects models.
Setting:
Los Angeles County, which has a diverse population of 10 million residents.
Participants:
A representative sample of 1535 adults in Los Angeles County who are participants in the Understanding Coronavirus in America tracking survey.
Results:
Rates of food insufficiency spiked in the first year of the pandemic, especially among participants living in poverty, in middle adulthood and with larger households. Government food assistance from the Supplemental Nutrition Assistance Program was significantly associated with reduced food insufficiency over time, while other forms of assistance such as help from family and friends or stimulus funds were not.
Conclusions:
The findings highlight that during a crisis, there is value in rapidly monitoring food insufficiency and investing in government food benefits.
Bacterial superinfection and antibiotic prescribing in the setting of the current mpox outbreak are not well described in the literature. This retrospective observational study revealed low prevalence (11%) of outpatient antibiotic prescribing for bacterial superinfection of mpox lesions; at least 3 prescriptions (23%) were unnecessary.
Individuals living with severe mental illness can have significant emotional, physical and social challenges. Collaborative care combines clinical and organisational components.
Aims
We tested whether a primary care-based collaborative care model (PARTNERS) would improve quality of life for people with diagnoses of schizophrenia, bipolar disorder or other psychoses, compared with usual care.
Method
We conducted a general practice-based, cluster randomised controlled superiority trial. Practices were recruited from four English regions and allocated (1:1) to intervention or control. Individuals receiving limited input in secondary care or who were under primary care only were eligible. The 12-month PARTNERS intervention incorporated person-centred coaching support and liaison work. The primary outcome was quality of life as measured by the Manchester Short Assessment of Quality of Life (MANSA).
Results
We allocated 39 general practices, with 198 participants, to the PARTNERS intervention (20 practices, 116 participants) or control (19 practices, 82 participants). Primary outcome data were available for 99 (85.3%) intervention and 71 (86.6%) control participants. Mean change in overall MANSA score did not differ between the groups (intervention: 0.25, s.d. 0.73; control: 0.21, s.d. 0.86; estimated fully adjusted between-group difference 0.03, 95% CI −0.25 to 0.31; P = 0.819). Acute mental health episodes (safety outcome) included three crises in the intervention group and four in the control group.
Conclusions
There was no evidence of a difference in quality of life, as measured with the MANSA, between those receiving the PARTNERS intervention and usual care. Shifting care to primary care was not associated with increased adverse outcomes.
Healthcare workers (HCWs) in long-term care facilities (LTCFs) are disproportionately affected by severe acute respiratory coronavirus virus 2 (SARS-CoV-2), the virus that causes coronavirus disease 2019 (COVID-19). To characterize factors associated with SARS-CoV-2 positivity among LTCF HCWs, we performed a retrospective cohort study among HCWs in 32 LTCFs in the Minneapolis–St Paul region.
Methods:
We analyzed the outcome of SARS-CoV-2 polymerase chain reaction (PCR) positivity among LTCF HCWs during weeks 34–52 of 2020. LTCF and HCW-level characteristics, including facility size, facility risk score for resident-HCW contact, and resident-facing job role, were modeled in univariable and multivariable generalized linear regressions to determine their association with SARS-CoV-2 positivity.
Results:
Between weeks 34 and 52, 440 (20.7%) of 2,130 unique HCWs tested positive for SARS-CoV-2 at least once. In the univariable model, non–resident-facing HCWs had lower odds of infection (odds ratio [OR], 0.50; 95% confidence interval [CI], 0.36–0.70). In the multivariable model, the odds remained lower for non–resident-facing HCW (OR, 0.50; 95% CI, 0.36–0.71), and those in medium- versus low-risk facilities experienced higher odds of testing positive for SARS-CoV-2 (OR, 1.47; 95% CI, 1.08–2.02).
Conclusions:
Our findings suggest that COVID-19 cases are related to contact between HCW and residents in LTCFs. This association should be considered when formulating infection prevention and control policies to mitigate the spread of SARS-CoV-2 in LTCFs.
Optimum nutrition plays a major role in the achievement and maintenance of good health. The Nutrition Society of the UK and Ireland and the Sabri Ülker Foundation, a charity based in Türkiye and focused on improving public health, combined forces to highlight this important subject. A hybrid conference was held in Istanbul, with over 4000 delegates from sixty-two countries joining the proceedings live online in addition to those attending in person. The primary purpose was to inspire healthcare professionals and nutrition policy makers to better consider the role of nutrition in their interactions with patients and the public at large to reduce the prevalence of non-communicable diseases such as obesity and type 2 diabetes. The event provided an opportunity to share and learn from different approaches in the UK, Türkiye and Finland, highlighting initiatives to strengthen research in the nutritional sciences and translation of that research into nutrition policy. The presenters provided evidence of the links between nutrition and disease risk and emphasised the importance of minimising risk and implementing early treatment of diet-related disease. Suggestions were made including improving health literacy and strengthening policies to improve the quality of food production and dietary behaviour. A multidisciplinary approach is needed whereby Governments, the food industry, non-governmental groups and consumer groups collaborate to develop evidence-based recommendations and appropriate joined-up policies that do not widen inequalities. This summary of the proceedings will serve as a gateway for those seeking to access additional information on nutrition and health across the globe.
Across a remarkable forty-year career that has continued apace since his retirement in 2014, Allan Macinnes has made a hugely significant contribution to Scottish, British and early modern historical scholarship. Across five monographs, nine edited books and innumerable essays and articles, Macinnes honed a distinctive style as a historian, renowned for his unrivalled coverage of Scottish and international archives and a take-no-prisoners approach to intellectual inquiry that recalled his days as a shinty player. This was later captured in the nickname he acquired among British historians in the early 1990s – Ice Pick – on account of his ‘direct’ style of engagement. Yet despite this fearsome reputation, Macinnes was also a committed teacher and advisor. As an early adopter of what is now accepted as an essential requirement of all professional historians – research-led teaching – he had incomparable success in producing a new generation of historians, many of whom remain active scholars today.
Macinnes was integral to the transformation of Scottish History as a discipline in the late twentieth and early twenty-first centuries. In his own words, he sought to avoid what he perceived to be the excessive insularity and introspection of his immediate predecessors and contemporaries by emphasising comparative history and the promotion of Scottish History internationally. Much like his historical sparring partner Archibald Campbell, Marquess of Argyll, Macinnes has been comfortable operating in Scottish, British and overseas theatres. This has been reflected similarly in his many collaborations with early modernists in Europe and North America and in the wide-ranging contributions to this collection.
Evaluating Macinnes's wide-ranging scholarship is no easy task: any attempt to do so is liable to fall short when trying to fully capture its breadth and depth. Rather than auditing a prolific and ongoing publication record, then, we have opted instead to trace three fundamental elements of his work that also serve to illuminate the broader historiographical landscape of the last half-century, and which remain major interventions in Scottish and British historiography. These elements are: (i) the study of Scottish history in a comparative international context and a rejection of Anglocentrism in British history; (ii) the study of the Scottish Highlands without recourse either to sentimentalism or the over-privileging of governmental perspectives; and (iii) the study of political economy and Scottish commerce before and after the Treaty of Union.
The involvement of citizens in the production and creation of public services has become a central tenet for administrations internationally. In Scotland, co-production has underpinned the integration of health and social care via the Public Bodies (Joint Working) (Scotland) Act 2014. We report on a qualitative study that examined the experiences and perspectives of local and national leaders in Scotland on undertaking and sustaining co-production in public services. By adopting a meso and macro perspective, we interviewed senior planning officers from eight health and social care partnership areas in Scotland and key actors in national agencies. The findings suggest that an overly complex Scottish governance landscape undermines the sustainability of co-production efforts. As part of a COVID-19 recovery, both the implementation of meaningful co-production and coordinated leadership for health and social care in Scotland need to be addressed, as should the development of evaluation capacities of those working across health and social care boundaries so that co-production can be evaluated and report to inform the future of the integration agenda.
Sudden onset severe headache is usually caused by a primary headache disorder but may be secondary to a more serious problem, such as subarachnoid hemorrhage (SAH). Very few patients who present to hospital with headache have suffered a SAH, but early identification is important to improve patient outcomes. A systematic review was undertaken to assess the clinical effectiveness of different care pathways for the management of headache, suspicious for SAH, in the Emergency Department. Capturing the perspective of patients was an important part of the research.
Methods
The project team included a patient collaborator with experience of presenting to the Emergency Department with sudden onset severe headache. Three additional patients were recruited to our advisory group. The patient's perspective was collected at various points through the project including at team meetings, during protocol development and when interpreting the results of the systematic review and drawing conclusions.
Results
Patients were reassured by the very high diagnostic accuracy of computed tomography (CT) for detecting SAH. Patients and clinicians emphasized the importance of shared decision making about whether to undergo additional tests to rule out SAH, after a negative CT result. When lumbar puncture was necessary, patients expressed a preference to have it on an ambulatory basis; further research on the safety and acceptability of ambulatory lumbar puncture was recommended.
Conclusions
Patient input at the protocol development stage helped researchers understand the patient experience and highlighted important outcomes for assessment. Patient involvement added context to the review findings and highlighted the preferences of patients regarding the management of headache.
Sudden onset severe headache is usually caused by a primary headache disorder but occasionally is secondary to a more serious problem, such as subarachnoid hemorrhage (SAH). Guidelines recommend non-contrast brain computed tomography (CT) followed by lumbar puncture (LP) to exclude SAH. However, guidelines pre-date the introduction of more sensitive modern CT scanners. A systematic review was undertaken to assess the clinical effectiveness of different care pathways for the management of headache in the Emergency Department.
Methods
Eighteen databases (including MEDLINE and Embase) were searched to February 2020. Studies were quality assessed using criteria relevant to the study design; most studies were assessed using the QUADAS-2 tool for diagnostic accuracy studies. Where sufficient information was reported, diagnostic accuracy data were extracted into 2 × 2 tables to calculate sensitivity, specificity, false-positive and false-negative rates. Where possible, hierarchical bivariate meta-analysis was used to synthesize results, otherwise studies were synthesized narratively.
Results
Fifty-one studies were included in the review. Eight studies assessing the accuracy of the Ottawa SAH clinical decision rule were pooled; sensitivity was 99.5 percent, specificity was 23.7 percent. The high false positive rate suggests that 76.3 percent SAH-negative patients would undergo further investigation unnecessarily. Four studies assessing the accuracy of CT within six hours of headache onset were pooled; sensitivity was 98.7 percent, specificity was 100 percent. CT sensitivity beyond six hours was considerably lower (≤90%; 2 studies). Three studies assessing LP following negative CT were pooled; sensitivity was 100 percent, specificity was 95.2 percent. LP-related adverse events were reported in 5.3–9.5 percent of patients.
Conclusions
The evidence suggests that the Ottawa SAH Rule is not sufficiently accurate for ruling out SAH and does little to aid clinical decision making. Modern CT within six hours of headache onset (with images assessed by a neuroradiologist) is highly accurate, but sensitivity reduces considerably over time. The CT-LP pathway is highly sensitive for detecting SAH, although LP resulted in some false-positives and adverse events.
The impact of dietary phosphorus on chronic renal disease in cats, humans and other species is receiving increasing attention. As Ca and P metabolism are linked, the ratio of Ca:P is an important factor for consideration when formulating diets for cats and other animals. Here, we describe a fully randomised crossover study including twenty-four healthy, neutered adult cats, investigating postprandial responses in plasma P, ionised Ca and parathyroid hormone (PTH) following one meal (50 % of individual metabolic energy requirement) of each of six experimental diets. Diets were formulated to provide P at either 0·75 or 1·5 g/1000 kcal (4184 kJ) from the soluble phosphorus salt sodium tripolyphosphate (STPP, Na5P3O10), variable levels of organic Ca and P sources, and an intended total Ca:P of about 1·0, 1·5 or 2·0. For each experimental diet, baseline fasted blood samples were collected prior to the meal, and serial blood samples collected hourly for 6 h thereafter. For all diets, a significant increase from baseline was observed at 120 min in plasma PTH (P < 0·001). The diet containing the highest STPP inclusion level and lowest Ca:P induced the highest peaks in postprandial plasma P and PTH levels (1·8 mmol/l and 27·2 pg/ml, respectively), and the longest duration of concentrations raised above baseline were observed at 3 h for P and 6 h for PTH. Data indicate that Ca:P modulates postprandial plasma P and PTH. Therefore, when formulating diets containing soluble P salts for cats, increasing the Ca:P ratio should be considered.
The impacts of COVID-19 for people with obsessive-compulsive disorder (OCD) may be considerable. Online cognitive behavioural therapy (iCBT) programmes provide scalable access to psychological interventions, although the effectiveness of iCBT for OCD during COVID-19 has not been evaluated.
Aim:
This study investigated the uptake and effectiveness of iCBT for OCD (both self- and clinician-guided courses) during the first 8 months of the pandemic in Australia (March to October 2020) and compared outcomes with the previous year.
Method:
1,343 adults (824/1343 (61.4%) female, mean age 33.54 years, SD = 12.00) commenced iCBT for OCD (1061 during the pandemic and 282 in the year before) and completed measures of OCD (Dimensional Obsessive-Compulsive Scale) and depression (Patient Health Questionaire-9) symptom severity, psychological distress (Kessler-10), and disability (WHO Disability Assessment Schedule) pre- and post-treatment.
Results:
During COVID-19, there was a 522% increase in monthly course registrations compared with the previous year, with peak uptake observed between April and June 2020 (a 1191% increase compared with April to June 2019). OCD and depression symptom severity were similar for the COVID and pre-COVID groups, although COVID-19 participants were more likely to enrol in self-guided courses (versus clinician-guided). In both pre- and during-COVID groups, the OCD iCBT course was associated with medium effect size reductions in OCD (g = 0.65–0.68) and depression symptom severity (g = 0.56–0.65), medium to large reductions in psychological distress (g = 0.77–0.83) and small reductions in disability (g = 0.35–.50).
Conclusion:
Results demonstrate the considerable uptake of online psychological services for those experiencing symptoms of OCD during COVID-19 and highlight the scalability of effective digital mental health services.
Greek had held an important place in Roman society and culture since the Late Republican period, and educated Romans were expected to be bilingual and well versed in both Greek and Latin literature. The Roman school ‘curriculum’ was based on Hellenistic educational culture, and in the De grammaticis et rhetoribus Suetonius says that the earliest teachers in Rome, Livius and Ennius, were ‘poets and half Greeks’ (poetae et semigraeci), who taught both Latin and Greek ‘publicly and privately’ (domi forisque docuisse) and ‘merely clarified the meaning of Greek authors or gave exemplary readings from their own Latin compositions’ (nihil amplius quam Graecos interpretabantur aut si quid ipsi Latine composuissent praelegebant, Gram. et rhet. 1–2). Cicero, the Latin neoteric poets and Horace are obvious examples of bilingual educated Roman aristocrats, but also throughout the Imperial period a properly educated Roman would be learned in utraque lingua. The place of Greek in Quintilian's Institutio Oratoria reveals the importance and prevalence of Greek in Roman education and literature in the late first century a.d. Quintilian argues that children should learn both Greek and Latin but that it is best to begin with Greek. Famously, in the second century a.d. the Roman author Apuleius gave speeches in Greek to audiences in Carthage, and in his Apologia mocked his accusers for their ignorance of Greek.
Background: In an effort to reduce inappropriate testing of hospital-onset Clostridioides difficile infection (HO-CDI), we sequentially implemented 2 strategies: an electronic health record-based clinical decision support tool that alerted ordering physicians about potentially inappropriate testing without a hard stop (intervention period 1), replaced by mandatory infectious diseases attending physician approval for any HO-CDI test order (intervention period 2). We analyzed appropriate HO-CDI testing rates of both intervention periods. Methods: We performed a retrospective study of patients 18 years or older who had an HO-CDI test (performed after hospital day 3) during 3 different periods: baseline (no intervention, September 2014–February 2015), intervention 1 (clinical decision support tool only, April 2015–September 2015), and intervention 2 (ID approval only, December 2017–September 2018). From each of the 3 periods, we randomly selected 150 patients who received HO-CDI testing (450 patients total). We restricted the study to the general medicine, bone marrow transplant, medical intensive care, and neurosurgical intensive care units. We assessed each HO-CDI test for appropriateness (see Table 1 for criteria), and we compared rates of appropriateness using the χ2 test or Kruskall-Wallis test, where appropriate. Results: In our cohort of 450 patients, the median age was 61 years, and the median hospital length of stay was 20 days. The median hospital day that HO-CDI testing was performed differed among the 3 groups: 12 days at baseline, 10 days during intervention 1, and 8.5 days during intervention 2 (P < .001). Appropriateness of HO-CDI testing increased from the baseline with both interventions, but mandatory ID approval was associated with the highest rate of testing appropriateness (Fig. 1). Reasons for inappropriate ordering did not differ among the periods, with <3 documented stools being the most common criterion for inappropriateness. During intervention 2, among the 33 inappropriate tests, 8 (24%) occurred where no approval from an ID attending was recorded. HO-CDI test positivity rates during the 3 time periods were 12%, 11%, and 21%, respectively (P = .03). Conclusions: We found that both the clinical decision support tool and mandatory ID attending physician approval interventions improved appropriateness of HO-CDI testing. Mandatory ID attending physician approval leading to the highest appropriateness rate. Even with mandatory ID attending physician approval, some tests continued to be ordered inappropriately per retrospective chart review; we suspect that this is partly explained by underdocumentation of criteria such as stool frequency. In healthcare settings where appropriateness of HO-CDI testing is not optimal, mandatory ID attending physician approval may provide an option beyond clinical decision-support tools.
To evaluate the effect of the burden of Staphylococcus aureus colonization of nursing home residents on the risk of S. aureus transmission to healthcare worker (HCW) gowns and gloves.
Design:
Multicenter prospective cohort study.
Setting and participants:
Residents and HCWs from 13 community-based nursing homes in Maryland and Michigan.
Methods:
Residents were cultured for S. aureus at the anterior nares and perianal skin. The S. aureus burden was estimated by quantitative polymerase chain reaction detecting the nuc gene. HCWs wore gowns and gloves during usual care activities; gowns and gloves were swabbed and then cultured for the presence of S. aureus.
Results:
In total, 403 residents were enrolled; 169 were colonized with methicillin-resistant S. aureus (MRSA) or methicillin-sensitive S. aureus (MSSA) and comprised the study population; 232 were not colonized and thus were excluded from this analysis; and 2 were withdrawn prior to being swabbed. After multivariable analysis, perianal colonization with S. aureus conferred the greatest odds for transmission to HCW gowns and gloves, and the odds increased with increasing burden of colonization: adjusted odds ratio (aOR), 2.1 (95% CI, 1.3–3.5) for low-level colonization and aOR 5.2 (95% CI, 3.1–8.7) for high level colonization.
Conclusions:
Among nursing home patients colonized with S. aureus, the risk of transmission to HCW gowns and gloves was greater from those colonized with greater quantities of S. aureus on the perianal skin. Our findings inform future infection control practices for both MRSA and MSSA in nursing homes.