49.13 The Effect of Hospital Choice on Surgical Outcomes for Medicaid Beneficiaries

J. Claflin1, J. B. Dimick1, D. A. Campbell1, M. J. Englesbe1, K. H. Sheetz1  1University Of Michigan,Department Of Surgery,Ann Arbor, MI, USA

Introduction:

Few studies have evaluated whether poor outcomes for Medicaid beneficiaries are driven by patients’ choice in hospitals. We sought to evaluate the associations between hospital choice and surgical outcomes in Medicaid patients. 

Methods:

We identified 139,566 non-elderly Medicaid and private insurance beneficiaries undergoing general, vascular, or gynecological surgery between 2012-17 using a statewide clinical registry in Michigan, a Medicaid expansion state. We calculated risk-adjusted rates of complications, readmissions, emergency department (ED) visits, and post-acute care utilization using multivariable logistic regression, accounting for patient and procedural factors. We then evaluated if, and to what extent, the choice of hospital influenced outcome disparities between Medicaid and private insurance beneficiaries.

Results:

Adjusted rates for all outcomes were higher in Medicaid patients. For example, 14.7% of Medicaid patients returned to the ED after discharge compared to 7.5% for private insurance patients (P<0.01). Hospital choice explained <1.0% of the observed difference in complication rates between Medicaid and private insurance patients. In contrast, hospital choice explained a significant proportion of the disparities in readmissions (29.7%), ED visits (15.2%), and post-acute care utilization (41.2%). While differences in outcomes were similar, hospitals’ Medicaid populations changed by -39.9% to +460.0% before and after Medicaid expansion.

Conclusion:

Hospital choice accounts for a significant proportion of the disparities in post-discharge resource utilization between Medicaid and privately insured patients. Policies aiming to improve the quality and equity of surgical care for Medicaid beneficiaries should focus on the post-acute care period.

49.14 Identifying Superusers of Postoperative Inpatient Care in an Emergency General Surgery Population

S. A. Brownlee1, H. Ton-That2, M. Anstadt2, R. Gonzalez2, A. N. Cobb1,2, P. C. Kuo1,2, A. N. Kothari1,2  1Loyola University Medical Center,One:MAP Division Of Clinical Informatics And Analytics,Maywood, IL, USA 2Loyola University Medical Center,Department Of Surgery,Maywood, IL, USA

Introduction:  Recent attention has focused on the very small percentage of Americans who drive the majority of healthcare spending in this country. Included in this group are patients with prolonged inpatient length of stays – potentially accruing months' or even years' worth of hospital charges. The aim of this study was to characterize these superusers of postoperative care in an emergency general surgery population. 

Methods:  The Healthcare Cost and Utilization Project’s State Inpatient Database for FL, NY, IA, and WA from 2009-2013 were queried using ICD-9 codes for patients admitted with an emergency general surgery diagnosis and a related operating room procedure. Superusers were defined as patients with a length of stay in the top 0.06% of the population (characterizing them as having a rare condition based on Rare Diseases Act of 2002). Superusers were compared to patients with an average length of stay to determine baseline differences in demographic, comorbid, and clinical characteristics. Multivariable models were utilized to identify independent risk factors for classification as a superuser.

Results: 403,188 patients met our inclusion criteria with 264 patients qualifying as superusers (0.06%). Superusers had a length of stay >= 131 and mean length of stay of 174 days (SD=44.5). Maximum length of stay for a superuser was 359 days. Comparatively, the population average (mean) length of stay was 7 days (SD=9.2) and 20,767 patients ("average user") had a 7 day length of stay. Male sex (OR 1.4, P=0.04), Medicaid insurance (OR 2.0, P=.003), and African American race (OR 2.4, P<.001) were independently associated with classification as a superuser. 

Conclusion: Patient factors including sex, insurance, and race are associated with prolonged utilization of inpatient care in an emergency general surgery population. Though superusers account for a small subset of this surgical population, they confer a disproportionate cost burden on the healthcare system. Early identification of these patients can allow targeted interventions to improve the efficiency of care delivery in emergency general surgery.

 

49.12 MELD-Na Score as a Predictor of Postoperative Complications in Elective Hernia Repair

K. A. Schlosser1, A. M. Kao1, M. R. Arnold1, J. Otero1, T. Prasad1, A. E. Lincourt1, K. R. Kasten1, V. A. Augenstein1, B. R. Davis1, B. T. Heniford1, P. D. Colavita1  1Carolinas Medical Center,Division Of Gastrointestinal And Minimally Invasive Surgery,Charlotte, NC, USA

Introduction:
In patients with cirrhosis, the Model for End-Stage Liver Disease Sodium (MELD-Na) is validated as a predictor for transplant and non-transplant surgical outcomes. MELD-Na may also predict patient outcomes in the non-cirrhotic surgical patient. MELD-Na has been demonstrated to predict postoperative morbidity and mortality after elective colectomy, including anastomotic leak. The aim of this study is to apply MELD-Na to predict postoperative complications following elective ventral hernia repair.

Methods:
The ACS NSQIP database was queried (2005-2014) for all elective ventral hernia procedures in patients without ascites or esophageal varices. Postoperative complications and outcomes were compared by MELD-Na score using Chi-square tests and multivariate logistic regression analysis, controlling for age, gender, smoking, steroid use, wound class, and other comorbidities.

Results:
A total of 36,267 elective hernia repairs were identified with mean age 57.9 ± 13.7 years, BMI 33.0 ± 8.7, and 43.7% performed in males. 29.8% of all patients had a preoperative MELD-Na score between 10-20. The overall major, minor, and wound complication rates were 9%, 10.9%, and 6% respectively. 70.2% were performed open (OVHR). In multivariate analysis of OHVR,  incremental increases in MELD-Na score (10-14, 15-19, and ≥ 20) were independently associated with worse outcomes when compared to MELD-Na < 10. MELD-Na 10 – 14 predicted increased 30-day mortality (OR 1.663; CI 1.10-2.52), return to the operating room (OR 1.20; CI 1.00-1.44), major complications (OR 1.27; CI 1.12-1.44), and minor complications (OR 1.30; CI 1.15-1.46). OVHR with MELD-Na 15-19 had higher odds for 30-day mortality (OR 2.89; CI 1.73-4.81), return to the OR (OR 1.38; CI 1.03-1.85), major complications (OR 1.59; CI 1.31-1.92), and minor complications (OR 1.49 CI 1.23-1.79). In laparoscopic repair (LVHR), there were increased major complications in patients with MELD-Na 15-19 (OR 2.38; CI 1.41-4.04). With OVHR and LVHR grouped, elevated MELD-Na scores were associated with increases in 30-day mortality, return to the OR, and major and minor complications. Higher MELD-Na scores were associated with increased rates of several poor outcomes (see table); major complications increased by 28% for MELD Na of 10-14 (OR 1.28; CI 1.138-1.144),  59% for MELD-Na of 15-19 (OR 1.59; CI 1.31-1.94), and over 100% for MELD-Na ≥20 (OR 2.13; CI 1.51-3.00).

Conclusion:
MELD-Na is independently associated with increased postoperative complications and 30-day mortality in elective laparoscopic and open ventral hernia repair. Preoperative MELD-Na screening in non-cirrhotic patients should be considered.
 

49.09 Utilization of Ultrasound for Primary Inguinal Hernia: Framing the Need for De-Implementation

J. B. Melendez2, S. Manz4, M. Cramer3, A. Kanters1, J. B. Dimick1, D. A. Telem1  1University Of Michigan,Department Of Surgery,Ann Arbor, MI, USA 2University Of Michigan,Medical School,Ann Arbor, MI, USA 3Cornell University,Ithaca, NY, USA 4University Of Michigan,Ann Arbor, MI, USA

Introduction: Numerous studies demonstrate that ultrasound use in diagnosing inguinal hernias is unwarranted. Despite this, ultrasonography is liberally utilized as part of the workup for this condition. To address this practice gap, we sought to understand provider practice patterns and the impact of ultrasound on patient management.

Methods: A single-center, retrospective chart review of 522 patients who presented to the division of minimally invasive surgery from 1/2014-4/2017 was performed.  Following the exclusion of patients with a recurrent inguinal hernia, 358 charts were evaluated. Data for patients with and without ultrasound were compared via univariate analysis. Significant variables were then analyzed through multivariate regression models

Results: Of the 358 patients evaluated for an inguinal hernia, 131 (36.6%) had an ultrasound examination of which 121 (93.1%) ultrasounds were positive for hernia. Women (p=0.002) and persons younger than 40 (p=0.01) were significantly more likely to undergo ultrasound. Ultrasounds were ordered most frequently by primary care providers in internal and family medicine (76.5%). Only 2.3% of ultrasounds were ordered by surgeons. The physical exam performed by the ordering provider was available for 87% of patients who underwent an ultrasound, and a positive hernia exam was documented in 44.7%. For patients who did not undergo ultrasound, 82.9% had a positive pre-surgical referral clinical exam (data available for 72.9%). Hernia detection rates on physical exam did not differ between surgeon and referring/ordering provider.  A total of 187 patients underwent surgery, of which 54 had an ultrasound and 134 did not. All but 1 patient had a hernia per the operative records. For those with positive physical exams, recommendation for operative intervention did not differ between patients with and without an ultrasound (75.3% vs. 79.1%, p=0.5). Similarly, recommendation for operative intervention did not differ between patients with and without an ultrasound and a negative clinical exam (14.8% vs. 7.7%, p=0.37). For the 8 (14.8%) patients with an ultrasound and negative physical exam who went to surgery, 6 had positive and 2 had negative ultrasounds for hernia. 

Conclusion: Ultrasound is used as a diagnostic tool in nearly 40% of patients with a primary inguinal hernia, without a clear impact on patient care. Presence of a positive ultrasound did not change patient management, and decisions for operative intervention were predominantly guided by physical exam. The majority of ultrasounds were ordered by providers in a primary care setting. The decision to order an ultrasound appears multifactorial, particularly as nearly half of patients who underwent ultrasound had a documented positive physical exam by the ordering provider.  Further work to understand individual provider behavior is needed to develop effective strategies aimed at de-implementation of ultrasound use for inguinal hernia diagnosis.

 

49.10 Quantifying the Impact of Diabetes Mellitus on Breast Cancer Surgery Outcomes

B. N. Tran1, M. Singh1, A. Doval1, E. C. Levine1, D. Singhal1, B. T. Lee1  1Beth Israel Deaconess Medical Center,Plastic And Reconstructive Surgery,Boston, MA, USA

Introduction:  Diabetes has been identified as one of the comorbidities that negatively affect surgical outcomes. This study aimed to quantify the impact of diabetes on outcomes of breast cancer surgery using the American College of Surgeons National Surgical Quality Improvement Program. 

Methods:  Patients with breast cancer diagnosis were identified from 2005 to 2015. Those who underwent lumpectomy, mastectomy, and breast reconstruction were identified. Complication rates of patients with and without diabetes for each group were compared using student t-test. A binary logistic regression was performed to identify risk factors associated with postoperative complications. 

Results: There were 36,001 lumpectomies, 41,539 mastectomies, and 36,740 delayed breast reconstruction. About 2.6% of lumpectomy patients underwent immediate reconstruction, compared to 33.5% of mastectomy patients. The presence of diabetes doubled the rate of complications in all groups including lumpectomy without reconstruction (2.7% vs. 4.6%), with reconstruction (4.6% vs. 8.1%), mastectomy without reconstruction (7.1% vs. 10%), with immediate reconstruction (8.6% vs. 16.8%), implant-based delayed reconstruction (5% vs. 8.9%), autologous delayed reconstruction (17.5% vs. 26.9%), and autologous with implant delayed reconstruction (8% vs. 16.4%).  Diabetic patients on insulin had higher incidence of complications compared to those on oral agents only in mastectomy group (9.8% vs.13.6%). Independent risk factors for postoperative complications were DM (OR 1.3, p<0.001), smoking (OR 1.3, p<0.001), and ASA 3 and 4 (OR 1.5, p<0.001). Any level of dependence was found to be independent risk factor for oncologic resection (OR 2.3, p<0.001).  

Conclusion: Diabetes is a significant comorbidity in breast cancer surgery, both in oncologic resection and reconstruction.  Complications associated with implant-based reconstruction might be undercounted due to the 30-day nature of this type of database. Patient should be counseled about the risks associated with diabetes in breast cancer surgery. 

 

49.11 “Adoption and Outcomes of Primary Anastomoses with Proximal Diversion for Diverticulitis”

B. Resio1, K. Y. Pei1, J. Liang1, Y. Zhang1  1Yale University School Of Medicine,New Haven, CT, USA

Introduction:  Complicated diverticulitis is a common indication for colon surgery in Western society.  Traditionally, colectomy with end colostomy or Hartmann’s procedure (HP) has been the surgical procedure of choice.  However there is accumulating evidence that primary anastomoses with proximal small bowel diversion (PAPD) is safe, even in complex cases. This study seeks to clarify current adoption of PAPD  for diverticulitis and compare outcomes between PAPD and HP.

 

Methods:  We performed a retrospective review of the American College of Surgeons National Surgical Quality Improvement Program (NSQIP) participating hospitals.  All patients who underwent open HP (CPT 44143) or PAPD (CPT 44140 and 44310) for primary diagnosis of diverticulitis (ICD9 562.11, 562.13) between 2005 and 2015 were identified.  Primary outcome measures included overall complications, length of stay, and mortality. Outcomes were compared with logistic regression and presented as odds ratios, which were risk adjusted for patient and operative characteristics.

 

Results: The proportion of PAPD decreased from 64.94% in 2005 to 43.79% in 2015. Most emergency cases underwent HP and the percentage of HP for emergency cases increased overtime from 67.44% in 2005 to 82.77% in 2015.   When adjusted for patient characteristics, PAPD was associated with decreased risk of overall complications (OR=0.71, 95%CI: 0.65-0.78), in non-emergency cases (OR=0.65, 95%CI: 0.58-0.72) but not in emergency cases (OR=0.87, 95%CI: 0.75-1.01).  PAPD resulted in decreased length of hospital stay (OR=0.41, 95%CI: 0.37-0.46 for 7-10 days of hospital stay; OR=0.23, 95%CI: 0.20-0.25 for longer than 10 days of stay). Decreased mortality (OR=0.76, 95%CI: 0.58-0.99) was seen in non-emergency cases (OR=0.50, 95%CI: 0.32-0.77) but not in emergency cases (OR=0.90, 95%CI: 0.65-1.25).  PAPD resulted in higher risk of returning to the operating room in emergency cases (OR=1.28, 95%CI: 1.00-1.64)

 

Conclusion: The adoption of PAPD among NSQIP hospitals was initially rapid but appears to be decreasing. PAPD resulted in decreased complications, length of stay, and 30-day mortality when compared to HP only for non-emergent cases. Outcomes were comparable to HP during emergencies, however, there was increased need to return to the operating room. Caution must be exercised in selecting PAPD during emergent procedures.

49.07 Blood Transfusions Increase Risk of Venous Thromboembolism Following Ventral Hernia Repair

J. H. Helm1, M. C. Helm1, J. C. Gould1  1Medical College Of Wisconsin,General Surgery,Milwaukee, WI, USA

Introduction:  Blood transfusions are known to affect the clotting cascade and inflammatory pathways of the circulatory system. This may further increase the risk of venous thromboembolism (VTE), particularly in the setting of a separate prothrombotic stimulus- major abdominal surgery. VTE defined as deep vein thrombosis (DVT) or pulmonary embolism (PE) is a serious complication affecting morbidity and mortality outcomes in general surgery patients. The aim of this study was to evaluate the association between bleeding requiring a perioperative blood transfusion and the incidence of postoperative VTE in ventral hernia patients. 

Methods:  The American College of Surgeons National Surgery Quality Improvement Program (NSQIP) was queried for non-emergent open (n=37,591) and laparoscopic (n=11,895) ventral/incisional hernia repairs (VHR) that occurred between 2013 and 2015. Univariate and multivariate regression analyses were used to determine factors predictive of VTE development within 30-days of surgery. Additional analyses were conducted to determine the subsequent impact of bleeding requiring transfusion on the development of VTE, based on surgical approach. 

Results: The overall rate of bleeding requiring transfusion was 1.5% (1.3% open, 1.9% lap; p<0.0001). DVT was diagnosed an average of 12.7 days and PE 10.2 days after surgery. There was no difference in the rate of VTE between groups (both 0.4%). Of the patients that had data available, 48.5% (n=176) experienced the VTE after discharge. There was no difference in the length of stay between groups, however patients who underwent open repair were younger (56.3 vs. 57.4 years; p<0.0001) and experienced shorter operating room times (100.6 vs. 114.5 minutes; p<0.0001). Increased age, operative time, length of stay, and comorbidities including ASA class III/IV, metabolic syndrome, chronic obstructive pulmonary disease (COPD), dialysis, CHF, disseminated cancer, chronic steroid use, bleeding disorder, and transfusion prior to surgery increased perioperative bleeding risk (p<0.05). A single blood transfusion following open VHR increased the risk of venous thromboembolism 8.9-fold (p<0.0001). Predictive risk factors for VTE following transfusion included age, perioperative complication, ASA class III/IV, bleeding disorder, and COPD. 

Conclusion: To our knowledge this is the first study exploring the risk of venous thromboembolism development following perioperative blood transfusions in ventral hernia patients. Bleeding and blood transfusion may lead to an increased rate of VTE following surgery due to a hypercoagulable state induced by the transfusion, by withholding chemoprophylaxis in patients who bleed, by a combination of these factors, or other factors. As nearly half of these thrombotic events are diagnosed after discharge from the hospital, close clinical follow up and selective outpatient VTE chemoprophylaxis is highly recommended.

 

49.08 Epidural Versus Spinal Block Analgesia After Major Abdominal Operations

D. M. Hall1, S. Christian1, C. Moore1, J. Deneve1, P. Dickson1, R. S. Daugherty1, S. W. Behrman1, M. Kent2, B. Bicknell2, D. Shibata1, L. Douthitt2, E. Glazer1  1University Of Tennessee Health Science Center,Department Of Surgery,Memphis, TN, USA 2Medical Anesthesia Group,Anesthesia,Memphis, TN, USA

Introduction: Minimizing opiate use after major abdominal operations has shown promise as part of enhanced recovery after surgery programs (ERAS). Alternatives to epidural (EP) such as intrathecal/spinal (IS) analgesia blocks or transversus abdominus plane (TAP) injections often allow a reduction in opioid usage.  At our institution, intrathecal/spinal block + TAP (S-TAP) has not yet been compared to EP in patients undergoing major abdominal operations.  We hypothesized that intrathecal/spinal block + TAP (S-TAP) injections might provide similar post-operative analgesia, similar pain scores, and a similar amount of opiate use compared to EP and multimodality therapy when utilized in patients undergoing major abdominal operations.

Methods:  We retrospectively reviewed all patients treated by our ERAS protocol over a 6-month time period who underwent major gastrointestinal and hepatopancreatobiliary operations. Patients were included if they received an EP, S-TAP block, or if the ERAS service was consulted. Pain scores (PS) (scored 0 -10 for the first 5 post-operative days), daily opioid use for the first 5 post-operative days, and total length of stay (LOS) were compared amongst patients with epidurals (bupivacaine), S-TAP (dilaudid & bupivacaine), and a group receiving multimodality oral/intravenous analgesia (acetaminophen, opioids, ketamine, gabapentin, and/or NSAIDs). Patients in the EP and S-TAP groups also received multimodality analgesics. Means were compared with ANOVA and Student’s t-test. Multivariate analysis was performed with linear regression.

Results: 120 patients were identified (average age of 60 ± 13 years; 53% were male). Only those who underwent open operations (n = 88) were investigated.  40 (45%) patients underwent operations for cancer. 27 patients were in the EP group, 20 patients were in the S-TAP group, and 41 patients were in the multimodality group.  Pain scores were similar amongst the groups (P>0.12 each day) with the exception of POD0 where the multimodality group PS was 8.5 compared with 6.5 for the EP (P=0.03) and 6.36 for S-TAP (P=0.02). The EP group used significantly fewer opiate equivalents (total 70 mg morphine equivalents) compared to the S-TAP group (total 165 mg, P=0.04) and multimodality group (134 mg, P=0.03). The mean LOS for the S-TAP group (6.75 days) was significantly shorter than the multimodality group (11 days, P=0.03) but not significantly different from the epidural group (8.5 days, P=0.20).

Conclusion: The patients in the EP group, S-TAP group, and multimodality group had similar pain scores, similar length of stay, but a significant difference in opiate use after major abdominal operations. Based on these results in our institutional population, S-TAP optimization in an ERAS protocol may help provide equivalence to epidural use in select environments and patient populations.

 

49.05 Gastrointestinal Perforations: Examining the Consequences of Antibiotics among Medicare Beneficiaries

V. T. Daniel1, S. Sanders1, D. Ayturk1, B. A. McCormick2, H. P. Santry1  1University Of Massachusetts Medical School,Department Of Surgery,Worcester, MA, USA 2University Of Massachusetts Medical School,Center For Microbiome Research,Worcester, MA, USA

Introduction: Approximately 266 million courses of antibiotics are issued annually in the US with the rising elderly population consuming a large number of antibiotics nationally. Exposure to antibiotics is a modifiable determinant of microbiota, and nosocomial infections from antibiotic-induced microbiome dysbiosis have been well-studied. Alterations in gut microbiome, possibly through inducing a pro-inflammatory state, are also known to cause gastrointestinal (GI) perforations. However, the effect of antibiotics, a proxy of microbiome modulation, on the incidence of GI perforations is unknown.

Methods: For this case-control design, we queried a 5% random sample of Medicare beneficiaries (2009-2011) to identify patients who were ≥ 65 years with stomach, small intestine, or large intestine perforations. Exposure was previous outpatient antibiotic use (0-30 days, 31-60 days, or 61-90 days prior to admission). Cases had GI perforations, while controls did not have GI perforations and were matched by age and sex. Univariate and multivariable regression analyses were performed.

Results: We identified 5929 cases and 29,645 matched controls.  Compared to matched controls, 0-30 day (prior to admission) antibiotic use was higher among cases (27% vs. 8%, p<0.0001), 31-60 day antibiotic use was higher among cases (8% vs. 6%, p<0.0001), and 61-90 day antibiotic use was higher in cases (5.4% vs. 4.7%, p<0.0001).  No antibiotic 90 days prior to admission was lower among cases compared to controls (60% vs. 81%, p<0.0001). Of those who used antibiotics 0-90 days prior to admission, the odds of developing a GI perforation was highest in the antibiotic use 0-30 days prior to admission. Specifically, among all antibiotic users, the use of fluoroquinolones was a significant predictor of GI perforations compared to other antibiotic classes (OR 2.1, 95% CI 1.87, 2.37).

Conclusion:  Recent antibiotics increase the odds of developing a GI perforation, with the most recent antibiotic use (0-30 days) having the greatest odds. Minimizing exposure to antibiotics may lower the incidence of GI perforations. Exposure to antibiotics, one of the most modifiable determinants of microbiota, should be minimized in the outpatient setting.

 

 

 

 

 

 

 

49.06 Modified Frailty Index Predicts Complications and Death after Non-Bariatric Gastrectomies

K. Zorbas1, A. Di Carlo1, A. Karachristos1  1Temple University,Department Of Surgery,Philadelpha, PA, USA

Introduction:  The aim of the present study was to validate the modified frailty index (mFI) as a preoperative risk predictor for postoperative morbidity and mortality after gastrectomy for a non-bariatric disease.

Methods:  The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database was queried for patients who underwent total or partial gastrectomy for malignancy or benign disease from 2005 to 2011. A modified frailty index was calculated, based on 11 variables: functional status; diabetes mellitus (DM); chronic obstructive pulmonary disease (COPD) or pneumonia; congestive heart failure (CHF); previous myocardial infarction (MI); previous percutaneous coronary intervention (PCI), cardiac surgery, or angina; hypertension (HTN) requiring the use of medications; peripheral vascular disease or rest pain; impaired sensorium; history of transient ischemic attacks (TIA) or cerebrovascular accident without residual deficit; history of cerebrovascular accident with residual deficit. Then the population divided into four categories based on the MFI score. Thirty-day mortality and postoperative complications were evaluated. Chi-square test and logistic regression analyses were performed.

Results: Overall, 6832 patients underwent a gastrectomy for a non-bariatric disease. There were 2320 (36.5%) patients with no frailty, 3309 (52.1%) patient with low frailty (1-2 mFI), 655 (10.3%) with intermediate frailty (3-4 mFI) and 71 (1.1%) frail (≥5 mFI) patients. A higher mFI score is associated with significantly higher rates of mortality (1.5% vs 21.1%, P<0.001), overall morbidity (23.2% vs 66.2%, P<0.001), Clavien IV complications (6.7% vs 46.5%, P<0.001), surgical site infections (12.7% vs 16.9%, P=0.005) and sepsis related complications (9.4% vs 23.9%, P<0.001).  Multivariate analysis identified mFI as an independent predictor of Clavien IV complications (OR 5.04, 95%CI 2.79-9.08, P < 0.001) and 30-day mortality (OR 4.63, 95%CI 2.09-10.3, P < 0.001).

Conclusion: An increasing mFI score in patients undergoing a non-bariatric gastrectomy, associated with a higher incidence of morbidity and mortality. MFI Score can be easily calculated preoperatively, from patient’s history, and it can add valuable insight regarding the postoperative mortality and morbidity.
 

49.03 Variations in Nationwide Readmission Patterns After Umbilical Hernia Repair

S. A. Eidelson1, J. Parreco1, M. B. Mulder1, K. G. Proctor1, R. Rattan1  1University Of Miami,Department Of Surgery,Miami, FL, USA

Introduction:

The purpose of this study was to identify risk factors and costs associated with readmission after umbilical hernia repair (UHR) across the United States.

Methods:

The 2013-2014 Nationwide Readmissions Database was queried for patients admitted for UHR. Multivariate logistic regression identified risk factors for 30-day readmission at index and different hospitals. Readmission cost was calculated using cost-to-charge ratios.

Results:

Of the 105,608 patients admitted for UHR, 8.9% (n=9,372) were readmitted within 30 days. Of those, 15.6% (n=1,461) occurred at a different hospital. Of the 18.5% (n=19,569) patients readmitted within 1 year, 1.2% (n=225) were due to recurrence. Risk factors unique to readmission to a different hospital included: elective initial admission (OR 1.23, p=0.001), age 18-44 years (OR 1.83, p = 0.019), and age 45-64 years (OR 1.38, p=0.006). Protective factors unique to readmission to a different hospital included: bowel resection (OR 0.76, p<0.001), medium hospital size (OR 0.67, p<0.001), and large hospital size (OR 0.56, p<0.001). Other risk factors for readmission to a different hospital included: leaving against medical advice (AMA) (OR 5.32, p<0.001), discharge to a skilled nursing facility (SNF) (OR 2.44, p<0.001), discharge with home health care (OR 1.67, p<0.001), Medicare (OR 2.07, p<0.001), Medicaid (OR 1.48, p<0.001), Charlson Comorbidity Index ≥2 (OR 1.49, p<0.001), and length of stay (LOS) >7 days (OR 1.68, p<0.001). Risk factors for readmission to the index hospital included: bowel resection (OR1.28, p<0.001), leaving AMA (1.65, p=0.013), discharge to SNF (OR 1.41, p<0.001), discharge with home health care (OR 1.56, p<0.001), medium hospital size (OR 1.13, p=0.004), large hospital size (OR 1.18, p<0.001), Medicare (OR 1.49, p<0.001), Medicaid (OR 1.35, p<0.001), CCI ≥2 (OR 1.23, p<0.001), and LOS >7 days (OR 1.93, p<0.001). Risk factors for readmission overall included: bowel resection (OR 1.19, p<0.001), leaving AMA (OR 2.30, p<0.001), discharge to SNF (OR 1.56, p<0.001), initial hospitalization at a for-profit hospital (OR 1.11, p=0.022) or metropolitan teaching hospital (OR 1.26, p<0.001), public insurance (Medicare OR 1.58, p<0.001; Medicaid OR 1.37, p<0.001), CCI ≥2 (OR 1.27, p<0.001), and LOS >7 days (OR 1.92, p<0.001). Laparoscopic surgery was protective against readmission (OR 0.92, p=0.001). The median readmission cost was higher for patients readmitted to a different hospital ($9,817 [$5,560-19,113] vs $9,129 [$5,183-$16,776], p<0.001). The median cost of recurrence within 1 year was $12,299 [$6,298-$25,120].

Conclusion:

This is the first UHR readmission study using a nationally representative sample that includes readmissions to a different hospital. A significant proportion of readmission after UHR occur at a different hospital and are not captured by current benchmarking. Compared to readmission to the index hospital, risk factors are unique and costs are higher.

49.04 Surgical Resection of Small Intestinal Neuroendocrine Tumors in Elderly Patients Improves Survival

E. W. Cytryn1, J. J. Aalberg1, D. Dolan1, C. M. Divino1  1Icahn School Of Medicine At Mount Sinai,Division Of General Surgery, Department Of Surgery,New York, NY, USA

Introduction:  The incidence of small intestinal neuroendocrine tumors (siNETs) in the US has been increasing over the past 20 years. The vast majority of these are diagnosed in older adults with a median age near 68. However, there is little in the current literature that outlines a definitive direction of treatment in these patients. The aim of this study was to determine whether elderly patients with siNETs who undergo surgery have different cancer-specific and overall survival outcomes than those who do not. We hypothesized that patients who underwent surgery would have better survival outcomes.  

Methods:  We identified patients 65 years and older diagnosed with siNETs from 1998-2014 in the Surveillance, Epidemiology, and End Results (SEER) database. To determine the risk factors associated with overall survival, we performed chi-square tests. Kaplan-Meier analysis was also used to estimate overall and cancer-specific survival functions for both study cohorts and log-rank tests were used to compare the curves. The effect of surgical intervention on overall and cancer-specific survival was further examined using a multivariable inverse probability weighted propensity scored Cox Proportional-Hazard model.

Results: We found 4,732 patients with siNETs, 3,877 (81.93%) of whom underwent cancer-directed surgery. In univariate analysis, surgery was found to be significantly associated with an increase in 1-year (80.99% vs. 68.77%; p<.0001) and 5-year overall survival (38.69% vs. 27.25%; p<.0001). Age, primary site, histologic type, grade, and stage were also found to be significantly associated with increased 1-year and 5-year overall survival. Log-rank tests performed on the Kaplan-Meier survival functions indicated a significant increase in overall survival (p<.0001) and cancer-specific survival (p<.0001) for patients receiving surgery. After propensity scoring, patients who underwent surgical resection were found to have a significantly reduced hazard of cancer-specific death (HR: 0.475; 95% CI: 0.422-0.535; p<.0001) as well as overall death (HR: 0.659; 95% CI: 0.611-0.711; p<.0001).

Conclusion: Elderly patients who underwent surgical resection of siNETs had better cancer-related and overall survival than patients who did not. Though surgery can be contraindicated in older patients, the results of this study provide support for the surgical management of siNETs and suggest this intervention as the standard of treatment.

 

49.02 Emergent Laparoscopic Ventral Hernia Repairs (LVHR)

A. M. Kao1, C. R. Huntington1, J. Otero1, T. Prasad1, V. Augenstein1, A. E. Lincourt1, P. D. Colavita1, B. T. Heniford1  1Carolinas Medical Center,Gastrointestinal And Minimally Invasive Surgery,Charlotte, NC, USA

Introduction:
Despite its popularity in the elective operative setting, few studies have examined the feasibility of laparoscopy in emergent VHR and, much less, its potential benefits. The aim for this study was to examine a national quality improvement program data for LVHR in the emergent setting.

Methods:
Patients who underwent an emergent VHR from 2005-2014 were queried from the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database. Descriptive and univariate analysis were used to compare demographics, comorbidities and 30-day outcomes based on surgical approach. Multivariate linear and logistic regression models were utilized to adjust for differences in baseline patient characteristics. Multivariate models were created using statistically significant (p<0.05) variables on univariate analysis, including age, preoperative sepsis and comorbidities.

Results:
A total of 8,826 patients underwent emergency VHR; 953 (10.8%) were performed laparoscopically and 7,873 (89.2%) underwent open repair. Laparoscopic (L) VHR in the emergent setting increased over time, from 10.5% in 2009 to 14.0% in 2014. Additionally, LVHR was performed more often by an attending and resident compared to an attending alone (43.5%vs37.0%,p<0.0001). Patients who underwent open (O)VHR were older (59.7±14.8 vs 55.5±15.2,p<0.0001) and more likely to be female (65.6%vs60.7%,p<0.002), smokers (20.7%vs17.7%,p<0.03),and have ASA >3 (69.8%vs54%,p<0.0001) with increased comorbidities including diabetes, COPD, and chronic steroid use. There was no difference between BMI or functional status. Preoperative sepsis was higher in OVHR (28.5%vs16.6%,p<0.0001) and was associated with 40% increase in postoperative wound complications(95%CI 1.02-1.65). 30-day mortality rate after emergent VHR was 2.7% overall, with 2.9% after OVHR and 1.0% after LVHR. Patients who underwent LVHR had readmission rate of 6.9% and reoperation rate of 2.7% compared to OVHR with readmission of 10.1%(p<0.0007) and reoperation 5.1%(p<0.0005).  After multivariate regression to control for confounding factors, LVHR was independently associated with 63% fewer surgical site infections (95% CI 0.35-0.93) and 56% decrease in wound complications(95%CI 0.24-0.81) compared to OVHR. Patients who underwent OVHR were 1.8 times more likely to have a minor complication(95%CI 1.08-2.86), however there were no statistical differences in 30-day mortality, major complication, septic shock or reoperation between LVHR and OVHR. Total hospital length of stay was 1.6 days longer(standard error 0.79, p<0.04) in patients who underwent OVHR.

Conclusion:
Emergent LVHR is more often utilized in younger, less comorbid patients.  When controlling for these issues LVHR offers a decrease in wound complications.  OVHR otherwise appears to be an appropriate option in the emergent setting.  Long term data concerning recurrence, mesh complications, and other issues are needed.
 

48.20 ACS NSQIP Surgical Risk Calculator: Pilot Analysis on Feasibility in an Academic Safety Net Hospital

A. M. Jensen1, M. L. Crandall1, B. K. Yorkgitis1, J. H. Ra1  1University Of Florida,Department Of Surgery,Jacksonville, FL, USA

Introduction:  There is a growing movement in today’s health care environment to not only tie reimbursements for patient care to outcomes, but to publicly report these results. Hospitals are looking for effective, simple methods to not only track patient outcomes, but to track outcomes that are risk-adjusted for patient population characteristics. This is especially relevant for safety net institutions servicing high-risk populations. One such program with the above goals is the American College of Surgeons National Surgical Quality Improvement Program® (ACS NSQIP®). NSQIP data powers a preoperative risk calculator tool that allows clinicians to input an individual patient’s risk factors into a statistical model that calculates the likelihood of various outcomes. This is an institution-based, retrospective quality audit whereby we determined the presence and consistency of charted data required to compute perioperative risk in the ACS NSQIP risk calculator. 

Methods:  A retrospective chart review of 30 randomly selected, elective major colorectal procedures was performed at an urban, academic safety net hospital between January 1st, 2015 and December 31st, 2015. For each case it was determined in a yes/no format whether or not the required NSQIP variables were readily presented via pre-operative documentation. The collected data was then analyzed to determine the presence and consistency of charted data required to compute perioperative risk in the ACS NSQIP risk calculator.

Results: Of the 30 reviewed patient charts, none (n=0) had all pre-operative risk documentation required to complete an ACS NSQIP risk analysis. Only 23.3% (n=7) of charts had ≥ 50% of required data, while 96.6% of charts (n=29) had ≤ 55% of required data to complete a NSQIP pre-operative risk assessment. It was noted that pre-operative risk variable documentation was found widely throughout patients’ charts in a largely non-uniform fashion, performed in varying degrees by multiple providers, and often lacked definitive documentation of pre-operative interventions to modulate risk based on patient risk factors.

Conclusion: Pre-operative risk assessment and charting practices at the safety net hospital reviewed was fragmented and often lacking the data needed to properly risk-assess patients in the pre-operative period. Even if risk was being assessed, there was lack of documentation required for outcomes assessments under current reimbursement models such as the MACRA Quality Payment Program. At safety net hospitals especially, where there are high-risk patients with often multiple comorbidities and socioeconomic barriers, we must implement means to consistently risk-stratify patients so that outcomes occurring correlate with pre-operative risks. Future research should be geared towards applicability and possible deficiencies of NSQIP in predicting postoperative complications in these safety net institutions.

 

49.01 Admission of Patients with Biliary Disease to a Surgery Service Decreases Length of Stay

R. C. Banning1, N. R. Bruce1, W. C. Beck1, J. R. Taylor1, M. K. Kimbrough1, J. Jensen1, M. J. Sutherland1, R. D. Robertson1, K. W. Sexton1  1University Of Arkansas For Medical Sciences,Department Of Surgery,Little Rock, AR, USA

Introduction:
The prevalence of gallbladder disease in the United States has been estimated to be 20.5 million adults. Two major complications of biliary disease are biliary pancreatitis and acute cholangitis, both of which may lead to shock and death. It is therefore important that this condition is promptly diagnosed and that definitive treatment – i.e., surgery or endoscopy – is not delayed. Previous studies have shown that patients at intermediate risk for having choledocholithiasis who proceeded directly to surgery had a shorter median length of stay compared to those patients who received endoscopic therapy first. Therefore, in patients for whom there is a high clinical index of suspicion for choledocholithiasis, direct admission to a surgical service may further decrease their length of stay, reduce the number of imaging and interventional procedures needed, and therefore the cost of the hospital stay.  If a patient is admitted to a primary medicine service, however, they may experience unnecessary delay before surgical intervention.

Methods:
A retrospective study was conducted at an academic medical center on all patients admitted with biliary disease from May 2014 to May 2017. Diagnoses were grouped into the following categories: cholecystitis, choledocholithiasis, symptomatic cholelithiasis, cholangitis, congenital abnormalities, biliary fistula, and unspecified biliary disease. As part of a hospital wide protocol, biliary disease patients were admitted to a primary surgical service as beginning 10/1/2015. Outcomes were compared to medical service admission prior to the institution of this protocol. Data for the study were obtained from the enterprise data warehouse and an intention to treat analysis was performed. Aside from the admission to a surgical service, no additional protocols or care pathway changes were implemented. The primary outcome measure was length of stay.

Results:
The data set consisted of 12,945 admissions on 4,317 unique patients. Average length of stay was significantly shorter in the primary surgical service admission group (1.9 days ± 4.6 days; n=4505) vs. the primary medical service admission group (2.4 days ± 6.0 days; n=8440), with p<0.0001. When evaluating based on diagnosis, the decrease in LOS was statistically significant for cholangitis (3.4 days ± 6.9 days; n=636 pre-protocol, n=378 post-protocol; p=0.0191), cholecystitis (1.8 days ± 4.5 days; n=4380 pre-protocol, n=2141 post-protocol; p=0.0015), and symptomatic cholelithiasis (3.5 days ± 8.7 days; n=1206 pre-protocol, n=251 post-protocol; p<0.0001).

Conclusion:
This single-institution study suggests that admission to a primary surgical service may improve length of stay for patients admitted with biliary disease.
 

48.18 Current Trends of Pilonidal Disease at a Veteran Administration Hospital: A 12-Year Experience

O. Renteria1, H. Cunningham1, V. Jain1, M. S. Sultany1, M. Ruiz1, S. Huerta1,2, S. Huerta1,2  1University Of Texas Southwestern Medical Center,Surgery,Dallas, TX, USA 2VA North Texas Health Care System,Surgery,Dallas, TX, USA

Introduction:  Pilonidal Disease (PD) affects primarily young, adult men with highest prevalence in the second decade of life.  The Veteran Administration (VA) patient population is composed primarily of older men. The aim of this study was to investigate the incidence and outcomes of veteran patients that underwent surgical treatment of PD.

Methods:  A retrospective review of all patients treated for pilonidal Disease at the VA North Texas Health Care System over the past 12 years (2005-2017) was performed. Univariate and multivariate analysis was undertaken to analyze outcomes.

Results: 122 patients were identified with mean age 41.4 ± 17.6 years-old; male = 95.0%; White = 73.8%; BMI = 33.1 ± 7.5 kg/m2; ASA III-IV = 30.3%. Recurrence and complication rates were 10.7% and 22.1%, respectively.  Most patients (65.6%) underwent excision with primary closure. The most common complications were wound dehiscence and infection. No comorbid condition was identified as an independent predictor of complications or recurrence. Longer operative times (OR 1.037, 95% CI: 1.001– 1.075) and older age (OR 1.037, 95% CI: 1.000 – 1.076) emerged as the only independent predictors for complications. No significant difference in complications and recurrence was determined between primary closure and secondary intention healing. There was no difference in the complication rate between colorectal surgeons and general surgeons performing the surgery (p = 0.13), but recurrence was higher for colorectal surgeons (28.0%) compared to general surgeons (6.2%; p = 0.005).

 

Conclusion: The incidence of PD in Veteran patients is low, but it is associated with substantial morbidity and recurrence.  In this small cohort, no comorbid factor revealed to be a predictor for complications or recurrence, but longer operative time and age are predictors of complications. Surgical outcomes were similar to previous studies, however, no surgical method was identified as an optimal standard and management of Veteran patients presenting with PD should be individualized.
 

48.19 A feasible and alternative approach for the treatment of internal hemorrhoids

N. HAYASHI1, N. HAYASHI1  1Kinan Hospita,Surgery,Tanabe, WAKAYAMA, Japan

Introduction:
~Sclerotherapy with aluminum potassium tannic acid (ALTA) has been widely accepted for the treatment of internal hemorrhoids because of its effectiveness and low invasiveness. However, it is often difficult to maneuver an anoscope for the precise 4-step injection in a conventional ALTA injection therapy

Methods: ~Sixty eight consecutive patients with symptomatic internal hemorrhoids Golisher grade III or IV were enrolled and divided into two equal groups; the one treated with EI using ALTA, and the other with AI. In each case, ALTA is injected directly into 4 points of an internal hemorrhoid (4-step injection) to induce sclerosis for remission of the hemorrhoids. Surgical outcomes and postoperative complications were retrospectively compared between the two groups.

Results:~ The overall median operative time was shorter in EI group (19±12 min) than AI group (27±15 min) (P<0.05). Postoperative pain was significantly less in EI group than AI group during the first two postoperative days. In AI group, three patients needed further hospital stay for topical pain. No significant difference was observed in other complications between the two groups.

Conclusion:~ALTA therapy with an endoscope in the treatment of symptomatic internal hemorrhoids has the advantage in terms of surgical time and immediate post-operative pain. This method might be a good alternative for conventional ALTA therapy with the rigid anoscope.

 

48.16 Practice Management Guidelines: Treatment of Cardiovascular Implantable Device Pocket Infections

E. Buckarma1, M. Mohan1, L. Baddour2, T. Earnest1, H. Schiller1, E. Loomis1  1Mayo Clinic,Department Of Surgery,Rochester, MN, USA 2Mayo Clinic,Department Of Infectious Diseases,Rochester, MN, USA

Objective: Treatment of CIDPI requires a multimodal approach that includes antimicrobials, device explanation and local wound care.  Our institution implemented a Practice Management Guideline (PMG) to standardize the care of Cardiovascular Implantable Device Pocket Infections (CIDPI) and engage our Acute Care Surgeons in 2013.  Our PMG includes wound culture, complete capsulectomy, pulse lavage and placement of a negative pressure wound therapy device at the time of device extraction.  48 hours later, wounds are irrigated and closed in a delayed primary fashion over drains.  Our objective was to compare the outcomes of patients who underwent cardiovascular device extraction before and after the implementation of the PMG for the treatment of CIDPIs.

Methods: An IRB approved retrospective review of 155 patients at our institution from 2012-2015 who underwent cardiovascular device explanation.  Evaluated outcomes measured included days from device explant to wound closure, post-operative complications (hematoma, surgical site infection, unplanned return to OR) 
Outcomes data was analyzed prior to (Group A) and after (Group B) enactment of the PMGs. 

Results:  58 patients (Group A:  42 male, 16 female; mean age 68) were managed prior to PMG implementation and 97 (Group B: 72 male, 25 female; mean age 67) managed after.  Mean days from device explanation to wound closure were compared (Group A, 6 ± 3.5 and Group B, 2.6 ± 1.8) and time to closure was reduced by 3 days in Group B (p<0.05).  No increase in surgical site infection, unplanned return to OR, hematoma was demonstrated between groups (p<0.05).  

Conclusion:  The implementation of PMGs is effective in reducing the number of days to pocket wound closure; acute care surgeons are well equipped to participate in this practice and improve patient outcomes. 

 

48.17 The Impact of Intravenous Acetaminophen After Abdominal Surgery on Pain: A Meta-Analysis

J. J. Blank1, N. G. Berger1, J. P. Dux1, F. Ali1, K. A. Ludwig2, C. Y. Peterson2  1Medical College Of Wisconsin,General Surgery,Milwaukee, WI, USA 2Medical College Of Wisconsin,General Surgery, Colorectal Division,Milwaukee, WI, USA

Introduction:  Pain management after surgery relies heavily on opioids despite known adverse effects. While opioid medications are typically the cornerstone of any pain control plan, they come at a cost. Opioids have a profound impact on postoperative gastrointestinal (GI) motility via activation of the mu-opioid receptors of the small and large intestines. Slowed gastrointestinal motility not only causes discomfort and pain, it causes nausea as well. In fact, opioid medications have been shown to increase the incidence of postoperative ileus, a disruption in normal intestinal peristalsis, which can result in longer hospital stays, increased incidence of complications, and decreased patient satisfaction. Additionally, according to the American Society of Addiction Medicine, four of five new heroin users started out misusing prescription pain medications. Therefore, a multi-modal strategy for pain management is optimal. There is limited data on the effectiveness of intravenous acetaminophen in comparison to other non-opioid analgesics after abdominal surgery.

Methods:  PubMed, Scopus and Cochrane databases were queried for keywords acetaminophen, intravenous (IV), and postoperative. Included studies were prospective, had a comparison group receiving alternate medication, used IV acetaminophen for at least 24 hours, and evaluated adult patients having any trans-abdominal intraperitoneal surgery. Outcomes evaluated were study quality, demographic data, surgical technique, postoperative pain scores, and postoperative narcotic consumption. A random effect analysis of mean differences (MD) was performed and heterogeneity was assessed using I2 statistic. 

Results: Seventeen articles were identified with 1,595 patients included. Overall study quality was moderate (mean Jadad score = 5.6 [±1.9], range 0-8). There was no difference in 24H pain scores or narcotic consumption between acetaminophen or any alternative analgesics (MD -0.10[-0.33, 0.14], p=0.42, I2=91%; MD -3.93[-9.12, 1.25], p=0.14, I2=99%, respectively). Subgroup analysis showed reduced 24H narcotic consumption for NSAIDs compared to acetaminophen (MD 11.18 [10.40, 11.96], p<0.001, I2=0%). For open surgery, analysis demonstrated reduced 24H narcotic consumption for acetaminophen compared to alternative medications (MD -7.29[-13.41, -1.16], p=0.02, I2= 99%). There were no differences in 24H pain scores among subgroups. 

Conclusion: Both NSAIDs and acetaminophen show benefit in reducing 24-hour narcotic consumption in patients undergoing abdominal surgery with moderate quality evidence and significant heterogeneity amongst studies. Given the current nationwide opioid addiction crisis, there is great need to investigate alternative methods of pain control, such as NSAIDs and acetaminophen, in the postoperative setting. 

 

48.13 Reducing Cost and Improving Operating Room Efficiency: Examination of Surgical Instrument Processing

A. Dyas1, K. Lovell1, C. Balentine1, T. Wang1, J. Porterfield1, H. Chen1, B. Lindeman1  1University Of Alabama at Birmingham,Department Of Surgery,Birmingham, Alabama, USA

Introduction:  Operating room efficiency can often be compromised due to delays in processing surgical instruments. We observed that many instruments included in head and neck trays were not routinely used during thyroid and parathyroid surgery at our institution, which increases costs and decreases efficiency. Our objective was to create a streamlined instrument tray to optimize operative efficiency and cost.

Methods:  Head and neck surgical instrument trays were evaluated by operating room team leaders. Instruments within the tray were identified as either necessary or unnecessary based on use during thyroidectomies and parathyroidectomies. The operating room preparation time, tray weights, number of trays, and number of instruments were recorded for both the head and neck tray and the thyroidectomy tray. Cost savings were calculated using Navigant, with estimated reprocessing cost of $0.51 per instrument.

Results: Three of thirteen existing head and neck trays were converted to thyroidectomy/parathyroidectomy trays at no additional hospital cost. Unnecessary instruments were added back to stock for use on other surgical trays. The starting head and neck surgical set was reduced from two trays with 108 total instruments to one tray with 36 instruments. Each operation using the new tray saved $36.72 ($55.08 to $18.36) in reprocessing costs. Projected savings to the hospital with implementation was over $27,000 annually for instrument processing alone. In addition, unmeasured hospital savings include decreased instrument wear and replacement frequency, quicker operating room setup, and decreased decontamination costs. Tray weight decreased from 27 pounds to 10 pounds, and tray preparation time decreased from 8 minutes to 3 minutes.

Conclusion: Optimizing thyroidectomy/parathyroidectomy trays can reduce cost, physical strain, preparation time, decontamination time, and processing times.  Streamlining surgical trays is an effective strategy for hospitals to reduce costs and increase operating room efficiency.