71.09 Trends in Liver Transplantation with Older Liver Donors in the United States

C. E. Haugen1, X. Luo1, C. Holscher1, J. Garonzik-Wang1, M. McAdams-DeMarco1,2, D. Segev1,2  1Johns Hopkins University School Of Medicine,Surgery,Baltimore, MD, USA 2Johns Hopkins University Bloomberg School Of Public Health,Epidemiology,Baltimore, MD, USA

Introduction:  As the United States population ages, older liver donors (OLDs) represent a potential expansion of the donor pool. Historically, grafts from OLDs have been associated with poor outcomes and higher rates of discard, but some recent studies reported equivalent outcomes to grafts from younger donors. We hypothesized that there would be increased use of grafts from OLDs over time and sought to identify trends in demographics, discard, and outcomes of OLDs. 

Methods: We identified OLDs (aged≥70) and liver-only OLD graft recipients from 1/1/2003-12/31/2016 in the Scientific Registry of Transplant Recipients. We studied temporal changes in OLD graft characteristics, utilization, and recipient characteristics. Cuzick test of trends was used to evaluate OLD graft use over time. Cox proportional hazards models were used to estimate adjusted mortality and graft loss for OLD graft recipients. 

Results: Since 2003, 3350 transplants with OLD grafts have been performed. However, the annual percentage of OLD transplants performed out of all adults liver transplants has decreased from 6.0% to 3.2% (p=0.001), and annually 12-25% of all recovered OLDs were discarded. Recently transplanted OLDs were more likely to shorter cold ischemia and come from non-Caucasian donors who had higher BMI and anoxia or head trauma as the cause of death compared to OLDs in 2003. Recent OLD recipients were more likely to be older and less likely to be listed as Status 1 or receive shared organs. Graft and patient survival for recipients of OLD grafts have improved since 2003 (Figure). For recipients of OLD grafts from 2013-2016, mortality was 60% lower (aHR:0.40, 95%CI:0.31-0.52, p<0.001) and all-cause graft loss was 55% lower (aHR:0.45, 95%CI:0.36-0.57, p<0.001) than for OLD recipients between 2003-2006.

Conclusion: Up to 25% of OLDs are discarded annually across the US, and the number of OLD transplants performed has been decreasing. However, there is a significant improvement in graft and patient survival for OLD recipients since 2003. Particularly in the setting of an aging population, these trends in improved outcomes can guide OLD use and decrease OLD discard to possibly expand the donor pool. 
 

71.10 Survival after the Introduction of the Lung Allocation Score In Simultaneous Lung-Liver Recipients

K. Freischlag2, M. S. Mulvihill1, P. M. Schroder1, B. Ezekian1, S. Knechtle1  1Duke University Medical Center,Surgery,Durham, NC, USA 2Duke University Medical Center,School Of Medicine,Durham, NC, USA

Introduction:
The optimal management of patients with combined lung and liver failure is uncertain. Simultaneous lung and liver transplantation (LLT) confers survival benefit over remaining on the waitlist for transplantation. In 2005, the lung allocation score (LAS) was introduced and significantly reduced waitlist and overall mortality in single organ lung transplants. The current system for simultaneous LLT generally matches a recipient with a donor based on his or her LAS. Oftentimes this results in a relatively low MELD score at transplantation compared to liver alone. However, the current impact of LAS on LLT is unknown. To ascertain whether the current lung allocation system has improved survivability in this cohort, we studied LLT before and after the introduction of LAS.

Methods:
The OPTN/UNOS STAR file was queried for adult recipients of simultaneous LLT. Demographic characteristics were subsequently generated and examined. Kaplan-Meier analysis with the log-rank test compared survival between groups. A hazard ratio was generated based on the presence of LAS alone.

Results:
A total of 64 recipients of LAS were identified as suitable for analysis. Of those, 10 underwent transplant prior to the introduction of LAS. Comparatively, those without a LAS score had a higher mean FEV1 (48.22 vs 29.56, p=0.012), higher mean creatinine at transplant (1.22 vs 0.73, p=0.001), higher percentage diagnosed with primary pulmonary hypertension (40% vs 0%, p=0.004), and an earlier mean transplant year (1999.4 vs 2011.17, p<0.001). Survival was significantly lower in the LLT cohort before the introduction of LAS compared to the cohort after LAS (Figure 1- 1-year: 50.0% vs 83.3%, 5-year: 40.0% vs 67.5%, 10-year: 20.0% vs 55.6%, p=0.0073).  Presence of LAS was a predictor of decreased mortality (OR 0.051, 95%CI 0.006-0.436, p=0.007). 

Conclusion:
LLT is a rare procedure and most national analyses have included patients before and after the introduction of the LAS. Our study shows that survival in combined lung-liver transplantation after the introduction of the lung allocation score was significantly increased and presence of LAS was a predictor of decreased mortality.  While many factors contributed to the changes in mortality, the cohorts before and after the introduction of LAS are significantly different and should be treated as such when conducting future studies in simultaneous thoracic and abdominal organ allocation.
 

71.08 Kidney Paired Donation Programs Don't Become Concentrated with Highly Sensitized Candidates Over Time

C. Holscher1, K. Jackson1, A. Thomas1, C. Purcell2, M. Ronin2, A. Massie1, J. Garonzik Wang1, D. Segev1  1Johns Hopkins University,Baltimore, MD, USA 2National Kidney Registry,New York, NY, USA

Introduction: In order to utilize a willing but incompatible living donor, transplant candidates must either proceed with incompatible living donor kidney transplantation or attempt to find a more compatible match using kidney paired donation (KPD). For the latter, the benefit of a “better” match must be balanced with the morbidity and mortality associated with increased dialysis time while searching for a match. A common criticism of KPD registries is that the "easy-to-match" candidates match and leave the registry pool quickly, and thereby create a registry pool concentrated with difficult to match patients, making future KPD matches challenging. We hypothesized that, given alternative treatments such as deceased donor kidney transplant priority and desensitization, this concern would no longer be the case.

Methods: We studied 1894 registrants to the National Kidney Registry (NKR), the largest KPD registry in the United States (US), between 2011 and 2015. Candidates were considered a part of the KPD registry pool for the year they registered, and remained in the pool after registration until they matched into a KPD transplant or were removed from the registry for other reasons such as death, receipt of a deceased donor kidney transplant, or incompatible living donor transplant. The prevalent composition of the NKR pool was compared across years, comparing by age, gender, race/ethnicity, body mass index (BMI), ABO blood type, and panel reactive antibody (PRA) categories. Comparisons were made with chi-square, ANOVA, and t-tests, as appropriate.  

Results: Candidates were median age 50 (IQR: 38-60) years, 48% female, 66% white, and had a median BMI of 27 (23-31). Overall, 59% of candidates had blood type O, 24% had blood type A, 15% had blood type B, and 2% had blood type AB. The mean PRA was 53 with 29% having a PRA of 0, 29% having a PRA of 1-79, 18% having a PRA 80-97, and 24% having a PRA 98 or higher. Notably, there were no statistically significant differences by year in age, gender, race, BMI, blood type, or PRA. Further, there were no statistically significant changes by year in the composition of the pool by PRA category (Figure).  

Conclusion: In the largest KPD registry in the US, there is no evidence that KPD registrants have become more difficult to match over time. This should encourage continued enrollment of incompatible donor/recipient pairs in KPD registries to facilitate compatible transplantation.

71.06 Initial Experience with Closed Incision Negative Pressure Wound Therapy in Kidney Transplant Patients

C. J. Lee1, G. Sharma1, C. Blanco1, A. Bhargava1, S. Patil1, F. Weng2, S. Geffner1,2, H. Sun1,2  1Saint Barnabas Medical Center,Department Of Surgery,Livingston, NJ, USA 2Saint Barnabas Medical Center,Renal And Pancreas Transplant Division,Livingston, NJ, USA

Introduction: Despite being the most commonly transplanted organ, renal transplantation still has several complications, one of the most frequent being surgical site infections (SSI). This study examines the effectiveness of closed incision negative pressure wound therapy (NPWT) and antimicrobial foam dressings in preventing SSI in kidney transplant patients.

Methods:  A retrospective chart review was performed on 78 patients who received a kidney transplantation from June 2015 to August 2015. The recipients were divided into high risk (Body mass index over 30, on hemodialysis over 5 years, or pre-transplant diabetes mellitus) and low risk. High risk patients were dressed with closed incision NPWT, and low risk patients with antimicrobial foam. All patients received antibiotics within 30 minutes of the incision. The incision was closed in multiple layers and skin was closed with surgical staplers. Both dressings were applied intraoperatively and removed 5 to 7 days postoperatively. Wounds were evaluated until 30 days postoperatively. The primary end point of the study was SSI as defined according to the definition of the Centers for Disease Control and Prevention. Univariate analysis was performed by using Chi-square test for categorical and continuous variables. A P value of .05 or less was considered statistically significant.  SPSS 20.0 statistical package was used to perform data evaluation.

Results: NPWT was used in 39 patients, out of which 17 patients received a living donor kidney transplant (LDKT) and 22 patients received a deceased donor kidney transplant (DDKT). Antimicrobial foam dressing was applied in 39 patients, out of which 21 patients received a LDKT and 18 patients received a DDKT. Eight patients in the antimicrobial foam group underwent re-operation and none of them were for wound infection. Six patients in the NPWT group underwent re-operation, and 2 out of 6 patients were because of wound infection. SSI in high-risk patients at the same institution in 2014 was 22%, which was reduced to 12.8% in patients from June to August 2015, after implementation of the NPWT dressing. The SSI in the low-risk patients in 2014 was 6%, which was reduced to 0% after implementation of the antimicrobial foam dressing. On Univariate analysis, BMI, diabetes and peri-operative transfusion were the significantly different factors between the two groups. On multivariate analysis, there were no independent factors associated with wound infection. (Table 1)

Conclusion: Based on our initial experience, closed incision NPWT and antimicrobial foam dressings, in addition to standard perioperative measures, have shown encouraging results in reducing surgical site infections for high- and low- risk renal transplant recipients.

 

71.07 Early Hypertension, Diabetes, and Proteinuria After Kidney Donation: A National Cohort Analysis

C. Holscher1, S. Bae1, M. Henderson1, S. DiBrito1, C. Haugen1, A. Muzaale1, A. Massie1, J. Garonzik Wang1, D. Segev1  1Johns Hopkins University,Baltimore, MD, USA

Introduction:  Living kidney donors (LKDs) are at greater risk of end stage renal disease (ESRD) than the general population. While late post-donation ESRD is more likely due to hypertension (HTN) or diabetes (DM), early post-donation ESRD is often secondary to glomerulonephritis and is associated with proteinuria. Better understanding of the prevalence of and risk factors for early post-donation proteinuria, HTN, and DM will improve LKD follow-up care.

Methods:  Using SRTR data, we identified 41260 LKDs who underwent donor nephrectomy from 2008-2014 with follow-up data included through 2017. Given the high loss to follow-up (59% missing proteinuria, 33% missing HTN, and 31% missing DM), sensitivity analyses were done using inverse probability weighting (IPW) and multiple imputation by chained equations (MICE). Multiple logistic regression models were used to compare risk factors for proteinuria, HTN, and DM.

Results: Among LKDs, 1.55%1.70%1.86% had HTN, 0.06%0.09%0.13% had DM, and 5.11%5.47%5.84% had proteinuria at two years post-donation. Sensitivity analyses revealed similar estimates of HTN and DM, but slightly higher estimates of proteinuria [6.09%6.44%6.84% (IPW) and 6.35%6.72%7.23% (MICE)]. HTN was more likely in older (for each 10 years, aOR: 1.341.491.66), more obese (for each 5 BMI units, aOR: 1.171.341.53), and hypertensive (for each 10 mmHg, aOR: 1.351.451.56) LKDs. HTN was less likely in LKDs who had donated more recently (by year, aOR: 0.900.941.00), were female (aOR: 0.630.780.97), Hispanic/Latino (

Reference: white, aOR: 0.430.640.94), and not related to the recipient (aOR: 0.580.730.93). DM was more likely in LKDs who were Hispanic/Latino (aOR: 1.393.8510.64) and had higher BMIs (aOR: 1.131.933.28). Proteinuria was more likely in LKDs who had higher BMIs (aOR: 1.121.231.36) and in African American LKDs (aOR: 1.481.852.32) and Hispanic/Latino LKDs (aOR: 1.211.511.88) relative to white LKDs. Proteinuria was less likely in LKDs who were older (aOR: 0.770.830.90), female (aOR: 0.640.760.89), and were not related to their recipient (aOR: 0.700.830.99, Table). 

Conclusion: The low early post-nephrectomy prevalence of HTN, DM, and proteinuria in LKDs is reassuring and suggests risk of ESRD is limited to a small proportion of LKDs. Improved understanding of which LKDs are at risk for these conditions might improve pre-donation risk stratification and counseling as well as post-donation prevention of ESRD. 

71.05 Arterial, but Not Venous, Reconstruction Increases Morbidity and Mortality in Pancreaticoduodenectomy

S. L. Zettervall1, J. Holzmacher1, T. Ju1, G. Werba1, B. Huysman1, P. Lin1, A. Sidawy1, K. Vaziri1  1George Washington University School Of Medicine And Health Sciences,Surgery,Washington, DC, USA

Introduction:  Vascular reconstruction during pancreaticoduodenectomy is increasingly utilized to improve pancreatic cancer resectability. However, very few multi-institutional studies have evaluated the morbidity and mortality of arterial and venous resection with reconstruction during this procedure.

Methods:  A retrospective analysis of prospectively collected data was performed utilizing the targeted pancreas module of the National Surgical Quality Improvement Program (NSQIP) for patients undergoing pancreaticoduodenectomy from 2012-2014. Demographics, comorbidities, and 30-day outcomes were compared among patients who underwent venous or arterial reconstruction and no vascular reconstruction. Multivariate analysis was utilized to adjust for differences in demographic and operative characteristics.

Results: 3002 patients were included in NSQIP in the time period studied: 384 with venous reconstruction, 52 with arterial reconstruction, and 2566 without. Patients who underwent both venous and arterial reconstruction were excluded (N=81). Compared to patients without reconstruction, those with venous reconstruction had more congestive heart failure (0.2% vs. 1.8%, P <.01), and those with arterial reconstruction had higher rates of pulmonary disease (11.5 vs. 4.5%, P =0.02). Pre-operative chemotherapy was more common in both venous (34% vs 12%, P < .01), and arterial reconstruction (21% vs 12%, P < .04). On multivariate analysis, there was also no increase in morbidity or mortality following venous reconstruction, compared to those without reconstruction. In contrast, using multivariate analysis, arterial reconstruction was associated with increased 30-day mortality with an Odds Ratio (OR): 6.7, 95%; Confidence Interval (CI): 1.8-25. Also morbidity was increased as represented with return to the operating room (OR: 4.5, 95%; CI: 1.5-15), pancreatic fistula (OR: 4.4, 95%; CI: 1.7-11), and reintubation (OR: 3.9, 95%; CI: 1.1-14).

Conclusion: These findings suggest that venous reconstruction, may be considered to improve survival in patients previously thought of as unresectable due to venous involvement. Careful consideration should be made prior to arterial reconstruction given the significant increase in perioperative complications and death within 30-day from operative procedure.

 

71.04 Significance of repeat hepatectomy for intrahepatic recurrence of HCC within Milan criteria

T. Gocho1, Y. Saito1, M. Tsunematsu1, R. Maruguchi1, R. Iwase1, J. Yasuda1, F. Suzuki1, S. Onda1, T. Hata1, S. Wakiyama1, Y. Ishida1, K. Yanaga1  1Jikei University School Of Medicine,Department Of Surgery,Minato-ku, TOKYO, Japan

Introduction: Standard treatment strategy for intrahepatic recurrence (IHR) of hepatocellular carcinoma (HCC) within Milan criteria (MC) after primary hepatic resection is different between Western countries and Japan. In Western countries, salvage liver transplantation (ST) is reported to have good results, while repeat hepatectomy in Japan is usually a treatment of choice for patients with good hepatic reserves in Japan. The aim of this study is to evaluate the prognostic impact of IHR of HCC within MC and to identify factors related to IHR within MC.

Methods: Between April 2003 and December 2015, 218 patients were treated with primary resection for HCC at Jikei University Hospital. Of those, 118 patients who developed IHR were retrospectively reviewed, and the significance of the following clinicopathological factors were assessed: patient factors (age, sex, viral status, background liver), primary and recurrent tumor factors (size, number, macroscopic portal vein invasion), treatment modality and 5-year overall survival after recurrence (5-y OS).

Results: Median age was 68 years (29 – 90) and 107 patients (91%) were male. Sixty-eight patients (58%) developed IHR within MC, and 37 patients (54%) were treated with repeat hepatectomy. With the median follow-up period of 64.6 months, IHR within MC showed significantly better 5-y OS (74%) as compared with IHR beyond MC (22%) (p < 0.001?. 5-y OS of the patients with IHR within MC treated with repeat hepatectomy was 85%, which was better than reported 5-y OS of ST. By univariate analysis, the patients with IHR within MC had higher rate of HBV+?p = 0.034?, tumor size more than 5 cm ?p < 0.001? and macroscopic PV invasion ?p = 0.041?. By multivariate analysis, independent prognostic factors consisted of tumor size more than 5 cm ?p = 0.041?, macroscopic PV invasion (p = 0.027? and repeat hepatectomy ?p < 0.001?.

Conclusion: IHR within MC after primary liver resection in selected patients for HCC could be treated with repeat hepatectomy with good outcome as compared with ST.

71.03 So Many Pancreatic Cystic Neoplasms, So Little Known About Their Natural History

F. F. Yang1, M. M. Dua2, P. J. Worth2, G. A. Poultsides3, J. A. Norton3, W. G. Park4, B. C. Visser2  1Stanford University,School Of Medicine,Palo Alto, CA, USA 2Stanford University,Hepatobiliary & Pancreatic Surgery,Palo Alto, CA, USA 3Stanford University,Surgical Oncology,Palo Alto, CA, USA 4Stanford University,Gastroenterology & Hepatology,Palo Alto, CA, USA

Introduction: Pancreatic cystic neoplasms (PCNs) are a frequent incidental finding on imaging performed for indications unrelated to the pancreas. Guidelines for management of PCNs are largely based on surgical series; important aspects of their natural history are still unknown. The purpose of this study was to characterize which PCNs can be safely observed.

Methods: A retrospective study of patients who either underwent immediate resection of a PCN (within 6 weeks of presentation) or observation with at least two imaging studies between 2004-2014 was performed. Descriptive statistics and multiple logistic regression analyses were performed to determine predictors of premalignancy and malignancy.

Results:  Of the 1151 patients in this study, 66 (5.7%) underwent immediate surgery while 1085 patients had surveillance with a median follow-up of 15.5 months, mean of 24.7 (SD 25.6). Of the observed patients, 183 (16.9%) demonstrated radiographic progression, while the majority (83.1%) did not progress. Eighty-four (7.6%) of the observed patients eventually underwent surgery for concerning features with a median of 8.0 months until resection, mean of 18.1 (SD 26.1). The risk of malignancy among patients undergoing immediate surgery was 65%. The risk of developing malignancy during the first 12 months of surveillance was 5.3%, while the risk for malignancy decreases with surveillance time (TABLE).

Multiple logistic regression demonstrated that amongst all patients, jaundice (OR=36.3, CI 95%=5.96-221, p<0.0001), initial cyst size>3.0cm (OR=5.14, CI 95%=1.13-23.5, p=0.035), solid component (OR=2.96, CI 95%=1.04-8.42, p=0.042), and main pancreatic duct dilation (MPD)>5mm (OR=4.18, CI 95%=1.18-14.9, p=0.27) were independent predictors of premalignancy or malignancy. Among observed patients, jaundice (OR=13.9, CI 95%=1.48-130.3, p=0.021), unintentional weight loss (OR=8.03, CI 95%=1.59-40.5, p=0.012), radiographic progression (OR=3.42, CI 95%=1.28-7.91, p=0.004), and MPD>5mm (OR=4.99, CI 95%=1.24-20.0, p=0.023) were independent predictors of premalignancy or malignancy.

Conclusion: Relatively few pancreatic cystic lesions progress to malignancy during surveillance, especially beyond a time frame of one year. However, the risk of transformation does persist after 5 years of follow-up. This understanding of the natural history, predictors of malignancy, and especially the timeframe of transformation of PCN to either carcinoma-in-situ or invasive adenocarcinoma is important for counseling of patients undergoing surveillance.

71.01 Damage Control Pancreatic Débridement: Salvaging the Most Severely Ill

T. K. Maatman1, A. Roch1, M. House1, A. Nakeeb1, E. Ceppa1, C. Schmidt1, K. Wehlage1, R. Cournoyer1, N. Zyromski1  1Indiana University School Of Medicine,Indianapolis, IN, USA

Introduction:  Damage Control Laparotomy is a widely accepted practice in trauma surgery. We have applied this approach selectively to severely ill patients requiring open pancreatic débridement. Damage Control Débridement (DCD) is a novel, staged approach to pancreatic débridement; we sought to evaluate outcomes associated with this technique.

Methods:  Retrospective review evaluating 75 consecutive patients undergoing open pancreatic débridement between 2006 and 2016. Data were prospectively collected in our institutional Necrotizing Pancreatitis Database. 12 patients undergoing DCD were compared to 63 undergoing single stage débridement (SSD). Two independent groups T-tests and Pearson’s correlations or Fisher’s exact tests were performed to analyze the bivariate relationships between DCD and suspected factors defined pre- and post-operatively. P-values of <0.05 were accepted as statistically significant.

Results: Patients treated by DCD were more severely ill globally. DCD patients had higher incidence of preoperative organ failure, need for ICU admission, APACHE II scores (table), and more profound malnutrition (albumin DCD=1.9 g/dL, SSD=2.5 g/dL; p=0.03). Indications for DCD included: hemodynamic compromise (n=4), medical coagulopathy (n=4), or a combination (n=4). 6 of 12 DCD patients required more than one subsequent débridement prior to definitive abdominal closure (mean number of total débridements=2.6; range 2-4). Length of stay (DCD=43.8, SSD=17.1, p<0.01) and ICU stay (DCD=20.8, SSD=5.9, p<0.01) was longer in DCD patients. However, no difference was seen in the rate of readmission (DCD=42%, SSD=41%, p=0.90) or repeat intervention (any: DCD=58%, SSD=33%, p=0.10; endoscopic: DCD=17%, SSD=11%, p=0.59; percutaneous drain: DCD=42%, SSD=19%, p=0.09; return to OR after abdominal closure: DCD=0%, SSD=13%, p=0.20). The DCD group had a decreased rate of pancreatic fistula (DCD=33%, SSD=65%, p=0.04). Overall mortality was 2.7%; no significant difference in mortality was observed between DCD (8%) and SSD (2%), p=0.19.

Conclusion: Despite having substantially more severe acute illness, necrotizing pancreatitis patients treated with damage control débridement had equivalent morbidity and mortality as those undergoing elective single stage pancreatic débridement. Damage control débridement is an effective technique with which to salvage severely ill necrotizing pancreatitis patients.

 

71.02 Raid Growth Speed of Cyst was a Predictive Factor for Malignant Intraductal Mucinous Papillary Neoplasms

K. Akahoshi1, N. Chiyonobu1, H. Ono1, Y. Mitsunori1, T. Ogura1, K. Ogawa1, D. Ban1, A. Kudo1, M. Tanabe1  1Tokyo Medical And Dental University,Hepato-Biliary-Pancreatic Surgery,Bunkyo-ku, Tokyo, Japan

Introduction:
Intraductal mucinous papillary neoplasms (IPMN) are cystic tumors of the pancreas with the ability to progress to invasive cancer, and being discovered with increasing frequency. Currently, the timing of surgical treatment is determined based on the international consensus guideline. However, pre-operative risk stratification for malignant IPMN is sill difficult. Novel predictors for malignant potential of IPMN are expected to be identified.

Methods:
This is a retrospective, single-center study of IPMN patients who underwent surgical resection between 2005 and 2015, and 81 patients were enrolled. Clinical and pathological data were collected and analyzed. The differences between benign IPMN and malignant IPMN were compared. Malignant IPMN was defined as presence of high-grade dysplasia or invasive cancer based on pathological diagnosis of resected specimen.

Results:
Of the 81 patients, 46 showed benign (low to intermediate dysplasia) and 35 showed malignant IPMN. Malignant IPMN were present in 28% of patients with branch duct type (10/36), 55% with combined duct type (17/31) and 57% with main duct type (8/14). Fifty-nine percent (24/41) of patients with high-risk stigmata and 27% (10/37) with worrisome features exhibited malignant IPMN. High-risk stigmata significantly correlated with malignant potential (p=0.006). Next, cyst growth speed of branch duct type and combined type patients with at least 2 contrast-enhanced imaging studies was measured. Average cyst growth speed of benign IPMN and malignant IPMN patients was 0.979±1.796mm/year and 6.933±2.958mm/year, respectively (p<0.001).

Conclusion:
Rapid cyst growth speed was a predictive factor for malignant IPMN as well as high-risk stigmata. Evaluation of cyst growth speed would contribute to optimize treatment strategy of IPMN patients.
 

70.10 A Decade of Components Separation Technique; An Increasing Trend in Open Ventral Hernia Repair.

M. R. Arnold1, J. Otero1, K. A. Schlosser1, A. M. Kao1, T. Prasad1, A. Lincourt1, P. D. Colavita1, B. T. Heniford1  1Carolinas Medical Center,General Surgery,Charlotte, NC, USA

Introduction:

Components separation technique (CST), a complex surgical adjunct to ventral hernia repair (VHR), was originally described by Ramirez et al. greater than 25 years ago. Reports of CST have increased over the last several years, but no studies to date have examined the trends of CST utilization and associated complications over time. This purpose of this study is to examine the trends on CST over the past 10 years.

Methods:

The ACS-NSQIP database identified open VHR with components separation from 2005 to 2014. Preoperative risk factors, operative characteristics, outcomes, and morbidity trends were compared. Univariate analysis of outcomes and morbidity was performed. Multivariate analysis was performed to control for potential confounding variables.

Results:

A total of 129,532 patients underwent open VHR during the study period. CST was performed as part of 8,317 ventral hernia repairs. Use of CST increased from 39 cases in 2005, to 2,275 cases in 2014 (2.6% vs10.2%; p<0.0001). Over the past decade, preoperative smoking and dyspnea significantly decreased (p<0.05). A decrease was seen in superficial and deep wound infection (10.3% vs 5.7%;p<0.05) and (7.7% vs. 3.4%;p<0.05), and all wound related complications (18.0% vs 10.2%;p< 0.05), minor complications (18.0% vs. 13.4%; p<0.0001), and major complications (25.6% vs 12.8%;p<0.0001). Hospital length of stay decreased (11.0 vs. 6.3;p<0.05), and hospital readmissions decreased from 2011 to 2014 (14.4% vs. 11.1%;p<0.05). There was no significant change in thirty-day mortality (Range 0-1.42%;p=0.28). Multivariate regression was performed to control for pre-operative comorbidities; there was an overall decrease in wound dehiscence for 2011 (OR0.3, 95%CI 0.1-0.9), 2012 (OR0.2, 95%CI 0.1-0.7), 2013 (OR0.2, 95%CI 0.0-0.6), and 2014 (OR0.2, 95%CI 0.1-0.7). There was no significant change in major or minor complications, wound infection, or mortality.

Conclusion:

CST in VHR has significantly increased in frequency over 10 years. As experience with CST increased, there has been a significant decrease in the rate of associated complications. When adjusted for preoperative risk factors, the risk of wound dehiscence decreased. However, the rate of other complications has remained unchanged. This suggests preoperative patient optimization, and improvement in patient selection and modifiable risk factors, such as smoking, rather than changes in surgical technique, have led to improved outcomes. Due to the limitations of the NSQIP database, changes in chronic disease management, such as diabetes, may be overlooked. Additional studies are needed to further elucidate the reason for this decrease in complications.

 

70.08 Don’t Get Stuck. A Quality Improvement Project to Reduce Perioperative Blood-Borne Pathogen Exposure

J. P. Gurria1,3, H. Nolan1, S. Polites1, K. M. Arata4, A. Muth5, L. Phipps4, R. A. Falcone1,2,3  1Cincinnati Children’s Hospital Medical Center,General And Thoracic Surgery,Cincinnati, OH, USA 2University Of Cincinnati,Surgery,Cincinnati, OH, USA 3Cincinnati Children’s Hospital Medical Center,Trauma Surgery,Cincinnati, OH, USA 4Cincinnati Children’s Hospital Medical Center,Operating Room,Cincinnati, OH, USA 5Cincinnati Children’s Hospital Medical Center,Occupational Safety And Environmental Health,Cincinnati, OH, USA

Introduction:
Blood-borne pathogen exposure (BBPE) among health care employees represents a significant safety and resource burden with over 380,000 events reported annually across hospitals in the United States.  The perioperative environment is a high-risk area for BBPE and efforts to reduce exposures are not well defined. The incidence of patients with blood-borne pathogen infections is often under-appreciated and therefore the risk from of BBPE is not appreciated.  A multi-disciplinary group of nurses, surgical techs, surgeons and employee health specialists worked collaboratively to develop and implement a BBPE prevention bundle to reduce exposures and therefore the overall number of Occupational Safety and Health Administration (OSHA) recordable cases.

Methods:
A BBPE prevention bundle including mandatory double gloving, safety zone on the sterile field, engineered sharps injury prevention devices on all needles, and clear communication around passing of sharps was created in an evidence-based, collaborative fashion at our institution and implemented for all perioperative staff. After an initial pilot period from January through July 2016 with one surgical division, the bundle implementation was spread throughout our entire perioperative system. We monitored exposures as both, days between exposures, and total number of exposures comparing baseline period of fiscal year (FY) 2015 to the post-implementation periods of FY 2016 and FY 2017. Analysis by specialty, role, location, type of injury, and timing was performed.

Results:
During the study period the number of surgical procedures remained relatively constant (35,000/FY). During the pre-implementation period, 45 OSHA recordable cases were reported. During implementation year, (FY16), cases decreased to 38 (15% decrease); while in the late post-implementation period of FY17, only 22 cases were reported (42% additional decrease), for a total of 52% decrease (p<0.022). The mean number of days between injuries significantly increased from 2.5 to 16.3 over the study period (FIGURE 1). For FY17, the main cause and type of BBPE was a needle stick while suturing (63%). By role, clinical fellows and attendings combined had the most injuries (54.6%). Among divisions: pediatric surgery (18%), operating room staff (18%) and orthopedics (13.6%) had the most events.

Conclusion:
A comprehensive and multi-disciplinary approach to employee safety, focused on reduction of BBPE, resulted in a significant progressive annual decrease of injuries among perioperative staff.  Efforts to change the culture of safety and implement a successful bundle into a complex environment, benefited from the support and diversity of a widely representative team.
 

70.09 The Association Between Travel Distance, Institutional Volume, and Outcomes for Rectal Cancer Patients

M. Cerullo1, M. Turner1, M. A. Adam1, Z. Sun1, J. Migaly1, C. Mantyh1  1Duke University Medical Center,Department Of Surgery,Durham, NC, USA

Introduction: Though the relationship between surgical volume and outcomes has been well studied, travel to higher-volume centers remains a significant barrier for patients. Though travel to higher-volume centers is associated with improved outcomes for other cancers, it remains unclear whether travel to higher volume centers is associated with improved 30- and 90-day mortality, better long-term survival, or a higher likelihood of undergoing appropriate surgery in patients with rectal cancer.

Methods: Patients with rectal adenocarcinoma (stages I-III) with a single primary tumor were identified in the 2011-2014 National Cancer Database (NCDB). Patients were further categorized into quartiles by distance traveled, while centers were categorized into volume quartiles. Multivariable logistic regression and Cox proportional hazards regression models were used to compare outcomes between patients in the highest quartile of travel distance treated at the highest volume centers (travel) with patients in the lowest quartile of travel distance treated at the lowest volume centers (local).

Results: Exactly 3,088 patients in the lowest quartile of travel distance treated at low-volume centers and 3,071 patients in the highest quartile of travel distance treated at high-volume centers were identified. Overall, 38.2% of patients had stage III disease (35.3% of short-distance/low-volume patients vs. 41.1% of greater-distance/high-volume patients, p<0.001), and 63.6% received neoadjuvant radiation (57.7% of short-distance/low-volume patients vs. 69.6% of greater-distance/high-volume patients, p<0.001). After adjustment for disease severity and receipt of adjuvant therapy, patients who traveled greater distances to high-volume centers had a 71% lower 30- and 61% lower 90-day mortality (30-day: OR=0.29, 95% CI: 0.15–0.57, p<0.001; 90-day: OR=0.39, 95% CI: 0.24–0.62, p<0.001), as well as lower risk of overall mortality (Hazard ratio=0.78, 95% CI: 0.68–0.88, p<0.001). These patients also were more likely to have adequate lymph node harvest (OR=1.83, 95% CI: 1.64–2.05, p<0.001) and less likely to have positive margins (OR=0.76, 95% CI: 0.59–0.96, p=0.02). However, these patients also had 42% greater odds of being readmitted after surgery (OR: 1.42, 95% CI: 1.14–1.75, p=0.001).

Conclusion: Traveling greater distances to high-volume centers improves 30- and 90-day mortality, overall risk of death, and pathologic surrogates for appropriate surgery in rectal cancer patients. As patients may often be faced with choosing to obtain care at local lower-volume centers or traveling to higher-volume centers, these findings provide an impetus for facilitating travel for patients to higher-volume centers.

70.06 Neoadjuvant Radiation for Locally Advanced Colon Cancer: A Good Idea for a Bad Problem?

A. T. Hawkins1, T. M. Geiger1, M. Ford1, R. L. Muldoon1, B. Hopkins1, L. A. Kachnic1, S. Glasgow2  1Vanderbilt University Medical Center,Nashville, TN, USA 2Washington University,Colon And Rectal Surgery,St. Louis, MO, USA

Introduction: Compared with lower tumor stages, resection of locally advanced colon cancer (LACC) is associated with poor survival. Few strategies are available to address this disparity. Data on the effect of neoadjuvant radiation therapy (NRT) to improve resectability and survival is lacking. We hypothesized that NRT will result in increased R0 resection, decreased multivisceral resection rates and improved overall survival in patients with LACC.  

Methods: The National Cancer Database (NCDB 2004-2014) was queried for adults with clinical evidence of LACC (defined as clinically evident T4 disease prior to surgery) who underwent curative resection. Patients with metastatic disease or in whom clinical staging data was unavailable were excluded.  Patients were divided into two groups – those who underwent NRT and those that did not.  Bivariate and multivariable analyses were used to examine the association between NRT and R0 resection rate, multivisceral resection and overall survival.  

Results: 15,207 patients with clinical T4 disease who underwent resection were identified over the study period.  195 (1.3%) underwent NRT.  The majority of patients in the NRT group underwent 4500 cGy of radiation in 25 fractions over five weeks (range: 3900- 5040 cGy). Rate of NRT utilization did not change by year. Factors associated with the use of NRT included younger age, male gender, private insurance, lower Charlson comorbidity score, and treatment at an academic/research program.  NRT was associated with superior R0 resection rates (NRT 87.2% vs. No NRT 79.8%; p=0.009) but not lower multivisceral resection rates (NRT 45.6% vs. No NRT 21.5%; p< 0.001).  Five-year overall survival was increased in the NRT group (NRT 62.0% vs. No NRT 45.7%; p< 0.001; PLEASE SEE FIGURE).  The benefit of NRT persisted in a Cox proportional hazard multivariable model containing a number of confounding variables including comorbidity, multivisceral resection and postoperative chemotherapy (OR 1.30; 95%CI 1.01-1.69; p=0.04).  

Conclusion: Younger age, male gender, private insurance, lower Charlson comorbidity score, and treatment at an academic/research program were associated with increased rates of NRT utilization. Although radiation is rarely used in locally advanced colon cancer, this NCDB analysis suggests that the use of neoadjuvant radiation for clinical T4 disease may be associated with superior R0 resection rates and improved overall survival.  NRT should be considered on a case-by-case basis in locally advanced colon cancer. Further research is necessary to identify patients that will benefit the most from this approach. 

 

70.07 The Effect of Trainee Involvment on Patient Outcomes in General Surgery Cases over Time

T. Feeney2, J. Havens1  1Brigham And Women’s Hospital,Trauma,Boston, MA, USA 2Harvard School Of Public Health,Boston, MA, USA

Introduction:  Resident duty hour reform was implemented in 2003 and further modified in 2011. The effect of these changes on patient outcomes remains unclear. We investigated the effect on outcomes of resident involvement in surgical procedures over time since these changes have been implemented. We hypothesized that there has been no change in outcomes since implementation of the resident work hours restrictions.

Methods:  We utilized the ACS-NSQIP database (2005-2012). General surgery were identified by common procedural terminology code, and were restricted to ages ≥18. Using 2005/2006 as reference logistic and linear regression analysis was performed.

Results:There were 422,733 procedures analyzed. In the attending only group there was no difference in the odds of major morbidity in 2012 (Odds Ratio (OR) (0.67 (95% Confidence Interval 0.35-1.31;p=0.247)), overall morbidity (0.86 (0.63-1.18;p=0.354)) or all-cause mortality(0.40(0.09-1.87;p=0.246)). In cases that included a trainee there was no change in the odds of major morbidity in 2012(OR 1.42(0.77-2.62;p=0.264)), the odds of overall morbidity (OR 1.12(0.81-1.53;p=0.495)) or the odds of all-cause mortality (OR 2.20(0.46-10.49;p=0.322)) over the same time period. There was an increase in mean operative time the attending only cases from 2005 to 2012 (14.7 min(10.8- 18.6;p<0.001), but there was a decrease of 7.48 min(11.0-3.9;p<0.001)in the mean operating time in the cases that included a trainee.

Conclusion: Between 2005-2012 there were no changes in the odds of overall complications, major complications, or all-cause mortality in surgeries involving attending surgeons only or involving trainees. There has been a significant change in the mean operative time in both groups. Attending surgeons operating alone had an increase in operative time while cases that included a trainee had a decrease in operative time. These data suggest that while operative time has changed, surgical outcomes for patients have not changed between 2005 and 2012.

 

70.04 Enhanced Recovery Pathway for Colorectal Surgery Improves Outcomes in Private and Safety Net Settings

T. J. Roberts1, J. L. Anandam1,3, P. K. Brown1, J. R. Lysikowski1, J. L. Rabaglia1,2,3  1University Of Texas Southwestern Medical Center,Dallas, TX, USA 2VA North Texas Health Care System,Dallas, TX, USA 3Parkland Health & Hospital System,Dallas, TX, USA

Introduction:  Although it is known that Enhanced Recovery Pathways (ERP) decrease length of stay (LOS) and improve outcomes in colorectal surgery, these studies predominantly represent the private health care setting. There is a paucity of information regarding the effectiveness of ERP in the public arena, comprised of the under or un-insured who may have different social determinants of health. This study aims to compare the effect of an ERP on LOS and readmission for colorectal surgery across the private and safety net settings in a large urban academic medical center.

Methods:  A multidisciplinary panel of experts utilized professionally recognized standards and evidence-based best practice to create a comprehensive ERP for elective colorectal surgery. The ERP included standardization of patient education, optimization of co-morbidities, multimodal analgesia, carbohydrate loading, intraoperative goal-directed fluid therapy, minimization of opioids, and early ambulation, removal of urinary catheter, and resumption of diet. There were no social interventions. The ERP was implemented in the safety net hospital (SNH) in 9/2014 and in the private University hospital (PUH) in 12/2014. Process and outcome metrics from 100 consecutive patients having surgery in the 18 months prior to ERP at each institution were compared to a similar group post ERP. Surgeons and discharge criteria remained constant. Primary end points were LOS and readmissions.

Results: Patients in the post ERP cohorts at both facilities were significantly older than pre ERP (p=0.047, 0.034), with no significant difference in gender and BMI. The rate of open versus minimally invasive was similar at SNH (p=0.067), while more post ERP patients at PUH underwent open surgery (p=0.002). 96% of PUH patients were funded through private insurance or Medicare, verses only 6% at the SNH. ERP implementation reduced total LOS at both facilities, while readmission and reoperation remained constant. LOS at PUH fell from 8.1 to 5.9 days (p=0.028), and at SNH from 7.0 to 5.1 days (p=0.004). 30-day all-cause readmission and return to surgery were stable (PUH p=0.634; SNH p=1) and (PUH p=0.610; SNH p=0.066) respectively. Surgical site infection rate was unchanged at PUH (p=0.485) and significantly reduced at SNH (p=0.021, OR 0.39). Mean time to ambulation and mean time to first bowel movement were reduced at SNH (p=0.002, 0.001). Mean time to resumption of solids was reduced at both PUH and SNH (p<0.001).

Conclusion: Implementation of ERP is similarly effective across private and safety net settings, without interventions to address social determinants of health. Both cohorts experienced reduced LOS without increasing readmission or reoperation. The data suggest ERP may have a more dramatic impact on outcomes in the safety net setting, perhaps through standardization in a group with more varied baseline health status. Utilization of ERP appears to be advantageous for all populations regardless of funding.

 

70.05 Path to the OR: When are the Delays and How does it Impact Outcomes in Emergency Abdominal Surgery?

C. M. Dickinson1, N. A. Coppersmith1, H. Huber1, A. Stephen1, D. T. Harrington1  1Brown University School Of Medicine,Surgery,Providence, RI, USA

Introduction: Current recommendations state that patients with peritonitis should be operated on within 1-2 hours. However, there is limited literature that support time-based recommendations or identify where delays exist from the emergency room (ER) to the operating room (OR). We investigated the time course for patients that needed emergency abdominal surgery and evaluated whether time to operation impacted outcomes. 

Methods: A retrospective review was done of all non-transferred adult patients over a 5-year period who were admitted from the ER and underwent a non-trauma exploratory laparotomy within 24 hours of admission. To limit the study group to patients with clear emergent indications for surgery, small bowel obstructions without perforations, appendicitis, cholecystitis, GI bleeds, and malignant obstructions were excluded. Demographics, comorbidities, vitals, labs, and operative details were reviewed. Times were noted for presentation(PR) to ER, time of ER physician evaluation(EREval), timing of diagnostic imaging(SCAN), time of signed surgical consult note(SC), and time of case start(OR). Adverse outcomes were identified using ICD-9 codes for infectious complications, wound complications, kidney injury, ileus, cardiovascular complications, and respiratory failure. Chi-square, t-tests, ANOVA and discriminant function analysis were used.

Results: One hundred forty-one patients were reviewed. Mean age was 60.8 years, 55.3% were male, and mean APACHE II was 8.5. Mean time from PR to OR was 597 minutes, PR to EREval 91 minutes, EREval to SCAN 156 minutes, SCAN to SC 147 minutes and SC to OR 205 minutes. Patients that did not develop a complication had a shorter time from EREval to SCAN compared to those who developed complications (113.8 vs 176.8 minutes, p<0.05). Shorter total time to OR (543.7 vs 702.3 minutes, p<0.05) was associated with lower rates of complications. There was no significant difference in time to EREval based on the shift that the patient presented on, however those who had an image obtained during the first shift (7AM-3PM) had longer delays to SCAN (1st shift 204 minutes, 2nd shift 152 minutes, 3rd shift 130 minutes, p<0.05). There were no significant differences based on shift when evaluating time from SCAN to OR. However, those who had a case start time during first shift experienced significantly longer total delays to operation (1st shift 779 min, 2nd shift 527 min, 3rd shift 491 min from arrival to OR, p<0.05). 

Conclusion: Increased time to OR was associated with a higher number of complications in patients undergoing emergency abdominal surgery. These delays are spread out over a patient's course, from arriving to the ER, to obtaining imaging and surgical team evaluation. Interestingly it appears that during the first shift patients experience the most delays. Further investigation into the cause for these delays is critical to expediting patient care for those who need emergent abdominal surgery.

70.03 From Procedure to Poverty: Out-of-Pocket and Catastrophic Spending for Pediatric Surgery in Uganda

A. Yap3, M. Cheung2, N. Kakembo1, P. Kisa1, A. Muzira1, J. Sekabira1, D. Ozgediz2  1Makerere University,Department Of Surgery,Kampala, , Uganda 2Yale University School Of Medicine,Department Of Surgery,New Haven, CT, USA 3Yale University School Of Medicine,New Haven, CT, USA

Background: Universal health coverage with financial risk protection is a target for the United Nations Sustainable Development Goals (SDGs). Catastrophic health expenditure (CHE) is usually defined as patient healthcare spending of >10% of total household expenditure, or >40% of household expenditure after food expenses. 90% of global CHE occurs in low-income countries; however, CHE data for pediatric surgical conditions in low-income countries is limited. While public hospitals provide free care, patients accrue substantial expenses. Our prior focus group discussions with families in Uganda have highlighted an alarming economic burden, but these have not been studied further. We sought to calculate out-of-pocket (OOP) expenditure accrued by families with children undergoing surgery at the national referral hospital in Kampala, Uganda, one of two Ugandan centers with a dedicated pediatric surgery unit.

Methods: In this cross-sectional study, 132 families receiving inpatient care at Mulago Hospital Department of Pediatric Surgery were included through prospective convenience sampling from November 2016 to April 2017. From prior work, a survey tool was designed including expenses for transport, diagnostic tests, medications, food and lodging, and productivity losses. The tool was pilot tested, revised, and validated. Loans and pawned possessions were also collected from a subset of participants. Ugandan shillings were converted into US dollars using purchasing power parity.

Results: Median transportation cost was $26.63. Median OOP medication cost was $18.37, and median cost for diagnostic tests was $27.55. The median amount spent on food and lodging per visit per family was $32.60. 88 (67% of) participants responded to questions on loans and sold items: 19% borrowed money and 9% sold items to fund the hospital visit. For those with employed family members staying in the hospital, median household productivity loss was $95.52. Total median expense per visit per family amounted to $150.62. Using the average annual household expenditure in Uganda ($2694) as reference, 31.8% of households face catastrophic spending for a single inpatient hospital stay.

Conclusions: Though pediatric surgical services in Uganda are formally provided for free by the public sector, families pay for transportation, lodging, and medical costs due to public resource shortages. Households also accrue productivity loss from workdays spent in the hospital and almost a third of households incur catastrophic expenditure for a single pediatric surgical procedure. This study suggests that broader financial protection must be established to meet SDG targets.

70.01 Cost Analysis of the Mongolian ATLS Program: a Model for Low- and Middle-Income Countries

J. E. Kornfeld1, M. Katz6, J. R. Cardinal7, B. Bat-Erdene3, G. Jargalsaikhan3, J. Nellermoe2, L. A. Dunstall8, M. Holland8, A. Zorigtbaatar9, H. Pioli7, S. Orgoi3,5, J. Nunez6, R. Price2,6  1Dartmouth Medical School,Lebanon, NH, USA 2University Of Utah,Center For Global Surgery,Salt Lake City, UT, USA 3Mongolian National University Of Medical Sciences,Department Of Surgery,Ulaanbaatar, , Mongolia 5WHO Collaborating Center For Essential Emergency And Surgical Care,Ulaanbaatar, , Mongolia 6University Of Utah,Department Of General Surgery,Salt Lake City, UT, USA 7University Of Utah,Salt Lake City, UT, USA 8Westmead Hospital,Sydney, NSW, Australia 9McGill University,Montreal, QC, Canada

Introduction: In the last two decades the burden of trauma has increased in Mongolia; trauma is now the number one cause of death amongst Mongolians aged 24-44. In 2015, the Mongolian National University of Medical Sciences (MNUMS), in partnership with the American College of Surgeons, implemented an Advanced Trauma Life Support (ATLS) training program. According to the Disease Control Priorities, organized trauma systems have been associated with a decrease in mortality and the economic burden of trauma. ATLS in Mongolia has been shown to have a positive impact on confidence and self-reported clinical competencies in the context of trauma care. The cost of ATLS continues to be a challenge for promulgating ATLS in low- and middle-income countries (LMICs).  A cost-analysis for continued development and expansion of ATLS in Mongolia, a LMIC, might provide a framework for expanding self-sustaining ATLS programs in other LMICs.

Methods: All costs associated with a Mongolian ATLS program were identified and quantified based on discussions with the Mongolian government, MNUMS, ATLS Australasia headquarters and existing pricing data. Costs were then classified either as essential or contingencies. A basic minimum budget was constructed. Costs were considered contingencies if they represented components of the course that have yet to be established (training a Mongolian educator, supporting ATLS directors and coordinators to attend regional and international ATLS meetings, ATLS updates/translations). Budget scenarios were developed with various combinations of contingencies and the basic minimum budget. Savings projections were calculated for enacting contingencies that included training Mongolian instructors and educators.

Results: The modeling shows the minimum annual cost of ATLS in Mongolia to be $10,700 (three ATLS student courses). A more comprehensive budget of $19,900 includes additional contingencies. Since beginning the program in 2015, an initial investment of $85,000 to train Mongolian instructors reduced instructor costs by $49,600 per year for a cost reduction of 79.6% and will be paid back within two years. An initial investment of $4,050 to train a Mongolian educator will reduce educator costs by $1,750 per year; this initial investment will be paid back within 2.1 years.

Conclusion: The cost-analysis of ATLS in Mongolia demonstrates that the initial investment to train Mongolian instructors led to substantial savings. Further investment to train a Mongolian educator will also lead to lower long-term costs. A minimal cost for the ATLS course was determined, which can be scaled up with different contingencies. We believe this is the first cost-analysis performed for a government-supported ATLS program.

 

70.02 Laparoscopic vs. open cholecystectomy in Mongolia: comparison of clinical outcomes and costs

S. Lombardo1, J. S. Rosenberg1, S. Erdene2, J. Kim1, E. Sandang2, J. Nellermore1, R. Price1  1University Of Utah,Center For Global Surgery,Salt Lake City, UT, USA 2Mongolian National University Of Medical Sciences,Department Of Surgery,Ulaanbaatar, ULAANBAATAR, Mongolia

Introduction:  Laparoscopic cholecystectomy (LC) is the surgical standard of care for operable uncomplicated biliary disease in developed countries, with shorter hospital length of stay (HLOS), reduced pain, and earlier return to work when compared to open surgery (OC).  Use of LC in resource-limited, low and middle income (LMIC) countries, such as Mongolia, is increasingly common. This prospective, observational study evaluates costs, clinical outcomes, and quality of life (QoL) associated with LC vs OC in Mongolia.

Methods:  Patient surveys and chart review were used to capture patient demographics, clinical outcomes, and out-of-pocket and insurance costs associated with cholecystectomy.  QoL was assessed pre-operatively and at 30 days post-operatively using the 5-level EuroQol (ED-QL-5D) questionnaire. Patient demographics, intra- and post-operative complications, surgical and hospital fees, and QoL results were collected by researchers through verbal interview and chart review from March 2016 through February 2017.  Four of the seven participating sites were tertiary centers in Ulaanbaatar; the remaining three were rural secondary level facilities. Student T-test and Chi-squared tests were used for univariate analysis.  Multivariate logistic and linear regressions were generated using variables with p-value 0.2 or less on univariate analysis. Outcomes were analyzed on the basis of intent-to-treat.

Results: In total, 215 cholecystectomies were captured, with 122 (56.7%) starting laparoscopically.  Two converted to OC (1.6%). Patients undergoing LC were more likely to have attended college and have insurance, though overall insurance rates were low (10.3%). Pre-operative symptoms were comparable between groups. No deaths were reported. Total complication rate was 21.8% with no difference between groups, however LC patients were less likely to have superficial infections (Table 1). Mean HLOS and mean days to return to work (DRTW) were significantly shorter for LC. QoL was significantly improved after surgery for both groups, with no difference between groups. Mean total costs (out-of-pocket + insurance) were higher for LC, but this was not significant (555,000 vs. 477,000 Tugriks, p-value 0.126). After adjustment, LC was associated with significantly lower rates of complication, shorter HLOS, fewer DRTW, greater improvement in QoL scores, and no increase in cost when compared to OC (Table 1).

Conclusion: LC is a safe surgical treatment for patients with biliary disease in Mongolia.  LC is comparative in expense to OC and is associated with improved outcomes. Reduced HLOS, shorter time off work, fewer complications, and improved QoL after LC are likely associated with greater cost-savings, but further investigation is required.