66.09 Preoperative Depression is Associated With Adverse Post-Surgical Outcomes in General Surgery Patients

D. S. Lee1,2, L. Marsh3,4, M. Garcia-Altieri3,4, K. Makris1,2, L. Chiu1,2, N. Becker1,2, S. S. Awad1,2  3Baylor College Of Medicine,Psychiatry,Houston, TX, USA 4Michael E. DeBakey VA Medical Center,Psychiatry,Houston, TX, USA 1Baylor College Of Medicine,General Surgery,Houston, TX, USA 2Michael E. Debakey VAMC,General Surgery,Houston, TX, USA

Introduction:  Depression is a common comorbidity in surgical patients and is associated with adverse post-operative outcomes for a variety of surgical procedures.  However, this association is not well studied for general surgery procedures, and the use of retrospective data to identify patients with depression has been shown to be highly unreliable.  

Methods:  202 consecutive patients who were scheduled for a variety of elective general surgery procedures were prospectively screened for depression using the Patient Health Questionnaire – 8 item scale (PHQ-8) during their pre-operative visit.  The PHQ-8 defines major depression by a score of 10 or greater and severe major depression by a score of 20 or greater.  Demographics, comorbidities, body mass index (BMI), serum albumin, wound classification, and American Society of Anesthesiologists (ASA) score were collected.  Operative time was used as a surrogate marker of operative complexity.  Outcomes of interest were surgical site infection (SSI), readmission (RA), and emergency room visits that occurred within 30 days of the index operation, and these complications were considered as one composite outcome variable.  T-test and chi square test were used for comparisons of continuous and categorical variables.

Results: Of the 202 patients screened, 171 underwent surgery as scheduled (42 depressed, 129 not depressed).  No significant differences were found in age, comorbidities, serum albumin, wound class, ASA score, or operative time.   Patients with major depression (PHQ-8 score ≥ 10) had a higher composite complication rate than non- depressed patients (21.4% vs. 13.2%, p=0.195).  By adjusting the threshold for diagnosing depression to a PHQ-8 score of 20 (severe major depression), we found that these patients had a higher complication rate than patients with a PHQ-8 score less than 20 (44.4% vs. 14.2%, p=0.012)

Conclusion: Depression is an under-recognized comorbidity and is associated with adverse outcomes after general surgery procedures.  Patients with more severe depressive symptoms may need to be referred to a mental health professional prior to surgery in order to optimize management of depression and decrease the chances for an adverse event.

 

66.08 Variations in HIDA Scan-based Gallblader Ejection Fractions Over Time in Suspected Biliary Dyskinesia

E. Wiesner1,2, L. Martin1,2, W. Peche1,2, J. Langell1,2  1University Of Utah,Department Of Surgery,Salt Lake City, UT, USA 2VA Salt Lake City Health Care System,Center Of Innovation,Salt Lake City, UT, USA

Introduction:  The diagnosis of biliary dyskinesia (BD) remains somewhat controversial.  However, the Rome IV Criteria for Functional Gastrointestinal Disorders outlines the clinical diagnostic requirements for BD. Many authors also recommend supportive confirmation of the diagnosis with cholecystokinin (CCK)-stimulated cholescintigraphy (hepatobiliary iminodiacetic acid [HIDA] scan) and calculation of gallbladder ejection fraction (GBEF).  No prior studies have evaluated the consistency and utility of repeat HIDA-GBEF imaging in patients with suspected BD.  Here, we conducted a retrospective study to evaluate HIDA-GBEF consistency overtime among patients with suspected BD. 

Methods:  We queried the Veteran’s Healthcare Administration National Corporate Data Warehouse from January 2005 to July 2016 for patients who underwent more than one HIDA. Patients undergoing HIDA for a suspected diagnosis of BD were included. Radiology reports were reviewed and the GBEF for each study was abstracted. The data were analyzed for changes in GBEF over time, specifically evaluating differences between studies and cross over from abnormal-to-normal and normal-to-abnormal diagnostic criteria.

Results: We identified 546 patients who underwent more than one HIDA scan during the study period.  522 underwent two HIDA scans, 23 underwent three HIDA scans, and 1 underwent four HIDA scans.  The initial EF was reported as normal in 365 patients (mean GBEF 68% +/-19) and reduced in 181 patients (mean GBEF 17% +/-10).  Of the patients with an initially normal GBEF, 97 patients (27%) had a reduced EF on subsequent imaging (average GBEF 64+/-19% versus 19.7+/-14%) with a mean time between studies of 33.7 months.  Of the patients with an initially low GBEF, 81 patients (45%) had a normal EF on subsequent imaging (average GBEF 18+/-11% versus 73+/-44%) with a mean time of 26.53 months between studies.

Conclusion: We found substantial variation in repeat HIDA scan data over time with about one-third of patients demonstrating a change in diagnostic criteria.  These data suggest that HIDA scan GBEF may have a low precision, calling into question its clinical value in the evaluation of BD.  Additional studies are necessary to determine the utility of HIDA-GBEF in the evaluation of patients with suspected BD.

 

66.07 Does Practice Make Perfect: The Impact of Resident Participation in Cholecystectomy at VA Hospitals

L. Martin1,2, C. Zhang1, A. Presson1, R. Nirula1, W. Peche1,2, B. Brooke1,2  1University Of Utah,Salt Lake City, UT, USA 2VA Salt Lake City Health Care System,Salt Lake City, UT, USA

Introduction:  Resident participation in operative cases within Veterans Administrative (VA) hospitals is often assumed to be associated with worse surgical outcomes.  While recent studies have evaluated the association between resident post-graduate year (PGY) and perioperative morbidity, this metric fails to capture resident participation as a function of case-level involvement.  We designed this study to examine resident participation as a function of training and case-level involvement on cholecystectomy performed at all PGY levels.

Methods:  We identify all cholecystectomies performed at nationwide VA hospitals from 2005 to 2014 using surgical CPT codes and then requested the corresponding Veterans Affairs Surgical Quality Improvement Project (VASQIP) dataset.  Resident participation was categorized (Levels 0-3) as a function of involvement as well as year of training (Table).  We performed multivariate regression analyses to examine the effect of resident participation on operative time and composite metric of peri-operative complications (intra-op transfusion, return to OR, or organ space infection) after adjustment for surgical approach (laparoscopic vs. open), diagnosis (cholelithiasis vs. cholecystitis), patient comorbidities, perioperative physiology, and preoperative laboratory values.

Results: A total of 32,833 patients were identified as having undergone either laparoscopic (82%) or open (17%) cholecystectomy for either acute or chronic cholecystitis (74%) or symptomatic biliary disease (26%).  Mean operative time was 102 min, and was found to be significantly increased for residents at each participation level when compared to an attending operating alone in multivariate models (Table).  The peri-operative complication metric occurred in 1436 cases (4%), and was found to be significantly increased in univariate analysis for resident participation at levels 2 (OR 1.24; p<0.05) & level 3 (OR 1.35; p<0.05) as compared to an attending operating alone.  However after adjusting for patient-level confounders in the multivariate model, resident participation level was not found to be significantly associated with a higher likelihood of peri-operative complications (Table).

Conclusion: While resident participation in cholecystectomy within VA hospitals is associated with increased operative time, there is no adverse effect on the rate of perioperative complications.  These findings suggest that resident involvement achieves education and training objectives without sacrificing quality of care.

66.06 Population-Based Evaluation of Enhanced Recovery Protocol Implementation in Michigan

E. George1, G. Krapohl3, S. E. Regenbogen2,3  1University Of Michigan,Health Science Scholars Program,Ann Arbor, MI, USA 2University Of Michigan,Department Of Surgery,Ann Arbor, MI, USA 3University Of Michigan,Michigan Surgical Quality Collaborative,Ann Arbor, MI, USA

Introduction:  Enhanced Recovery Protocols (ERP) are widely demonstrated to improve perioperative outcomes after colectomy, yet it remains unknown to what extent ERPs have been successfully implemented outside the high-volume and highly specialized institutions that pioneered them. Thus, we sought to quantify the extent of ERP uptake within a representative, population-based, statewide hospital collaborative, and to understand obstacles to further dissemination.

Methods:  We conducted a statewide survey among 70 member hospitals of the Michigan Surgical Quality Collaborative. Through interviews with key stakeholders, we identified hospitals with full ERPs and those in the process of implementation, and described the time course of their development. Respondents named key obstacles to ERP implementation and detailed specific practices included in their protocols. Hospital characteristics were obtained from the American Hospital Association Annual Survey and compared using chi square tests for proportions.

Results: Interim results from 46 respondent hospitals (66% interim response) revealed that between 2010 and 2016, 13 (28%) hospitals fully implemented an ERP, while 22 hospitals (48%) did not. The time course of uptake is detailed in the Figure. At present, 11(24%) hospitals are still in development, but have not yet fully implemented their ERP. Hospitals with ERPs identified coordination of time and logistics of development and implementation (54%) as the most common obstacle, followed by disagreement on standard practices (15%), and nursing preferences (8%). For those without ERPs, the most common obstacles are surgeon engagement (52%), disagreement on standard practices (15%), coordination of time and logistics for development and implementation (15%), and anesthesiology preferences (12%). ERP hospitals were no more likely than non-ERP hospitals to be either teaching institutions (77% vs. 61%, p=0.50) or large hospitals with more than 300 beds (54% vs. 42%, p=0.53).

Conclusion: Despite increasing consensus around the value of ERPs for colectomy and years of emphasis among our statewide collaborative, implementation continues to be a challenge. Administrative support, logistical burden, and surgeon engagement are the most commonly reported challenges to more widespread ERP adoption. Interestingly, the likelihood of ERP implementation is no different in large academic hospitals than that of small non-academic ones. These findings suggest that broader implementation of ERP will require a three-pronged approach: improved dissemination of evidence-based standardized protocols to foster wider consensus, administrative support to incentivize the time and logistical burden of implementation, and opportunities to educate and engage surgeon leaders.

 

66.05 Utility of the 10 Hounsfield Unit Threshold for Identifying Adrenal Adenomas: Can We Improve?

M. Kohli1, R. Randle1, S. Pitt1, D. Schneider1, R. Sippel1  1University Of Wisconsin,Department Of Surgery,Madison, WI, USA

Introduction: Adrenal incidentalomas are identified on up to 5% of abdominal CT scans. Assessing such lesions for malignancy is essential for establishing appropriate patient follow up. A threshold of 10 Hounsfield units (HU) is currently recommended for differentiating benign adenomas from non-adenomas. Our study aims to evaluate the utility of the 10 HU threshold and to determine whether additional CT imaging features can be used to identify adenomas. 

Methods:  We performed a retrospective review of a single institution’s prospective endocrine surgery database. Our cohort included 192 patients who underwent an adrenalectomy between 2001 and 2015 due to a unilateral adrenal mass (excluding pheochromocytoma). All masses that were non-adenomatous via surgical histology (adrenal cortical carcinomas, ganglioneuromas, metastases, etc.) were in the non-adenoma group. Imaging characteristics of adenomas (n=128) and non-adenomas (n=64) were compared. Sensitivity and specificity for detection of adenomas were calculated over a range of unenhanced HU values and using absolute washout >60%. Multivariate analysis was performed to identify predictors of adenomas.  

Results: Unenhanced HU values <10 were more common in adenomas compared to non-adenomas (47.6% vs. 6.7%, p<0.001), but less than half of the adenomas resected met this criterion. Two non-adenomas (1 lymphangioma and 1 metastasis) measured <3 HU. Non-adenomas were more likely to measure ≥4cm (p=0.001), have irregular borders (p<0.001), have a non-homogeneous appearance (p=0.006), and contain calcifications (p=0.028). These suspicious imaging features were also present in 12-39% of benign adenomas. Multivariate analysis revealed that HU ≤16 (OR 15.9, 95% CI 3.1-81.7, p=0.001) and smooth borders (OR 6.4, 95% CI 2.1-20.0, p=0.001) were both independent predictors of adenomas. The 10 HU cutoff had a sensitivity of 47.6% and a specificity of 93.3% (AUC=0.71, p<0.001). Raising the cutoff to 16 HU improved the sensitivity to 65.9% without detriment to specificity, which remained 93.3% (AUC=0.79, p<0.001). Absolute contrast washout of >60% had a sensitivity and specificity of 53.8% and 100%, respectively (AUC=0.61, p=0.011). In the cohort of patients with washout values available (n=33), if a lesion was <16 HU and/or had >60% absolute washout, the sensitivity and specificity increased to 96% and 100% (AUC=0.98, p<0.001). 

Conclusion: The traditional 10 HU threshold has a high specificity for identifying adrenal adenomas, but is limited by a poor sensitivity. Increasing the threshold to 16 HU has the potential to improve sensitivity without sacrificing specificity. A combination criteria of <16 HU and/or >60% absolute washout yielded both a high sensitivity and specificity and can thus be used to accurately identify adrenal adenomas and allow for appropriate selection of patients for non-operative management.  

 

66.04 Impact of Complications after Pancreatectomy in the ACS NSQIP Procedure-Targeted Database

J. A. Mirrielees1, S. M. Weber1, C. C. Greenberg1, J. R. Schumacher1, J. E. Scarborough1  1University Of Wisconsin,Surgery,Madison, WI, USA

Introduction:  Existing federal quality initiatives primarily target a number of nonspecific postoperative complications that are easy to measure without regard for their relative value, such as surgical site infection, venous thromboembolism, adverse cardiac events, and respiratory complications.  The impact of these and other procedure-specific complications on the clinical and resource utilization of pancreatectomy patients is not currently known. We employ an empirical approach to examine the potential impact of a series of complications following pancreatectomy on mortality and resource utilization in order to identify the highest value targets for quality improvement interventions.

Methods:  Patients from the 2014 ACS-NSQIP Pancreatectomy-Targeted Participant Use File were included for analysis.  The frequency of 2 procedure-specific and 7 non-specific postoperative complications were determined.  Multivariable poisson regression with log link and robust error variance was used to determine the independent associations between individual complications and subsequent 30-day clinical (mortality, end-organ dysfunction) and resource utilization (prolonged hospitalization, hospital readmission) outcomes.  Adjusted relative risk estimates from these models were used to calculate adjusted population attributable fractions (PAFs) as a measure of complication impact.  The PAF describes the estimated reduction in the incidence of an adverse outcome that would be anticipated if exposure to a specific postoperative complication had been completely avoided in the study population. 

Results: There were 5,047 patients who underwent pancreatectomy in the study period. The most frequent complications included bleeding (18.3%), pancreatic fistula (18.1%), organ/space surgical site infection (11.4%), and delayed gastric emptying (11.3%). Bleeding and pneumonia were the complications with the largest overall impact on 30-day mortality in our study population (see Table).  Complete prevention of these complications would have resulted in reduction in mortality of 29.7% and 26.4%, respectively.  

Conclusion

Bleeding, pneumonia, pancreatic fistula, delayed gastric emptying, and organ/space surgical site infection have relatively large impacts on the clinical and resource utilization outcomes of patients who undergo pancreatectomy.  Most of the complications that are targeted by existing federal quality initiatives (urinary tract infection, venous thromboembolism, and surgical site infection) have comparatively small impacts on this patient population.  Redirecting initiatives towards the postoperative complications which matter the most would likely improve their effectiveness.

66.03 Increased Adoption of Bundled Measures Decreases Surgical Site Infection Rate for Colectomy

L. Ly1, J. Cedarbaum1, Y. Chen1, A. Hjelmaas1, R. Anand1, S. Collins2, S. Regenbogen2  1University Of Michigan,Medical School,Ann Arbor, MI, USA 2University Of Michigan,Department Of Surgery,Ann Arbor, MI, USA

Introduction:  With the advent of value-based purchasing for preventing healthcare-acquired infections, there is increasing interest in bundled interventions to reduce the rates of surgical site infections (SSI). In our statewide Michigan Surgical Quality Collaborative (MSQC), we previously found that compliance with six preventative measures was associated with decreased SSI rates for individual colectomy patients. We now seek to evaluate the effect of hospital-level implementation of these bundled preventative measures on overall SSI rates.

Methods:  This retrospective cohort study included all elective colectomies in 59 MSQC hospitals from 2012 to 2015. In accordance with our previously published method, a “bundle score” was assigned to each case, with one point given for each SSI preventative measure followed: 1) postoperative normothermia (temperature of >98.6°F); 2) SCIP-2-compliant IV prophylactic antibiotics; 3) postoperative glycemic control (day 1 glucose ≤140 mg/dL); 4) minimally-invasive surgery; 5) oral antibiotics with mechanical bowel preparation, if used; and 6) short operations (<100 minutes). We computed Pearson correlation coefficients to compare associations between trends in hospitals’ average “bundle score” over time and their risk- and reliability-adjusted incidence of postoperative SSI.

Results: Among the study population of 4,784 cases, 298 patients developed SSIs (6.2%). Overall, 91% of patients had postoperative normothermia; 87% had appropriate IV prophylactic antibiotics; 57% had postoperative glycemic control; 57% had minimally-invasive surgeries; 49% had oral antibiotics with mechanical bowel preparation, if used; and 28% had short operative duration. The year-to-year change in hospitals’ average bundle score ranged from -0.82 to +0.87, with an average of +0.07. The change in SSI incidence ranged from -9.0% to +5.3%, with an average of -0.1%. There was a small but statistically significant negative correlation between the change in “bundle score” and the change in SSI rate at the hospital level (Pearson’s r=-0.18, p=0.02, see Figure).

Conclusion: Among MSQC hospitals, there was a wide variability in the adoption of the six SSI preventative measures. Hospitals that increased compliance with this bundle of interventions for SSI prevention in colectomy were significantly more likely to experience a decrease in the incidence of postoperative SSI. These findings suggest that efforts to further increase adoption of these preventative measures are warranted. 

66.02 Decreased Inpatient Mortality after Hepatic Resection in a State Population

D. A. Hashimoto1, Y. J. Bababekov2, S. M. Stapleton2, I. H. Marks2, K. D. Lillemoe1, D. C. Chang2, P. A. Vagefi1  1Massachusetts General Hospital,Department Of Surgery,Boston, MA, USA 2Massachusetts General Hospital,Codman Center For Clinical Effectiveness In Surgery,Boston, MA, USA

Introduction:  There have been considerable improvements in surgical technique and perioperative care in the last decade with respect to hepatic resection for hepatobiliary diseases.  As a result, decreased post-operative mortality has been described at the institutional level. However, inpatient mortality trends following hepatic resection have yet to be assessed on a population level.

Methods:
The New York (NY) Statewide Planning and Research Cooperative System (SPARCS) inpatient database was utilized. All patients over the age of 18 years who underwent wedge hepatectomy or lobectomy from 2000-2014 were included. Trauma and recipient hepatectomy were excluded. Adjusted analysis accounted for age, race, payer status, Charlson Comorbidity Index (CCI), cirrhosis, viral/alcoholic hepatitis, hepatic malignancy (primary vs. secondary tumor), need for biliary-enteric reconstruction, and hospital hepatectomy volume.

Results:

A total of 13,467 hepatectomies were performed from 2000-2014 in the state of NY with a mean inpatient mortality of 2.35% (± 15.1% SD). Of these, 86.6% of hepatectomies were performed at academic centers (hospitals with a surgical residency). Inpatient mortality decreased from a rate of 3.69% in 2000 to 1.98% in 2014 (p<0.0001). Adjusted analysis demonstrated a decreasing trend in mortality from 2000 to 2014 with sustained significance reached in 2009 (OR 0.29, p=0.001) (Figure 1).

Subset analysis revealed similar findings for patients in academic centers, with secondary tumors, or with CCI>3 (all p<0.001). Independent predictors of mortality included age>70 years, male gender, Medicare payer status, primary liver tumor, and need for biliary-enteric reconstruction. Hepatectomy at an academic center (OR 0.62, p=0.002) and female gender (OR 0.67, p=0.001) were protective against mortality.

Conclusion:
This study demonstrates at the state population level that inpatient mortality after hepatectomy has improved over the time period 2000-2014. Increased survival may be due to a combination of advancements in operative and perioperative care. In-depth analyses of surgical care at hospitals in NY may reveal state wide quality improvement practices that led to reduced inpatient mortality after hepatic resections. Such measures could serve as a model for other health systems.

66.01 Bariatric Surgery in the Land of the Long White Cloud (Aotearoa/New Zealand)

A. D. MacCormick1, A. D. MacCormick1  1University Of Auckland,Surgery,Auckland, AUCKLAND, New Zealand

Introduction:
New Zealand has some of the highest obesity rates in the world, with indigenous and Pacific populations most significantly affected.  Counties Manukau Health (CMH) provides government-funded healthcare to South Auckland, a young, ethnically diverse and deprived population. CMH Bariatric Service has performed approximately 150 bariatric procedures since 2007, the vast majority being sleeve gastrectomy, and prioritizes the optimization of patient care.

Methods:
We performed a systematic review of the literature to determine the components for a bariatric enhanced recovery after surgery (ERAS) program. Subsequently we performed one of the first randomized controlled trials of ERAS versus standard care in patients undergoing Laparoscopic Sleeve Gastrectomy. Review of our five year outcomes indicated a propensity for weight regain at 18 months. We therefore conducted a systematic review looking for reasons this may occur and to determine a standard definition for weight regain.

Results:

ERAS programs are well recognized as providing important improvements in post-operative outcomes in colorectal surgery. The effects of ERAS following bariatric surgery, however, were until recently unknown.  A randomized trial conducted at CMH compared ERAS to standard care and found a significant reduction in length of stay and cost in the ERAS group with no increase in complication or admission rates.

 

Medium and long term outcomes following sleeve gastrectomy have only been reported in recent years.  Review of the five-year outcomes at CMH revealed %EWL outcomes of 60% at 18 months and 40% at five years, indicating a significant trend towards weight regain.  Furthermore, the onset of the weight regain was noted to occur at the time patients were discharged from the CMH Bariatric Service suggesting that a lack of follow-up support may be associated with weight regain.

 

Systematic review of weight regain specifically following sleeve gastrectomy has identified a lack of follow-up support as a potential contributor to weight regain.  To investigate this further, focus group discussion with almost 40 CMH sleeve gastrectomy patients who had experienced weight regain were conducted.  They also identified a lack of follow-up support as a contributing factor to weight regain and expressed a desire for more long-term support.  Based on these findings, a one year text message support intervention was designed and is currently being evaluated by randomized trial.

Conclusion:
Although only a relatively small center at the bottom of the world, the CMH Bariatric Service is committed to optimizing post-operative outcomes.  From this body of work, contributions have been made to consensus guidelines for bariatric ERAS programs as well as to the topical issue of weight regain.

48.20 Donor Biliary Anatomy should not be a Contraindication for Right Liver Donation

K. S. Chok1, A. C. Chan1, J. W. Dai1, J. Y. Fung2, T. Cheung1, S. Sin1, T. Wong1, K. Ma1, C. Lo1  2The University Of Hong Kong,Division Of Gastroenterology And Hepatology, Department Of Medicine,Hong Kong, NA, Hong Kong 1The University Of Hong Kong,Division Of HBP Surgery And Liver Transplantation, Department Of Surgery,Hong Kong, NA, Hong Kong

Introduction: There are scarce data on the impact of donor biliary anatomy on the incidence of biliary complication in donors and in recipients after liver transplantation. This study tried to establish the relation between donor biliary anatomy and incidence of biliary complication after right-lobe living donor liver transplantation (RLDLT), and to determine whether donor biliary anatomy should be a contraindication to right liver donation.

Patients and

Methods: A retrospective study was performed on all adult recipients of RLDLT at our center from January 2011 to December 2014. They were divided into the Stricture group and the Non-stricture group. Donor biliary anatomy was classified according to Huang’s classification, and all cholangiograms were reviewed by the first author.

Results: There were 125 RLDLTs performed during the study period. Twenty-six recipients had biliary anastomotic stricture (the Stricture group). One donor in the Stricture group and one in the Non-stricture group had biliary stricture; both of them had type-A anatomy. Bile leakage was not seen in any donor or recipient. The most common donor biliary anatomy was type A (96/125; 76.8%), followed by type B (13/125; 10.4%) and type D (10/125; 8%). Univariate analysis found no correlation between type of donor biliary anatomy and incidence of biliary anastomotic stricture in recipients (p=0.49).

Conclusions: Type of donor biliary anatomy was not related to the incidence of recipient or donor biliary complication, and therefore donor biliary anatomy should not be a contraindication to right liver donation.

48.19 The True Cost of Recurrent Hepatitis C

J. Campsen1, H. Thiesset1, R. Kim1  1University Of Utah,Transplantation And Advanced Hepatobiliary Surgery,Salt Lake City, UT, USA

Introduction:
It is commonly recognized that recurrent hepatitis c after transplantation is universal and immediate (1). The disease process within the liver is made worse after transplantation and accounts for 30% of transplants having cirrhosis within five years (2-4). Changes in antiviral treatment before transplantation show promising results, but has yet to eliminate the threat against the newly implanted organ. Furthermore recurrent hepatitis c after liver transplantation has a large financial impact on patients and the healthcare system. A national estimate for liver transplantation and first year costs can range from $300,000-$500, 000 without significant complications such as recurrent hepatitis c (5). We aimed to examine the financial impact of recurrent hepatitis c after transplantation and show one center’s experience.

Methods:
After approval from the University of Utah Institutional Review Board and as part of a clinical trial (6), patients underwent standard of care antiviral therapy with sofosbuvir and ribavirin before liver transplantation. Patients 1& 2 showed non-detectable levels of hepatitis C before inclusion in the clinical trial. The first patient was randomized to control and the second was randomized to 300mg of Civacir® Polyclonal Immune Globulin (IgG) and received 16 doses.

Results:
Patient 1 who was randomized to control, had recurrent hepatitis c detected at day six post liver transplantation. The control patient had significant complications after surgery due to the recurrent hepatitis c including multiple hospitalizations at an estimated cost of $140, 408.00. Furthermore, this patient required subsequent re-treatment with antiviral therapy at a national average cost of $169,000(7) (Table) for 24 weeks of treatment. The patient who received Civacir® did not have recurrent hepatitis c within the first six months of follow up and study completion.

Conclusion:
Two patients with similar standard of care protocols were enrolled in a clinical trial to prevent recurrent hepatitis c. In our experience, Civacir® proved effective in preventing recurrent hepatitis c in the transplanted liver and further prevented subsequent costs associated with treatment and patient care. The potential impact of Civacir® could eliminate the need for re-transplantation and hundreds of thousands of dollars to the healthcare system.

 

48.18 Postoperative Complications Associated With Parathyroid Autotransplantation After Thyroidectomy

Z. F. Khan1, G. A. Rubio1, A. R. Marcadis1, T. M. Vaghaiwalla1, J. C. Farra1, J. I. Lew1  1University Of Miami Miller School Of Medicine,Division Of Endocrine Surgery, DeWitt Daughtry Family Department Of Surgery,Miami, FL, USA

Introduction: Permanent hypoparathyroidism is a well-recognized complication of total thyroidectomy that may acutely manifest postoperatively with muscle spasms/tetany, paresthesias, and seizures. An established procedure, parathyroid autotransplantation (PAT) can successfully prevent permanent hypoparathyroidism due to inadvertently resected or devascularized parathyroid tissue. This study examines the independent patient characteristics and postoperative complications associated with those undergoing PAT after total thyroidectomy.

Methods: A retrospective cross-sectional analysis was performed using the Nationwide Inpatient Sample (NIS) database from 2006-2011 to identify surgical patients hospitalized for total thyroidectomy that did or did not undergo PAT. Characteristics including co-morbidities and postoperative complications were measured. Univariate and logistical regression analyses were conducted to identify characteristics that were independently associated with patients that underwent PAT. Data were analyzed using two-tailed Chi-square and t-tests.

Results:Of 219,584 admitted patients who had total thyroidectomy, 14,521 (6.7%) also underwent PAT. Patients in the PAT group had fewer comorbidities including DM, HTN, CHF, chronic lung disease (12.5% vs 15.1%, 37.1% vs 39.9%, 1.5% vs 2.1%, 11% vs 12%, respectively,  p<0.01) and fewer cardiac complications including stroke and MI (0% vs 0.2% and 0.1% vs 0.2% , respectively, p<0.01). However, the autotransplanted group had higher rates of renal failure (2.7% vs 2.1%, p<0.01) and thyroid malignancy (55.4% vs 43.1%, p<0.01) compared to those not autotransplanted. The PAT group also had higher incidence of wound complications including SSI and seroma (2.6% vs 2.1%; 0.2% vs 0.1%; 0.2% vs 0.1%, p<0.01, respectively), unilateral vocal cord paralysis (2.4% vs 1.6%, p<0.01), substernal thyroidectomy (8.7% vs 7.5%, p<0.01) and in-hospital death (1.6% vs 0.3%, p<0.01). Immediate hypoparathyroidism (3.2% vs 1.3%, p<0.01), hypocalcemia (15% vs 8.6%, p<0.01), and tetany (0.3% vs 0.1%, p<0.01) were all associated with PAT patients as well. On multivariate analysis, renal failure (2.246 OR; 95% CI 1.448-3.485), and elective procedures (OR 1.744; 95% CI 1.422-2.129) were associated with increased odds of undergoing PAT during hospitalization for total thyroidectomy.

Conclusion:Although a known preventative measure for permanent hypoparathyroidism, PAT is associated with higher rates of postoperative complications. Patients with fewer comorbidities who undergo PAT experience higher rates of wound complications, hypoparathyroidism, hypocalcemia and tetany. Acute severity of postoperative hypoparathyroidism may further contribute to higher rate of in-hospital death in these PAT patients. PAT should not be routinely performed and utilized only in select patients with suspected compromised parathyroid function after total thyroidectomy

 

48.17 Subcutaneous Granular Cell Tumor: Analysis of 19 Cases Treated at a Dedicated Cancer Center

A. S. Moten3, S. Movva2, M. Von Mehren2, N. F. Esnaola1, S. Reddy1, J. M. Farma1  1Fox Chase Cancer Center,Department Of Surgery,Philadelphia, PA, USA 2Fox Chase Cancer Center,Department Of Hematology/Oncology,Philadelphia, PA, USA 3Temple University Hospital,Department Of Surgery,Philadelphia, PA, USA

Introduction:  Granular cell tumors (GCT) are rare lesions that can occur in almost any location in the body, and there have been no large-scale studies regarding GCT located in the subcutaneous tissue.   The aim of this study was to define patient characteristics, treatment patterns and outcomes of patients with subcutaneous GCT.

Methods:  A retrospective chart review was performed of patients with subcutaneous GCT treated at a dedicated cancer center.  Descriptive statistics were obtained, bivariate and multivariate regression performed, and survival rates calculated using Stata software.  

Results: A total of 19 patients were treated for subcutaneous GCT at our institution between 1992 and 2015, 79% female and 63.2% white.  Mean age was 48.2 years.  Most (68.4%) had comorbidities, and some (31.6%) had a history of cancer.  Mean tumor size was 2.37cm.   Most patients underwent primary excision of their tumors without undergoing prior biopsy (73.7%).  Men were more likely to undergo re-excision for positive margins than women (75.0% versus 13.3%, respectively, p-value 0.01).   No patient received adjuvant therapy.  Three patients (15.8%) had multifocal tumors, and they were significantly more likely to experience recurrence than patients with solitary tumors (33.3% versus 6.25%, respectively, p-value 0.02).  Patients with multifocal tumors were also more likely to undergo repeat surgery (33.0% versus 0%, respectively, p-value 0.02).  A total of 2 patients (10.5%) experienced recurrence, with a median time to recurrence of 23.5 months.  Overall cancer-specific 5-year survival was 88.0%.   There was no increased risk of death based on gender, race or recurrence status.

Conclusion: Patients with subcutaneous GCT treated with excision fair well without adjuvant treatment.  However, patients with multifocal tumors are more likely to experience recurrence and should undergo repeat surgery.

 

48.16 Effect of Hospital Safety Net Status on Treatment and Outcomes in Hepatocellular Carcinoma

A. A. Mokdad1, A. G. Singal2, J. A. Marrero2, A. C. Yopp1  1University Of Texas Southwestern Medical Center,Surgery,Dallas, TX, USA 2University Of Texas Southwestern Medical Center,Internal Medicine,Dallas, TX, USA

Introduction:  Safety net hospitals play an integral role in the care of “vulnerable” patients with cancer. Following the institution of the Affordable Care Act (ACA), the fate of safety net hospitals is unclear. Hepatocellular carcinoma (HCC) is a leading cause of cancer deaths and the fastest growing cancer in the United States. The role of safety net hospitals in the management of this health taxing cancer has not been investigated. This study explores the presentation, treatment, and outcomes of patients with HCC at safety net hospitals in effort to guide resource allocation during an evolving healthcare platform.

Methods:  A total of 17,551 patients with HCC were identified in the Texas Cancer Registry between 2001 and 2012. Hospitals in the highest quartile of disproportionate share hospital index were classified safety net. Patient demographics, tumor presentation, treatment, and overall survival were compared among patients managed at safety net hospital(s), non-safety net hospital(s), or both. Risk-adjusted treatment utilization and overall survival were examined using multivariable analysis. The proportion of patients presenting at safety net hospitals over time was explored using time trend analysis. Transfer patterns between safety net and non-safety net hospitals were examined.  

Results: A total of 328 acute short term hospitals were identified, 74 (23%) were designated safety net. Safety net hospitals were more likely teaching compared to non-safety net hospitals; oncology and radiology resources were comparable. Forty-three percent of HCC patients sought care at a safety net hospital (33% exclusively at safety net hospital(s) and 10% at both safety net and non-safety net hospitals). The proportion of HCC patients presenting at safety net hospitals did not significantly change over the study period time. Patients at safety net hospitals were mostly Hispanic (58%) and poor (61%). Tumor stage was comparable between hospitals categories. Overall treatment utilization was lower at safety net hospitals (adjusted odds ratio [OR]=0.85, 95% confidence interval [CI]=0.78-0.92) which was largely related to lower chemotherapy use (26% vs. 34%, P < 0.01). Overall survival was comparable (adjusted hazard ratio [HR]=1.03, 95% CI=0.99-1.08). In patients managed at both hospital groups, diagnosis and management of disease recurrence/persistence were more common at non-safety net hospitals, while first course treatment of HCC was more common at safety net hospitals. 

Conclusion: Almost one in two patients with HCC seek care at safety net hospitals. While the fate of safety net hospitals remains uncertain under the ACA, monitoring the redistribution of HCC patients and anticipating resource allocation will be key in an evolving healthcare platform.
 

48.15 Peritoneal Dialysis is Feasible as a Bridge to Simultaneous Liver-Kidney Transplant

R. Jones1, R. Saxena2, C. Hwang1, M. MacConmara1  1University Of Texas Southwestern Medical Center,Department Of Surgery, Division Of Surgical Transplantation,Dallas, TX, USA 2University Of Texas Southwestern Medical Center,Department of Internal Medicine, Division Of Nephrology,Dallas, TX, USA

Introduction: Combined chronic liver and kidney failure is a life-threatening condition that requires complex multiorgan support given complications related to fluid shifts, bleeding and infection. Simultaneous liver-kidney transplant (SLKT) is the best therapeutic option, but scarcity of organs leaves the majority of patients on dialysis for months or years while awaiting SLKT. Hemodialysis (HD) exacerbates pre-existing intravascular instability. Peritoneal dialysis (PD) is an alternative strategy which causes less hemodynamic instability and may assist in the management of large volume ascites. However, PD has been avoided in cirrhotics due to concerns regarding elevated risk of bacterial peritonitis, treatment failure or impairment of transplant candidacy. We describe our outcomes using PD in a group of 12 patients with combined liver and kidney failure, demonstrating that PD is a preferential bridging option in patients awaiting SLKT. 

Methods: Patients with advanced liver and kidney failure who were initiated on PD between January 2006 and December 2015 were identified by review of our institution’s medical record.  The hospital electronic medical record and dialysis center records were used to retrieve demographic and clinical data.  Outcomes included mortality, complications of catheter insertion, dialysis treatments and need for large volume paracentesis.

Results: Twelve patients with combined liver and kidney failure were initiated on PD during this period. Ten of 12 patients were male with average age of 56 years. No deaths occurred at the completion of the study with mean follow up of 4.5 years. With average MELD score of 21, expected three-month mortality would be 19.6 percent. There are a total of 480 months of peritoneal dialysis amongst these patients.  Three patients received SLKT, while the 9 remaining patients continue on PD. Within this group of 9, 4 are actively listed for SLKT. There was 1 operative complication.  The rate of peritonitis was 1 episode every 44 months of PD. The need for large volume paracentesis was totally eliminated given ability to drain fluid daily. After initiating PD, patients were hospitalized a mean of 4 times with a median of 2 times (one patient was admitted 20 times for recurrent gastrointestinal bleed).

Conclusions: Our data suggests that PD is a useful option for management of combined liver and kidney failure and can be used to provide a durable treatment for selected patients.  Peritoneal dialysis can bridge patients to transplant or serve as a long-term treatment option for patients not suitable for transplantation. We promote PD as first line renal therapy in patients with combined liver-kidney failure given its excellent outcomes and usefulness in managing ascites.

48.14 Warming during implantation: an overlooked opportunity for improvement in kidney transplantation?

Y. GoldenMerry1, H. Piristine1, P. Prabhakar1, J. Parekh1, C. Hwang1, M. MacConmara1  1UT Southwestern Medical Center,Dallas, TX, USA

Introduction:
Many factors impact outcomes of transplant kidney allografts.  There has been renewed interest in studying the effect of warm ischemia time during implantation on allograft outcomes.  Longer anastomotic time leads to warming of the allograft, and organ temperature above 15 degrees Celsius at reperfusion has been shown to increase the risk of delayed graft function (DGF).  DGF is risk factor for allograft loss and has been associated with a reduction in five-year allograft survival by up to 50%. We sought to investigate the perceived importance of anastomotic time amongst practicing kidney transplant surgeons and attitudes toward potential need for improvement in this component of the transplant process.
 

Methods:
Transplant surgeons were invited to complete an anonymous electronic questionnaire on their kidney transplant operative practices. Self-reported data on cold and warm ischemia time, percent of organ imports, total length of operation, percent of complex anastomotic procedures and anastomosis time were gathered. Opinions regarding the effect of warm ischemia time on DGF, current methods to combat organ warming, and receptiveness to new technology were also collected.
 

Results:
Surgeons at seven transplant centers across the US completed the survey.  Average cold ischemia time (time from cross clamp until the time the organ was taken out of ice) at centers was 13.3+/-3.7 hours, import kidneys accounted for 12+/-7% of transplants, and 26+/-25% of all kidneys were placed on pulsatile perfusion prior to transplantation.  The average operative time was 174+/-37 minutes, with anastomoses taking 30+/-5 minutes.  Surgeons perceived that warm time was greater than goal in 25+/-21% of anastomoses. Seventy percent of surgeons agreed that warming during implantation contributes to negative graft outcomes (DGF), however 30% of responders did not employ any cooling and the remainder used an icy wrap or tried to irrigate with cold slush while performing the anastomosis.  Eighty percent of surgeons indicated they would utilize a specific cooling device if available.

 

Conclusions:
Longer anastomotic time has been demonstrated to negatively affect kidney allograft function. Our data show that surgeons recognize the negative impact of warming during anastomoses especially in complex cases.  Most surgeons do not use a specific strategy to keep the organ cool and would be very receptive to a dedicated cooling device.  

 

48.13 ABO Incompatible Renal Transplant Reduces Waitlist Times: Analysis of the UNOS Database (1995-2015)

C. S. Lau1,2, K. Malik2, S. Mulgaonkar3, R. S. Chamberlain1,2,4  1Saint Barnabas Medical Center,Surgery,Livingston, NJ, USA 2St. George’s University School Of Medicine,St. George’s, St. George’s, Grenada 3Saint Barnabas Medical Center,Medicine,Livingston, NJ, USA 4Rutgers University,Surgery,Newark, NEW JERSEY, USA

Introduction:   Renal transplants significantly improve quality of life for patients with end stage renal disease (ESRD) relying on lifetime dialysis.  However, the demand for organs exceeds the number of available organs, increasing the need for ABO incompatible (ABOi) renal transplant.  The current knowledge regarding the clinical outcomes of ABOi transplantations is limited and derived mainly from case reports and small cohort studies. This study examines a large cohort of kidney transplant patients undergoing ABOi incompatible and ABO compatible (ABOc) transplants, in an effort to identify the demographic and clinical factors associated with graft survival outcomes and transplant waitlist times.

Methods:   Demographic and clinical data for 102,084 patients undergoing renal transplant were abstracted from the United Network for Organ Sharing (UNOS) database (1995-2015).  Patients were grouped into ABOc (N=101,237) and ABOi (N=847) renal transplants.  Endpoints examined included waitlist times and graft survival time.  Standard statistical methodology was used. 

Results:  A total of 102,084 patients received a renal transplant, 847 (0.83%) received ABOi transplants and 101,237 (99.17%) received ABOc transplants.  The mean age of transplant recipients were similar for both ABOc (49.86 ± 15.8 years) and ABOi (50.6 ± 14.2 years) transplants.  Although there were more male transplant recipients (ABOc: 64.0% and ABOi: 61.0%) compared to females, a similar male-to-female ratio was observed among both ABOc and ABOi transplants (1.78:1 and 1.57:1, p>0.05). While a majority of ABOc transplants were from cadaveric donors (66.4% vs. 34.7% living donors, p<0.01), a significantly greater number of ABOi transplants were from living donors (65.3% vs. 34.7% cadaveric, p<0.01). The mean waitlist time to transplant was significantly shorter for ABOi transplants compared to ABOc transplants (585.8 vs. 739.3 days, p<0.01). Graft survival time remained similar between both ABOi and ABOc transplants (770.0 vs. 821.5 days, p=0.197).

Conclusions:  Advances in kidney transplantation have significantly improved the prognosis of patients with ESRD.  In comparison to ABOc, ABOi renal transplant significantly shortens waitlist times, while maintaining similar graft survival times. Where adequate immunosuppression is available, ABOi renal transplants should be considered when ABOc transplants are not available. Further studies comparing the safety and efficacy of ABOi transplants, including long-term follow-up and required immunosuppression, are required.

48.12 Pancreas Retransplantation Is Risky For Patients With A History Of Transplant Pancreatectomy

M. Barrett1, Y. Lu2, D. M. Cibrik2, R. S. Sung1, K. J. Woodside1  1University Of Michigan,General Surgery,Ann Arbor, MI, USA 2University Of Michigan,Internal Medicine,Ann Arbor, MI, USA

Introduction:   Despite improvements in pancreas transplant outcomes, a small but significant subset of patients experience catastrophic graft failure, often due to allograft thrombosis, necessitating transplant pancreatectomy. It is unclear how this subset of patients fares when retransplanted.  We sought to review our institution’s experience with second pancreas transplant after previous transplant pancreatectomy.

Methods: Patient encounters in which transplant pancreatectomies were performed were identified using associated billing codes.  Chart review of these encounters through both the Organ Transplant Information System and the hospital EMR system was used to collect demographic and outcomes data.  Further investigation of discharge paperwork, clinic notes, and outside records was performed on patients who underwent second pancreas transplant to analyze allograft function.

Results: Between January 1990 and July 2016, 402 pancreas transplants—293 simultaneous kidney pancreas (KP) transplants, 99 pancreas after kidney (PAK) transplants, and 10 pancreas transplants alone (PTA).  Amongst this cohort, 87 pancreatectomies were performed in 78 patients. Of these, 15 (19%) patients underwent a second pancreas transplant after transplant pancreatectomy. The study population consisted of 5 women and 10 men. Median age at initial pancreas transplant was 37 years (range 27 – 57 years), with 8 patients who initially underwent PAK transplant, 6 who underwent a simultaneous KP transplant and 1 who underwent PTA.  Indication for initial pancreatectomy was thrombosis in 12 patients, all of whom had their graft removed within one month of transplant (median 1 day, range 0 – 31 days).  Another 3 patients developed an intraabdominal infection requiring pancreas allograft explantation (median 26 months, range 6 – 108 months).  

Median time from pancreatectomy to second transplant was 18 months (range 7-94 months). For the second pancreas transplants, one patient underwent KP transplant, while all others underwent pancreas-only transplant. Median time after second transplant to last documented follow was 10 years (36 days – 19 years).   Four pancreas allografts are still functioning—3 of which have been functional for 10 years.  Of the remaining patients, 7 required transplant pancreatectomy for allograft thrombosis in the immediate post operative period (4 at 1 day, with the rest within a week).  Four others failed after the perioperative period (at 1, 2, 2, and 6 years, respectively). The 8 patients with graft function for a year or more were younger at second transplant (median 38 vs. 45 years old), although this was not statistically significant.

Conclusion: Pancreas retransplant after previous transplant pancreatectomy is feasible, although it is associated with a high initial failure rate, suggesting that these patients require additional considerations before retransplantation—over and above that of those with intact failed pancreas allografts.

48.11 Surgical techniques of concomitant coronary artery bypass grafting and lung transplantation.

M. Hamms1, M. A. Kashem2, B. O’Murchu4, R. Bashir4, J. Gomez-Abraham2, S. Keshavamurthy2, E. Leotta2, T. Yoshizumi2, K. Shenoy3, A. J. Mamary3, G. Criner3, F. Cordova3, Y. Toyoda2,3  1Temple University Hospital,Philadelpha, PA, USA 2Temple University,Cardiovascular Surgery,Philadelpha, PA, USA 3Temple University,Division Of Thoracic And Pulmonary Medicine,Philadelpha, PA, USA 4Temple University,Section Of Cardiology,Philadelphia, PA, USA

Introduction: Significant coronary artery disease is a relative contraindication for lung transplantation. However, recent single center studies suggest concomitant coronary artery bypass grafting (CABG) can be performed at the time of lung transplantation. The purpose of this study was to show our excellent outcomes with these concomitant procedures, and to describe our surgical techniques.

Methods: Retrospective review for 240 consecutive lung transplants performed during March, 2012 to August, 2016, was conducted. Lung Transplantation with CABG (n=17) and without CABG (n=223) was compared for statistical significance using SAS Inc. 

Results:

The recipient age was significantly (p=0.009) higher, 66 ± 5 (range 52-74) years in lung transplant with CABG vs. 62 ± 10 (range 21-78) years in Lung transplant without CABG whereas the lung allocation score (60 ± 21 vs. 53± 21), and the donor age (35 ± 10 vs. 33 ± 11 years)  were similar, respectively.

All CABGs (bypass grafts=1-3) were performed on a beating heart without cardioplegic cardiac arrest, with off pump (n=7), with cardiopulmonary bypass (n=7), and with veno-arterial extracorporeal membrane oxygenation (n=3). On pump vs. off pump was determined based on the need to safely perform lung transplant portion.

Surgical approaches were determined based on the surgical exposure to the lung and coronary arteries, consisting from median sternotomy (n=7), anterior thoracotomy (n=7) and clamshell (n=3).

When the left anterior descending coronary artery required revascularization, the left internal mammary artery (LIMA) was used in 92% (11 out of 12 patients). The LIMA was harvested through median sternotomy (n=6) or left anterior thoracotomy (n=5).

When the saphenous vein grafts were used (n=15), the inflow was the ascending aorta (n=12), the descending aorta (n=2) and the LIMA (n=1).

The median hospital stay was similar with lung transplant with CABG (18 days) vs. lung transplant without CABG (18 days).

Two patients died after concomitant lung transplantation and CABG on postoperative day No. 414 and 642 both due to infection, resulting in 100% 1-year survival and 80% 3-year survival rates whereas lung transplantation without CABG had 85% and 76%, respectively (p=0.397).

Conclusion:Excellent outcomes can be achieved in lung transplantation along with concomitant CABG by carefully conducted surgical strategies including off pump vs. on pump, a variety of surgical approaches, and choice of conduits.
 

48.10 Hospital Readmissions Following Discharge After Orthotopic Liver Transplantation (OLT)

E. J. Minja1, T. L. Pruett1  1University Of Minnesota,Division Of Solid Organ Transplantation,Minneapolis, MN, USA

Introduction:

Preventing early hospital readmissions is key in reducing  medical care cost.  Our objective was to determine the incidence and causes of readmissions ≤ 30 days of discharge following an Orthotopic Liver Transplantation (OLT).

Methods:

1028 patients underwent OLTs  between 1/1/1997 and 12/31/2014  at our institution . Electronic medical  records were reviewed after IRB approval. Causes of readmissions were analyzed. Patients  ≤ 18 years  were excluded  from analysis. Student’s t-test was used to compare groups. A value of p< 0.05 was considered significant.

Results:

Between 1/1/1997 and 12/31/2014, 1028  OLTs were performed, of which 155 (15.1%) were from living donors (LD) and 873 (84.9%) from deceased donors (DD). 931 patients (90.7%) underwent liver alone transplants and 96 patients (9.3%) patients had simultaneous liver and kidney transplants [Table 1].

473 patients (46%) were readmitted ≤ 30 days of discharge. Complete data for analysis was available for 225 patients who received OLTs between 9/2004 and 12/2014. Of this pool of readmitted patients, 188 patients (83.6%) had received DD OLTs vs. 37 patients (16.4%) who had LD OLTs.

The most common cause of hospital readmissions following an OLT was biliary complications. Of the LD OLT recipients readmitted within 30 days of discharge, 43% had biliary complications compared to 13% for DD recipients, p=<0.05. The most common biliary complications amongst LD OLT recipients were bile leaks (50%) and bilomas (31%) [Figure 1-3].

Conclusion:
46% (453/1028) of our patients after OLT were readmitted ≤30 days of discharge representing a significant health care burden. Our data suggests the need to develop predictive models for readmission following OLT and perhaps the need to change our surgical approaches, with goals to reduce preventable hospital readmissions.