17.19 Risk Factors Necessitating Early Ophthalmologic Evaluation of Facial Burns

S. J. Day1, A. Patel1, D. E. Bell1  1University Of Rochester,School Of Medicine And Dentistry,Rochester, NY, USA

Introduction:  Facial burns with ocular involvement often lead to significant morbidity. This study aimed to identify risk factors for short- and long-term ophthalmic complications in facial burn patients. Upon validation, these risk factors could distinguish patients who require ophthalmologic evaluation and prompt intervention.

Methods:  Retrospective case review was conducted of facial burn patients presenting to an American Burn Association-verified regional burn center from June 2007 to May 2016. Demographic, injury-related, and hospitalization-related variables were assessed for correlation with short- and long-term ophthalmic complications. Short-term complications included visual loss on presentation, lagophthalmos, ectropion, chemosis, ocular hypertension, chalasis, conjunctival necrosis, and orbital or periorbital infection. Long-term complications included lagophthalmos, cicatricial ectropion, exposure keratopathy, scleritis, and corneal stem cell deficiency. 

Results: From June 2007 to May 2016, 1126 facial burn patients presented to a regional burn center’s inpatient and outpatient settings. One hundred thirty-seven (12.2%) had associated periorbital and orbital injuries. Of the ocular burns, 66.4% were male and 75.9% were Caucasian. Average total body surface area (TBSA) burned was 9.72% (range, 0.02 – 75.13%), and average facial surface area burned was 1.64% (range, 0.02 – 6.65%). One hundred and twenty patients (87.6%) received an ophthalmologic consult. Sixty patients (43.8%) developed short-term ophthalmic complications, with the most common being chemosis (n = 36, 26.3%). Eight patients (5.8%) developed long-term complications, with the most common being lagophthalmos (n = 4, 2.9%). Two flash burn patients (1.5%) developed cicatricial ectropion and underwent full-thickness skin grafts to the eyelids. One scald burn caused a localized corneal stem cell deficiency, which led to chronic keratitis requiring long-term steroid treatment.

 

Statistically significant risk factors (p < 0.05) for both short- and long-term ophthalmic complications included inhalation injury, higher percentage of body with 3rd degree burns, and presence of corneal injury.

 

Presence of short-term complications was significantly associated (p < 0.05) with advanced age, higher TBSA burns, and higher percentage of body with 2nd degree burns. Significant associations (p < 0.05) with the development of long-term complications included active smoker status, 3rd degree eyelid burns, periorbital edema, need for intubation, longer duration of mechanical ventilation, visual loss on presentation, chemosis, lagophthalmos, ectropion, bloodstream infection, and longer length of hospitalization. 

Conclusion: Providers should obtain early ophthalmologic evaluation for facial burn patients who present with advanced age, active smoking status, inhalation injury, 2nd and 3rd degree burns, or need for intubation. 
 

17.18 Prognostic effect of lymph node ratio after curative-intent resection for distal cholangiocarcinoma

A. Y. Son1, R. Shenoy1, C. G. Ethun2, G. Poultsides3, K. Idrees4, R. C. Fields5, S. M. Weber6, R. C. Martin7, P. Shen11, C. Schmidt9, S. K. Maithel2, T. M. Pawlik10, M. Melis1, E. Newman1, I. Hatzaras1  1New York University School Of Medicine,Surgery,New York, NY, USA 2Emory University School Of Medicine,Surgery,Atlanta, GA, USA 3Stanford University,Surgery,Palo Alto, CA, USA 4Vanderbilt University Medical Center,Surgery,Nashville, TN, USA 5Washington University,Surgery,St. Louis, MO, USA 6University Of Wisconsin,Surgery,Madison, WI, USA 7University Of Louisville,Surgery,Louisville, KY, USA 9Ohio State University,Surgery,Columbus, OH, USA 10Johns Hopkins University School Of Medicine,Surgery,Baltimore, MD, USA 11Wake Forest University School Of Medicine,Surgery,Winston-Salem, NC, USA

Introduction:
The ratio of metastatic to total harvested lymph nodes (LNR) is an important prognostic factor following resection of gastrointestinal malignancies. We assessed the prognostic value of LNR in patients undergoing resection for distal cholangiocarcinoma (DCC).

Methods:
Patients who underwent curative intent resection of DCC in 10 institutions of the US Extrahepatic Biliary Malignancy Collaborative were included. Descriptive statistics were used to evaluate characteristics of demographic data. Multivariate proportional hazards regression was used to identify factors associated with recurrence-free survival and overall survival.

Results:
A total of 265 were included (median age 67 years; 63.4% male): 199 with low-LNR (00.4). The high LNR group was less likely to have undergone a Whipple procedure (85.4% vs. 82.9% vs. 60.0%, p<0.01), had a higher proportion of margin-positive resection (19.6% vs. 19.5% vs. 45.8%, p<0.05), poor differentiation (26.2% vs. 36.6% vs. 52.2%, p<0.05), lymphovascular (44.3% vs. 74.3% vs. 88.2%, p<0.001) and perineural invasion (81.0% vs. 69.2% vs. 91.3%, p>0.05). Multivariate analysis showed high-LNR as an independent predictor of poor RFS (HR 4.6, 95%CI 1.8-11.8, p=0.001) and OS (HR 2.2, 95%CI 1.0-4.6, p<0.05) (Table 1).  Rates of adjuvant chemoradiation in low-moderate LNR and high-LNR were 61.9% and 82.6%, respectively (p=0.07). Nevertheless, stratification by LNR showed no improvements in RFS or OS with either adjuvant chemoradiation.

Conclusion:
LNR can be used as a prognostic factor for recurrence and survival in patients undergoing curative-intent resection for DCC. Every effort should be made to perform an oncologic resection, with negative margins and adequate lymph node harvest, as adjuvant chemoradiation does not appear to provide LNR-specific improvements in long-term prognosis.
 

17.17 Cluster Analysis of Acute Type A Aortic Dissection Results in a Modified DeBakey Classification

J. L. Philip1, B. Rademacher1, C. B. Goodavish2, P. D. DiMusto3, N. C. De Oliveira2, P. C. Tang2  1University Of Wisconsin,General Surgery,Madison, WI, USA 2University Of Wisconsin,Cardiothoracic Surgery,Madison, WI, USA 3University Of Wisconsin,Vascular Surgery,Madison, WI, USA

Introduction: Acute type A aortic dissection is a surgical emergency. Traditional classifications systems group dissections were based on an evolving therapeutic approach. Using statistical groupings based on dissection morphology, we examined the impact of morphology on presentation and clinical outcomes.

Methods: We retrospectively reviewed 108 patients who underwent acute type A dissection repair from 2000-2016. Dissection morphology was characterized using 3-dimensional reconstructions of computed tomography scan images based on the true lumen area as a fraction of the total aortic area along the aorta. Two-step cluster analysis was performed to group the dissections.

Results: Cluster analysis resulted in two distinct clusters (silhouette cluster of cohesion and separation = 0.6). Cluster 1 (n=71, 65.7%) was characterized by a dissection extending form the ascending aorta into the abdominal aorta and iliac arteries (table 1). Cluster 2 dissections (n=37, 34.3%) extends from the ascending aorta to the aortic arch with limited extension into the distal arch and descending thoracic aorta. Cluster 2 extends the traditional DeBakey type II definition to include dissections propagating into the arch. Cluster 1 patients had more malperfusion (P=0.002) as well as lower extremity and abdominal pain on presentation (P<0.05). Cluster 2 had a greater number of diseased coronary vessels (P<0.05) and more commonly had previous percutaneous coronary intervention (P<0.05). No differences in age, gender and other major comorbidities were noted. Cluster 1 is characterized by a smaller primary tear area (3.7 vs 6.6 cm2, P=0.009), greater number of secondary tears (1.9 vs 0.5, P<0.001), more dissected non-coronary branches (2.9 vs 0.6, P<0.001) and greater degree of aortic valve insufficiency (P<0.05). Operative variables including cardiopulmonary bypass and cross-clamp time as well as extent of arch repair and type of proximal operations were similar. There were no differences in post-operative complications or survival. Cluster 1 had a significantly higher rate of intervention for distal dissection complications (10% vs 0%, P=0.048).

Conclusions: This study examines clinical presentation and outcomes in acute type A dissection based on morphology using statistical categorization. Cluster 2 acute type A dissections had much less distal aortic dissection involvement and need for distal aortic intervention. Therefore, it is likely reasonable to extend the definition of DeBakey type II dissection to involvement of the distal arch and proximal descending thoracic aorta. The greater area of the primary aortic tear in Cluster 2 with rapid decompression of the false lumen may explain the lesser degree of distal aortic dissection.

17.16 Outcomes Following Cholecystectomy in Kidney Transplant Recipients

S. R. DiBrito1,2, I. Olorundare1,2, C. Holscher1,2, C. Haugen1,2, Y. Alimi2,3, D. Segev1,2  1Johns Hopkins University School Of Medicine,Transplant Surgery,Baltimore, MD, USA 2Johns Hopkins School Of Public Health,Baltimore, MD, USA 3Georgetown University Medical Center,Washington, DC, USA

Introduction:  Cholecystectomy is one of the most commonly performed operations, and kidney transplant (KT) recipients are among those who require surgical management for cholelithiasis and cholecystits. Limited studies suggest that KT recipients have higher complication rates and longer length of stay (LOS) following general surgical procedures than non-transplant patients. This study investigates differences in complication rates, LOS, and cost between KT recipients and non-transplant patients undergoing cholecystectomy. Differences in outcomes following surgery at transplant centers vs non-transplant centers were also evaluated. 

Methods:  The Nationwide Inpatient Sample was used to study 7318 adult KT recipients and 5.3 million non-transplant patients who underwent cholecystectomy between 2000-2011. Postoperative complications were defined by ICD-9 code. Complication rates, LOS, and cost were compared using hierarchical logistic regression, hierarchical negative binomial regression, and mixed effects log-linear models respectively. Hospitals were categorized as transplant center if at least one transplant was performed during the follow-up period.

Results: On primary admission, the mortality rate for KT recipients was significantly higher than for non-transplant patients. (2.7 vs 1.2%, p <0.001). The rate of any postoperative complication was also higher (18.8 vs 13.9%, p <0.001). Following adjustment for patient and hospital level factors, mortality and complications both had increased odds in KT recipients respectively (OR 2.39, 95%CI 1.66-3.44; OR 1.30, 95%CI 1.12-1.51). Median length of stay was significantly longer in KT recipients compared to non-transplant patients (5 vs 3 days, p <0.001). After adjusting, the LOS ratio in KT recipients was still higher (OR 1.23, 95%CI 1.18-1.28). Median hospital costs were signficantly higher for KT recipients ($12077 vs $9002, p<0.001), and the higher cost ratio was still demonstrated after adjustment (ratio 1.14, 95%CI 1.10-1.18). There was no significant difference in any of the outcomes when comparing cholecystectomy performed at transplant center vs non-transplant center (Table 1).

Conclusion: KT recipients have higher mortality, more frequent postoperative complications, longer length of stay, and higher hospital costs than non-transplant patients undergoing cholecystectomy. Undergoing surgery at a transplant center makes no difference in outcomes when comparing KT recipients and non-transplant patients

 

17.15 Impact of Deceased Donor Cardiac Arrest Time on Post-Liver Transplant Graft Function and Survival

J. R. Schroering1, C. A. Kubal1, B. Ekser1, J. A. Fridell1, R. S. Mangus1  1Indiana University School Of Medicine,Indianapolis, IN, USA

Introduction:
Transplantation of liver grafts from donors who have experienced cardiopulmonary arrest is common practice. The impact of so-called donor “downtime,” the cumulative time of donor asystole, has not been well described. This study reviews a large number of liver transplants at a single center to quantitate the frequency and length of deceased donor arrest. Post-transplant clinical outcomes include delayed graft function, early graft loss, peak post-transplant transaminase levels, recipient length of hospital stay, and short- and long-term graft survival.

Methods:
The records of all liver transplants performed at a single center over a 15-year period were reviewed. Donor arrest included any reported asystole, and the time period of asystole was calculated from the pre-hospital reports as recorded by the onsite organ procurement coordinator.  Peak donor transaminase levels were extracted from the original on-site records. Post-transplant graft function was assessed using measured laboratory values including alanine aminotransferase (ALT), total bilirubin (TB), and international normalized ratio (INR). Cox regression was employed to assess graft survival, with a direct entry method for covariates with p<0.10.

Results:
The records for 1830 deceased donor liver transplants were reviewed. There were 521 donors who experienced cardiopulmonary arrest (28%). The median arrest time was 21 minutes (mean 25, range 1 to 120, SD 18). The median peak pre-procurement ALT for donors with arrest time was 127u/L, compared to 39u/L for donors with no arrest (p<0.001). Post-transplant, the peak ALT for liver grafts from donors with arrest was 436u/L, compared to 513u/L for donors with no arrest (p=0.09). Early allograft dysfunction occurred in 27% and 29% of arrest and no arrest donors (p=0.70). Comparing recipients with donor arrest and no arrest, there was no difference in risk of early graft loss (3% vs 3%, p=0.84), recipient length of hospital stay (10 vs 10 days, p=0.76), and 30-day graft survival (94% vs 95%, p=0.60). Cox regression comparing four groups of patients (no arrest, < 20 minutes arrest, 20 to 40 minute arrest and >40 minutes arrest) demonstrated no statistical difference in survival at 10 years.

Conclusion:
These results support the routine use of deceased liver donors who experience cardiopulmonary arrest in the period immediately prior to donation. Prolonged periods of asystole were associated with higher peak elevations in ALT in the donor, but lower peak ALT elevations in the recipient. Peak TB levels were similar in the donors, but significantly lower with increasing arrest time in the recipient. There were no differences in early and late survival.  
 

17.14 Surgical Management of Adolescents and Young Adults with GIST: A Population-Based Analysis

K. E. Fero1, T. M. Coe5, J. D. Murphy4, J. K. Sicklick2  1University Of California – San Diego,School Of Medicine,San Diego, CA, USA 2University Of California – San Diego,Division Of Surgical Oncology And Department Of Surgery,San Diego, CA, USA 3University Of California – San Diego,Division Of Medical Oncology And Department Of Medicine,San Diego, CA, USA 4University Of California – San Diego,Radiation Medicine And Applied Sciences,San Diego, CA, USA 5Massachusetts General Hospital,Department Of Surgery,Boston, MA, USA

Introduction:  There is a dearth of population-based evidence regarding outcomes of the adolescent and young adult (AYA) population with gastrointestinal stromal tumors (GIST). The aim of this study is to describe a large cohort of AYA patients with GIST and investigate the impact of surgery on GIST-specific and overall survival.

Methods:  This is a retrospective cohort study of patients in the Surveillance, Epidemiology and End Results (SEER) database with histologically-diagnosed GIST from 2001-2013, with follow-up through 2015. SEER is a population-based cancer registry with 18 sites covering approximately 30% of the United States. We identified 392 AYA patients among 5,765 patients with GIST; the main exposure variable identified was tumor resection and the primary outcome measure was mortality. Baseline characteristics were compared between AYA (13-39 years old) and older adult (OA; ³40 years old) patients and among AYA patients stratified by operative management. Kaplan-Meier estimates were used for overall survival (OS) analyses. Cumulative incidence functions were used for GIST-specific survival (GSS) analyses. The impact of surgery on survival was evaluated with a multivariable Fine-Gray regression model.

Results: There was no significant difference between AYA and OA patients with regards to sex, race distribution, tumor size or stage. Compared to OA, more AYA patients had small intestine GISTs (35.5% vs 27.3%, P <0.01) and were managed operatively (84.7% vs 78.4%, P < 0.01). Multivariable analysis of AYA patients demonstrated that non-operative management was associated with over a 2-fold increased risk of death from GIST (SDHR 2.271; 95% CI:1.214-2.249). On subset analysis of AYA patients with tumors of the stomach and small intestine (n=349), small intestine location was associated with improved survival (OS: 91.1% vs 77.2%, P=0.01; GSS: 91.8% vs 78.0%, P<0.01). On subset analysis of AYA patients with metastatic disease (n=91), operative management was associated with improved survival (OS: 69.5% vs 53.7%, P=0.04; GSS: 71.5% vs 56.7%, P=0.03).

Conclusion: We report the first population-based analysis of GIST outcomes in the AYA population. These patients are more likely to undergo surgical management than patients in the OA cohort. Operative management is associated with improved overall and GIST-specific survival in AYA patients, including those with metastatic disease.
 

17.13 Distal Perfusion Catheters Reduce Vascular Complications Associated with Veno-Arterial ECMO Therapy.

Y. Sanaiha1, G. R. Ramos1, Y. Juo1, J. C. Jimenez1, P. B. Benharash1  1David Geffen School Of Medicine,Division Of General Surgery,Los Angeles, CA, USA

Introduction: Despite advances in cannulation technique, venoarterial (VA) Extracorporeal Membrane Oxygenation (ECMO) is still associated with near 15% incidence of acute limb ischemia (ALI) requiring interventions such as amputation. Distal perfusion catheters (DPCs) have been used to provide antegrade flow to the distal extremity, and can be placed preemptively or in response to signs of limb ischemia. However, the efficacy of DPCs in reducing incidence of vascular complications is not well-established. The present study aims to evaluate the impact of DPC on ALI incidence and mortality.

Methods: The institutional ECMO database was used to identify all adult patients who underwent VA-ECMO between January 2013 to June 2016. Demographic, technical, and clinical outcomes data were collected. Acute limb ischemia was defined as skin mottling on exam, loss of pulses, or tissue loss. Interventions included thombectomy, fasciotomy or amputation.

Results: During the study period, 103 adult ECMO patients met inclusion criteria and were included for analysis. Indications for ECMO were cardiogenic shock in 46.6%, cardiac arrest in 24.3%, post-cardiotomy syndrome in 16.5%, transplant rejection in 4.5%, and severe respiratory failure in 7.8%. 51 patients received DPCs as a preemptive measure and 1 patient received DPC as a therapeutic measure. Overall, 28 (34.1%) patients experienced ALI, with 15 (18.2%) patients requiring surgical intervention. Patients who received DPCs had similar ALI rates compared to those without a DPC (28.8% vs 29.4%, p=NS). Overall mortality with VA-ECMO cannulation was 55% over the study period, while 53% of patients with ALI did not survive to 30 days post-decannulation.

Conclusion: This study is one of the largest retrospective cohort studies examining efficacy of DPC in reducing the incidence of ALI. Patients receiving DPCs were found to have a similar incidence of ALI. Development of ALI was not significantly associated with increased mortality. Methods to continuouly assess distal blood flow are needed even in the presence of DPCs.  Establishemnt of selection criteria for the use of DPCs may improve outcomes.

 

 

17.12 Long-Term Follow Up of Activity Levels and Lifestyle Choices After Esophagectomy

Z. El-Zein1, S. Barnett1, J. Lin1, W. Lynch1, R. Reddy1, A. Chang1, M. Orringer1, P. Carrott1  1University Of Michigan,Section Of Thoracic Surgery,Ann Arbor, MI, USA

Introduction: We aimed to determine if preoperative counseling for esophagectomy patients led to durable smoking cessation, routine exercise for years after surgery, and lower postoperative complication rates. 

Methods:  Of 789 patients identified as long-term survivors from a prospectively-collected esophagectomy database, 393 (49.8%) were contacted and agreed to receive a survey of long-term lifestyle changes. Of 294/393 (74.8%) patients who returned the completed survey, 239 (81.3%) underwent esophagectomy for cancer (median follow-up 5.5 years) and 55 (18.7%) for benign disease (median follow-up 9.2 years). Each group was analyzed using descriptive statistics and Chi-square where appropriate.

Results: In the cancer population, 35/239 (14.6%) were smokers at preoperative counseling and 14/35 (40%) smoked following surgery (P<0.01). With regard to exercise, 177/239 (74.1%) reported counseling preoperatively and 62/239 (25.9%) reported no counseling. The median exercise frequency for the counseled group was 5 vs. 2 days/week for the “non-counseled” group preoperatively (p<0.0001) and 4 vs. 3 days/week postoperatively (P=0.02). Currently, 117/177 (66.1%) in the counseled group exercise regularly (2+ days/week for 30+ minutes) compared to 31/62 (50%) from the “non-counseled” group (p=0.02). In the cancer population, preoperative smokers had higher pneumonia rates than non-smokers (11.4% vs. 3.9%, P=0.06). The documented exercise-counseled group had a significantly lower pneumonia rate than the non-counseled group (3.2% vs. 11.5%, P=0.01). In the benign population, there was no significant reduction in smoking rates and no significant difference in exercise frequency or complication rates between the counseled and non-counseled groups. 

Conclusion: Preoperative interventions before a major operation such as an esophagectomy due to cancer lead to an increase in activity level, permanent changes in lifestyle habits, and lower postoperative complication rates.

 

17.11 Survival outcomes in octogenarians with early stage, non-small cell lung cancer in the recent era

W. M. Whited1, E. Schumer1, J. Trivedi1, M. Bousamra1, V. Van Berkel1  1University Of Louisville,Department Of Cardiovascular And Thoracic Surgery,Louisville, KY, USA

Introduction:
This study aims to examine survival differences in elderly patients with early stage (I and II), non-small cell lung carcinoma (NSCLC) undergoing pulmonary resection.

Methods:
The national Surveillance, Epidemiology, and End Results database was queried for lung cancer patients between 1998 and 2010.  Age at diagnosis for all patients and those undergoing surgical resection were compared by year between patients <80 and >80 years.  Using Kaplan-Meier analysis, survival was compared between age subgroups (<70, 70-79, and >80 years) stratified by cancer stage for patients with early stage, NSCLC who underwent surgical resection.

Results:

41,680 patients age 18 years or greater were identified with early stage, NSCLC.  Of these, 29, 580 patients underwent pulmonary resection.  The proportion of patients older than 80 out of all patients who underwent resection for early stage, NSCLC demonstrates an upward trend since 1998 from 180 patients (9.1%) to 278 patients (11.4%).  Survival comparison stratified by stage for patients <70, 70-79, and >80 years is shown in Figure 1.   Five year survival for patients with stage I, NSCLC, ages <70, 70-79, and >80 years, respectively, is 62%, 45%, and 28% (p<0.001).  Five year survival for patients with stage II, NSCLC, ages <70, 70-79, and >80 years, respectively, is 43%, 23%, and 17% (p<0.001). There were similar trends in survival when stratified by sex and histology.  

Conclusion:

The number of elderly patients diagnosed with NSCLC is increasing, particularly those >80 years; therefore, there is an increasing number of older patients undergoing surgical resection.  While surgical resection in octogenarians for stage I, NSCLC is feasible, elderly patients have poor overall survival and should be fully informed of alternatives to surgical intervention. 

 

17.10 Living Kidney Donor-Recipient Relationship and Development of Post-donation Comorbidities

J. McLeod1, C. Carroll1, W. Summers1, C. Baxter1, R. Deierhoi1, P. MacLennan1, J. E. Locke1  1University Of Alabama,Birmingham, Alabama, USA

Introduction: Living donor selection practices aim to quantify lifetime risk of comorbid disease development (e.g. hypertension, diabetes, kidney disease) based on candidate’s pre-donation demographic and health characteristics at the time of evaluation. Studies aimed at predicting this risk have been limited by lack of information on donor relationship or family history. The goal of this study was to better understand the relationship between donor-recipient relationship and risk for post-donation comorbid disease development.

Methods: Participants enrolled in an IRB approved study agreed to survey examination and consented for medical record abstraction, allowing for capture of baseline health characteristics and post-donation outcomes. We used descriptive statistics and logistic regression to examine the odds of comorbid disease by donor-recipient relationship (adjusted for age, ethnicity, gender). 

Results:59 adult living kidney donors were studied; median age at donation 43.3 (IQR: 38.9-56.7); median age at survey 64.5 (IQR: 51.2-69.7); 54 European American and 5 African American; with median follow-up of 6.6 years (IQR: 4.3-29.2). More than half of the cohort was related to their recipient (Related: 67% vs. Unrelated: 33%). Twenty living kidney donors developed comorbid disease over the course of the study.  Hypertension was the most common post-donation comorbidity. Interestingly, 19 of 20 post-donation comorbidities developed in related donors. Related donors had a 17-fold higher odds of developing a post-donation comorbidity compared to their unrelated donor counterparts (adjusted odds ratio:  17.00, 95%CI: 1.84-157.08, p=0.01). 

Conclusion:Donor-recipient relationship was strongly correlated with development of post-donation comorbidities. This finding suggests the potential for some underlying genetic susceptibility carried by family members and warrants further study.  

 

17.09 Overall Survival Following Salvage Liver Transplant for Hepatocellular Carcinoma

J. P. Silva1, N. G. Berger1, T. C. Gamblin1  1Medical College Of Wisconsin,Surgical Oncology,Milwaukee, WI, USA

Introduction:  Shortage of available organs has led to the method of salvage liver transplantation (SLT). Primary hepatic resection is carried out, and then transplantation may occur in the setting of hepatocellular carcinoma (HCC) recurrence or liver function deterioration. The survival outcomes of SLT compared with primary liver transplantation (PLT) have not been described in a nationally representative population. The present study sought to evaluate the differences in overall survival (OS) between SLT and PLT in hepatocellular carcinoma patients, using prior upper abdominal surgery (UAS) as a proxy for prior hepatic resection or ablation.

Methods:  HCC patients undergoing liver transplantation were identified using the Organ Procurement and Transplantation Network (OPTN) database (1987-2015). The patients were separated by presence of prior UAS into two cohorts, PLT and UAS. In order to focus on patients with prior surgery related to HCC, prior UAS without a corresponding record of prior HCC treatment were excluded. OS was analyzed by log-rank test and graphed using Kaplan-Meier method. Recipient and donor demographic and clinical characteristics were also studied using Cox univariate and multivariate analysis.

Results: A total of 15,070 patients were identified, with a median age of 58 years. 78.5% of all patients were male. 6,220 patients (41.3%) composed the UAS group, and 8,850 (58.7%) composed the PLT group. Compared to the UAS cohort, PLT patients were more likely to be older (p<0.001), female (p<0.001), and diabetic (p<0.001). OS was improved in the PLT compared to UAS , with median OS of 131.4 months and 122.4 months, respectively (p<0.001). UAS was associated with an increased hazard ratio (HR) on univariate analysis (HR 1.14, 95% CI 1.07-1.22; p<0.001). However, upon multivariate analysis, UAS was not associated with significantly increased HR (HR 1.15, 95% CI 0.92-1.42; p=0.21). Independent factors associated with increased HR include increased recipient age (HR 1.02, 95% CI 1.01-1.04; p=0.005), African American race (HR 1.43, 95% CI 1.03-2.00; p=0.034), and number of tumors (HR 1.09, 95% CI 1.04-1.15; p<0.001).

Conclusion: Primary liver transplant for hepatocellular carcinoma patients is associated with an improved overall survival when compared to patients with prior UAS. However, on multivariate analysis, no significant increased hazard ratio exists, likely due to differences in patient demographics and disease characteristics. With thoughtful patient selection, SLT is a feasible alternative to PLT and a deserving focus for further investigation.

17.08 Association between donor hemoglobin A1c and recipient liver transplant outcomes: a national analysis

B. A. Yerokun1, M. S. Mulvihill1, R. P. Davis1, M. G. Hartwig1, A. S. Barbas1  1Duke University Medical Center,Department Of Surgery,Durham, NC, USA

Introduction:
While the association between donor diabetes mellitus and liver transplant recipient outcomes is well described, limited data exist on the effect of donor glyclated hemoglobin (Hgb A1c) and recipient outcomes after liver transplantation. The objective of the analysis was to evaluate liver transplant recipient outcomes associated with donor Hgb A1c levels in non-diabetic donors in the United States. We tested the hypothesis that the use of allografts from non-diabetic donors with an elevated donor Hgb A1c would be associated with decreased allograft and overall survival.

Methods:
The Scientific Registry of Transplant Recipients was used to identify adult patients who underwent liver transplantation (2010-2015). Liver transplant recipients were stratified into two groups based of the Hgb A1c level of the donor: Hgb A1c <6.5 (euglycemic) vs ≥6.5 (hyperglycemic). Recipients of donors with a missing Hgb A1c were excluded. Propensity score matching (10:1) was used to adjust for donor age, donor gender, donor ethnicity, donor serum creatinine, extended criteria allografts, recipient age, recipient gender, recipient ethnicity, recipient MELD score. Kaplan-Meier analysis was used to assess overall survival.

Results:
A total of 10491 liver transplant recipients were included: 10156 (96.8%) received euglycemic allografts & 335 (3.2%) received hyperglycemic allografts. Hyperglycemic donors were older (49 v 36), had a higher BMI (29.8 v 26.2), had a higher serum creatinine (1.3 v 1.0 mg/dL), and were more likely to be an extended criteria donor (35.8 v 14.4%). Recipients of hyperglycemic allografts were more often female (74 v 66%) and had a lower MELD score at time of transplantation (20 v 23). Recipients of hyperglycemic and euglycemic allografts were similar in age, BMI, diabetes, and time on waitlist. After adjustment, overall survival was not statistically different between the two groups (p=0.065), but allograft survival was significantly increased in the recipients of euglycemic allografts (p=0.035).

Conclusion:
In this nationally representative study of liver transplant recipients, patients who received hyperglycemic allografts had decreased allograft survival compared to those patients who received euglycemic allografts. This analysis demonstrates potential utility in the measurement of Hgb A1c for assessment of liver allografts.
 

17.07 Infections During the First Year Post-transplant in Adult Intestine Transplant Recipients

J. W. Clouse1, C. A. Kubal1, B. Ekser1, J. A. Fridell1, R. S. Mangus1  1Indiana University School Of Medicine,Department Of Surgery, Transplant Division,Indianapolis, IN, USA

Introduction:
The risk for post-intestine transplant infection is greatest in the first year, primarily related to the need for high doses of immunosuppression when compared to other transplant organs. This study reports the infection rate, location of infection, and pathogen causing bacterial, fungal, or viral infections in intestine transplant recipients at an active transplant center. Additional risk for infection was assessed based on simultaneous inclusion of the liver or colon as a part of the intestine transplant.

Methods:
Records from a single transplant center were reviewed for adult (≥18 years old) patients receiving an intestine transplant. Positive cultures and pathology reports were used to diagnose bacterial, fungal, and viral infections and also to determine location and infectious agent.

Results:
During the study period 184 intestine transplants were performed on 168 adult patients. One-year bacterial, fungal, and viral infection rates were 91%, 47%, and 29%, respectively. The most commonly infected sites were the urinary tract and bloodstream. Coagulase-negative Staphylococcus was the most commonly isolated pathogen and found in 48% of patients. Antibiotic resistant strains were frequently observed with VRE infecting 45% of patients and MRSA infecting 15%. The majority of VRE cases occurred after 2008. Klebsiella and E. coli had the highest incidence of infection among gram-negative bacteria at 42% and 35%. Candida species were the most common fungal pathogens, and were seen in 42% of all patients, and in 89% of patients who developed a fungal infection. Candida glabrata was the most prominent Candida species and was present in over 75% of Candida-infected patients. Cytomegalovirus infections were present in 15% of transplant recipients, with 41% (6% overall) of those patients developing tissue-invasive disease. Bacterial and fungal bloodstream infections (BSI) developed in 72% of patients, with a median time to first infection of 30 days. Age, gender, race, liver inclusion, or colon inclusion did not have a significant impact on the development of a BSI or median time to first BSI.

Conclusion:
Overall, 91% of intestine transplant patients had a bacterial infection in the first year post transplant. Opportunistic fungal and viral infections were also very common. Inclusion of the colon with the small intestine significantly (p≤0.01) increases the risk for fungal bloodstream infections, and infections caused by C. difficile, Bacteroides species, Epstein-Barr virus and upper respiratory viruses (p<0.01 for each). In contrast, simultaneous transplant of the liver significantly (p=0.01) reduced the risk of developing a bacterial urinary tract infection.
 

17.06 “Patterns of Liver Discard in Texas – Why Are Older Donors Not Used?”

S. Gotewal1, C. Hwang1, J. Reese1, J. Parekh1, M. MacConmara1  1University Of Texas Southwestern Medical Center,Transplant,Dallas, TEXAS, USA

Introduction:  Utilization patterns of marginal donor livers vary throughout the different donation service areas (DSA) and regions in the US.  In addition, aggressive utilization is not uniform across all types of marginal donors.  The aim of this study was to examine the procurement, utilization and discard of marginal livers by the organ procurement organization (OPO) in the North Texas DSA and to compare this with the national trend in order to identify areas for potential increase in organ utilization. 

Methods:  This retrospective study identified all donors between January 1, 2013 and September 30, 2015 in the North Texas DSA, in which the intent at time of donor procurement was to use the liver for transplant. Donor age, type, BMI were collected together with serologies, biochemistry and macrovesicular fat content.  In addition, use or discard of the liver was confirmed through OPO records.  In the event of discard further data regarding the underlying reason was obtained.  Marginal organ utilization rates were then compared with national data available through the SRTR database.

Results: 703 donor liver procurements took place during the study period and of these 628 were transplanted and 75 (10.7%) discarded.  Older age (>65 years), donation after cardiac death (DCD) status and macrovesicular fat content (>30%) were all associated with statistically higher rates of discard.  Interestingly, older age had a higher discard rate when compared with national trends (23% vs. 17%).  Within this subgroup macrovesicular fat content (6% vs. 16.5% national), average BMI (27.7 vs. 28.4), and DCD status (0/36 vs. 28/175 national) would all suggest that these organs would be of better quality. OPO records noted severe calcification within visceral arteries as the reason for organ decline and discard 7 of the cases.  Importantly, when a second transplant surgeon was present at the procurement none of the organs were discarded. Of the 44 livers above age 65, a second surgeon was present on 8, and had 0 discards. There were 10 discards over age 65 when a second surgeon was not present (p=.045). 

Conclusion: Marginal livers from donors of advanced age are less frequently used in North Texas.  These data suggest that many were discarded for subjective causes.  Importantly, the presence of another surgeon may act to increase utilization.

 

17.05 Minimalist Approach to Sedation in TAVR May Decrease Incidence of Post-Procedural Dysphagia

L. Mukdad1, W. Toppen1,3, K. Kim1, S. Barajas1, R. Shemin1, A. Mendelsohn2, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Cardiac Surgery,Los Angeles, CA, USA 2David Geffen School Of Medicine, University Of California At Los Angeles,Head And Neck Surgery,Los Angeles, CA, USA 3David Geffen School Of Medicine, University Of California At Los Angeles,Internal Medicine,Los Angeles, CA, USA

Introduction:  Transcatheter aortic valve replacement (TAVR) has become the preferred therapy for severe aortic stenosis in patients with high surgical risk. While general anesthesia has been the standard during these procedures, recent evidence suggests that a minimalistic approach utilizing conscious sedation may have similar if not improved clinical outcomes. The incidence and impact of post-TAVR dysphagia has yet to be fully elucidated. The purpose of this study was to compare the incidence of postoperative dysphagia and aspiration pneumonia in patients undergoing TAVR procedures with either conscious sedation (CS) or general anesthesia (GA).

Methods:  This was a retrospective single center study involving all adult patients undergoing TAVR between September 2013 and May 2016. The diagnosis of postoperative dysphagia was confirmed by a fiberoptic endoscopic evaluation of swallowing test. Propensity score matching was used to control for intergroup differences and account for potential selection biases. The Society of Thoracic Surgeons predicted risk of mortality score, a previously validated composite risk stratification tool was used as our propensity score. Categorical variables were analyzed by Fisher’s exact test and continuous variables were analyzed by the independent sample T-test for unequal sample size. An alpha of < 0.05 was considered statistically significant. All data were analyzed using STATA 13.0 statistical software (StataCorp, College Station, TX).

Results: A total of 200 patients were included in this study (CS=58 and GA= 142). After propensity score matching and exclusion of unmatched patients, 187 patients remained (CS=58 and GA=129). There were no differences in baseline comorbidities between the matched groups (Table 1). Postoperative dysphagia was significantly higher in the GA group (0% CS vs 8% GA, p=0.03) with a trend towards more aspiration pneumonia in the GA group (0% CS vs 3% GA, p=0.31). Additionally, procedure room time (95 ± 44 min CS vs. 148 ± 109 min GA, p < 0.001), total ICU time (27 ± 32 h CS vs. 87 ± 98 h GA, p < 0.001), and total length of hospital stay (3 days CS vs. 7 days GA, p < 0.001) were significantly less in the conscious sedation TAVR group. 

Conclusion: We found that the use of conscious sedation in TAVR was associated with a significant decrease in the incidence of postoperative dysphagia and a trend towards fewer instances of aspiration pneumonia. Additionally, CS was associated with lower operative times and shorter ICU and hospital length of stay at our institution. Given the increase in resource utilization with dysphagia, our results suggest significant improvements with the use of CS for TAVR procedures.

17.04 Use of Enhanced Recovery Protocol in Kidney Transplant to Improve Postoperative Length of Stay

C. Schaidle-Blackburn1, A. Kothari1, P. Kuo1, D. DiSabato1, J. Almario-Alvarez1, R. Batra1, R. Garcia-Roca1, A. Desai1, A. Lu1  1Loyola University Chicago Stritch School Of Medicine,Intra-abdominal Transplant,Maywood, IL, USA

Introduction: Implementation of enhanced recovery (ER) protocols has been demonstrated to improve perioperative outcomes in multiple surgical specialties. Limited data exist on using ER protocols in patients undergoing kidney transplantation. The objectives of this study were to develop a multidisciplinary ER protocol for patients undergoing single organ kidney transplants and measure the impact of a pilot implementation on postoperative length of stay (LOS).

Methods: Retrospective review of pre- and post-intervention postoperative outcomes at a single institution between January, 2015 and July, 2016. All patients receiving a single organ kidney transplant over the age of 18 were included. Dual organ recipients and pediatric patients (≤18 years) were excluded. Project was conducted in two stages. Stage 1 was a multidisciplinary gap analysis using a modified Delphi approach to define a consensus ER protocol. Stage 2 was a 9-month pilot implementation. A prospective observational analysis was performed on the cohort with postoperative LOS identified as the primary outcome. Means reported ± standard deviations, tests of significance conducted with student’s t-tests (alpha=0.05).

Results: A total of 66 patients were included for study, 18 prior to intervention and 48 following intervention. Mean age of the study population was 49.8±12.9 years (49.5+16.6 years pre-ER and 50.0±11.4 years post-ER, P=.899). Gap analysis led to the development of an ER protocol through multidisciplinary engagement based on 3 major domains: (1) standardization of postoperative pathways, (2) early mobilization, (3) patient engagement. Standardization of postoperative pathways was achieved through updating perioperative electronic order sets and operating room protocols; early mobilization achieved through elimination of central lines, early Foley catheter removal, and physical/occupational therapy consults; patient engagement was achieved through creation of patient education materials and patient-driven diet advancement. Following implementation, postoperative length of stay decreased from 7.5±4.0 days to 4.9±2.0 days (P<.001).  

Conclusion: Implementation of a locally-developed, consensus-based ER protocol can reduce LOS in patients undergoing kidney transplant. This study also demonstrates the feasibility of implementing ER in this complex patient population. Based on pilot data and institutional transplant volume, this ER protocol could generate cost savings that exceed $950,000 annually.

 

17.03 Medial Row Perforators: Higher Rates of Fat Necrosis in Bilateral DIEP Breast Reconstruction

B. Tran1, P. Kamali1, M. Lee1, B. E. Becherer1, W. Wu1, D. Curiel1, A. Tobias1, S. J. Lin1, B. T. Lee1  1Beth Israel Deaconess Medical Center,Surgery/Plastic And Reconstructive Surgery,Boston, MA, USA

Introduction:
The purpose of this study is to evaluate perfusion related complications in bilateral deep inferior epigastric perforator (DIEP) flap reconstruction of the breast based on perforator selection over a decade.

Methods:
A retrospective review of a prospectively maintained DIEP database was performed on all patients undergoing bilateral DIEP reconstruction at a single institution between 2004-2014. The flaps were divided into three cohorts based on perforator location: lateral row only perforator group, medial row only perforator group, and medial plus lateral row perforator group. Postoperative flap related complications were compared and analyzed.

Results:
Between 2004 and 2014, 818 bilateral DIEP flap reconstructions were performed. Seven hundred and twenty eight flaps met the study criteria. Within the study group, 263 (36.1%) flaps had perforators based only on the lateral row, 225 (30.9%) had perforators based only on the medial row, 240 (33.0%) flaps had perforators based on both the medial and lateral row. The groups were well matched in terms of perforator number and flap weight. Fat necrosis occurrence was significantly higher in flaps based solely on the medial row versus flaps based only on the lateral row perforators (24.5% versus 8.2%, p< 0.001). There was no statistically significant difference in fat necrosis between flaps based only on the lateral row versus flaps based on both the medial and lateral row (8.2% versus 11.6%, p=0.203). Generally, within the same row, increasing the number of perforators decreased the incidence of fat necrosis.

Conclusion:
Perforator selection is critical to minimizing perfusion related flap complications. In bilateral DIEP flaps, lateral row based perforators result in significantly less fat necrosis than medial row based perforators. Our data suggests that the addition of a lateral row perforator to a dominant medial row perforator will decrease the risk of fat necrosis.
 

17.02 Improving Outcomes in use of Extended Criteria Donors of Liver Allografts: A National Analysis

R. Davis1, M. S. Mulvihill1, B. A. Yerokun1, M. G. Hartwig1, A. S. Barbas1  1Duke University Medical Center,Department Of Surgery,Durham, NC, USA

Introduction:  Orthotopic liver transplantation (OLT) remains the gold standard for patients with end-stage liver disease.  However, the scarcity of ideal donor allografts has led to the use of extended criteria donors (ECD). While a precise definition of an ECD donor is not well defined, donor age > 60 years is thought to negatively impact survival following OLT. Despite this, use of donor allografts > 60 years old has steadily increased with reports of improved outcomes. Here, we hypothesized that the risk to recipients receiving donor allografts > 60 years old has decreased since the inception of ECD liver donation. 

Methods:  The OPTN/UNOS STAR (Standard Transplant Analysis and Research) file was queried for all first-time isolated adult liver transplants. From 247,723 records in the registry, this yielded 121,399 recipients suitable for analysis, representing transplants from September, 1987 to June, 2015. Cohorts were then developed based on donor age greater than 60 years old and 5-year increments of transplant date. Kaplan-Meier analysis with the log-rank test compared survival between recipients of donor allografts > 60 years across each five-year interval.

Results: Utilization of donor allografts > 60 years old for OLT has steadily increased. Before 1990, no donors over age 60 were accepted for transplant. From 1990 to 1995, 210 recipients received allografts from donors over age 60, and peaked at 3,945 recipients receiving allografts from donors > 60 years between 2005 and 2010, the last complete era available for analysis. Across all eras, recipients of allografts from donors > 60 years old tended to be older (53 vs 56 years) and be on the waitlist longer (82 vs 100 days). Recipient INR, albumin, medical condition at transplant, and creatinine were similar between groups. Donors > 60 years old had similar serum creatinine and percent steatosis. Unadjusted 5-year survival varied significantly by era of transplant, from 54.7% (1990-1995) and increasing to 69% (2010-2015). 

Conclusion: This study demonstrates improved long term survival of recipients receiving donor allografts > 60 years old in the current era compared with recipients in previous eras. This data supports the continued use of ECD liver allografts > 60 years old. Further refinement of donor management, surgical techniques, and the potential use of ex vivo liver perfusion should serve to continue the trend toward improved recipient outcomes.    

 

17.01 Duct-to-Duct Bile Duct Reconstruction in Liver Transplant for Primary Sclerosing Cholangitis (PSC)

S. Chanthongthip1, C. A. Kubal1, B. Ekser1, J. A. Fridell1, R. S. Mangus1  1Indiana University School Of Medicine,Transplant,Indianapolis, IN, USA

Introduction:
Reconstruction of the bile duct after liver transplantation in patients with primary sclerosing cholangitis (PSC) is controversial. Historically, these patients have all undergone Roux-en-Y choledochojejunostomy (RY). More recently, several centers have published results suggesting equivalent outcomes using duct-to-duct reconstruction (DD). Recently, a meta-analysis of 10 single center studies demonstrated similar post-transplant incidence of bile duct complications, and similar 1-year graft survival. This study presents a similar single center analysis comparing RY and DD reconstruction of the bile duct, with 10-year Cox regression analysis of graft survival.

Methods:
The records of all liver transplants at a single center between 2001 and 2015 were reviewed. Patient with primary sclerosing cholangitis were identified. The primary operative reports for these patients were reviewed and reconstruction of the bile duct was coded as RY or DD. The decision to perform DD was based upon the appearance of the remaining bile duct after hepatectomy, with a healthy appearing native duct being reconstructed with DD and a diseased duct with RY.  All bile duct imaging and interventions were recorded to assess for strictures and leaks. Survival data was extracted from the transplant database, which is compared on a regular basis with national death databases to assure accuracy. Cox regression analysis was performed with a direct entry method.

Results:
There were 1722 liver transplants during the study period, with 164 patients having a diagnosis of PSC (10%). Bile duct reconstruction included 93 RY and 71 DD. The DD patients had a higher MELD and older age, while RY patients were more likely to be a retransplant and have a longer warm ischemia time. Among the PSC patients, 8% had a bile duct leak (13% RY and 3% DD, p=0.03).  Strictures were seen in 29% of patients (5% RY and 39% DD, p<0.001), though this is a result of nearly all patients undergoing ERCP receiving a diagnosis of stricture and having a stent placed. Only one patient in either group require operative intervention for stricture, and another patient required operative intervention for leak. Length of hospital stay was equivalent in both group (10 days, p=0.51). Cox regression 10-year graft survival showed no difference between RY and DD (p=0.47).

Conclusion:
DD reconstruction of the bile duct after liver transplantation for PSC results in similar post-operative outcomes, and equivalent long-term survival, compared to RY. These results were achieved using surgeon judgement of bile duct quality to determine the method of reconstruction.
 

16.19 Analytic Morphomics And Geriatric Assessment Predict Pancreatic Fistula After Pancreaticoduodenectomy

A. J. Benjamin1, A. Schneider1, M. M. Buschmann2, B. A. Derstine4, S. C. Wang3,4, W. Dale2, K. Roggin1  1University Of Chicago,Surgery,Chicago, IL, USA 2University Of Chicago,Geriatrics & Palliative Medicine,Chicago, IL, USA 3University Of Michigan,Surgery,Ann Arbor, MI, USA 4University Of Michigan,Morphomic Analysis Group,Ann Arbor, MI, USA

Introduction:

Following pancreaticoduodenectomy (PD), pancreatic fistula (PF) remains a significant cause of morbidity, and current models used to predict PF rely on measures which are only available at the time of operation.  Body imaging analysis, such as analytic morphomics (AM), and pre-operative geriatric assessment (GA) have been shown to forecast significant adverse outcomes following PD.  We hypothesized preoperative AM and GA can accurately predict PF.

Methods:

An IRB-approved review identified patients (n=63) undergoing PD by experienced pancreatic surgeons who had a non-contrast computed tomography scan (CT), pre-operative geriatric assessments, and prospectively tracked postoperative 90-day outcomes collected between 10/2007 and 3/2016.  PF were graded according to the International Study Group for PF (ISGPF) criteria.  Pre-operative GA included the Short Physical Performance Battery, self-reported exhaustion on the Center for Epidemiologic Studies Depression Scale (CES-D exhaustion; one of the five criteria of Fried’s frailty), and the Vulnerable Elders Survey (VES-13).  CT scans were processed to measure morphomic variables which included measures of psoas muscle area and Hounsfield units (HU), subcutaneous fat measures, visceral fat measures, and total body dimensions.  Correlations with the development of a PF were obtained using univariate analysis and multivariate elastic net regression models.  

Results:

The median patient age was 67 (37-85) years old and the median BMI was 27.0 (18.9-50.5).  In total, 15/63 patients (23%) had a documented PF: 8 patients had a ISGPF grade A PF (53%), 6 had a grade B PF (40%), and one had a grade C PF (7%). On univariate analysis, PF was associated with CES-D exhaustion (p=0.005), VES-13 (p=0.038), subcutaneous fat HU (p=0.009), visceral fat area (p=0.035), visceral fat HU (p=0.001), average psoas HU (p=0.040) and psoas low density muscle area (p=0.049).  A predictive model based on demographics, analytic morphomics, and GA had a high AUC for predicting PF (AUC=0.915) when compared to a clinical “base model” including age, BMI, ASA class, and Charlson comorbidity index (AUC=0.685).  

Conclusion:

Preoperatively measured AM, in combination with GA, can accurately predict the likelihood of PF following PD.  Validation of this model on a larger cohort would provide surgeons with a practical tool to more accurately risk-stratify patients prior to PD.