39.10 Improvements in Long-Term Survival of Kidney Transplant Patients Over the Past Three Decades

C. Y. Wu1, A. C. Thorsen1, T. Galvan1, A. Rana1  1Baylor College Of Medicine,Division Of Abdominal Transplantation,Houston, TX, USA

Introduction: : Kidney transplant is the most common solid organ transplant and an important life-prolonging treatment. Over the past three decades, there have been many advances in kidney transplant patient selection and management. However, there has not been research if these gains have impacted overall patient survival outcomes. This project aimed to look at all kidney transplant data over the past 30 years and determine if long-term outcomes if patient survived >1 year and short-term outcomes <1 year have improved over time. This would allow greater evaluation of the significance of progress of survival outcomes in recent kidney transplant history.

Methods: This project retrospectively analyzed 370,959 adult (>18 years) patients who received kidney transplants between 1987 to 2017 using the United Network for Organ Sharing (UNOS) database. The collected data was analyzed using the STATA statistical software. The Kaplan-Meier method for time-to-event analysis and multivariable Cox regression were employed to estimate survival. Additional variables and a cause of death analysis were also included.

Results: Unadjusted long-term outcomes among 1-year survivors over the past 30 years have improved steadily over time. Compared with the reference range 2011 to 2017, time periods starting from 1987 were found to have decreased hazard ratios (p<.001), indicating that a transplant now has a greater likelihood of long-term survival than 30 years ago. From 1987-1990 (HR 1.52, CI 1.47–1.57) and 1991-1995 (HR 1.18, CI 1.17-1.20), the risk of failure compared to the current decade has decreased to 2001-2005 (HR 1.08, CI 1.08-1.09) and 2006-2010 (HR 1.05, CI 1.04-1.05). [GEA1]  Other controlling variables include age of recipient BMI, male sex, dialysis, previous transplant, hospital status, GFR of recipient and age of donor that influenced long-term patient survival. The highest reported causes of death in these patients were meningitis, overdose and spontaneous cranial bleed.

Conclusion: Over the last 30 years there has been appreciable gains in long-term survival of kidney transplant recipients. The UNOS database proved to be a powerful tool in understanding trends in kidney transplants. Further research should be performed to determine what specific factors have led the long-term survival of patients in recent decades should be conducted.
 

39.09 High Rate of Rejection in Pediatric Liver Transplantation: Is Induction Immunosuppression Needed?

C. S. Hwang1, S. Levea2, M. MacConmara1  1UT Southwestern Medical Center,Surgery,Dallas, TX, USA 2UT Southwestern Medical Center,Medicine,Dallas, TX, USA

Introduction:
The variable of donor and recipient age in relation to rejection in liver transplantation has not been well studied.  We were interested if donor and recipient age affect the rates of rejection in liver transplant recipients. 

Methods:

We queried the United Network for Organ Sharing database for all patients who underwent liver transplantation from 1996 to 2017. The patients were divided by adult and pediatric (< 18 years) status.  Donor and recipient demographic data were examined, rejection episodes within the first year were examined (early rejection), and patient and allograft survival were calculated.  Categorical differences were compared using the unpaired Student's t-test and nominal variables using either the Chi Square or Fischer's exact test. A p-value of <0.05 was considered significant.

Results:
154,113 patients underwent liver transplantation during this time period, and 16,355 were pediatric recipients.  16,121 (11.7%) adult patients had an early rejection episode while 3,121 (19.1%) pediatric patients had an early rejection episode (p<0.001).  When examined based upon donor age, patients receiving liver allografts from pediatric donors (15.6%) had a significantly higher rejection rate than those who received allografts from adult donors (11.8%) (p<0.001).  Pediatric patients who received pediatric allografts with rejection had significantly poorer allograft survival than similar patients without rejection (p<0.001, Graph 1).

Conclusion:
Pediatric patients who receive organs from pediatric donors are at higher risk of developing early rejection episodes.  This transplant recipient population should undergo induction immunosuppression to decrease the risk of rejection in the early post-operative period and improve allograft survival.

39.08 Elderly Patients May Benefit from Single Lung Transplantation: An Analysis of UNOS Database

M. A. Kashem1, J. Levy1, H. Zhao1, Y. Toyoda1  1Temple University,Cardiovascular Surgery,Philadelpha, PA, USA

Introduction:  The number of elderly patients requiring a lung transplantation (LTx) continues to grow. However, potential recipient age discrimination continues to be of medical concern. We investigated the survival outcome of elderly primary single and double LTx patients analyzing the UNOS database.   

Methods:  We analyzed the UNOS database, investigating single and double LTx recipients based on three different elderly age groups from 1987 to 2014. Patients receiving single or double LTx were divided into three different age groups based on recipient age – <69 years old, 70-74 years old, and >75 years old. Using variables such as age, gender, ethnicity, BMI, length of stay (LOS), ECMO and inhaled NO usage, blood groups and types of procedures, three groups were compared for any significance (0.05). Survival outcome was compared using a Kaplan-Meier Curve.  Using SAS program, data were expressed Mean+/_standard deviation. 

Results: We reviewed 27,980 patients who received LTx where 16,015 had double and 11,888 had single LTx. Within the age cohorts, 27,076 patients age were <69, 729 patients 70-74, and 98 patients >75 years old. Demographic data showed 54% male, 86% white, 7% black, 5% Latino, and 2% other, recipient's age (55±15) years, BMI (24±4) kg/m2, median LOS- 15 days, blood groups: O-45%, A-40%, B-11%, AB-4%, ECMO-1% and inhaled NO-0.21% usage. Log-rank test for equality of survival illustrated significance based on age for patients who received single and double LTx. When comparing the single LTx age cohorts, only the 70-74 vs. >75 year-old patients didn’t show significance (p=0.49), <69 vs. 70-74 years old (p=0.0001), <69 vs. >75 years old (p=0.012). Double LTx recipients showed significant differences when compared between 70-74 vs. >75 years (p=0.01). 

Conclusion: Survival outcome of single LTx patients did not show any difference compared to older patients above 70 years whereas younger patients below 69 years of age tend to do better with double LTx. The beneficial survival outcome for single LTx patients extends to an older age than double LTx patients. 

 

39.07 Liver transplantation for cholangiocarcinoma: A single center three decade experience

F. M. Kaldas1, T. Ito1, D. S. Graham1, S. M. Younan1, V. G. Agopian1, J. DiNorcia1, H. Yersiz1, D. G. Farmer1, R. W. Busuttil1  1University Of California – Los Angeles,Surgery,Los Angeles, CA, USA

Introduction: Liver transplantation (LT) for cholangiocarcinoma (CCA) remains limited to a small number of centers with mixed practice patterns and results. While the role of neoadjuvant therapy (NT) has been explored, an in depth analysis including the type of CCA, NT, and outcomes over time is lacking.

Methods: Multivariate Cox regression was used in a retrospective review of a prospectively collected database to examine factors impacting survival. Hilar CCA (HC) was compared to intrahepatic CCA (IC). NT type and impact were sub analyzed (50 consecutive adults 1985-2018, single academic center)

Results: Among 50 patients (HC 20, IC 30), 33 patients (66%) presented with large tumors (hilar≥3cm, intrahepatic≥5cm). 5 year overall survival (OS) after LT for the entire cohort was 35%. However, patients who underwent LT in the recent Era (2008 to 2018) tended to have better survival compared to the older Era (1985-2007) (5y OS: 41% vs 32%, p=0.290). Patients with IC tended to have better outcomes compared to those with HC (5y OS: 42% vs 20%, p=0.151). Furthermore, IC patients receiving LT in the recent Era showed a trend towards improved survival compared to the older Era at 5 years (75% vs 37% p=0.313). 22/50 patients (44%) received NT. NT type was unknown in 4 patients excluding them from the NT sub-analysis. NT was divided into four sub-groups (No NT n=28, chemotherapy n=4, local n=3, combined chemotherapy + local n=11). All patients undergoing local NT alone and 82% of patients undergoing combined NT were in the recent Era, while most patients in the no NT group 89% or chemotherapy alone group 75% were the older Era (p<0.001). 5 year OS from time of initial treatment in the combined NT group was 78%, significantly superior to all other groups (p=0.024) (Fig 1). Tumor multifocality (HR: 2.749, p=0.009) and perineural invasion (HR: 2.669, p=0.019) were significant independent predictors of poor prognosis. The absence of combined therapy also trended towards predicting a poorer outcome (HR: 3.795, p=0.075).Tumor size had no impact on survival outcomes.

Conclusion: LT for IC and HC has improved considerably over time. While combined (chemotherapy + local) NT followed by LT offers the most optimal long term outcomes, tumor size did not impact survival regardless of NT modality used. These findings suggest that careful patient selection and management offer acceptable outcomes in LT for CCA.

 

39.06 Predicted End-Stage Renal Disease Risk in Living Kidney Donors at an Academic Medical Center

J. Jadi1, A. Nayyar2, C. Gaber3, P. D. Strassle3, D. Gerber2, A. Toledo2, P. Serrano2  1University Of North Carolina At Chapel Hill,School Of Medicine,Chapel Hill, NC, USA 2University Of North Carolina At Chapel Hill,Division Of Abdominal Transplant,Chapel Hill, NC, USA 3University Of North Carolina At Chapel Hill,School Of Public Health,Chapel Hill, NC, USA

Introduction:  Allogenic kidney transplantation is the definitive treatment course for patients with dialysis dependent end-stage renal disease (ESRD). While past research has shown that the overall risk for healthy kidney donors is negligible, there is a growing body of evidence that demonstrates certain donor characteristics may be associated with an increased risk of developing ESRD in a healthy kidney donor. The goal of this study was to assess how donor characteristics including gender, race, history of smoking, and BMI affect the long-term risk of ESRD in kidney donors at an academic medical center. 

Methods: Predicted pre-donation 15-year and lifetime risk of end-stage renal disease (ESRD) were calculated for kidney donors between 2006 and 2016 at University of North Carolina Medical Center, using the recently released ESRD Risk Tool for Kidney Donor Candidates (http://www.transplantmodels.com/esrdrisk/). Student’s t-tests were used to compare 15-year and lifetime predicted risk across sex, race, smoking status, and obesity status.

Results: Overall, the 15-year and lifetime predicted risks of ESRD were 0.19% and 1.04%, respectively. Compared to females, males had a higher mean 15-year (0.29% vs. 0.14%, p<0.0001) and lifetime (1.60% vs. 0.72%, p<0.0001) predicted risk of ESRD. Black donors, compared to whites, also had a higher 15-year (0.45% vs. 0.14%, p<0.0001) and lifetime (2.38% vs. 0.76%, p<0.0001) predicted risk of ESRD. Compared to non-smokers, smokers had a higher 15-year (0.28% vs. 0.16%, p<0.0001) and lifetime (1.50% vs. 0.86%, p<0.001) predicted risk of ESRD. No significant increase in ESRD risk was seen among obese patients compared to non-obese patients (15-year risk: 0.24% vs. 0.18%, p=0.08, and lifetime risk: 1.21% vs. 0.99%, p=0.26).

Conclusion: The overall 15 year and lifetime risk of ESRD in kidney donors was minimal (<1.5%). While overall predicted risks were low, certain characteristics were associated with greater percentages. Male gender, current smoking, and race had a significant impact on the 15 year and lifetime predicted risk of ESRD in our patient population. This study highlights the applicability of the novel ESRD risk assessment tool as an aid to the preoperative evaluation in kidney donors. Before allowing an individual to become a donor, these characteristics need to be considered as part of an individualized screening process to ensure that the benefit of the recipient is balanced by the risk for the donor patient.

 

39.05 Surgeon crossover between pediatric and adult centers is associated with decreased loss to follow-up

Y. Hung1,2, Y. J. Bababekov1,2, C. G. Rickert1,2, J. E. Williams1,3, D. C. Chang1,2, H. Yeh1,2  1Massachusetts General Hospital,Surgery,Boston, MA, USA 2Harvard School Of Medicine,Brookline, MA, USA 3Tufts Medical Center,Boston, MA, USA

Introduction:  The risk of adverse outcomes for pediatric renal transplant patients is highest during the transition period (TP) from pediatric to adult care, attributed in part to non-compliance. While the majority of the studies focus on graft failure and death, loss to follow-up likely plays a large role in both compliance and survival. We hypothesized that many pediatric patients are lost to follow-up during the ages 16-22 and that patients transplanted at pediatric centers with a closely affiliated adult center (AFFs) were less likely to suffer from fragmentation of care and loss to follow-up.  

Methods:  Patients undergoing renal transplantation at <18 years of age who had data for the entire TP, defined as age 16 to 22, were included. Main outcome was loss to follow-up during the TP after excluding patients who died or lost their grafts. AFFs were defined as those pediatric centers whose transplant surgeons were also on staff at an adult center and were identified using transplant center websites. Logistic regression was performed, adjusting for patient and system factors on the Scientific Registry of Transplant Recipients 1979-2017. 

Results: 6,762 patients were included, with 92.3% of them (n=6,243) transplanted at 95 AFF. Overall graft failure and death rates were 13.6% and 5.2%, respectively (14.1% and 5.2% at non-AFF vs 13.6% and 5.1% at AFF, p>0.05).  However, the overall loss to follow-up rate was 32% and varied significantly between non-AFF and AFF (38.3% vs 31.4%, p<0.01). On adjusted analysis, patients transplanted at non-AFF were 50% more likely to be lost to follow-up compared to those from AFF (OR 1.5, p<0.01). In addition, patients with Medicaid or Medicare had higher risks of loss to follow-up than patients with private insurance (OR 1.3, OR 1.5, respectively. All p<0.05) (Table 1).  

Conclusion: The risk of loss to follow-up during the TP is alarmingly high, and is significantly higher among recipients transplanted at non-AFFs. Graft and patient survival among patients lost to follow-up are likely quite low and the reported graft and patient survival for the transition period are likely overestimated. Poor follow-up may be mitigated by improving the integration of care when transferring patients from pediatric to adult transplant centers. 

 

39.04 Post-Transplant Cancer Following Live Donor HLA-Incompatible Kidney Transplantation

J. Motter1, K. Jackson1, L. Kucirka1, A. Massie1, J. Garonzik-Wang1, S. Bae1, X. Luo1, B. Orandi2, J. Long1, M. Waldram1, A. Muzaale1, J. Coresh1, D. Segev1  1The Johns Hopkins University School Of Medicine,Baltimore, MD, USA 2University Of Alabama at Birmingham,Birmingham, Alabama, USA

Introduction: Incompatible living donor kidney transplant (ILDKT) recipients require increased immunosuppression compared to compatible living donor kidney transplant (LDKT) recipients in order to facilitate transplantation (via desensitization) and prevent acute rejection. This continued exposure to higher levels of immunosuppression may increase their risk of post-transplant cancer.

Methods: Using USRDS data of Medicare beneficiaries, we compared 3351 LDKT to 423 IKT recipients from 22 US transplant centers between 1999 to 2011. Recipients were determined to have developed post-transplant cancer if they had 1 inpatient (Part A) or ≥2 outpatient claims (Part B) for one of 33 different cancers. We estimated the cumulative incidence of post-transplant cancer using the Kaplan-Meier method, and used weighted Cox regression to determine the risk of developing cancer in ILDKT compared to LDKT recipients, adjusting for recipient and transplant-related factors.

Results: The thee-year cumulative incidence of post-transplant cancer was 1.4% for ILDKT and 2.0% for LDKT recipients. After adjusting for recipient and transplant-related factors, there was no difference in risk of post-transplant cancer among ILDKT vs compatible recipients (weighted hazard ratio: 0.24 0.86 3.14, p=0.8). Additionally, there were no cancer-specific differences in risk for ILDKT compared to LDKT recipients, after adjustment for multiple comparisons.

Conclusion: Despite exposure to higher levels of immunosuppression, ILDKT recipients are not at increased risk of developing a post-transplant cancer. While ILDKT recipients are at increased risk for a variety of post-transplant complications compared to LDKT, cancer does not appear to be one of these. While careful monitoring is warranted, this supports current immunosuppression protocol for ILDKT recipients.

 

39.03 Donor, Recipient Indices and Frailty Predict Outcomes after Heart-Kidney Transplantation

V. B. Dioguardi1, C. Biefeld1, A. Di Carlo1, K. Lau1, S. Karhadkar1  1Temple University,SURGERY / ABDOMINAL ORGAN TRANSPLANTATION,Philadelpha, PA, USA

Introduction:
Simultaneous heart kidney transplantation (SHKT) is increasing in frequency; however there is a lack of consensus regarding patient selection criteria for this life saving procedure.  Frailty as measured by the mFI-5 frailty index is a new method for quantifying patient characteristics as they relate to adverse surgical outcomes. Frailty variables defined within this index fall short in their applicability to the transplant field.  For patients undergoing SHKT, symptom-based variables are inherently alleviated by transplantation itself, rendering the score unusable for long term outcomes.  Here, we sought to find donor and recipient variables that could be used to predict outcomes after SHKT. 

Methods:
Using the United Network for Organ Sharing (UNOS) database, data from patients with a SHKT from 2006 – 2018 was analyzed.  A total of 1061 patients were included in this study. Recipient and donor characteristics were studied and mFI-5 Frailty index was calculated for all patients. Additional variables included were pre-transplant dialysis status, functional status, kidney donor profile index (KDPI), donor age, recipient age, and time on weight list.  The primary outcome was overall patient survival. Results were analyzed using Kaplan Meier survival analysis, univariate and multivariate Cox Regression analysis.

Results:
A total of 1061 consecutive SHKT were analyzed. The mean age was 54 years. 80% of these were male. The average time on the wait list was 198 days. The mean mFI-5 score was 2.93 with a standard deviation of 0.735. Using Kaplan Meier survival analysis, five variables were predictive of improved overall patient survival. These variables were recipient age, Time on the wait list, pre-transplant dialysis status (p <0.001), functional status (p <0.001), as well as donor age (p= 0.011).  These variables maintain their significance for predicting patient survival when combined (p<0.001).  While KDPI(p <0.001) and time on weight list (p = 0.031) were found to be significant as isolated variables, they were no longer significant when in combination.  

Conclusion:
Pre-transplant dialysis and poor recipient functional status predict poor overall survival after SHKT.  These variables in combination with donor age can be used to build an index to predict patient survival post SHKT. This new scoring tool for SHKT candidates will help more informed and precise decision making and maximize equity and utility in Heart and Kidney allocation. 
 

39.02 Survival Benefit of Deceased Donor Kidney Transplantation for Recipients With Long Dialysis Vintage

K. R. Jackson1, A. Massie1, T. Purnell1, C. Holscher1, C. Haugen1, J. Long1, J. Garonzik-Wang1, D. Segev1  1Johns Hopkins University School Of Medicine,Baltimore, MD, USA

Introduction:  Since implementation of the new Kidney Allocation System (KAS), deceased donor kidney transplant (DDKT) waitlist registrants with long dialysis vintage have high allocation priority. However, longer dialysis vintage is associated with worse post-transplant survival. As such, the survival benefit of DDKT compared to remaining on dialysis for these candidates is unknown.

Methods:  Using Scientific Registry for Transplant Recipients data from 2002-2016 on 13,581 DDKT waitlist registrants who were active on the waitlist >10y and >15y past dialysis initiation, we studied the survival benefit of DDKT using Cox regression, adjusting for candidate characteristics. The hazard associated with DDKT was partitioned into time intervals, such that the model would separately estimate the change in hazard in the first 30 days post-transplant, in days 31-90, 91-180, 181-365, days 366-1095 (1-2 years post-transplant), days 1096-1826 (3-4 years post-transplant), and after day 1826 (beyond 5 years post-transplant). The hazard associated with DDKT was used to compare post-transplant mortality for DDKT recipients to mortality for candidates who remained on dialysis.

Results: Candidates with >10y dialysis time had a median (IQR) age of 41.1 (31.8-49.7) years at time of dialysis initiation, 45.5% were female, 54.5% were black, 6.1% had calculated panel reactive antibody exceeding 80%, and most commonly had hypertension as a cause of end-stage renal disease (39.4%). Among patients alive and waitlisted at 10y after dialysis initiation, 11/13/15-year survival past dialysis initiation were 91.5%/74.1%/60.3%. Among DDKT recipients, 1/3/5-year survival post-DDKT were 95.3%/89.4%/82.4%. DDKT was associated with substantially reduced mortality risk (adjusted hazard ratio[aHR]: 0.400.470.55, at >5y post-DDKT, p<0.001, Table). Among 2,450 patients waitlisted at 15y after dialysis initiation, 16/18/20-year survival past dialysis initiation were 90.9%/77.1%/61.8%. Of those who received DDKT, 1/3/5-year survival post-DDKT were 94.0%/89.1%/80.4%. DDKT was again associated with substantially reduced mortality risk (aHR: 0.27 0.39 0.58 at >5y post-DDKT, p<0.001, Table).

Conclusion: Patients with long dialysis vintage receive substantial survival benefit from DDKT. Since they receive high allocation priority under KAS, DDKT should be the preferred management strategy even for patients who never previously listed for DDKT.

 

39.01 Donor-specific cell-free DNA as an emerging non-invasive biomarker of organ rejection after liver transplantation.

S. K. Goh1,2, H. Do2,3, A. Testro6, J. Pavlovic6, A. Vago6, J. Lokan4, R. M. Jones1,5,6, C. Christophi1,5,6, A. Dobrovic1,2,3,7, V. Muralidharan1,5,6  1Department of Surgery, The University of Melbourne, Austin Health, Victoria, Australia 2Translational Genomics and Epigenomics Laboratory, Olivia Newton-John Cancer Research Institute, Victoria, Australia 3School of Cancer Medicine, La Trobe University, Victoria, Australia 4Department of Anatomical Pathology, Austin Health, Victoria, Australia 5Hepato-Pancreato-Biliary & Transplant Surgery Unit, Austin Health, Australia 6Victorian Liver Transplant Unit, Austin Health, Victoria, Australia 7Department of Clinical Pathology, The University of Melbourne, Victoria, Australia

Introduction:  Assessment of donor-specific cell-free DNA (dscfDNA) in the recipient is emerging as a non-invasive biomarker of organ rejection after transplantation. We previously developed a digital PCR-based approach that readily measures dscfDNA within clinically relevant turnaround times. Using this approach, we characterised the dynamics and evaluated the clinical utility of dscfDNA after liver transplantation (LT).

Methods:  Deletion/insertion polymorphisms (DIPs) were used to distinguish donor-specific DNA from recipient-specific DNA. Post-transplant dscfDNA was measured in the plasma of the recipients. In the longitudinal cohort, dscfDNA was serially measured at days 3, 7, 14, 28 and 42 in 20 recipients. In the cross-sectional cohort, dscfDNA was measured in four clinically stable recipients (>1-year post-transplant) and 16 recipients (>1-month post-transplant) who were undergoing liver biopsies.

Results: Recipients who underwent LT without complications demonstrated an exponential decline in dscfDNA. Median levels at days 3, 7, 14, 28 and 42 were 1936, 1015, 247, 90 and 66 copies/mL respectively. dscfDNA was higher in recipients with treated biopsy-proven acute rejection (tBPAR) as compared to those without. The area under the receiver operator characteristic curve of dscfDNA was higher compared to that of liver function tests for tBPAR (dscfDNA:98.8% with 95%CI of 95.8-100%, ALT:85.7%, ALP:66.4%, GGT:80.1% and bilirubin:35.4%).

Conclusion: dscfDNA as measured by probe-free droplet digital PCR methodology was reflective of organ integrity after LT. Our findings affirmed the promising utility of dscfDNA as a promising diagnostic tool of tBPAR.

38.10 Opioid Prescribing Practices in Pediatric Surgeons: Changing in Response to the Opioid Epidemic?

K. T. Anderson1,7, M. A. Bartz-Kurycki1,7, D. M. Ferguson1,7, M. Raval5,7, D. Wakeman4,7, D. Rothstein6,7, E. Huang2,7, K. Lally1,7, K. Tsao1,7  6University of Buffalo,Pediatric Surgery,Buffalo, NY, USA 1McGovern Medical School at UTHealth and Children’s Memorial Hermann Hospital,Pediatric Surgery,Houston, TX, USA 2University of Tennessee Health Sciences Center, Le Bonheur Children’s Hospital,Pediatric Surgery,Memphis, TN, USA 4University of Rochester School of Medicine and Dentistry,Surgery,Rochester, NY, USA 5Northwestern University, Feinberg School of Medicine,Pediatric Surgery,Chicago, IL, USA 7Pediatric Surgical Research Collaborative,USA, USA, USA

Introduction: The crisis of opioid misuse in the United States has led healthcare providers to re-evaluate their prescribing practices and pain management strategies. This study aimed to describe the perception of pediatric surgeons and their self-reported prescription practices for common general pediatric surgical procedures.

Methods: Pediatric surgeons in the Pediatric Surgical Research Collaborative and one non-member group were surveyed. Respondents were asked about their usual (>50% of the time) practices for pain management perioperatively (during or immediately after surgery) and at discharge in four common pediatric surgery operations: an infant after inguinal hernia repair, a young child after umbilical hernia repair, a school-aged child after laparoscopic appendectomy, and a teenager after laparoscopic cholecystectomy. Descriptive statistics and logistic regression were used for analysis.

Results: There were 171 respondents (61% response rate) with a median of 10 years in practice (IQR 4.5-20). The majority of pediatric surgeons responded that the opioid epidemic is an issue in pediatric surgery (61%), their prescribing practices matter (79%) and that they have changed their opioid prescribing patterns (80%). Almost ¼ of surgeons had witnessed opioid abuse problems in their practice, with 17% reporting treating pediatric patients with opioid abuse problems. Most surgeons prescribed opioids in the treatment of surgical pain perioperatively and at discharge for school age children undergoing a laparoscopic appendectomy or a teenager undergoing laparoscopic cholecystectomy (Table). Opioid prescribing was less common in younger children. Presence or use of a hospital or state prescription monitoring system was not associated with opioid prescribing. Increasing years in practice, however, was associated with greater odds of opioid prescribing at discharge in infants (OR 1.07, 95% CI 1.02-1.12).

Conclusions: Most pediatric surgeons believe that opioid misuse is an important issue and have changed their practices to address it. Nevertheless, a majority of surgeons prescribe opioids to school age and older children after common surgical procedures. 
 

38.09 National Perspective Of Firearm Violence In Pediatric Population: Does The Type Of Firearm Matter?

M. Zeeshan1, M. Hamidi1, A. Tang1, E. Zakaria1, L. Gries1, T. O’Keeffe1, N. Kulvatunyou1, A. Northcutt1, B. Joseph1  1University Of Arizona,Trauma And Acute Care Surgery,Tucson, AZ, USA

Introduction:
Firearm injuries are the second leading cause of death in the US. There is paucity of data regarding firearm injuries in pediatric population. The aim of our study was to assess the prevalence of firearm injuries, the type of firearm used and its impact on mortality in pediatric trauma patients.

Methods:
All patients aged <18 years who were admitted secondary to a firearm injury in the ACS Pediatric-TQIP (2014-16) were included. Data were recorded regarding demographics, ED vitals, injury parameters, intent of injury, type of firearm used and in-hospital mortality. We performed Cochrane Armitage trend analysis and regression analysis.

Results:
We analyzed 123,835 pediatric trauma patients, of which 3221 (2.6%) patients were admitted secondary to a firearm injury. Mean age was 13±4 years, 83.3% were male, and 58% were African-American. A total of 64% patients were shot in an act of violence, 23% had self-inflected injuries while 7.7% patients had accidental injuries. There was no difference in the incidence of violent crimes between males and females. However, accidental injuries were more common in females (30% vs 20%, p=0.01) while self-inflected injuries were more common in males (10% vs 5%, p=0.04). Overall mortality rate was 12.8% (<2y: 21.6%, 2-12y: 11.4% and 13-17y: 10.8%). The number of firearm related admissions (200/10,000 to 300/10,000 trauma admissions, p<0.001) and mortality (10% to 14%, p=0.03) increased over the study period.  The most common weapon used was handgun (45%) followed by machine gun (21%), and shot gun (16%). On sub analysis based on type of firearm, patients who were shot by machine gun were more likely to die (p=0.03) Table 1. On regression analysis, self-inflicted gunshot wounds (OR: 2.7[1.8-3.9]), use of machine gun (OR: 3.1[1.9-5.1]), and head injury (OR: 4.1[3.7-6.5]), were independent predictors of mortality after firearm injuries.

Conclusion:
Firearm injuries are becoming more prevalent among pediatric trauma patients. Handguns are the most commonly used weapon in firearm injuries. Self-inflicted gunshots and use of automatic weapons like machine guns are associated with higher mortality. Injury prevention should focus on men and women equally. Stricter gun laws and a focused national intervention targeting both small and automatic guns may prevent civilian injuries in pediatrics.
 

38.08 Impact of Hydroxyurea Therapy on Surgical Splenectomy Rates in Sickle Cell Disease Patients

M. Verla1, G. Airewele2, C. Style1, H. Sriraman1, O. Olutoye1  1Baylor College of Medicine, Michael E. DeBakey Department of Surgery,Division Of Pediatric Surgery, Texas Children’s Hospital,Houston, TX, USA 2Baylor College Of Medicine,Department Of Pediatrics, Section Of Hematology-Oncology, Texas Children’s Hospital,Houston, TX, USA

Introduction:  Sickle cell disease (SCD) is typically associated with auto-splenectomy from splenic infarction. Surgical splenectomy is performed on those with sequestration crises or hypersplenism. Hydroxyurea therapy decreases the frequency and severity of sickle cell crises and thus the auto-splenectomy associated with sickle cell disease.. The purpose of this study was to determine if hydroxyurea therapy is associated with 1) an increase in the incidence of surgical splenectomy and 2) a later age at surgical splenectomy.

Methods:  We performed a retrospective review of children with SCD who underwent a surgical splenectomy at our children’s hospital between January 1990 and December 2017. Patient demographics, type of SCD, hydroxyurea use, and peri-operative data were collected. Patients were further stratified into two groups, pre-2005 and post-2005, based on the year when hydroxyurea use steadily increased at our institution. Data were analyzed using chi-square analysis and two-way multivariate analysis of variance. A p-value < 0.05 was considered statistically significant.

Results: Over the 27-year period, a total of 2,910 patients with SCD were identified and 125 children had a splenectomy. Of these, 20% (n=21) received hydroxyurea for at least 6 months prior to surgical splenectomy. Splenic sequestration and hypersplenism were the most common indications (96%) for splenectomy at a median age of 5 years (IQR: 2.6 – 9.9). The cumulative incidence of splenectomy was 4.9% pre-2005 versus 3.5% post-2005. Ninety-four children (78%) had HbSS, of whom 18 had hydroxyurea therapy for at least 6 months. Those who had a long-term history of hydroxyurea therapy had a splenectomy at a median age of 6 years (IQR: 3.4–8.9) versus 3 years (IQR: 2.2–6.4) for those who did not have a long-term history of hydroxyurea use (p=0.03, Figure 1). Regardless of the pre- or post-2005 stratification, all HbSS patients on hydroxyurea therapy had their splenectomy at a later age.

Conclusion: Although the incidence of surgical splenectomy does not appear to have increased with the introduction of hydroxyurea therapy, patients receiving hydroxyurea long-term are undergoing surgical splenectomy at an older age.
 

38.07 Pediatric Colorectal Surgery: A Collaborative Approach from a Single Institution

C. Pisano1, I. Sapci1, P. Karam1, M. M. Costedio1, A. L. DeRoss1  1Cleveland Clinic,Digestive Disease Institute/Department Of Pediatric Surgery And Colorectal Surgery,Cleveland, OH, USA

Introduction: Inflammatory bowel diseases (IBD) such as Crohn’s disease and ulcerative colitis are relapsing gastrointestinal disorders commonly presenting in pediatric patients. Due to the chronic nature of these diseases, children with IBD need life-long follow-up, often requiring surgical management. While presenting symptoms are similar, the needs and expectations of treatment may differ between adult and pediatric patients. Patients initially require operations performed by pediatric surgeons, but are then followed by adult colorectal surgeons after the age of 18. The varied age of this population may cause difficulties in surgical management and continuity of care is not always well established. This may create frustration for patients and healthcare providers. There have been models in other fields establishing transitional care from the pediatric to the adult patient. However, there has been little mention of similar efforts in surgery. A collaborative system involving both pediatric and colorectal surgeons may add expertise and improve the overall experiences for pediatric colorectal patients.  We hypothesized that surgeries performed in partnership with both pediatric and adult colorectal surgeons may lead to better outcomes for these patients.

Methods: Data was gathered retrospectively from patients 18 years old or younger who underwent colorectal resections for inflammatory bowel disease between 2010 and 2017 at a single institution. Data included patient demographics (age, gender, BMI, disease, steroid or biologic agent use), type of procedure, surgical approach, specimen extraction site, surgeon involvement (pediatric, colorectal or collaboration), operative time, and estimated blood loss. We analyzed days until passage of flatus and bowel movement, length of stay, type of surgical procedure, and surgical complications.

Results: A total of 117 patients were included in our study. Our data showed that days until flatus (2.27±0.47, p=0.049), first bowel movement (2.64±0.67, p=0.006) and length of stay (4.45±1.51, p=0.006) were the least in collaboration group. Single-incision laparoscopic surgery (SILS), compared to other laparoscopic techniques, was utilized most commonly in collaborative group (77.8% p=0.002). We did not see differences in surgical complication rates when comparing any of the groups.

Conclusion: Our results show improved outcomes in pediatric patients with inflammatory bowel disease when there was collaboration between pediatric and colorectal surgeons in comparison to surgeries performed by pediatric surgeons or adult colorectal surgeons alone. Such structured cooperation may benefit transition of care and other aspects of long-term management in this patient population.

 

38.06 The Limited Utility of Routine Culture in Pediatric Pilonidal, Gluteal, and Perianal Abscesses.

M. P. Shaughnessy1, C. J. Park1, L. Zhang1, R. A. Cowles1  1Yale University School Of Medicine,Department Of Pediatric Surgery,New Haven, CT, USA

Introduction:

Pilonidal, buttock, and perianal abscesses are common reasons for surgical consultation in the pediatric emergency department. When an abscess is clearly present, a bedside incision and drainage (I&D) typically includes a culture swab of the abscess fluid and patients are often discharged home with oral antibiotics. To fill a clear gap in the literature regarding culture utility and add to the existing data about antibiotic stewardship, we aimed to study abscess culture results by examining the impact of culture data on changes in management and effects on outcomes.We hypothesized that in a majority of cases, management is unaffected by culture data and therefore fluid culture from simple pilonidal, buttock, and perianal abscesses in the pediatric population may represent an unnecessary laboratory test and cost.

Methods:

With institutional review board approval, a single institution electronic medical record was searched to identify pediatric patients with a diagnosis of abscess having undergone I&D between February 1, 2013 and August 1, 2017. Two separate searches were conducted using both ICD-10 codes and CPT codes. Patients from these searches were merged, duplicates removed, and any patients with abscesses outside the gluteal region were excluded. From the resulting 317 patient encounters, 68 were excluded due to either improper coding or procedures having been performed outside of the pediatric emergency department. The final number of patient encounters was 249. Patients were divided into two different comparison groups for data analysis based upon the presence or absence of culture and recurrence or no recurrence. Data were analyzed with the support of SPSS Version 24.0 using chi-squared test or Fisher’s exact test when applicable. 

Results:

Patient age distribution was bimodal with median ages of 1 and 16 years. Abscesses were more likely to occur in females (63.1%) than in males (36.9%). The most common abscess location was the gluteal cleft (46.6%), the most frequently cultured organism MRSA (26.1%), and the overall recurrence rate was 10.8%. Antibiotics were prescribed 80.3% of the time with the most commonly prescribed being Bactrim (34.5%), followed by Clindamycin (30.9%). In total, culture results were found to directly alter management in only 5 patient encounters (2.7%). When comparing groups by culture or no culture, no statistically significant difference in recurrence rate (p=0.4) was noted. When comparing groups by recurrence versus no recurrence, we found no statistically significant difference between sex (p=0.68), age (p=0.11), resident type (p=0.28), vessel loop use (p=0.2), packing use (p=0.28), or antibiotic use (p=0.17). 

Conclusion:

We conclude that microbiological culture results are of limited utility in the management of pediatric pilonidal, gluteal, and perianal abscesses as they do not appear to alter treatment plans and omission of culture is not associated with failure of surgical management. 

38.05 Practitioner Perceptions Surrounding the Desire of Families to Participate in Fertility Preservation

J. Vaught1,2, K. S. Corkum2,3, C. J. Harris2,3, E. E. Rowell2,3  1Feinberg School Of Medicine – Northwestern University,Chicago, IL, USA 2Ann & Robert H. Lurie Children’s Hospital of Chicago,Pediatric Surgery,Chicago, IL, USA 3Feinberg School Of Medicine – Northwestern University,Surgery,Chicago, IL, USA

Introduction:

With increasing survival of childhood cancer patients, the long-term consequences of medical therapy, including impaired fertility from gonadotoxic therapy, have become an important consideration in a child’s treatment plan. Given this concern, there is growing interest in fertility preservation (FP) consultation for high-risk patients, yet referral can be inconsistent even in centers with a FP program. This study aimed to assess practitioner perceptions surrounding families’ interest in FP and the desire of parents to discuss FP if their child was deemed to be at significant risk of infertility from their planned therapy.

Methods:

A survey was administered to parents of non-oncology patients and practitioners across all medical and surgical subspecialties. Parent surveys were administered in outpatient clinics; and practitioner surveys were distributed via email. Questions focused on demographics, family history and FP, specifically related to willingness to participate, cost and attitudes toward the consultation.  

Results:

A total of 164 practitioners (95.9%) and 101 parents (96.2%) completed the full survey. Compared to practitioners, parents were younger (44 years vs. 37 years, p<0.001), non-white (79.5% vs. 51.4%, p<0.001), single parents (6.2% vs. 18.1%, p<0.001) and had a total household income less than $131,201 (11.8% vs. 68.6%, p<0.001). There was no difference between parents and practitioner’s perceived parent willingness to participate in FP (90.7% vs. 96.8%, p=0.07). Practitioners grossly overestimated the amount of distress having a FP discussion would cause parents (77.5% vs. 32.7%, p<0.001) and underestimated the parent’s feeling of hope provided by a FP discussion (67.1% vs. 84.7%, p<0.001). Practitioners significantly underestimated a parent’s willingness to pay for FP (median [IQR] $500 [200-1000] vs. $1000 [300-5000] per year, p=0.03). In consultation, practitioners incorrectly perceived that parents would want to learn about infertility risk from a third party fertility preservation consultant (62.6% vs. 39%, p<0.001), when in fact parents preferred discussion with their pediatrician (14.6% vs. 45.7%, p<0.001) or oncologist (73.1% vs. 64.8%, p=0.183).

Conclusion

Parents feel less distress and more hope with an FP consultation than practitioners perceived. Moreover, they are willing to pay more for these services than presumed. These misperceptions could hinder referral for FP consultation and suggest that a standardized process for evaluating infertility risk may best serve patients and families.

38.04 Decolonization Protocols Do Not Decrease Recurrence of MRSA Abscesses in Pediatric Patients

S. T. Papastefan1,3, C. T. Buonpane1,2, G. Ares1,2,3, B. Benyamen1, I. Helenowski1,2, C. J. Hunter1,2  1Ann & Robert H. Lurie Children’s Hospital of Chicago,Pediatric Surgery,Chicago, IL, USA 2Feinberg School Of Medicine – Northwestern University,Surgery,Chicago, IL, USA 3University Of Illinois At Chicago,Surgery,Chicago, IL, USA

Introduction: Methicillin-resistant staphylococcus aureus (MRSA) nasal colonization is associated with the development of future skin and soft tissue infection in children. While MRSA decolonization protocols are effective in eradicating MRSA colonization, they have not been shown to prevent recurrent MRSA infections. This study analyzed the prescription of decolonization protocols, rates of MRSA abscess recurrence, and factors associated with recurrence.  We hypothesized that decolonization would decrease MRSA abscess recurrences after incision and drainage. 

 

Methods: This study is a single institution retrospective review of patients ≤ 18 years of age diagnosed with MRSA culture-positive abscesses who underwent incision and drainage from January 2007 to December 2017 at a tertiary care children’s hospital. The primary outcome was MRSA abscess recurrence. MRSA decolonization protocols (for the patient and all family members) included daily mupirocin nasal ointment and sodium hypochlorite baths or chlorhexidine towel washes two to three times per week for two weeks.

 

Results: Three hundred ninety-nine patients with MRSA culture-positive abscesses who underwent incision and drainage were identified. Mean age was 3.44 ± 4.45 years, 45% were male and 94.5% had community acquired MRSA infections.  119 (29.8%) patients were prescribed the MRSA decolonization protocol. Patients with prior history of abscesses or cellulitis (32% vs 17%, p=0.002), prior MRSA infection (17.6% vs 4.6%, p<0.0001), groin/genital region abscesses (30% vs 18%, p=0.01), and incision and drainage by a pediatric surgeon (34.0% vs. 10.0%, p<0.0001) were more likely to be prescribed decolonization. Additionally, patients with a higher number of family members with a history of abscess/cellulitis (0.45 vs. 0.20, p<0.0001) or MRSA infection (0.27 vs 0.05, p<0.0001) were more likely to be prescribed decolonization. 62 patients (15.6%) had a MRSA abscess recurrence. Decolonized patients did not have lower rates of recurrence (18.5% vs 14.3%, p=0.29).  Recurrence was more likely to occur in patients with prior abscesses (p=0.004), prior MRSA infection (p=0.04), family history of abscesses (p=0.002), family history of MRSA infection (p=0.0003), Hispanic ethnicity (p=0.018), and those with fever on admission (p=0.047). In a subgroup analysis of patients with these significant risk factors, decolonization did not decrease the rate of recurrence. 

 

Conclusion: Contrary to our hypothesis, MRSA decolonization did not decrease the rate of recurrence of MRSA abscesses in our patient cohort. We found significant variability in decolonization prescription between practitioners. Patients at high risk for MRSA recurrence such as personal or family history of abscess or MRSA infection, Hispanic ethnicity, or fever on admission did not benefit from decolonization. Future study of methods to reduce recurrence in patients at high risk are indicated.

38.03 Optimal Predictor of Gonadal Viability in Testicular Torsion: Time to Treat vs Duration of Symptoms

O. A. Morin1, M. G. Carr1, S. D. Bhattacharya1  1University of Tennessee Chattanooga,General Surgery,Chattanooga, TN, USA

Introduction:  Little published evidence has been presented that drastically minimizing the “time-to-treat” in testicular torsion results in fewer orchiectomies. However, the current ACS NSQIP benchmark is a time-to-treat <2h. Duration of symptoms (DoS) may serve as a more accurate predictor. We evaluated testicular salvage rates based on patient presentation with <24h versus >24h total DoS. We hypothesize that time-to-treat has little impact on testicular salvage rates and patients’ DoS better correlates with predicting testicular viability. 

Methods:  Medical records of all male pediatric patients treated for suspected diagnosis of testicular torsion in the emergency department from January 1, 2016 to September 30, 2017 were retrospectively evaluated. Twenty-three patients met inclusion criteria. Statistical analysis compared testicular viability based on both time-to-treat, DoS, and patients originating in our system versus transfers from outside hospitals.

Results:

Testicular salvage rates for patients presenting directly to the ED was 50% with an average time-to-treat of 2.6h. Testicular salvage rates in patients transferred from an outside ED was 88.9% with an average time-to-treat of 5.1h. Overall testicular survival was not statistically impacted by decreasing the time-to-treat by an average of 3h (p<0.189).

When comparing DoS, a 77.8% testicular salvage rate (DoS <24 hours) versus a 16% salvage rate (DoS >24 hours) was shown in patients presenting directly to the ED (p<0.041). Within the total population (N=23), a significant difference was shown (p<0.023) when comparing overall testicular salvage rates in patients presenting with <24h versus >24h total DoS.

Conclusion: In this case series, it appears that a better predictor of ultimate outcomes and increased testicular salvage rates is a duration of symptoms <24h rather than a shortened time-to-treat. This is a meaningful metric when providing accurate pre-operative counselling to parents and may better focus quality improvement efforts surrounding this topic. 

38.02 Household Chronic Opioid Use Increases Risk of Persistent Opioid Use after Surgery among Teens

C. M. Harbaugh1,5, J. S. Lee1, K. Chua3,6, B. Kenney5, T. J. Iwashyna2,5, M. J. Englesbe1,5, C. M. Brummett5,7, A. S. Bohnert4,5, J. F. Waljee1,5  1University Of Michigan,Department Of Surgery,Ann Arbor, MI, USA 2University Of Michigan,Department Of Internal Medicine, Division Of Pulmonary Critical Care Medicine,Ann Arbor, MI, USA 3University Of Michigan,Department Of Pediatrics And Communicable Diseases,Ann Arbor, MI, USA 4University Of Michigan,Department Of Psychiatry,Ann Arbor, MI, USA 5University Of Michigan,Institute For Healthcare Policy And Innovation,Ann Arbor, MI, USA 6University Of Michigan,Child’s Health Evaluation And Research Center,Ann Arbor, MI, USA 7University Of Michigan,Department Of Anesthesia, Division Of Pain Medicine,Ann Arbor, MI, USA

Introduction:  Opioid-naïve teens prescribed opioids after surgery have a substantial risk of developing prolonged opioid use. Prior work has shown that this risk is associated with individual-level factors such as mental health disorders and prior substance use. It is unknown whether the risk of persistent use varies with exposure to chronic opioid use among family members.

Methods:  We performed a retrospective analysis of 2010-2016 Truven MarketScan Commercial Claims including opioid-naïve patients aged 13–21 years who had surgery without subsequent procedure or anesthesia, perioperative opioid prescription, and dependent status with ≥1 family member on the same insurance plan (N=257,550). The dependent variable was new persistent opioid use (NPOU, ≥1 prescription 91–180 days after surgery). The main independent variable was household chronic opioid use (≥120 days or ≥3 prescriptions within 90 days in the year before surgery among any family member). We used generalized estimating equations with robust standard errors at the family level to model NPOU as a function of household chronic use, controlling for patient demographics; patient mental health and chronic pain conditions; and household mental health and chronic pain conditions. Sensitivity analysis evaluated the association with the number of adult (primary insurance holder/spouse) and youth (other dependent) family members with chronic opioid use. Average marginal effect for the primary outcome was calculated using observed values.

Results: In this cohort, 4.3% of patients (11,087) have a household member with chronic opioid use with an unadjusted NPOU rate of 2.5% overall. Patients and family members with household chronic opioid use were more likely to have chronic pain, mental health, and opioid use disorders compared to families without any chronic opioid use. The adjusted odds of NPOU were 57% higher among patients with any household chronic use (aOR 1.57, 95% CI 1.42 – 1.74). The adjusted rate of NPOU was 2.4% for patients with no household chronic use compared to 4.1% for patients with household chronic use (Figure). On sensitivity analysis, NPOU was significantly associated with chronic opioid use among adults in a dose-dependent manner (one adult: aOR 1.55, 95% CI 1.40 – 1.72; two adults: aOR 1.63, 95% CI 1.12 – 2.37), but not other youth (aOR 1.49, 95% CI 0.89 – 2.49). 

Conclusion: The risk of persistent opioid use after surgery was significantly higher among adolescents who had family members with chronic opioid use. This suggests that household opioid use patterns should be assessed to determine which patients might require closer monitoring and heightened anticipatory guidance when opioids are prescribed.

 

37.10 Social Determinants of Psychological Distress in GI Cancer Patients

K. J. Lafaro1, A. Li2, C. J. LaRocca1, J. Rodriguez3, K. Clark3, M. Lozcalzo3, F. L. Wong2, S. Warner1  1City Of Hope National Medical Center,Department Of Surgery,Duarte, CA, USA 2City Of Hope National Medical Center,Department Of Population Sciences,Duarte, CA, USA 3City Of Hope National Medical Center,Department Of Supportive Care Medicine,Duarte, CA, USA

Introduction: The diagnosis and treatment of cancer can cause patients significant psychological distress which can negatively impact their cancer treatment morbidity, recovery, and survival. Multiple factors may place patients at increased risk for psychological distress. Programs that identify these high risk patients and pre-emptively intervene may improve care and outcomes. Herein, we investigate the association of biopsychosocial factors with psychological distress of patients with gastrointestinal malignancies.

Methods: A validated forty-eight  item electronic distress screen was administered to new patients in the medical and surgical oncology clinics of a large cancer center from 2009-2015. Responses were recorded on a five-point Likert scale from 1, indicating no problem, to 5 indicating a very severe problem. Univariate and multivariate logistic regression analyses were performed to analyze factors that impacted the psychological distress status of patients.

Results:A total of 1027 patients participated in biopsychosocial screening at a single institution from 2009-2015.  The majority of patients had either colorectal (50%) or hepatobiliary (31%) malignancies.  The median age at screening was 63.  48% of patients were female and 61% were Caucasian.  Half of patients had at least a college degree.  On univariate analysis, not having a partner (OR 0.67, 95%CI 0.49, 0.93, p=0.02), income <$40,000 (OR 0.52, 95%CI 0.36, 0.76, p=0.01) and tobacco use (OR 0.7, 95%CI0.52-0.93, p=0.01) were significantly associated with an increased level of overall psychological distress.  Obesity was specifically associated with an increased risk of feeling anxious and fearful (OR 0.55, 95%CI 0.34, 0.87, p<0.001). However, age, gender, race, non-English primary language and level of education were not associated with increased psychological distress.  On multivariate analysis, not being partnered, income <$40,000 and obesity remained significantly associated with increased psychological distress, fearfulness and anxiety (p<0.05).

Conclusion: Psychological distress can have a significant impact on cancer care and patient survival. It is feasible to identify GI cancer patients at high risk for psychological distress including those who are not married, obese or have incomes <$40,000. Identifying these patients will allow for early intervention and potentially improved cancer recovery and survival.