99.20 Impact of Donor Pre-Procurement Cardiac Arrest on Outcomes in Pediatric Liver Transplantation

J. Schroering1, T. Hathaway1, C. A. Kubal1, R. S. Mangus1  1Indiana University School Of Medicine,Surgery / Transplant,Indianapolis, IN, USA

Introduction:
Donor pre-procurement cardiac arrest (PPCA) has previously been considered detrimental to overall donor quality in liver transplantation. However, these liver grafts are now in widespread use, with research in adult populations demonstrating the safety of using these organs. The outcomes associated with the utilization of these liver grafts specifically in a pediatric population have not been described. Our study aims to evaluate the routine use of donors with a history of PPCA in the pediatric population at our center.

Methods:
This study is a single-center retrospective analysis of all pediatric liver transplants performed over a 17-year period. Donor records were reviewed for incidence and duration of PPCA events. Donors were then stratified into the following groups: no arrest time, less than 15 minutes arrest time, 16 to 40 minutes arrest time, or greater than 40 minutes arrest time. Pre-transplant donor and post-transplant recipient laboratory values were collected to assess the degree of liver injury associated with each donor group. Comparative survival between the groups is assessed with Cox regression analysis.

Results:
The records for 116 pediatric liver transplant donors and recipients were reviewed. There were 63 (54%) donors who had no cardiac arrest time prior to procurement, while 53 (46%) donors had some degree of arrest time (median=35 minutes, range=4 to 90 minutes).  Donors that experienced an arrest event demonstrated a higher median pre-transplant peak alanine aminotransferase (ALT) level (p<0.001). In contrast, when comparing post-transplant median ALT levels in the recipient, the cardiac arrest groups had lower or comparable median peak ALT (p=0.05) and day 3 ALT (p=0.24) levels than the non-cardiac arrest group. Rates of early graft loss were similar between the groups, including equivalent graft survival at 1-year. Differences in graft survival at 10 years did not reach statistical significance, though the group with donor arrest time over 40 minutes (n=24) had a markedly lower survival by Cox regression analysis.

Conclusion:
Liver grafts from donors with or without PPCA demonstrated no statistically significant differences in function or survival. Donor PPCA alone should not be used as an exclusionary donor criterion in pediatric liver transplantation.
 

99.19 Evolving Minimally Invasive Practice for Liver Surgery – an Institutional Experience

C. C. Taylor1, C. Y. Chai1, N. N. Massarweh1, S. S. Awad1, H. S. Tran Cao1  1Baylor College of Medicine,Surgery,Houston, TX, USA

Introduction:   Liver surgery has historically been considered technically challenging and poorly suited for minimally invasive approaches.  In recent years, both laparoscopy and robotic surgery have been advocated as good, or even preferable, alternative approaches to open liver surgery in certain situations.  At the same time, an emphasis on parenchymal preservation has reduced the need for major hepatectomies.  The goals of this study were to evaluate our institutional experience with minimally invasive liver surgery.

Methods:   This was a retrospective review of our institutional hepatobiliary database.  Patients undergoing liver surgery between 2010 and 2016 were included in the study.  Surgical approaches were classified as open surgery, laparoscopic (which included hand-assisted laparoscopic), and robotic.  Operations included ablation, non-anatomic partial hepatectomies, minor anatomic hepatectomies (fewer than 3 segments), and major hepatectomies (3 or more segments).

Results:  During the study period, 101 liver operations for which complete data was available were conducted, of which 44 were performed in a minimally invasive manner.  Indications for surgery included hepatocellular carcinoma (85, 84.2%), intrahepatic cholangiocarcinoma (11, 10.9%), metastatic disease (4, 4.0%), and benign pathology (1, 1.0%).  All but one patient were male, and mean age was 63.4.  All major hepatectomies were performed via an open approach.  Of the first 50 cases performed, 24 (48.0%) were done laparoscopically.  Of these, 16 (66.7%) consisted of ablations, 2 (8.3%) of non-anatomic hepatectomies, and 5 (20.8%) of minor hepatectomies – all left lateral sectionectomies.  By comparison, of the last 51 cases, 20 (39.2%) were performed in a minimally invasive manner, including 3 robotically.  Of the 20 cases, 4 (20%) were ablations, and 15 (75%) were non-anatomic partial or minor hepatectomies.  In both groups, one minimally invasive operation was aborted after intraoperative ultrasound found the tumor to have progressed beyond the point of resectability.  The most common reason for open surgery in patients whose tumor did not require a major hepatectomy was prior abdominal surgery.

Conclusion:  With increasing experience, minimally invasive techniques can be successfully applied to liver surgery, with a clear role for resection rather than simply for ablative purposes.  Although no major hepatectomy was performed via minimally invasive approaches during this study period, laparoscopic and robotic approaches to deep or posterior lesions were achievable and safe.

 

99.18 A Longitudinal Study of Periampullary Cancer: Improved Outcomes and Increased Use of Adjuvant Therapy

C. H. Hester1, E. A. Dogeas1, M. A. Augustine1, J. A. Mansour1, P. A. Polanco1,2, M. A. Porembka1, S. A. Wang1, H. A. Zeh1, A. A. Yopp1  1University Of Texas Southwestern Medical Center,Surgical Oncology,Dallas, TX, USA 2Department of Veterans Affairs North Texas Health Care System,Surgical Oncology,Dallas, TX, USA

Introduction: Periampullary adenocarcinoma (PAC) is stratified anatomically: ampullary adenocarcinoma (AA), distal cholangiocarcinoma (DCC), duodenal adenocarcinoma (DA), and pancreatic ductal adenocarcinoma (PDAC).  We aimed to determine differences in incidence, prognosis, and treatment in stage-matched PAC patients in a longitudinal study.

Methods:  PAC patients were identified in the NCDB from 2004-2012.  Clinicopathological variables were compared between subtypes.  Covariate-adjusted treatment use and OS were compared.

Results: 116,705 patients with PAC were identified: 10,320 (9%) AA, 3,732 (3%) DCC, 7,142 (6%) DA and 95,511 (82%) PDAC.  DA, DCC, and PDAC were associated with worse survival compared to AA (HR 1.10 95%CI 1.1-1.1, HR 1.50 95%CI 1.4-1.6, and HR 1.90 95%CI 1.8-1.9).  Among resected patients, DA was associated with improved survival compared to AA (HR 0.70, 95%CI 0.67-0.75); DCC and PDAC were associated with worse survival (HR 1.41,95%CI 1.31-1.53 and HR 2.041,95%CI 1.07-2.12).  Resected AA, PDAC, and DA, but not DCC, demonstrated significantly improved survival over the studied period.  While all patients had increased adjuvant therapy (AT) receipt over time (p<0.001), only patients with PDAC had increased neoadjuvant therapy (NAT) receipt (p<0.001).

Conclusion: Resected PDAC, AA, and DA were associated with clinically significant improved survival over time, mirroring a concurrent associated increased receipt of AT. 

 

99.17 Pancreatic Adenocarcinoma Causing Necrotizing Pancreatitis: Not as Rare as You Think?

K. A. Lewellen1, T. K. Maatman1, M. A. Heimberger1, E. P. Ceppa1, M. G. House1, A. Nakeeb1, C. M. Schmidt1, N. J. Zyromski1  1Indiana University School Of Medicine,General Surgery,Indianapolis, IN, USA

Introduction:
Necrotizing pancreatitis presents a unique clinical challenge due to its lengthy disease course, therapeutic complexity, and numerous associated complications. Pancreatic necrosis occurs in 15-20% of acute pancreatitis patients and may result from any etiology of acute pancreatitis. Scattered reports describe pancreatic tumors causing necrotizing pancreatitis; however, the relationship between these disease processes has not yet been fully elucidated. We have treated a number of patients whose necrotizing pancreatitis was caused by pancreatic adenocarcinoma and therefore sought to clarify the clinical outcomes of these patients.

Methods:
Review of an institutional database identified necrotizing pancreatitis patients treated between 2008 and 2018. Medical record analysis of those with necrotizing pancreatitis and pancreatic adenocarcinoma included patient demographics, date of diagnosis of necrotizing pancreatitis, date of diagnosis of pancreatic adenocarcinoma, management details, and date of last follow up or death.

Results:
Among 647 patients treated for necrotizing pancreatitis during this time frame, seven patients (1.1%; two females, five males) had pancreatic adenocarcinoma contemporaneous with necrotizing pancreatitis. The mean age at diagnosis of necrotizing pancreatitis was 60.6 years (range 49-66). Two patients had post-procedural pancreatitis after cancer diagnosis, the remaining five had pancreatitis caused by adenocarcinoma. The average duration between diagnosis of necrotizing pancreatitis and diagnosis of pancreatic adenocarcinoma was 245 days (range 23-655). For pancreatic adenocarcinoma treatment, three patients received chemotherapy alone, one received palliative radiation therapy, and one died without oncologic management. One patient’s treatment plan was unknown. Only one patient underwent operative resection of pancreatic adenocarcinoma. Median survival was 12.0 months (range 0.4-49.9).

Conclusion:
These data suggest that pancreatic adenocarcinoma may be a more common cause of necrotizing pancreatitis than previously considered. The long duration between diagnosis of necrotizing pancreatitis and pancreatic adenocarcinoma highlights the diagnostic and therefore therapeutic delay in this patient population. Pancreatic adenocarcinoma should be considered in necrotizing pancreatitis patients of this age group in whom etiology is otherwise unclear. Prompt diagnosis of pancreatic adenocarcinoma facilitates optimal treatment in this extremely challenging clinical situation.
 

99.16 Receipt of Adjuvant Therapy Among Pancreatic Cancer Patients with a High Area of Deprivation Index

J. Mora1, A. N. Krepline1, I. Akinola1, K. K. Christians1, C. N. Clarke1, B. George2, P. S. Ritch2, W. A. Hall3, B. A. Erickson3, K. S. Dua4, M. O. Griffin5, M. R. Holt5, D. B. Evans1, S. Tsai1  1Medical College Of Wisconsin,Division Of Surgical Oncology,Milwaukee, WI, USA 2Medical College Of Wisconsin,Division Of Hematology And Oncology,Milwaukee, WI, USA 3Medical College Of Wisconsin,Department Of Radiation Oncology,Milwaukee, WI, USA 4Medical College Of Wisconsin,Division Of Gastroenterology And Hepatology,Milwaukee, WI, USA 5Medical College Of Wisconsin,Department Of Radiology,Milwaukee, WI, USA

Introduction: Area of deprivation index (ADI) is a geographic-based measurement of socioeconomic deprivation and has been used to study the relationship between social determinants and healthcare quality and outcomes. The impact of ADI on the delivery of postoperative (adjuvant) therapy to patients with pancreatic cancer (PC) is unknown.

Methods:  Patients with localized PC who completed all neoadjuvant therapy and surgery were identified from a prospective database at the Medical College of Wisconsin.  ADI for all patients was obtained using the ZIP code+4 code.  Patients were dichotomized into high and low ADI categories based on the median ADI. Clinicopathologic data, preoperative (neoadjuvant) therapy, surgical outcomes, and the receipt of adjuvant therapy were abstracted.

Results: From 2009-2018, 310 patients with localized, operable PC who completed all neoadjuvant therapy and surgery were identified. If data regarding adjuvant therapy was missing (n=12), these patients were excluded. Of the remaining 298 patients, the median ADI was 97.50 (IQR 17.3); 149 (50%) patients in the high and low ADI groups.  There was no difference between groups in age, gender, clinical stage, carbohydrate antigen 19-9 at diagnosis, type of neoadjuvant therapy, or type of operation performed.  Pancreaticoduodenectomy was the most common operation performed (n=238; 80%). Grade 3 or higher Clavien complications occurred in 39 (13%) of the 298 patients; 19 (49%) in the low ADI group and 20 (51%) in the high ADI group (p=0.86). Adjuvant therapy was received by 167 (56%) of the 298 patients; 95 (64%) of the 149 patients with low ADI and 72 (48%) of the 149 patients with high ADI (p =0.007). In a multivariable logistic regression, high ADI was associated with a 54% decreased odds of receiving any adjuvant therapy (95% CI:0.27-0.79, p=0.005).  Age >65, neoadjuvant therapy consisting of chemotherapy and chemoradiation, and postoperative Grade 3+ complications were also independently associated with decreased odds of receiving adjuvant therapy (Table 1).

Conclusion: Following neoadjuvant therapy and surgery, 56% of patients received any adjuvant therapy.  Patients from high ADI neighborhoods were significantly less likely to receive adjuvant therapy independent of other well described risk factors including older age, previous multimodality neoadjuvant therapy, and Grade 3+ postoperative complications. Future studies should examine the challenges to delivery care in high ADI neighborhoods.

 

99.15 Risk Factor Analysis of Pancreatic Cancer Incidence in Kentucky

M. Lin1, M. Gao1, Y. Yang1, J. Kim1  1University of Kentucky,Surgical Oncology,Lexington, KY, USA

Introduction: Pancreatic cancer is one of the most common cancers in the US and in the state of Kentucky (KY), it is the fourth most common GI cancer. The high incidence of pancreatic cancer may be attributed to high risk behavior such as cigarette smoking, which remains prevalent in KY. The objective of this study was to investigate the effect of smoking behavior to the high incidence of pancreatic cancer in KY.

Methods: The Kentucky Cancer Registry (KCR), a publicly available database, was audited for cancer data between 2011-2014. Pancreatic cancer incidence rate in KY was compared to the national incidence rate. Investigation of pancreatic cancer incidence in KY was divided by county. Next, since smoking is a known risk factor for pancreatic cancer, we investigated the effect of smoking on pancreatic cancer incidence rates. Using the KCR, we recorded incidence rates for lung cancer, which was used to quantitatively measure the distribution of smoking throughout the state.

Results: From 2011-2014, incidence rate for pancreatic cancer in the US was 12.5 per 100,000 people. In KY, 3,312 people were diagnosed with pancreatic cancer from 2011-2014, resulting in an incidence rate of 13.2 per 100,000 people. County data revealed an incidence rate of 11.4 for Fayette county, which contains the second largest city in KY. The counties immediately surrounding Fayette revealed incidence rates of 13.8 (Woodford), 16.0 (Jessamine), 14.1 (Clark). The largest city in KY, Louisville, is located in Jefferson county, which had a pancreatic cancer incidence rate of 14.1 per 100,000, while the surrounding counties were some of the highest: 16.7 (Oldham), 13.9 (Shelby), 12.8 (Bullitt).

To investigate causes of the high incidence of pancreatic cancer, we analyzed lung cancer incidence from 2011-2014, which yielded 94.7 per 100,000 for the state. Analysis of each county revealed highest incidence rate (179.9) for Owsley county, located in the southeast region of KY, where the incidence rate for lung cancer was the highest in the state. Comparatively, Fayette county had one of the lowest incidence rates for lung cancer, yielding 71.8 per 100,000. Similarly, the three surrounding counties also had relatively low incidence rates (69.2 to 98.8) compared to Owsley county. Jefferson had a lung cancer incidence rate of 89.2 per 100,000 and its three surrounding counties had lung cancer incidence rates ranging from 72.4 to 97.4.

Conclusion: While the pancreatic cancer incidence rate for KY was similar to that of the national incidence rate, data for each county revealed high incidence of pancreatic cancer in the counties immediately surrounding the two largest cities. Investigation of smoking behavior as a risk factor, measured by lung cancer incidence, revealed no significant correlation between smoking and pancreatic cancer incidence. Our future studies will investigate other possible risk factors contributing to the high incidence of pancreatic cancer in KY.

99.14 Use of deceased allografts with nephrolithiasis in renal transplantation.

T. Bharani1, J. A. Cagliani1, L. W. Teperman1, E. P. Molmenti1  1North Shore University And Long Island Jewish Medical Center,Surgery,Manhasset, NY, USA

Introduction:  Kidney transplantation is the preferred treatment for end-stage renal disease. Over the past decade, there has been an increase in number of end-stage renal disease patients requiring transplant, without an increase in the availability of allograft kidneys. Donor nephrolithiasis has been considered a relative contraindication due to increased morbidity and graft failure. However, the shortage of kidney allografts has led many centers to consider the possibility of treating donor nephrolithiasis before transplant.

Methods:  We conducted an observational retrospective chart review of patients who underwent renal transplantation to assess the prevalence of ureteral and kidney stones in the donor graft. Kidney stones were removed in the back table before transplantation. Patients aged above 18 years old were included if they received the kidney from a deceased donor. Immediate and long-term complications of the transplanted recipients were recorded. Donors were followed with yearly ultrasonography of the remaining kidney in addition to standard follow-up protocol.

Results: A total of 200 adult deceased donors patients were included in this study. Nephrolithiasis was found in two patients demonstrating an incidence of 1%. This report outlines the clinical management of allograft nephrolithiasis through stone removal before transplantation. There has been no reported post-transplant complications or recurrence of nephrolithiasis in any of the recipients to date.

Conclusion: This study highlights the clinical manifestations of complications due to allograft nephrolithiasis and treatment options available for stone removal. Donor gifted lithiasis should be considered for potential transplantation given the increasing shortage of the allograft pool. Continued long-term follow-up of recipients is still required to ensure the safety of this approach.

 

99.13 A Case of Post-Renal Transplant Sirolimus-Induced Colitis in a Pediatric Patient with Cystinosis

D. I. Garcia1, P. Murty1, A. Fernandes1, D. N. Lewin3, K. E. Twombley2, S. N. Nadig1  1Medical University Of South Carolina,Surgery,Charleston, Sc, USA 2Medical University Of South Carolina,Pediatrics,Charleston, Sc, USA 3Medical University Of South Carolina,Pathology And Laboratory Medicine,Charleston, Sc, USA

Introduction:

Cystinosis is a rare autosomal recessive lysosomal storage disorder that results in the build up of cystine in multiple cell types. This disease particularly affects proximal tubule cells in the kidney resulting in development of Fanconi syndrome and subsequent renal failure. More than 90% of patients with this disease progress to end-stage by age 20 and require transplantation. Sirolimus is an anti-proliferative agent often used in post-transplantation immunosuppression regimens whose mechanism of action is to inhibit mammalian target of sirolimus (mTOR) thereby blocking cell cycle progression from G1 to S phase. It has fallen out of favor as a first-line reagent over the years due to its significant side effect profile resulting in discontinuation rates as high as 50%. Frequently reported adverse effects include hypercholesterolemia requiring lipid lowering agents, increased incidents of wound dehiscence and lymphoceles as well as a predisposition to development of focal segmental glomerulosclerosis. We describe the first reported case of pediatric sirolimus-induced colitis following renal transplant for nephropathic cystinosis.

Methods:

The patient underwent colonoscopy following onset of symptoms in which several colonic biopsies were taken. Histology with hemotoxylin and eosin staining was perfomed and evaluated by GI pathology fellowship board-certified pathologists at our institution. In addition to testing for inflammatory, autoimmune and infectious etiologies, donor-derived DNA and chimerism testing was run to assess for Graft Versus Host Disease (GVHD). Repeat colonic biopsies were performed following resolution of symptoms.

Results

We describe the first reported case of pediatric sirolimus-induced colitis following renal transplant for nephropathic cystinosis. The patient was a twelve-year-old female who, three months post-transplant, developed symptoms of colitis following a change in immunosuppressive drug regimen to sirolimus and prednisone after adverse reactions to mycophenolate mofetil. Histology from colonic biopsies showed gland drop out with crypt atrophy and focal apoptosis initially concerning for a gastrointestinal manifestation of GVHD. Donor-derived DNA and chimerism testing, however, ruled out GVHD as a diagnosis. Following sirolimus discontinuation, the patient’s symptoms completely resolved and repeated colonic biopsy showed normal histology.

Conclusion:

Serologic and histologic testing proved to be negative for other etiologies of the patient's colitis symptoms. The temporal relationship of colitis symptoms following commencement of sirolimus as well as the resolution of symptoms following discontinuation of sirolimus led us to conclude that the patient's colitis was drug-related.

99.12 FGF19 Copy Number as a Novel Theranostic Marker of Aggressive Pediatric Hepatoblastoma

A. P. Huynh1, M. L. Kueht2, A. Rana2, D. Lopez-Terrada3, J. Goss2  1Baylor College Of Medicine,Houston, TX, USA 2Baylor College Of Medicine,Michael E. DeBakey Department Of Surgery, Division Of Abdominal Transplantation,Houston, TX, USA 3Baylor College Of Medicine,Department Of Pathology And Immunology; Department Of Pediatrics,Houston, TX, USA

Introduction:  Hepatoblastoma (HB), the most common liver malignancy in children, generally presents in children under 3 years of age. Current standards for profiling risk in pediatric HB are based on tumor histologic appearance, coarse patient demographics (age at time of diagnosis), and circulating alpha fetoprotein (AFP) levels. In the era of personalized medicine, identification of a tumor’s genetic markers can reveal valuable therapeutic targets. FGF19 functions as a growth factor for liver cells and FGF19 gene amplification has been shown to act as a driver for adult hepatocellular carcinoma (HCC), but its role in HB is unknown. Here, we describe the first reported discovery of increased FGF19 copy number in a pediatric patient with aggressive HB.

Methods:  The patient’s medical records were reviewed to obtain the clinical course, and whole genome sequencing of the patient’s tumor was performed to further understand the aggressive nature of this HB.

Results: A 21-month-old male presented to our hospital with abdominal pain and elevated AFP levels. Liver biopsy confirmed an epithelial HB and MRI demonstrated a large multi-focal tumor involving multiple liver segments (PRETEXT IV, COG stage III). Despite initiating neo-adjuvant chemotherapy and attempting two different regiments, tumor burden and AFP continued to rise. The patient ultimately underwent orthotopic liver transplantation.

Explant histology revealed 70% tumor viability. Less than one month after transplantation, rising transaminases and elevated AFP levels prompted imaging that revealed pulmonary metastases and tumor thrombi in the superior mesenteric and portal vein. The patient died shortly after recurrence.

A mutation panel revealed a CTNNB1 missense mutation consistent with embryonal HB, corroborating the histopathological diagnosis. However, whole genome amplification analysis identified amplifications in a region of chromosome 11 corresponding to FGF19, an oncogene associated with adult HCC but not previously tied to pediatric HB.

Conclusion: Increasing utilization of tumor genetics and experience with tumor histopathology has identified variants of HB with characteristics resembling HCC and indeed, the concept of a spectrum between these two diagnoses has been posited. This particular case study is noteworthy because it demonstrates the utility of tumor genetics in identifying prognostic markers individual to this case of HB. More importantly, the identification of increased FGF19 copy number implicates a potential role for FGF receptor kinase inhibitors in difficult to control forms of pediatric HB.

99.11 Good Outcomes after Intraperitoneal Kidney Transplantation in Small Pediatric Patients

E. A. Gerzina1, A. Huynh1, A. Thorsen1, D. O’Conor1, G. Tan1, T. Malik1, K. Bhakta2, C. O’Mahony3, E. Brewer4, T. Galván3  1Baylor College Of Medicine,Houston, TX, USA 2Texas Children’s Hospital,Renal,Houston, TX, USA 3Baylor College Of Medicine,Division Of Abdominal Transplant,Houston, TX, USA 4Baylor College Of Medicine,Department Of Pediatrics-Renal,Houston, TX, USA

Introduction:  Renal transplantation in small children weighing <30 kg is technically demanding often due to patient size, donor-recipient size mismatch, and the congenital structural abnormalities that frequently cause ESRD in children. Though patient and allograft survival for this population is on par with adult outcomes, some studies have reported a higher incidence of certain complications. Our study reviewed cases of kidney transplantation in small children from a high-volume surgical center in Houston, TX, in order to assess post-operative complications and outcomes and determine what risks, if any, this patient population may continue to face. 

Methods: We conducted a retrospective chart review of all patients receiving intraperitoneal renal transplants at our institution from April 2011 to March 2018.  There were 48 intraperitoneal transplants in patients weighing <30 kg. We excluded patients who had a retransplant or multiple organ transplants. We assessed patient outcomes including major surgical complications, episodes of acute rejection, and patient and allograft survival.

Results:
Out of 168 renal transplant patients, 49 weighed <30 kg, and 48 of 49 received an intraperitoneal transplant. Of these, 28 (58.3%) received a deceased donor kidney, and 20 (41.7%) received a living donor kidney.  Mean body weight was 19.1 kg ± 5.2 kg. Eight patients (16.7%) had postoperative complications. Renal vein thrombosis occurred in 2 patients (4.2%), and two had postoperative hematomas. Primary graft dysfunction, renal artery thrombosis, urinary leak and urinary stricture occurred in 1 patient each (2.1%). Eight patients (16%) had acute rejection who were treated <6 mos from transplant and 3 patients (6%) treated ≥ 6 mos post-transplant. At the time of last follow-up (max 85 mos), patient survival was 97.9% and graft survival was 93.75%.  Two patients lost allografts due to chronic rejection. One patient had an episode of acute cellular rejection on POD 19 which did not resolve; he is currently on chronic peritoneal dialysis and is awaiting retransplant.

Conclusion:
Although renal transplant in <30 kg pediatric patients confers greater technical challenges, it did not result in a significantly increased risk of graft failure or overall complication rates compared to either pediatric or adult transplant recipients at large, though we did see a larger proportion of vascular complications. Interestingly, while vascular complications are responsible for about 10% of graft failures in children, no patients in this study experienced graft loss due to this cause. Their rates of urinary complications were also on par with the adult rate, in spite of frequently abnormal urological anatomy. Overall, our findings indicate that despite inherent technical challenges to renal transplantation in children weighing <30 kg, their outcomes and risks of complications do not vary significantly from that of the general population. 
 

99.10 Effect of Donor Hypernatremia on Pediatric Liver Transplant Outcomes

T. J. Hathaway1, J. R. Schroering1, C. A. Kubal1, G. S. Rao2, J. P. Molleston2, R. S. Mangus1  1Indiana University School Of Medicine,Department Of Surgery, Transplant Division,Indianapolis, IN, USA 2Indiana University School Of Medicine,Department Of Pediatric Gastroenterology,Indianapolis, IN, USA

Introduction:
Donor hypernatremia remains a concern for many transplant centers when appraising the usability of a potential liver graft. This study examines the practice of routinely utilizing donor liver grafts from deceased donors with severe hypernatremia in the pediatric population at our center. 

Methods:
This is a single-center retrospective review of all pediatric liver transplants over a 17-year period. Donors are grouped based on both the peak and terminal serum sodium levels (level of hypernatremia) obtained from organ procurement records (normal/mild, less than 160mEq/L; moderate, 160-169mEq/L; and severe, greater than or equal to 170 mEq/L). Outcomes measured in this study include post-transplant peak alanine aminotransferase (ALT) and total bilirubin levels, recipient length of hospital stay (LOS), patient death within 24 hours of transplant, and graft survival at 7 days, 30 days, and 1-year post-transplant. 

Results:
There were 118 pediatric liver transplants performed during the study period. Regarding peak donor sodium levels, there were 20 donors (19%) that had severe hypernatremia (≥170 mEq/L), and 31 donors (29%) that had moderate hypernatremia (160-169 mEq/L). Regarding terminal donor sodium levels, only 4 donors (4%) remained severely hypernatremic, and 14 donors (13%) remained moderately hypernatremic, at the time of procurement. Of the entire cohort, 3 patients died within 24 hours of transplant. These three grafts had a peak sodium level that was severely hypernatremic, however each had been corrected to normal levels prior to organ procurement. Graft survival at 7 days, 30 days, and 1-year did not differ significantly among the groups for both peak and terminal measures.

Conclusion:
Pediatric liver transplant recipients receiving grafts from donors with both moderate and severe hypernatremia had similar post-transplant clinical outcomes compared to those grafts from donors with serum sodium less than 160mEq/L. This was true for both peak and terminal levels. Clearly, most donors are actively managed to decrease serum sodium levels as there were 49 donors with moderate to severe peak sodium levels, but only 18 with these high levels for terminal measures.
 

99.09 Post-Transplant Vascular Complications in Isolated Intestine and Multivisceral Transplant Patients

A. E. Cabrales1, T. Nikumbh1, R. S. Mangus1  1Indiana University School Of Medicine,Indianapolis, IN, USA

Introduction:

Vascular complications such as aortic pseudoaneurysm, AV fistula formation, vascular graft leak, vascular graft thrombosis, and venous outflow obstruction can result in significant morbidity and mortality in abdominal organ transplant patients. In this study, we review our experience with such complications in isolated intestine and multivisceral transplant patients.

Methods:

All records for isolated intestine, multivisceral, and modified multivisceral transplants over a 15-year period at a single center were reviewed. All cases of aortic pseudoaneurysm, AV fistula formation, vascular graft leak, vascular graft thrombosis, and venous outflow obstruction were included.

Results:

Of 263 transplants, 16 major post-transplant vascular complications were identified. There were five cases of venous outflow obstruction, three of which required revision of the venous anastomosis. One case resulted in colonic necrosis necessitating colectomy, and another required venotomy and thrombectomy. Four patients developed vascular graft thrombosis, one of which was found to have splenic artery thrombosis for which a distal pancreatectomy was eventually required. The second patient developed distal arterial thrombi of the intestinal graft resulting in small bowel necrosis that required resection of the distal ileum. The third patient developed hepatic artery thrombosis which was successfully treated with intra-arterial tPA infusion and anastomotic revision; this patient also had poor portal flow that was successfully re-established with anastomotic revision. The fourth patient was found to have minimal flow in the aortic jump graft with diffuse necrosis of the transplanted organs and died shortly thereafter from complications of severe acidosis. Four cases of vascular graft leak were identified, all of which involved the aortic graft and resulted in exsanguination and death. There were two instances of aortic pseudoaneurysm, both of which were successfully treated with stent graft placement. One case of AV fistula formation was identified involving the hepatic artery and portal vein, and was successfully treated with coil embolization of the hepatic artery.

Conclusion:

Vascular complications can result in significant morbidity and mortality in intestine and multivisceral transplant patients. Post-transplant clinical assessments including continuous hemodynamic monitoring, trending of hemoglobin, and use of imaging modalities including ultrasound with duplex doppler and CT angiography can be helpful in identifying vascular complications at an early stage and aid in surgical decision-making. In cases of pseudoanerysm and AV fistula formation, involvement of interventional radiology or vascular surgery can be beneficial.

99.08 Delayed Graft Function: It’s Not Just About the Donor Kidney

M. Gunder1, S. Kumar1, K. Lau1, A. Di Carlo1, S. Karhadkar1  1Temple University,Surgery,Philadelpha, PA, USA

Introduction:

Delayed graft function (DGF) is defined as the need for dialysis within seven days of kidney transplant. DGF has long been associated with increased risk of graft failure and decreased overall survival. Several studies have identified risk factors for DGF, focusing on donor characteristics. There is little data about the role of recipient characteristics in predicting DGF.

Methods:

A retrospective review of consecutive kidney transplant recipients performed at a single institution from 2013-2018 was conducted. Data was collected on recipient variables, as well as on donor factors including KDPI, and the incidence of delayed graft function. Paired t-test or Fisher exact test was used to assess correlation between each variable and DGF.

Results:

A total of 199 patients underwent renal allotransplantation at our institution during the time period described. 72 patients in total had delayed graft function (36.18%). Factors found to correlate with increased rate of DGF include KDPI >50 (p=0.0340), time on dialysis >1600 days (p=0.0433), presence of recipient peripheral vascular disease (p=0.0336), recipient age >58 (p=0.0138), and recipient independence score <50% (p=0.0033).

Conclusion:

Delayed graft function occurs secondary to a multifactorial process therefore a predictive model must include both recipient and donor factors. We identified 5 separate variables that correlated with DGF after kidney transplantation. Identification of risk factors for DGF will help tailor induction immunosuppression protocols to maximize allograft function.

99.07 Imminent Death Donation: A Survey of Organ Donor Families to Assess Public Opinion

L. Washburn1, R. Ackah2, P. Moolchandani1, D. O’Conor1, T. Galván1, A. Rana1, J. A. Goss1  1Baylor College Of Medicine,Division Of Abdominal Transplantation, Michael E. DeBakey Department Of Surgery,Houston, TX, USA 2Ohio State University,Department Of Surgery,Columbus, OH, USA

Introduction:
Imminent death donation (IDD) is a new term in the field of transplant surgery to describe recovery of a living donor organ immediately prior to an impending and planned withdrawal of ventilatory support expected to result in the patient’s death. Supporters of IDD theorize the policy could increase the quantity and quality of procured organs. Implementation of IDD has the potential to increase the quantity and quality of procured organs. The UNOS Ethics Committee recently determined that IDD may be ethically appropriate and justifiable. Further work was not pursued due to potential risks, challenges to implementation, and lack of community support. The impact IDD will have on the public trust has not been fully addressed. This study aims to initiate the process of exploring the feelings, attitudes, and values the public has towards IDD.

Methods:
Participants will include families of donation after cardiac death (DCD) donors who donated more than one year ago. Families will be contacted via Lifegift, the organ procurement organization (OPO) which services Texas in the North, Southeast, and West. An electronic invitation will be sent to 500 participants and those who complete an informed consent will complete an anonymous survey. The survey, reviewed by two ethicists, includes a vignette with associated questions, a comprehension check, the Organ Donation Attitude Scale (ODAS), and demographic information. After survey collection, a statistical analysis of the data will be performed.

Results:
Results will consist of survey responses from families of DCD donors. Comprehension check questions will distinguish participants who understand the clinical vignette and the definition of IDD. The ODAS will assess support for organ donation and the clinical vignette will assess views towards IDD. Reasons for or against IDD will evaluated.

Conclusion:
Despite an agreement amongst multiple ethics committees, including the UNOS Ethics Committee, that IDD is a viable and ethical option for organ procurement, the Dead Donor Rule (DDR) remains a bedrock in the public and transplant psyche, that is, organ donation cannot lead to the death of the donor. This study serves to directly address the notion that IDD violates the public trust. Our preliminary study will have selection bias of participants that are already familiar with the organ donation process. Additionally, we will only represent community opinions within a single OPO. We plan to extend our work to OPOs across the United States to build upon the public opinion regarding implementation of IDD.
 

99.06 Intraoperative Blood Loss and Transfusion During Pediatric Liver Transplantation: A Single Center Experience

J. A. Villarreal1, D. Yoeli3, J. K. Yoeli3, R. L. Ackah2, R. R. Sigireddi1, M. L. Kueht1, N. Nguyen Galvan1, R. T. Cotton1, A. Rana1, C. A. O’Mahony1, J. A. Goss1  1Baylor College Of Medicine,Division Of Abdominal Transplantation, Michael E. DeBakey Department Of Surgery,,Houston, TX, USA 2The Ohio State University,Department Of Surgery,Columbus, OH, USA 3University Of Colorado Denver,Department Of Surgery,Aurora, CO, USA

Introduction: The aim of this study is to identify risk factors for massive intraoperative blood loss and transfusion in pediatric liver transplant recipients and describe its impact on graft survival, mortality, and hospital length of stay (LOS).

Methods: We reviewed all primary pediatric liver transplants performed at our institution between September 2007 and September 2016. Data is presented as n (%) or median (interquartile range). Estimated blood loss (EBL) was standardized by weight. Massive intraoperative blood loss and massive intraoperative transfusion were defined as greater than the 85th percentile of the cohort.

Results: 250 transplants were performed during the study period. Median EBL was 9.8 (5.5-21.5) cc/kg and 85th percentile was 34 cc/kg. 38 (15%) recipients had massive EBL. Median post-transplant LOS among those with massive EBL was 31.5 (15-58) days compared to 11 (7-21) days among those without massive EBL (p<0.001). Upon backwards stepwise regression, technical variant graft (OR 2.71, 95% CI 1.02 to 7.24), operative time (OR 2.77, 95% CI 1.85 to 4.15), and transfusion of FFP, platelet, and/or cryoprecipitate (OR 4.98, 95% CI 1.98 to 12.54) were identified as significant independent risk factors for massive EBL, while being admitted from home (OR 0.25, 95% CI 0.097 to 0.664) was a significant protective factor against massive EBL. Median transfusion volume was 16 (6.9-28.8) cc/kg and 85th percentile was 38 cc/kg. 37 (15%) recipients had massive transfusion. Patients with massive transfusion had a greater LOS, with a median LOS of 34 (14-59) days compared to 11 (7-21) days among patients who did not require massive transfusion (p=0.001). Upon backwards stepwise regression, recipient weight (OR 0.93, 95% CI 0.88 to 0.99), technical variant graft (OR 2.84, 95% CI 1.03 to 7.83), operative time (OR 2.04, 95% CI 1.40-2.96) and transfusion of FFP, platelets, and/or cryoprecipitate (OR 6.63, 95% CI 2.55 to 17.26) were significant independent risk factors for massive transfusion, while being admitted from home for transplantation (OR 0.22, 95% CI 0.88 to 0.99) was a significant protective factor against massive transfusion. Massive EBL and massive transfusion were not statistically significant for overall patient survival (HR 1.23, 95% CI 0.47 to 3.2 and HR 1.21, 95% CI 0.46 to 3.1, respectively). Massive transfusion was, however, a significant risk factor for 30-day graft loss (HR 2.995, 95% CI 1.02 to 8.76).

Conclusion: Pediatric liver transplant recipients with massive EBL or massive transfusion had significantly longer LOS and increased 30-day graft loss in patients who required massive transfusion. We identified longer operative time and technical variant graft were significant independent risk factors for massive EBL and transfusion, while being admitted from home prior to transplantation was a protective factor. Recipient weight was an independent risk factor for massive transfusion, but not massive EBL.

 

 

99.05 Transplant Kidney Graft Arterial And Venous Anatomy And Clinical Outcomes

P. Brady1, W. C. Goggins1, R. S. Mangus1  1Indiana University School Of Medicine,Surgery / Transplant,Indianapolis, IN, USA

Introduction:
There is an increasing shortage of kidney allografts available for transplantation. Each available kidney must be utilized in order to alleviate this shortage. Transplant kidney allografts with multiple renal arteries or veins present added technical complexity when performing the transplant procedure. These grafts may be at higher risk for non-use or for perioperative complications such as graft thrombosis or hematoma. Occasionally, small accessory vessels are ligated which may affect the short- and long-term function of the graft. This study reviews the arterial and venous anatomy for all kidney transplants at a single center over a 17-year period. Outcomes include early graft loss, delayed graft function and long-term graft function.

Methods:
The operative reports for all kidney transplants at a single center between 2001 and 2018 were reviewed. Extracted data included the number of arteries and veins for each graft, as well as type of reconstruction. Ligated vessels were recorded when reported by the surgeon. Delayed graft function was defined as a need for dialysis within 7 days of transplant. Cox regression with a direct entry method was utilized to assess long-term graft function. Operative technique and immunosuppression protocol was similar and consistent for all surgeons during the study period.

Results:
There were 3,504 kidney transplants performed during the study period, 2161 (62%) deceased and 1343 (38%) living donors. Data from 10 surgeons were included (range 10 to 1923 transplants). Donation after circulatory death (DCD) comprised 4% of all donors and 6% of deceased donors. Complete operative reports were available for 98% of grafts. There were 78% of grafts with a single renal artery and 97% with a single renal vein. There was no difference in delayed graft function with grafts that had multiple arteries (2.2% vs 2.5%, p=0.61) or multiple veins (0% vs 2.5%, p=0.16).  Risk of graft loss within 7-days of transplant for single and multiple vessels showed no difference (arterial 1% vs 1%, p=0.96; venous 1% vs 1%, p=0.80). Cox regression analysis demonstrated no difference in graft survival at 15-years based on vessel anatomy (p=0.58). Median graft survival for single and multiple vessels was (arterial) 136 and 137 months (p=0.69) and (venous) 136 and 136 months (p=0.95).

Conclusion:
These results demonstrate remarkably consistent results for kidney grafts with standard or with more complex arterial and venous anatomy. These results suggest that kidney  grafts with complex anatomy should not be excluded from transplant.
 

99.04 Differential Post-Transplant Normalization Rates of Serum Proteins in Pediatric Liver Patients

C. M. Schmidt II1, R. S. Mangus1  1Indiana University School Of Medicine,Surgery/Transplant,Indianapolis, IN, USA

Introduction:
Pediatric patients with liver disease suffer from hypoalbuminemia, loss of muscle mass, and malnutrition. After transplant (TX), serum albumin (ALB) levels normalize at variable rates. This study aims to quantify differential rates of post-TX normalization of nutritional markers (ALB, BMI, and BW) with subgroup analysis of patient pre-TX baseline nutritional status (muscle mass (MM), BMI, and ALB) and patient demographics.

Methods:
A retrospective review of a prospectively collected transplant database (2001-2018) at a single center was performed. Weekly ALB values and monthly BWs and BMIs were collected post-TX. A select group with CT imaging pre-TX underwent scaled scoring of core muscle mass (psoas muscle cross-sectional area at the L2/L3 intervertebral space divided by height2). ANOVA was used to compare subgroup nutritional marker reversal rates.

Results:
A total of 114 patients met study criteria, 82 with CT imaging analysis. Pre-transplant, 45% had severe sarcopenia. ALB normalized at a median time of 5 weeks for the whole population. Patients with lower baseline ALB had a lower median time to normalization of ALB (P<0.01). Normalization rate of median ALB levels lagged behind in those aged <2 compared to those aged >2 (p<0.05). Normalization rate of median ALB was faster in patients with BMI>23 compared to patients with BMI<23 (p<0.05). BMI and BW were unchanged except in those aged>11 who had an acute decrease in both in the first 30 days. Median ALB normalization rate by MM showed no significance when grouping by 3 levels of sarcopenia, but clearly there is a paradoxical more robust normalization rate in severe sarcopenia patients if solely compared to mild/no sarcopenia. 

Conclusion
Liver TX dramatically reverses hypoalbuminemia at differential rates according to age, pre-TX BMI and ALB levels. Despite normalization liver TX does not change BW and BMI. Those that struggle resolving their hypoalbuminemia have the lowest age and pre-TX BMI and ALB. Paradoxically, severe sarcopenia median ALB levels remained the highest throughout most of the first 7 weeks. The finding was not significant comparing all 3 groups, but a larger sample size could reveal a significant relationship mirroring this finding in the adult population. We speculate that a sarcopenic-dependent novel “ALB-stimulating factor” may exist at higher levels in severe sarcopenic patients. Understanding the differences in post-TX normalization rates according to pre-TX nutritional status, and demographic differences in this population are crucial to optimizing patient care. 

 

 

99.03 Approval for Kidney Donation Among Obese Living Donor Candidates: The Impact of Metabolic Syndrome

T. A. Correya1, M. N. Mustian1, P. A. MacLennan1, B. A. Shelton1, R. D. Reed1, R. Grant1, B. Terry1, S. Smith1, M. Hanaway1, V. Kumar1, J. E. Locke1  1University Of Alabama at Birmingham,Birmingham, Alabama, USA

Introduction:
While the number of kidney transplant waitlist candidates is increasing, access to living kidney donation (LKD) is limited. Likewise, potential donors are becoming less healthy, with an increase in prevalence of isolated medical abnormalities, such as obesity. There is no general consensus in the transplant community regarding evaluation of obese potential donors, and the characteristics that affect their approval status for donation have not been well-described. 

Methods:
A single-center retrospective case control study was performed, which analyzed all obese (BMI≥30 kg/m2) LKD candidates screened at our institution between 1/1/2012-12/31/2017. Of the 2423 obese candidates screened, 412 were evaluated in clinic, of which 33% were approved. Multivariable logistic regression was used to compare the potential donors who were approved for donation to their counterparts who were not approved. 

Results:

There were no statistically significant differences between the two groups on the basis of age (p=0.59), gender (p=0.88), race (p=0.10), blood type (p=0.09), or family history of diabetes (p=0.66) or hypertension (p=0.42). However,LKD candidates who were approved for donation had a lower prevalence of metabolic syndrome (p<0.001), compared to those not approved. We also found that black race (aOR 0.51; 95% CI:0.31-0.85, p<0.01) increasing BMI (aOR 0.86; 95% CI:0.79-0.93, p<0.001), and metabolic syndrome (aOR 0.35; 95% CI:0.21-0.61, p<0.001) were associated with decreased odds of approval for donation.

Conclusion:
Among obese LKD candidates, metabolic syndrome was associated with significantly decreased odds of donation approval. Our findings suggest screening for metabolic syndrome in obese potential LKDs before clinic evaluation may reduce healthcare costs of LKD evaluations.

99.02 Renal Autotransplantation Offers Pain Relief to Patients with Loin Pain Hematuria Syndrome

N. M. Bath1, T. Al-Qaoud1, H. W. Sollinger1, R. R. Redfield1  1University Of Wisconsin,Transplant,Madison, WI, USA

Introduction: Loin Pain Hematuria Syndrome (LPHS) is a rare clinical entity in which patients typically present with severe loin pain. This pain is often debilitating to the point that patients require narcotics for pain control and are often unable to continue employment. Renal autotransplantation is a potential treatment although results are varied. This highlights the importance of patient selection. In order to improve patient selection, we developed the UW-LPHS Test. Here we describe our initial results.

Methods: A retrospective chart review was performed which identified 15 patients with LPHS who underwent renal autotransplantation at a single institution between January 2017 to May 2018. In order to identify patients with LPHS who would likely benefit from autotransplantation, patients underwent bupivacaine injection into the ureter via cystoscopy, known as the UW-LPHS Test. Patients who report pain relief following the UW-LPHS Test are then deemed to benefit from renal autotransplantation. Pre-operative length of pain, pain medications taken pre-operatively, resolution of pain following the UW-LPHS Test, post-operative complications, and return to normal lifestyle were among data points collected.

Results: All patients had previously undergone extensive work-up to determine the etiology of their pain. Nutcracker Syndrome (NCS) was identified as the cause of LPHS in 60% of patients (n=9) with 55.5% of NCS patients (n=5) having undergone previous operative interventions which failed to resolve their pain. Of the 15 patients identified, 93.3% (n=14) underwent the UW-LPHS Test with near or complete resolution of their pain. Two patients not included in this cohort did not have pain relief following the UW-LPHS Test and were later diagnosed with interstitial cystitis. All patients report near complete or complete resolution of their pain post-operatively. 73.3% of patients (n=11) no longer require narcotics for pain control, and 26.7% of patients (n=4) are weaning their narcotic usage. 80% of patients (n=12) have returned to school or employment with 6.7% (n=1) planning to return in the coming months.

Conclusion: Renal autotransplantation for Loin Pain Hematuria Syndrome has been shown to reliably provide pain relief in this patient population; however, it is imperative to properly identify patients who will benefit from autotransplantation. The UW-LPHS Test appears to be an accurate predictor of successful outcomes following renal autotransplantation. Future studies are needed to further clarify the long-term outcomes in patients with LPHS who have undergone renal autotransplantation.

 

99.01 Expedited Evaluation for Liver Transplant: Does Patient Acuity Predict Outcome?

H. J. Braun1, D. Adelmann1, M. Tavakol1, A. Mello1, C. U. Niemann1, N. L. Ascher1  1University Of California – San Francisco,San Francisco, CA, USA

Introduction:
At our center, the majority of patients listed for liver transplantation (LT) are referred as outpatients by local and regional hepatologists (traditional). However, an increasingly large number of patients are placed on our waiting list after undergoing expedited workup (expedited), whereby they undergo urgent evaluation in the hospital. The purpose of this study was to compare the outcomes of expedited versus traditional patients and to determine whether the method of evaluation and acuity of presentation impact outcome after transplantation.

Methods:
All adult patients who underwent LT at our institution between 6/1/2012 and 12/31/2016 were reviewed. Patients were excluded if they received a transplant: 1) from a living donor, 2) as a retransplant, 3) for acute liver failure. We compared demographic data, intraoperative details, and outcomes from expedited versus traditional patients using Wilcoxon Rank Sum tests and Chi Squared tests. Survival was analyzed with Kaplan-Meier curves, and Cox regression was used to look at predictors of patient survival.

Results:
549 patients were included; 136 (24.7%) expedited, 413 (75.3%) traditional. Expedited patients were significantly younger (58 vs. 60, p<0.001), with greater median MELD at transplant (35 vs. 14, p<0.001), fewer diagnoses of hepatocellular carcinoma (HCC; 14% vs. 62.2%, p<0.001) and a higher percentage of patients requiring dialysis prior to transplant (47.9% vs. 33.3%, p=0.02). Expedited patients spent a shorter time on the waiting list (13 days versus 249 days, p<0.001), but there were no differences in perioperative or donor variables. Between expedited and traditional, there was no significant difference in survival at 30 days (p=0.77) or at 1 year (p=0.26) post-transplant (Figure 1). In the univariate regression analysis, the following variables were associated with an increased risk of death: creatinine at listing (HR 1.3, p=0.04), recipient of liver from a donation after cardiac death donor (DCD, HR=3.27, p=0.006); expedited evaluation (HR 2.56, p=0.01). In multivariate analysis, only DCD remained statistically significant (HR 3.26, p=0.009).

Conclusion:
Expedited LT evaluation occurs quickly but utilizes inpatient resources. It also impacts decision making during our selection committee meetings, as the acuity of these patients on presentation can be dramatic. In this review of our data, we found no significant difference in post-transplant survival for expedited versus traditional patients. Future efforts will focus on examining the cost-effectiveness of the inpatient evaluation and analyzing selection committee notes for expedited versus traditional patients.