49.04 Prealbumin Levels In Critically Ill Patients Correlate With CT-Derived Psoas Muscle Characteristics

N. T. Ferguson1, B. A. Warden2, J. L. Thomas3, S. P. Stawicki4  1St. Luke’s University Health Network,Department Of Medicine,Bethlehem, PENNSYLVANIA, USA 2Temple University,St. Luke’s University Hospital Campus,Bethlehem, PENNSYLVANIA, USA 3St. Luke’s University Health Network,Department Of Radiology,Bethlehem, PENNSYLVANIA, USA 4St. Luke’s University Health Network,Department Of Research & Innovation,Bethlehem, PENNSYLVANIA, USA

Introduction: Physiologic changes associated with acute stress may render traditional markers of nutritional status unreliable in the intensive care unit (ICU), creating the need for more objective alternatives. One such alternative to traditional serum laboratory testing is the use of data from computed tomography (CT), including the psoas muscle (PM) area and density. Our goal was to determine correlations between prealbumin and CT characteristics (e.g., density-corrected psoas area or DCPA) in a cohort of ICU patients. We hypothesized that PM area, density, and DCPA will significantly correlate with prealbumin in this population.

Methods: A convenience sample of ICU patients was studied between Jan 2010 – Jul 2015. Data collected included demographics (age, gender, BMI); labs (prealbumin, albumin, total protein, lymphocyte counts); abdominal CT measurements of PM density (Hounsfield units [HU]) and area (measured in mm2). Psoas data were acquired using axial CT images at the superior aspect of L4 vertebral body. Using advanced image processing software (GE Healthcare, Chicago, Illinois), the trace tool was used to outline PM borders. Software-generated cross-sectional area / HU were recorded. Bilateral PM data were averaged for cross-sectional area and density. The primary study variable was DCPA (average PM area/average PM density), and was further categorized into “low” (≤28) and “high” (>28) based on the mean dataset value. Permitted time between the CT and nutritional labs was 72 hrs (based on the 3-d half-life of prealbumin). Clinical data were contrasted against “high” and “low” DCPA. Univariate comparisons utilized Mann-Whitney U-test, Student’s t-test, and Fisher’s exact test, as appropriate.

Results: A total of 59 ICU patients underwent abdominal CTs for a variety of indications during the study period. Average patient age was 47.3 yrs, with 41/59 (69.5%) males. Average ICU stay was 25.2 d. A total of 86 measurement pairs were analyzed. The DCPA correlated with patient weight and prealbumin levels. It did not, however, correlate with BMI, lymphocyte count, albumin, or total protein determinations (Table 1). Neither the average PM area or density alone correlated with prealbumin.

Conclusion: Although neither of its constituent variables (psoas density or area) correlated with prealbumin in this pilot study, we found that DCPA ≤28 was associated with lower prealbumin levels. This identifies DCPA as a potential marker of suboptimal nutritional status. Clinical implications of this finding require independent confirmation, further investigation, larger sample sizes and greater data granularity.

49.03 Is Computer Tomography Of The Brain In Mild Traumatic Brain Injury Necessary?

T. J. Herron1, D. M. Davis1, L. Rabach1, Z. Boucher1, D. Ciesla1  1University Of South Florida College Of Medicine,Trauma & Acute Care Surgery/Surgery,Tampa, FL, USA

Introduction:  There are guidelines for the use of computed tomography (CT) in the evaluation of patients with Mild Traumatic Brain Injury (TBI), defined as a Glasgow Coma Scale (GCS) of 13-15.  Our clinical observation suggests the use of immediate head CT in patients with mild TBI infrequently leads to operative intervention or invasive intracranial pressure monitoring.

Methods:  Trauma patients with an initial Glasgow Coma Scale of 13-15 who presented to a level 1 trauma center were retrospectively reviewed from a prospective database over a 3-year period. All patients underwent an initial head CT scan as part of their trauma work up and had subsequent admission to the hospital. Patients on antiplatelet agents or anticoagulation, or with known intracranial pathology were excluded.

Results:  A total of 326 patients were reviewed. Of this, 115 patients met exclusion criteria, for a sample size of 211 patients. Four patients (1.89%) required operative intervention; one presented with a GCS 15, one with GCS 14. These patients underwent operative intervention only after having a decline in their GCS while under observation. Two patients who presented with a GCS of 13 underwent operation based on initial CT findings.  Two patients (0.9%) underwent invasive cranial monitoring.  In both cases, ICP monitoring was initiated after a decline in GCS while under observation. There was one mortality in the study; there were no mortalities in the patients that underwent intervention. Overall, 97% of patients were observed without need for operative intervention or ICP monitoring.    

Conclusion: In trauma patients presenting with GCS of 14 or 15, intervention was only necessary if there was a decline in clinical neurologic examination. Trauma Patients with a GCS of 14 or 15 and an indication for hospital admission can be observed for a decline in neurologic status without the need for initial or serial head CTs. 

 

49.02 Shedding New Light on Spontaneous Rapidly Resolving Traumatic Acute Subdural Hematomas

M. A. Brooke1, F. Castro-Moure2, A. K. Patel2, G. P. Victorino1  2Highland Hospital,Department Of Surgery,Oakland, CALIFORNIA, USA 1University Of California – San Francisco East Bay,San Francisco, CA, USA

Introduction:  In recent years, the phenomenon of rapidly resolving acute subdural hematoma (RRASDH) has gained recognition, but it is still poorly understood.  Previous studies have suggested the contribution of coagulopathy to rapid resolution, as well as the presence of a "low density band" between the ASDH and the skull, representing cerebrospinal fluid washout. Our goal was to investigate the significance of RRASDH, and examine any predictive factors.

Methods:  A retrospective analysis was performed of all non-operatively managed ASDHs treated at our trauma center from 2011-2015.  Inclusion criteria were ASDH on computed tomography (CT), admission Glasgow coma score (GCS) >7, and repeat CT to evaluate ASDH progression or regression over time. Rapid resolution was defined as decrease in hematoma thickness by 50% in 72 hours. Clinical data, CT findings, and trauma endpoints were collected and analyzed for resolving and non-resolving cases.

Results: There were 154 patients who met inclusion criteria. A change in hematoma thickness was associated with mortality, with patients who died showing a mean hematoma growth of 48% versus a 9% reduction in patients who survived (p=0.002). There were 29 cases of RRASDH, in which the average resolution rate was 0.23 mm/hr with a mean 78% size reduction. There were no predictive factors for rapid resolution with no differences between resolving and non-resolving groups in GCS, injury severity score, or initial CT findings (presence of cranial fractures, additional hemorrhage, cerebral edema, midline shift, or location of ASDH). These two groups also demonstrated no differences in endpoints such as mortality, ventilator days, ICU length of stay, or discharge destination. We found no difference between these two groups in either proportion of coagulopathic patients or gross PT/INR values. Finally, there was no difference in the prevalence of the "low density band" on CT. When compared with a group of patients who experienced rapid growth (>50% axial thickness in 72 hours), the RRASDH group did have a lower mortality (3.4% vs. 23.5%, p =0.04).

Conclusion: To our knowledge, this is the largest review of RRASDH. Our findings contradict some of the recent literature, as well as prevailing theories for the mechanism of rapid ASDH resolution. We found no significant relationship between coagulopathy or presence of a "low density band" and resolution of ASDH, shedding doubt on these theories of etiology. However, we also found no relationship between rapid resolution and better standard trauma outcomes, calling into question the basic significance of rapid resolution alone. What is far clearer is that rapid growth of ASDH is a poor clinical sign.

49.01 Utility of Coagulation and Platelet Function Testing after Traumatic Brain Injury

D. Shah2, D. Hanseman1, C. Zammit3, T. Pritts1, A. Makley1, B. Foreman3, M. Goodman1  1University Of Cincinnati,Division Of Trauma And Critical Care,Cincinnati, OH, USA 2University Of Cincinnati,College Of Medicine,Cincinnati, OH, USA 3University Of Cincinnati,Neurology & Rehabilitation Medicine,Cincinnati, OH, USA

Introduction:  Acute coagulopathy and platelet dysfunction commonly develop after traumatic brain injury (TBI). Thromboelastography (TEG) and platelet function assays (PFAs) can be performed upon admission to assess the risks of continued intracranial bleeding and injury progression. The role of TEG and PFA in assessing post-TBI coagulopathy has not been investigated. We hypothesized that, compared to blunt injuries, penetrating head injuries would (1) demonstrate greater coagulopathy by TEG, (2) be associated with altered platelet function assay results and (3) require more blood products after TBI. 

Methods:  Retrospective study of all patients admitted to the neurosurgical intensive care unit (NSICU) of a Level 1 trauma center from January 2013 through December 2015 with a head AIS ≥ 3. Patients were compared by mechanism of injury (blunt vs penetrating). Admission demographics, injury characteristics, and lab parameters, including blood gas, TEG, and PFAs, were evaluated. VerifyNow® Aspirin and P2Y12 tests were used for platelet function and potential inhibitor analysis. Admission labs were included if performed within 2 hours of hospital admission. Student’s t-tests were used to analyze quantitative differences between labs and groups. Pearson correlation coefficients were calculated between platelet count, PFA results and TEG results.

Results: 534 patients were included in the analysis (485 blunt and 49 penetrating). There were no differences between groups in platelet count or INR upon admission.  Patients with penetrating TBI were more coagulopathic by TEG, with all of the TEG parameters being significantly different except for R-time (Table). There were no differences in PFA results between blunt and penetrating groups, and the PFAs did not correlate to any TEG parameter in either TBI group. Additionally, the penetrating TBI group received more units of PRBCs in the first 4 hours (mean 0.67 vs 0.17 units, p<0.01) and plasma in the first 24 hours (mean 0.83 vs 0.25 units, p=0.046) than the blunt TBI group. There were no differences in platelet or cryoprecipitate use between TBI groups.

Conclusion: Patients presenting with a penetrating TBI demonstrate increased coagulopathy as compared to those with blunt TBI. Penetrating TBI patients received more blood products within the first 24 hours. Platelet testing did not correlate to TEG findings in this population. Routine admission TEG, but not VerifyNow PFA, may be utilized to demonstrate posttraumatic coagulopathy and determine the need for targeted blood product administration after TBI.

48.20 Donor Biliary Anatomy should not be a Contraindication for Right Liver Donation

K. S. Chok1, A. C. Chan1, J. W. Dai1, J. Y. Fung2, T. Cheung1, S. Sin1, T. Wong1, K. Ma1, C. Lo1  2The University Of Hong Kong,Division Of Gastroenterology And Hepatology, Department Of Medicine,Hong Kong, NA, Hong Kong 1The University Of Hong Kong,Division Of HBP Surgery And Liver Transplantation, Department Of Surgery,Hong Kong, NA, Hong Kong

Introduction: There are scarce data on the impact of donor biliary anatomy on the incidence of biliary complication in donors and in recipients after liver transplantation. This study tried to establish the relation between donor biliary anatomy and incidence of biliary complication after right-lobe living donor liver transplantation (RLDLT), and to determine whether donor biliary anatomy should be a contraindication to right liver donation.

Patients and

Methods: A retrospective study was performed on all adult recipients of RLDLT at our center from January 2011 to December 2014. They were divided into the Stricture group and the Non-stricture group. Donor biliary anatomy was classified according to Huang’s classification, and all cholangiograms were reviewed by the first author.

Results: There were 125 RLDLTs performed during the study period. Twenty-six recipients had biliary anastomotic stricture (the Stricture group). One donor in the Stricture group and one in the Non-stricture group had biliary stricture; both of them had type-A anatomy. Bile leakage was not seen in any donor or recipient. The most common donor biliary anatomy was type A (96/125; 76.8%), followed by type B (13/125; 10.4%) and type D (10/125; 8%). Univariate analysis found no correlation between type of donor biliary anatomy and incidence of biliary anastomotic stricture in recipients (p=0.49).

Conclusions: Type of donor biliary anatomy was not related to the incidence of recipient or donor biliary complication, and therefore donor biliary anatomy should not be a contraindication to right liver donation.

48.19 The True Cost of Recurrent Hepatitis C

J. Campsen1, H. Thiesset1, R. Kim1  1University Of Utah,Transplantation And Advanced Hepatobiliary Surgery,Salt Lake City, UT, USA

Introduction:
It is commonly recognized that recurrent hepatitis c after transplantation is universal and immediate (1). The disease process within the liver is made worse after transplantation and accounts for 30% of transplants having cirrhosis within five years (2-4). Changes in antiviral treatment before transplantation show promising results, but has yet to eliminate the threat against the newly implanted organ. Furthermore recurrent hepatitis c after liver transplantation has a large financial impact on patients and the healthcare system. A national estimate for liver transplantation and first year costs can range from $300,000-$500, 000 without significant complications such as recurrent hepatitis c (5). We aimed to examine the financial impact of recurrent hepatitis c after transplantation and show one center’s experience.

Methods:
After approval from the University of Utah Institutional Review Board and as part of a clinical trial (6), patients underwent standard of care antiviral therapy with sofosbuvir and ribavirin before liver transplantation. Patients 1& 2 showed non-detectable levels of hepatitis C before inclusion in the clinical trial. The first patient was randomized to control and the second was randomized to 300mg of Civacir® Polyclonal Immune Globulin (IgG) and received 16 doses.

Results:
Patient 1 who was randomized to control, had recurrent hepatitis c detected at day six post liver transplantation. The control patient had significant complications after surgery due to the recurrent hepatitis c including multiple hospitalizations at an estimated cost of $140, 408.00. Furthermore, this patient required subsequent re-treatment with antiviral therapy at a national average cost of $169,000(7) (Table) for 24 weeks of treatment. The patient who received Civacir® did not have recurrent hepatitis c within the first six months of follow up and study completion.

Conclusion:
Two patients with similar standard of care protocols were enrolled in a clinical trial to prevent recurrent hepatitis c. In our experience, Civacir® proved effective in preventing recurrent hepatitis c in the transplanted liver and further prevented subsequent costs associated with treatment and patient care. The potential impact of Civacir® could eliminate the need for re-transplantation and hundreds of thousands of dollars to the healthcare system.

 

48.18 Postoperative Complications Associated With Parathyroid Autotransplantation After Thyroidectomy

Z. F. Khan1, G. A. Rubio1, A. R. Marcadis1, T. M. Vaghaiwalla1, J. C. Farra1, J. I. Lew1  1University Of Miami Miller School Of Medicine,Division Of Endocrine Surgery, DeWitt Daughtry Family Department Of Surgery,Miami, FL, USA

Introduction: Permanent hypoparathyroidism is a well-recognized complication of total thyroidectomy that may acutely manifest postoperatively with muscle spasms/tetany, paresthesias, and seizures. An established procedure, parathyroid autotransplantation (PAT) can successfully prevent permanent hypoparathyroidism due to inadvertently resected or devascularized parathyroid tissue. This study examines the independent patient characteristics and postoperative complications associated with those undergoing PAT after total thyroidectomy.

Methods: A retrospective cross-sectional analysis was performed using the Nationwide Inpatient Sample (NIS) database from 2006-2011 to identify surgical patients hospitalized for total thyroidectomy that did or did not undergo PAT. Characteristics including co-morbidities and postoperative complications were measured. Univariate and logistical regression analyses were conducted to identify characteristics that were independently associated with patients that underwent PAT. Data were analyzed using two-tailed Chi-square and t-tests.

Results:Of 219,584 admitted patients who had total thyroidectomy, 14,521 (6.7%) also underwent PAT. Patients in the PAT group had fewer comorbidities including DM, HTN, CHF, chronic lung disease (12.5% vs 15.1%, 37.1% vs 39.9%, 1.5% vs 2.1%, 11% vs 12%, respectively,  p<0.01) and fewer cardiac complications including stroke and MI (0% vs 0.2% and 0.1% vs 0.2% , respectively, p<0.01). However, the autotransplanted group had higher rates of renal failure (2.7% vs 2.1%, p<0.01) and thyroid malignancy (55.4% vs 43.1%, p<0.01) compared to those not autotransplanted. The PAT group also had higher incidence of wound complications including SSI and seroma (2.6% vs 2.1%; 0.2% vs 0.1%; 0.2% vs 0.1%, p<0.01, respectively), unilateral vocal cord paralysis (2.4% vs 1.6%, p<0.01), substernal thyroidectomy (8.7% vs 7.5%, p<0.01) and in-hospital death (1.6% vs 0.3%, p<0.01). Immediate hypoparathyroidism (3.2% vs 1.3%, p<0.01), hypocalcemia (15% vs 8.6%, p<0.01), and tetany (0.3% vs 0.1%, p<0.01) were all associated with PAT patients as well. On multivariate analysis, renal failure (2.246 OR; 95% CI 1.448-3.485), and elective procedures (OR 1.744; 95% CI 1.422-2.129) were associated with increased odds of undergoing PAT during hospitalization for total thyroidectomy.

Conclusion:Although a known preventative measure for permanent hypoparathyroidism, PAT is associated with higher rates of postoperative complications. Patients with fewer comorbidities who undergo PAT experience higher rates of wound complications, hypoparathyroidism, hypocalcemia and tetany. Acute severity of postoperative hypoparathyroidism may further contribute to higher rate of in-hospital death in these PAT patients. PAT should not be routinely performed and utilized only in select patients with suspected compromised parathyroid function after total thyroidectomy

 

48.17 Subcutaneous Granular Cell Tumor: Analysis of 19 Cases Treated at a Dedicated Cancer Center

A. S. Moten3, S. Movva2, M. Von Mehren2, N. F. Esnaola1, S. Reddy1, J. M. Farma1  1Fox Chase Cancer Center,Department Of Surgery,Philadelphia, PA, USA 2Fox Chase Cancer Center,Department Of Hematology/Oncology,Philadelphia, PA, USA 3Temple University Hospital,Department Of Surgery,Philadelphia, PA, USA

Introduction:  Granular cell tumors (GCT) are rare lesions that can occur in almost any location in the body, and there have been no large-scale studies regarding GCT located in the subcutaneous tissue.   The aim of this study was to define patient characteristics, treatment patterns and outcomes of patients with subcutaneous GCT.

Methods:  A retrospective chart review was performed of patients with subcutaneous GCT treated at a dedicated cancer center.  Descriptive statistics were obtained, bivariate and multivariate regression performed, and survival rates calculated using Stata software.  

Results: A total of 19 patients were treated for subcutaneous GCT at our institution between 1992 and 2015, 79% female and 63.2% white.  Mean age was 48.2 years.  Most (68.4%) had comorbidities, and some (31.6%) had a history of cancer.  Mean tumor size was 2.37cm.   Most patients underwent primary excision of their tumors without undergoing prior biopsy (73.7%).  Men were more likely to undergo re-excision for positive margins than women (75.0% versus 13.3%, respectively, p-value 0.01).   No patient received adjuvant therapy.  Three patients (15.8%) had multifocal tumors, and they were significantly more likely to experience recurrence than patients with solitary tumors (33.3% versus 6.25%, respectively, p-value 0.02).  Patients with multifocal tumors were also more likely to undergo repeat surgery (33.0% versus 0%, respectively, p-value 0.02).  A total of 2 patients (10.5%) experienced recurrence, with a median time to recurrence of 23.5 months.  Overall cancer-specific 5-year survival was 88.0%.   There was no increased risk of death based on gender, race or recurrence status.

Conclusion: Patients with subcutaneous GCT treated with excision fair well without adjuvant treatment.  However, patients with multifocal tumors are more likely to experience recurrence and should undergo repeat surgery.

 

48.16 Effect of Hospital Safety Net Status on Treatment and Outcomes in Hepatocellular Carcinoma

A. A. Mokdad1, A. G. Singal2, J. A. Marrero2, A. C. Yopp1  1University Of Texas Southwestern Medical Center,Surgery,Dallas, TX, USA 2University Of Texas Southwestern Medical Center,Internal Medicine,Dallas, TX, USA

Introduction:  Safety net hospitals play an integral role in the care of “vulnerable” patients with cancer. Following the institution of the Affordable Care Act (ACA), the fate of safety net hospitals is unclear. Hepatocellular carcinoma (HCC) is a leading cause of cancer deaths and the fastest growing cancer in the United States. The role of safety net hospitals in the management of this health taxing cancer has not been investigated. This study explores the presentation, treatment, and outcomes of patients with HCC at safety net hospitals in effort to guide resource allocation during an evolving healthcare platform.

Methods:  A total of 17,551 patients with HCC were identified in the Texas Cancer Registry between 2001 and 2012. Hospitals in the highest quartile of disproportionate share hospital index were classified safety net. Patient demographics, tumor presentation, treatment, and overall survival were compared among patients managed at safety net hospital(s), non-safety net hospital(s), or both. Risk-adjusted treatment utilization and overall survival were examined using multivariable analysis. The proportion of patients presenting at safety net hospitals over time was explored using time trend analysis. Transfer patterns between safety net and non-safety net hospitals were examined.  

Results: A total of 328 acute short term hospitals were identified, 74 (23%) were designated safety net. Safety net hospitals were more likely teaching compared to non-safety net hospitals; oncology and radiology resources were comparable. Forty-three percent of HCC patients sought care at a safety net hospital (33% exclusively at safety net hospital(s) and 10% at both safety net and non-safety net hospitals). The proportion of HCC patients presenting at safety net hospitals did not significantly change over the study period time. Patients at safety net hospitals were mostly Hispanic (58%) and poor (61%). Tumor stage was comparable between hospitals categories. Overall treatment utilization was lower at safety net hospitals (adjusted odds ratio [OR]=0.85, 95% confidence interval [CI]=0.78-0.92) which was largely related to lower chemotherapy use (26% vs. 34%, P < 0.01). Overall survival was comparable (adjusted hazard ratio [HR]=1.03, 95% CI=0.99-1.08). In patients managed at both hospital groups, diagnosis and management of disease recurrence/persistence were more common at non-safety net hospitals, while first course treatment of HCC was more common at safety net hospitals. 

Conclusion: Almost one in two patients with HCC seek care at safety net hospitals. While the fate of safety net hospitals remains uncertain under the ACA, monitoring the redistribution of HCC patients and anticipating resource allocation will be key in an evolving healthcare platform.
 

48.15 Peritoneal Dialysis is Feasible as a Bridge to Simultaneous Liver-Kidney Transplant

R. Jones1, R. Saxena2, C. Hwang1, M. MacConmara1  1University Of Texas Southwestern Medical Center,Department Of Surgery, Division Of Surgical Transplantation,Dallas, TX, USA 2University Of Texas Southwestern Medical Center,Department of Internal Medicine, Division Of Nephrology,Dallas, TX, USA

Introduction: Combined chronic liver and kidney failure is a life-threatening condition that requires complex multiorgan support given complications related to fluid shifts, bleeding and infection. Simultaneous liver-kidney transplant (SLKT) is the best therapeutic option, but scarcity of organs leaves the majority of patients on dialysis for months or years while awaiting SLKT. Hemodialysis (HD) exacerbates pre-existing intravascular instability. Peritoneal dialysis (PD) is an alternative strategy which causes less hemodynamic instability and may assist in the management of large volume ascites. However, PD has been avoided in cirrhotics due to concerns regarding elevated risk of bacterial peritonitis, treatment failure or impairment of transplant candidacy. We describe our outcomes using PD in a group of 12 patients with combined liver and kidney failure, demonstrating that PD is a preferential bridging option in patients awaiting SLKT. 

Methods: Patients with advanced liver and kidney failure who were initiated on PD between January 2006 and December 2015 were identified by review of our institution’s medical record.  The hospital electronic medical record and dialysis center records were used to retrieve demographic and clinical data.  Outcomes included mortality, complications of catheter insertion, dialysis treatments and need for large volume paracentesis.

Results: Twelve patients with combined liver and kidney failure were initiated on PD during this period. Ten of 12 patients were male with average age of 56 years. No deaths occurred at the completion of the study with mean follow up of 4.5 years. With average MELD score of 21, expected three-month mortality would be 19.6 percent. There are a total of 480 months of peritoneal dialysis amongst these patients.  Three patients received SLKT, while the 9 remaining patients continue on PD. Within this group of 9, 4 are actively listed for SLKT. There was 1 operative complication.  The rate of peritonitis was 1 episode every 44 months of PD. The need for large volume paracentesis was totally eliminated given ability to drain fluid daily. After initiating PD, patients were hospitalized a mean of 4 times with a median of 2 times (one patient was admitted 20 times for recurrent gastrointestinal bleed).

Conclusions: Our data suggests that PD is a useful option for management of combined liver and kidney failure and can be used to provide a durable treatment for selected patients.  Peritoneal dialysis can bridge patients to transplant or serve as a long-term treatment option for patients not suitable for transplantation. We promote PD as first line renal therapy in patients with combined liver-kidney failure given its excellent outcomes and usefulness in managing ascites.

48.14 Warming during implantation: an overlooked opportunity for improvement in kidney transplantation?

Y. GoldenMerry1, H. Piristine1, P. Prabhakar1, J. Parekh1, C. Hwang1, M. MacConmara1  1UT Southwestern Medical Center,Dallas, TX, USA

Introduction:
Many factors impact outcomes of transplant kidney allografts.  There has been renewed interest in studying the effect of warm ischemia time during implantation on allograft outcomes.  Longer anastomotic time leads to warming of the allograft, and organ temperature above 15 degrees Celsius at reperfusion has been shown to increase the risk of delayed graft function (DGF).  DGF is risk factor for allograft loss and has been associated with a reduction in five-year allograft survival by up to 50%. We sought to investigate the perceived importance of anastomotic time amongst practicing kidney transplant surgeons and attitudes toward potential need for improvement in this component of the transplant process.
 

Methods:
Transplant surgeons were invited to complete an anonymous electronic questionnaire on their kidney transplant operative practices. Self-reported data on cold and warm ischemia time, percent of organ imports, total length of operation, percent of complex anastomotic procedures and anastomosis time were gathered. Opinions regarding the effect of warm ischemia time on DGF, current methods to combat organ warming, and receptiveness to new technology were also collected.
 

Results:
Surgeons at seven transplant centers across the US completed the survey.  Average cold ischemia time (time from cross clamp until the time the organ was taken out of ice) at centers was 13.3+/-3.7 hours, import kidneys accounted for 12+/-7% of transplants, and 26+/-25% of all kidneys were placed on pulsatile perfusion prior to transplantation.  The average operative time was 174+/-37 minutes, with anastomoses taking 30+/-5 minutes.  Surgeons perceived that warm time was greater than goal in 25+/-21% of anastomoses. Seventy percent of surgeons agreed that warming during implantation contributes to negative graft outcomes (DGF), however 30% of responders did not employ any cooling and the remainder used an icy wrap or tried to irrigate with cold slush while performing the anastomosis.  Eighty percent of surgeons indicated they would utilize a specific cooling device if available.

 

Conclusions:
Longer anastomotic time has been demonstrated to negatively affect kidney allograft function. Our data show that surgeons recognize the negative impact of warming during anastomoses especially in complex cases.  Most surgeons do not use a specific strategy to keep the organ cool and would be very receptive to a dedicated cooling device.  

 

48.13 ABO Incompatible Renal Transplant Reduces Waitlist Times: Analysis of the UNOS Database (1995-2015)

C. S. Lau1,2, K. Malik2, S. Mulgaonkar3, R. S. Chamberlain1,2,4  1Saint Barnabas Medical Center,Surgery,Livingston, NJ, USA 2St. George’s University School Of Medicine,St. George’s, St. George’s, Grenada 3Saint Barnabas Medical Center,Medicine,Livingston, NJ, USA 4Rutgers University,Surgery,Newark, NEW JERSEY, USA

Introduction:   Renal transplants significantly improve quality of life for patients with end stage renal disease (ESRD) relying on lifetime dialysis.  However, the demand for organs exceeds the number of available organs, increasing the need for ABO incompatible (ABOi) renal transplant.  The current knowledge regarding the clinical outcomes of ABOi transplantations is limited and derived mainly from case reports and small cohort studies. This study examines a large cohort of kidney transplant patients undergoing ABOi incompatible and ABO compatible (ABOc) transplants, in an effort to identify the demographic and clinical factors associated with graft survival outcomes and transplant waitlist times.

Methods:   Demographic and clinical data for 102,084 patients undergoing renal transplant were abstracted from the United Network for Organ Sharing (UNOS) database (1995-2015).  Patients were grouped into ABOc (N=101,237) and ABOi (N=847) renal transplants.  Endpoints examined included waitlist times and graft survival time.  Standard statistical methodology was used. 

Results:  A total of 102,084 patients received a renal transplant, 847 (0.83%) received ABOi transplants and 101,237 (99.17%) received ABOc transplants.  The mean age of transplant recipients were similar for both ABOc (49.86 ± 15.8 years) and ABOi (50.6 ± 14.2 years) transplants.  Although there were more male transplant recipients (ABOc: 64.0% and ABOi: 61.0%) compared to females, a similar male-to-female ratio was observed among both ABOc and ABOi transplants (1.78:1 and 1.57:1, p>0.05). While a majority of ABOc transplants were from cadaveric donors (66.4% vs. 34.7% living donors, p<0.01), a significantly greater number of ABOi transplants were from living donors (65.3% vs. 34.7% cadaveric, p<0.01). The mean waitlist time to transplant was significantly shorter for ABOi transplants compared to ABOc transplants (585.8 vs. 739.3 days, p<0.01). Graft survival time remained similar between both ABOi and ABOc transplants (770.0 vs. 821.5 days, p=0.197).

Conclusions:  Advances in kidney transplantation have significantly improved the prognosis of patients with ESRD.  In comparison to ABOc, ABOi renal transplant significantly shortens waitlist times, while maintaining similar graft survival times. Where adequate immunosuppression is available, ABOi renal transplants should be considered when ABOc transplants are not available. Further studies comparing the safety and efficacy of ABOi transplants, including long-term follow-up and required immunosuppression, are required.

48.12 Pancreas Retransplantation Is Risky For Patients With A History Of Transplant Pancreatectomy

M. Barrett1, Y. Lu2, D. M. Cibrik2, R. S. Sung1, K. J. Woodside1  1University Of Michigan,General Surgery,Ann Arbor, MI, USA 2University Of Michigan,Internal Medicine,Ann Arbor, MI, USA

Introduction:   Despite improvements in pancreas transplant outcomes, a small but significant subset of patients experience catastrophic graft failure, often due to allograft thrombosis, necessitating transplant pancreatectomy. It is unclear how this subset of patients fares when retransplanted.  We sought to review our institution’s experience with second pancreas transplant after previous transplant pancreatectomy.

Methods: Patient encounters in which transplant pancreatectomies were performed were identified using associated billing codes.  Chart review of these encounters through both the Organ Transplant Information System and the hospital EMR system was used to collect demographic and outcomes data.  Further investigation of discharge paperwork, clinic notes, and outside records was performed on patients who underwent second pancreas transplant to analyze allograft function.

Results: Between January 1990 and July 2016, 402 pancreas transplants—293 simultaneous kidney pancreas (KP) transplants, 99 pancreas after kidney (PAK) transplants, and 10 pancreas transplants alone (PTA).  Amongst this cohort, 87 pancreatectomies were performed in 78 patients. Of these, 15 (19%) patients underwent a second pancreas transplant after transplant pancreatectomy. The study population consisted of 5 women and 10 men. Median age at initial pancreas transplant was 37 years (range 27 – 57 years), with 8 patients who initially underwent PAK transplant, 6 who underwent a simultaneous KP transplant and 1 who underwent PTA.  Indication for initial pancreatectomy was thrombosis in 12 patients, all of whom had their graft removed within one month of transplant (median 1 day, range 0 – 31 days).  Another 3 patients developed an intraabdominal infection requiring pancreas allograft explantation (median 26 months, range 6 – 108 months).  

Median time from pancreatectomy to second transplant was 18 months (range 7-94 months). For the second pancreas transplants, one patient underwent KP transplant, while all others underwent pancreas-only transplant. Median time after second transplant to last documented follow was 10 years (36 days – 19 years).   Four pancreas allografts are still functioning—3 of which have been functional for 10 years.  Of the remaining patients, 7 required transplant pancreatectomy for allograft thrombosis in the immediate post operative period (4 at 1 day, with the rest within a week).  Four others failed after the perioperative period (at 1, 2, 2, and 6 years, respectively). The 8 patients with graft function for a year or more were younger at second transplant (median 38 vs. 45 years old), although this was not statistically significant.

Conclusion: Pancreas retransplant after previous transplant pancreatectomy is feasible, although it is associated with a high initial failure rate, suggesting that these patients require additional considerations before retransplantation—over and above that of those with intact failed pancreas allografts.

48.11 Surgical techniques of concomitant coronary artery bypass grafting and lung transplantation.

M. Hamms1, M. A. Kashem2, B. O’Murchu4, R. Bashir4, J. Gomez-Abraham2, S. Keshavamurthy2, E. Leotta2, T. Yoshizumi2, K. Shenoy3, A. J. Mamary3, G. Criner3, F. Cordova3, Y. Toyoda2,3  1Temple University Hospital,Philadelpha, PA, USA 2Temple University,Cardiovascular Surgery,Philadelpha, PA, USA 3Temple University,Division Of Thoracic And Pulmonary Medicine,Philadelpha, PA, USA 4Temple University,Section Of Cardiology,Philadelphia, PA, USA

Introduction: Significant coronary artery disease is a relative contraindication for lung transplantation. However, recent single center studies suggest concomitant coronary artery bypass grafting (CABG) can be performed at the time of lung transplantation. The purpose of this study was to show our excellent outcomes with these concomitant procedures, and to describe our surgical techniques.

Methods: Retrospective review for 240 consecutive lung transplants performed during March, 2012 to August, 2016, was conducted. Lung Transplantation with CABG (n=17) and without CABG (n=223) was compared for statistical significance using SAS Inc. 

Results:

The recipient age was significantly (p=0.009) higher, 66 ± 5 (range 52-74) years in lung transplant with CABG vs. 62 ± 10 (range 21-78) years in Lung transplant without CABG whereas the lung allocation score (60 ± 21 vs. 53± 21), and the donor age (35 ± 10 vs. 33 ± 11 years)  were similar, respectively.

All CABGs (bypass grafts=1-3) were performed on a beating heart without cardioplegic cardiac arrest, with off pump (n=7), with cardiopulmonary bypass (n=7), and with veno-arterial extracorporeal membrane oxygenation (n=3). On pump vs. off pump was determined based on the need to safely perform lung transplant portion.

Surgical approaches were determined based on the surgical exposure to the lung and coronary arteries, consisting from median sternotomy (n=7), anterior thoracotomy (n=7) and clamshell (n=3).

When the left anterior descending coronary artery required revascularization, the left internal mammary artery (LIMA) was used in 92% (11 out of 12 patients). The LIMA was harvested through median sternotomy (n=6) or left anterior thoracotomy (n=5).

When the saphenous vein grafts were used (n=15), the inflow was the ascending aorta (n=12), the descending aorta (n=2) and the LIMA (n=1).

The median hospital stay was similar with lung transplant with CABG (18 days) vs. lung transplant without CABG (18 days).

Two patients died after concomitant lung transplantation and CABG on postoperative day No. 414 and 642 both due to infection, resulting in 100% 1-year survival and 80% 3-year survival rates whereas lung transplantation without CABG had 85% and 76%, respectively (p=0.397).

Conclusion:Excellent outcomes can be achieved in lung transplantation along with concomitant CABG by carefully conducted surgical strategies including off pump vs. on pump, a variety of surgical approaches, and choice of conduits.
 

48.10 Hospital Readmissions Following Discharge After Orthotopic Liver Transplantation (OLT)

E. J. Minja1, T. L. Pruett1  1University Of Minnesota,Division Of Solid Organ Transplantation,Minneapolis, MN, USA

Introduction:

Preventing early hospital readmissions is key in reducing  medical care cost.  Our objective was to determine the incidence and causes of readmissions ≤ 30 days of discharge following an Orthotopic Liver Transplantation (OLT).

Methods:

1028 patients underwent OLTs  between 1/1/1997 and 12/31/2014  at our institution . Electronic medical  records were reviewed after IRB approval. Causes of readmissions were analyzed. Patients  ≤ 18 years  were excluded  from analysis. Student’s t-test was used to compare groups. A value of p< 0.05 was considered significant.

Results:

Between 1/1/1997 and 12/31/2014, 1028  OLTs were performed, of which 155 (15.1%) were from living donors (LD) and 873 (84.9%) from deceased donors (DD). 931 patients (90.7%) underwent liver alone transplants and 96 patients (9.3%) patients had simultaneous liver and kidney transplants [Table 1].

473 patients (46%) were readmitted ≤ 30 days of discharge. Complete data for analysis was available for 225 patients who received OLTs between 9/2004 and 12/2014. Of this pool of readmitted patients, 188 patients (83.6%) had received DD OLTs vs. 37 patients (16.4%) who had LD OLTs.

The most common cause of hospital readmissions following an OLT was biliary complications. Of the LD OLT recipients readmitted within 30 days of discharge, 43% had biliary complications compared to 13% for DD recipients, p=<0.05. The most common biliary complications amongst LD OLT recipients were bile leaks (50%) and bilomas (31%) [Figure 1-3].

Conclusion:
46% (453/1028) of our patients after OLT were readmitted ≤30 days of discharge representing a significant health care burden. Our data suggests the need to develop predictive models for readmission following OLT and perhaps the need to change our surgical approaches, with goals to reduce preventable hospital readmissions.

48.09 Outcome of Different Induction Therapies in Living Donor Renal Transplant in Indian Population

M. K. Lowther3, M. Khan3, S. Bansal3, V. Kher4, H. Raja5, F. Nwariaku2, J. Parekh2, B. Tanriover1, N. Rajora1  1University Of Texas Southwestern Medical Center,Internal Medicine/ Nephrology,Dallas, TX, USA 2University Of Texas Southwestern Medical Center,Sugery,Dallas, TX, USA 3University Of Texas Southwestern Medical Center,Dallas, TX, USA 4Medanta,Transplant Nephrology,Gurgaon, HARYANA, India 5Baylor University Medical Center,Internal Medicine,Dalas, TX, USA

Introduction:  Induction therapy with interleukin-2 receptor antagonist (IL2-RA) is recommended as a first line agent in living donor renal transplantation (LRT). However, comparative outcomes of induction therapy remains controversial in Indian LRT population.

Methods:  A single center (Medanta Medicity, Gurgaon, India) dataset was retrospectively studied for patients receiving LRT from 2010 to 2014 (N=901) to compare effectiveness of IL2-RA to other induction options (no-induction and rabbit anti-thymocyte globulin [R-ATG]). IL2-RA and no-induction were chosen for immunologically low risk patients. R-ATG was primarily given to the recipient with PRA>20% and HLA mismatch > 5 antigen out of 6. Patient charts were analyzed for dates which included follow-up dates with corresponding creatinine levels (at 3 months, 6 months, 1 year, last follow up), date and type of rejection if applicable, graft loss and death. The data used for analysis was the patients’ most recent follow up. The main outcomes were the risk of acute rejection at one-year and overall allograft failure (graft failure or death) post-transplantation through the end of follow-up.

Results: A total of 901 patients were followed with 316 patients on no induction, 550 patients on IL2-RA, and 35 patients on R-ATG. Rejection rates of the recipients were 26.4%, 22.6%, and 8.2% respectively (P = 0.92). Graft failure rates of the recipients were 3.3%, 1%, and 0% respectively (P = 0.11). The mean age of recipients was 38.7 years old. Similar Kaplan Meier curves for overall graft survivals were observed among induction categories. Rejection rate was higher in no-induction and IL2-RA groups (~25%) compared to r-ATG induction. On univariate Cox analysis, compared to no-induction therapy, overall allograft failure were similar among induction categories.

Conclusion: Compared to no-induction therapy, IL2-RA induction was not associated with better outcomes in Indian LRT recipients. R-ATG appears to be an acceptable and possibly the preferred induction alternative for IL2-RA in high rejection risk Indian patients as it offers lower rejection rates and better graft survival long term.

 

48.08 Pathology and Outcomes of Incidental Hepatocellular Carcinoma following Liver Transplantation

C. Ekeke4, C. Hughes1, A. Humar1, A. Tsung3, S. Ganesh1,2, V. Rachakonda1,2, A. D. Tevar1  1Thomas E. Starzl Transplantation Institute,Dept. Of Surgery / Unversity Of Pittsburgh,P, PA, USA 2Division Of Gastroenterology,Hepatology & Nutition / Dept. Of Medicine / University Of Pittsburgh,Pittsburgh, P, USA 3UPMC Liver Cancer Center,Dept. Of Surgery / University Of Pittsburgh,Pittsburgh, PA, USA 4Department Of Surgery,University Of Pittsburgh Medical Center,Pittsburgh, PA, USA

Introduction:   Liver transplant (LT) remains the most effecting treatment modality for management of hepatocellular carcinoma (HCC) in the end-stage liver disease population.  The longterm outcomes of preoperatively known HCC treated with LT have been well characterized.  Less is known about the tumor pathology and outcomes of incidentally discovered HCC found during hepatic explant pathology review. The aim of this study was to determine incidence, patient and pathologic characteristics and outcomes of incidental hepatocellular carcinoma discovered following LT in a large volume center experience.

Methods:   This study retrospectively reviewed patients undergoing liver transplant at the University of Pittsburgh Medical Center from 2002 to 2013.   Review of patient demographics, preoperative radiographic, tumor markers, tumor pathologic characteristics, short and long-term outcomes was performed.

Results:  During the study period, 320 patients underwent LT in which HCC was known preoperatively or found on explant. The average follow up was 2035.6 days.  Incidental HCC was detected in 52 of 1886 (2.8%) patients who underwent LT during that time period.. The most common indication for liver transplantation was hepatitis C.   Patients with incidental HCC versus known HCC shared similarities in age (57.21 vs. 58.09 yrs), sex (78.8% and 80.2% male) and lab MELD at transplant (17.27 vs. 15.01).  Average Peak and pre-transplant alpha fetal protein tumor (AFP) markers were 33.5 and 30.48 in the incidental HCC cohort and 849.2 and 337.45 in the known HCC group. Incidental HCC LT had more moderate to poorly differentiated tumor pathology (71.2% vs. 58.2%, p value = <0.05) and similar numbers of well (23.1% vs. 21.3%) and poorly (5.8% vs. 6.3%) differentiated lesions.  Lack of vascular invasion was similar between the two groups (73.1% vs 66.4% ), in incidental HCC and known HCC, respectively. HCC recurrence was 9.6% in incidental HCC and 12.7% in known HCC.

Conclusion:  We present a large volume experience with incidental HCC found after LT.  Patient demographics, recurrence and survival outcomes were similar in incidental and known HCC LT recipients.  Pathological findings were comparable in size, with evidence of more moderately-poorly differentiated tumors in the incidental HCC group.

 

48.07 Utility of Pre-Liver Transplant Screening Colonoscopy

R. C. Graham1, O. Afolabi1, J. A. Fridell1, C. A. Kubal1, B. Ekser1, R. S. Mangus1  1Indiana University School Of Medicine,Transplant Division, Department Of Surgery,Indianapolis, IN, USA

Introduction:

Solid organ transplant patients are at an increased risk for de-novo malignancies. Based upon existing literature, it is unclear if these patients have an increased risk of colorectal cancer (CRC) compared to the general population. This study reviews the reports for the required pre-transplant colonoscopy for a large number of liver transplant patients, and then assesses the risk of CRC and other cancers post-transplant.

Methods:

The records of all adult patients undergoing liver transplant (LT) at a single center over a 15 year period were reviewed. The protocol for CRC screening at our center requires a colonoscopy within 3 years of listing for LT. There is no specific post-transplant screening protocol, other than the standard of care for the community. Finding of advanced adenomas (polyps with villous histology, serrated histology, or dysplasia) and colon carcinoma are reported as events. Colonoscopy and pathology reports were reviewed for all patients included in the analysis.

Results:

There were 1685 liver transplants performed during the study period, with 1431 having a pre-transplant colonoscopy report available for review (85%). The median time from colonoscopy to transplant was 9 months. Median follow up was 69 months (minimum 12 months) post- transplant. Of those with available colonoscopy reports, 608 patients had a polyp identified (42%), of which 493 were biopsied with an available pathology report (81%). Of the biopsied polyps, 3 were cancerous (0.2% of all patients, 0.5% of patients with a polyp) and 38 were pre-cancerous (2% of all patients, 6% of patients with a polyp).  Among all patients there were 9 individuals who developed post-LT CRC (0.5%). Of these 9 CRC patients, 3 had an abnormal colonoscopy, one with hyperplastic polyps, one with tubular adenomas, and one with a combination of hyperplastic polyps, tubular adenomas, and tubulovillous adenoma; only the last of these three was considered pre-cancerous. Of the 38 patients with pre-cancerous polyps, one developed CRC in the follow up period (3%). There were 9 of the 38 (24%) patients with precancerous polyps who developed other cancers post-transplant including skin (5), breast (1), lung (1), bladder (1), and sarcoma (1). This compares to 16% of patients developing any non-HCC post-transplant cancer.

Conclusion:

These results suggest that screening colonoscopy prior to transplant is effective for exclusion of patients at high-risk for developing CRC post-transplant. Additionally, patients with pre-cancerous colon lesions appear to be at increased risk of developing other cancers post-transplant, but not CRC.

48.06 Improved Utilization and Sharing of Liver Allografts Using a Dedicated Independent Donor Surgeon

S. Gotewal1, C. Hwang1, J. Reese1, M. MacConmara1  1University Of Texas Southwestern Medical Center,Transplant,Dallas, TX, USA

Introduction:  In 2014, a novel approach to organ procurement was initiated by the organ procurement organization (OPO) in the North Texas by hiring a full-time donor surgeon.  The intent was to increase utilization and enhance distribution of organs. The aim of our study was to investigate the impact of the independent OPO surgeon on discard rates and patterns of organ use. 

Methods:  A retrospective review of the OPO donor database identified all procurement cases from the North Texas DSA, between January 1, 2013 and September 30, 2015. Basic donor demographic data, donor serologies and intraoperative variables were collected. Marginal donor status was determined by identifying age >65, macrovesicular fat>30%, cold ischemia time> 8 hrs, HBV status, HCV status, AST enzyme levels>500, sodium levels>170, liver segment use, or donation after cardiac death (DCD) donor. In addition, we calculated the cumulative number of marginal characteristics associated with each donor (marginal liver score). The presence of the OPO surgeon as assistant or primary surgeon was identified.  Organ disposition codes and sharing codes were obtained to evaluate patterns of utilization.

Results: There were 711 liver procurements done during the period of study with a discard rate of 11.7%. There was no difference in the discard rate in the time period before or after the OPO surgeons arrival (12.2% vs. 11.3%).  The OPO surgeon was present for 208 donor surgeries (29.3%), however there was a higher rate of discard when the OPO surgeon was present (13.5% vs. 10.2%, p<0.001) and this was not explained by age, macrovesicular fat content or cold ischemia time differences. The OPO surgeon procured livers from more DCD donors although this represented only small fraction of the total donor surgeries.

Marginal donors were procured by OPO and non-OPO surgeons at equal frequency, however the cases at which the OPO surgeon was present had much greater complexity (as determined by marginal score) and the rate of discard was significantly less when the OPO surgeon was present at these cases (22% vs. 47%, P<0.01) The OPO surgeon was also associated with a higher number of regional and national shared organs (54% vs. 26%).

Conclusion: The addition of a dedicated full-time OPO surgeon has changed the pattern of utilization of the donor livers in North Texas.  It has decreased the discard rate of livers from patients with multiple marginal characteristics and this has lead to 18 additional livers being transplanted per year since initiation of the OPO surgeon.

 

48.05 Gait Speed And Hand Grip Strengh Are Independent Predictors Of Liver Transplant Waiting List Dropout

S. Kulkarni2, H. Chen4, D. A. Josbeno5, A. Schmotzer1,6, C. Hughes1, A. Humar1, V. Rachakonda1,3, M. A. Dunn1,3, A. D. Tevar1  3University Of Pittsburgh,Division Of Gastroenterology, Hepatology & Nutition / Dept. Of Medicine,Pittsburgh, PA, USA 4Department Of Medicine,University Of Pittsburgh Medical Center,Pittsburgh, PA, USA 5Department Of Physical Therapy,University Of Pittsburgh Medical Center,Pittsburgh, PA, USA 6Division Of Gastroenterology, Hepatology & Nutrition,University Of Pittsburgh Medical Center,Pittsburgh, P, USA 1Thomas E. Starzl Transplantation Institute,Dept. Of Surgery / University Of Pittsburgh,Pittsburgh, PA, USA 2Department Of Surgery,University Of Pittsburgh Medical Center,Pittsburgh, PA, USA

Introduction:   Frailty scores have been shown to effectively predict perioperative surgical risk.  In this light, gait speed has been validated as a reproducible metric of patients functional status and facility with activities of daily living. Studies have also validated its role in predicting morbidity and long-term survival. The 5-Meter Walk Test (5MWT), which measures patients’ self-selected gait speed, is a pragmatic and reproducible clinical test that can be easily conducted in an outpatient setting.  Grip strength is another practical outpatient test that can be measured with a hand dynamometer and measures dominant hand isometric grip force in pounds. We propose that the 5MWT and grip strength measurement can accurately capture frailty in liver transplant listed patients, and more specifically, can predict liver transplant waitlist dropout.

Methods:   A retrospective analysis was done of patients undergoing outpatient liver transplant evaluation and successful listing at UPMC between 7/2013 and 7/2016. All of these patients had an averaged 5MWT score calculated after performing the test three times with a one-minute rest in between walks.  In addition, each patient had dominant arm grip strength measured and recorded with a hydraulic hand dynamometer. Patients with waitlist dropout due to progression of HCC were excluded from analysis.  Patient demographics, transplant evaluation data, and outcomes on the waitlist were examined.  Statistical analysis was performed using student t-test and chi-square analysis.

Results:  A total of 197 liver transplant listed patients evaluated as outpatients were reviewed.  The patients had an average age of 57.1 years (range 20 to 74) and were predominantly white (90.4%).  Patients’ most common etiology of liver disease was HCV, 64 (32.5%) of patients had a diagnosis of HCC, 14 (7.1%) of patients had a previous liver transplant, and average MELD score upon listing was 16.0.  Of the 197 patients, 38 (19.3%) were ultimately dropped from the waitlist due to non-HCC related reasons.  Grip Strength was predictive of waitlist dropout (46.14 lbs vs. 59.6 lbs; p<0.005).  In addition, 5MWT times were found to be an independent predictor of waitlist dropout (0.92 m/s vs. 1.03 m/s; p < 0.005).

Conclusion:  The 5MWT and grip strength have been shown to accurately measure frailty, and we show that both independently  predict waitlist dropout among liver transplant listed patients.  The 5MWT and grip strength should be considered valuable tools in the evaluation and maintenance of end stage liver patients listed for transplant.