31.08 Do Prolonged Operative Times Obviate the Potential Benefits Associated with Laparoscopic Colectomy?

P. J. Sweigert1, E. Eguia1, A. N. Kothari1, K. A. Ban1, M. H. Nelson1, M. S. Baker1, M. A. Singer1  1Loyola University Medical Center,Department Of Surgery,Maywood, IL, USA

Introduction:
Prior studies have demonstrated that minimally invasive (MIS) approaches to colectomy are oncologically equivalent and associated with shorter lengths of stay, and reduced morbidity in comparison to open approaches to colectomy. There is also increasing evidence that prolonged operative time, especially that greater than 3 hours, is associated with increased rates of postoperative morbidity. Few studies examine the impact of operative time (OT) on the potential benefits afforded by MIS colectomy. We sought to determine if benefits associated with MIS colectomy are maintained in cases where OT is significantly prolonged. 

Methods:
The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) procedure targeted colectomy database was queried to identify adult patients who underwent elective left (LC) and right (RC) colectomy with anastomosis between 2011 and 2016. Emergent or converted cases, patients with preoperative infections, and observations with missing OT data were excluded. Forward stepwise multivariable logistic regression adjusting for demographic and clinical risk factors was used to compare outcomes for prolonged (4th quartile) MIS cases to average (2nd-3rd quartile) open cases with 30-day mortality or serious morbidity as the primary outcome of interest. Secondary 30-day outcomes included any morbidity, mortality, anastomotic leak, surgical site infection (SSI), and prolonged length of stay (LOS). 

Results:
18,274 patients underwent RC and 54,550 LC during the study period. RC was most commonly performed for colon cancer (48.6%). Median OT for open RC was 132 min (IQR 92-189).  That for MIS RC was 135 min (IQR 103-175), p=0.010. LC was also most commonly performed for colon cancer (44.9%). Median OT for open LC was 171 min (IQR 119-242). That for MIS LC was 173 min (IQR 129-231), p=0.001. No difference was seen in the adjusted primary outcome when prolonged MIS cases were compared to average open RC (OR 0.818, 95% CI: [0.660, 1.014]).  Prolonged MIS cases did, however, show a significant benefit in comparison to average length open procedures for adjusted mortality or serious morbidity in LC (OR 0.824, 95% CI: [0.729-0.931]). Prolonged MIS approach was associated with significantly lower morbidity, SSI and LOS for both RC and LC relative to average open colectomy (Table 1). 

Conclusion:
MIS approaches to colectomy are associated with improved rates of postoperative complications, serious morbidity and 30-day mortality relative to open approaches to colectomy.  These benefits are maintained even with OT that extends beyond 3 hours.  Surgeons performing elective MIS colectomy are justified in persisting with prolonged attempts at MIS colectomy. 
 

31.07 Postoperative Length of Stay Following Colorectal Surgery Impact Readmissions

X. L. Baldwin1, P. D. Strassle1, S. Lumpkin1, K. Stitzenberg1  1University Of North Carolina At Chapel Hill,Chapel Hill, NC, USA

Introduction:
Enhanced recovery pathways have led to shorter lengths of stay (LOS) after colorectal surgery. There continues to be controversy about the relationship between LOS and readmission following colorectal surgery. The purpose of this study was to evaluate the association between LOS and readmissions in a nationally representative sample. We hypothesized that shorter LOS would increase readmission rate. 

Methods:
Hospitalizations of adult patients aged 18 – 85 years old, who underwent colon and/or rectal resection between January 2010 and August 2015 in the National Readmission Database were eligible for inclusion. Patients who were treated in December, who were not residents of the state in which they underwent surgery, who died, or who had a LOS <1 day were excluded. Multivariable logistic regression was used to assess the effect of LOS on 30-day readmission, adjusting for patient demographics, comorbidities, hospital characteristics, and inpatient complications.

Results:
We assessed 376,376 hospitalizations. Median LOS was 5 days (IQR 4-8) and 14% of patients (n=51,087) were readmitted within 30 days. As LOS increased, the incidence of readmissions also increased (7% patients with 1 day LOS to 29% in patients with LOS ≥20 days, p<0.0001), Figure 1. After adjustment for patient demographics, comorbidities, inpatient complications, and hospital characteristics, a 5-8 day LOS was associated with a 50% increase in odds of 30-day readmission (OR 1.53, 95% CI 1.05, 1.14, p<0.0001), a 9-12 day LOS was associated with a 100% increase in odds (OR 2.06 95% CI 1.99, 2.13, p<0.0001), and a LOS ≥13 days was associated with an almost 150% increase in odds (OR 2.45 95% CI 2.36, 2.55, p<0.0001), compared to a 1-4 day LOS.

Conclusion:
Contrary to our hypothesis, we found that an increased length of stay resulted in increased readmission rates following colorectal surgery. Shorter LOS decreased the odds of 30-day readmission, regardless of a patient’s pre-existing comorbidities and inpatient complications. Future studies should examine factors associated with prolonged hospitalization and identify possible interventions to decrease readmission in these patients.
 

31.06 Effect of Elective Sigmoidectomy for Diverticulitis on Bowel Function Patient-Reported Outcomes

J. L. Goldwag1,2, R. V. Lyn3, L. R. Wilson1,2, M. Z. Wilson1,2, S. J. Ivatury1,2  1Dartmouth Hitchcock Medical Center,The Department Of Surgery,Lebanon, NH, USA 2Dartmouth Medical School,Lebanon, NH, USA 3Dartmouth College,Hanover, NH, USA

Introduction:
Diverticular disease is common worldwide. A subset of these patients will choose to undergo elective surgical resection due to symptoms or complicated disease. The effect of elective sigmoid colon resection for diverticular disease on bowel function is unclear. The aim of this study is to evaluate changes in bowel function following elective sigmoid resection for diverticular disease.

Methods:
This is a prospective, observational study. We included all patients seen at our institution from May 2015 to July 2018 who underwent elective sigmoid resection for diverticular disease. We used the Colorectal Functional Outcome Questionnaire (COREFO), a validated bowel function questionnaire that assesses bowel function in five domains and a global function score. Scores range from zero to 100 with a higher score indicating worse function. We obtained questionnaire data at baseline as well as at postoperative follow-up. Patients were included if they completed both questionnaires and underwent elective sigmoid resection.  Patients were excluded if they remained in bowel discontinuity or did not complete both questionnaires. A paired t-test was used to compare baseline and post-intervention scores.

Results:
49 patients met criteria for inclusion in this study. The median time between questionnaire completion was 70 days (IQR: 56 to 85). The mean age was 60 ± 12 years with 57% female patients. 36 (73%) patients underwent sigmoidectomy alone and 13 (27%) underwent sigmoidectomy with fistula repair. Six patients (12%) had a diverting loop ileostomy in addition to sigmoidectomy and underwent a subsequent reversal. Overall, there was no difference in Total COREFO score from baseline to post-intervention.  There were also no differences in any of the five COREFO domains (Figure 1).

Conclusion:
Bowel function does not change in the postoperative period following elective sigmoid resection for diverticular disease. Surgeons should counsel patients, especially symptomatic ones, that bowel function will be no different at time of postoperative follow-up.
 

31.04 Transverse Abdominis Plane Block Vs Intrathecal Analgesia in Colorectal Surgery: A Randomized Trial

D. Colibaseanu1, O. Osagiede2, A. Merchea1, C. Thomas6, E. Bojaxhi3, J. Panchamia5, A. Jacob5, S. Kelley4, K. Mathis4, A. Lightner4, J. Naessens6, D. W. Larson4  1Mayo Clinic – Florida,Section Of Colon And Rectal Surgery,Jacksonville, FL, USA 2Mayo Clinic – Florida,Health Sciences Research,Jacksonville, FL, USA 3Mayo Clinic – Florida,Department Of Anesthesiology,Jacksonville, FL, USA 4Mayo Clinic,Division Of Colon And Rectal Surgery,Rochester, MN, USA 5Mayo Clinic,Department Of Anesthesiology,Rochester, MN, USA 6Mayo Clinic,Health Sciences Research,Rochester, MN, USA

Introduction: Transversus abdominis plane (TAP) block is an effective alternative to neuraxial analgesia in abdominal surgery; however, limited evidence supports its use over traditional analgesic modalities in colorectal surgery. We compared the analgesic efficacy of liposomal bupivacaine TAP block and intrathecal (IT) opioids in a prospective randomized trial. The primary outcomes were the mean pain score and morphine milligram equivalents (MME) used within the first 48 hours post-surgery. Secondary outcomes included length of stay, standardized costs, postoperative ileus, and intravenous patient-controlled analgesia use. 

Methods: Patients were recruited from two campuses of a single institution. Two hundred and nine patients undergoing elective small bowel or colorectal resections were enrolled. They were randomized to receive either bilateral TAP block or single-injection IT analgesia with hydromorphone. Patients were assessed at 4, 8, 16, 24, and 48 hours post-surgery.

Results: Two hundred patients completed the trial (TAP =102, IT N=98). The TAP group had a mean pain score 1.7 points higher than the IT group 4 hours post-surgery, persisting up to 16 hours post-surgery. There was evidence of higher MME use < 24 hours post-surgery in the TAP group compared to IT (median difference: 10.0 MME, 95% CI 3.0 – 20.5 MME). No difference in MME was observed between the two groups at 24 and 48 hours, or in secondary outcomes.

Conclusions: Intrathecal opioids provided better immediate postoperative pain control compared to liposomal bupivacaine TAP block, lasting up to 16 hours post-surgery. Both modalities provided adequate pain control in patients enrolled in this study, and should be considered as part of a multimodal postoperative analgesic plan for patients undergoing elective colorectal surgery.

 

 

31.05 Opioid Tolerance Impacts Major Abdominal Surgery Outcomes in Patients on Enhanced Recovery Pathway.

M. H. Zaman1, O. P. Owodunni2, M. Ighani2, M. Grant3, D. Bettick4, S. Sateri3, T. Magnuson2, S. Gearhart2  1The Johns Hopkins University School Of Medicine,Urology,Baltimore, MD, USA 2The Johns Hopkins University School Of Medicine,Surgery,Baltimore, MD, USA 3The Johns Hopkins University School Of Medicine,Anesthesia,Baltimore, MD, USA 4The Johns Hopkins Bayview Medical Center,Quality,Baltimore, MD, USA

Introduction: Chronic opioid exposure can lead to a state of tolerance in patients where increasing doses of opioids are necessary to reduce pain; this can make postoperative management difficult. An Enhanced Recovery Pathway (ERP) is an evidence-based intervention that focuses on optimizing recovery and postoperative outcomes. The effectiveness of an ERP depends on the degree of compliance with the pathway. We wish to determine the effects of opioid tolerance in patients undergoing abdominal surgery on an ERP and its impact on compliance and postoperative outcomes. 

Methods:  From January 2013 to June 2017, patients undergoing major abdominal surgery prior to and following ERP-implementation were included. Patients <18 years and having emergency surgery were excluded. Compliance was measured to 14 perioperative pathway variables and high-compliance was defined as achieving ≥75%. Opioid tolerance was defined as any patient taking a prescribed opioid medication equivalent to 60 mg of Morphine per day for one week prior to surgery.  CR-POSSUM scores were used for risk-adjusted analyses. Outcomes of interest include length of stay (LOS), major-complications (Clavien-Dindo“CD”≥2), and 30-day readmission rates. 

Results: 1251 patients (605 pre-ERP and 646 ERP patients) were included. A total of 221 patients were opioid tolerant. Opioid tolerant patients were more likely to be younger (56 vs. 59 years, P=0.002), have disseminated cancer (11% vs. 5 %, P=0.003) and have an open procedure (69% vs. 60%, P=0.01) than non-tolerant patients. When comparing opioid-tolerant patients prior to (107 patients) and following EPR implementation (114 patients), there was no difference in demographic and clinical characteristics; however, more opioid-tolerant patients following ERP implementation had a laparoscopic procedure (42% vs. 19%, P<0.001).  In a multivariable analysis, opioid tolerance was associated with an increase in major complications (OR 1.24, p=0.032) and in readmissions (OR 1.42, p=0.005).  Among the ERP cohort, opioid-tolerant patients were less likely to be highly compliant with ERP variables than non-tolerant patients (35% vs. 54%; p<.001). Opioid tolerance was associated with a higher median LOS (5 days vs. 4 days; p<0.02) and a higher readmission rate (24% vs. 13%; p<0.01) than non-opioid tolerant ERP patients. In opioid-tolerant patients, high compliance with ERP was associated with a decreased odds of major complications (OR: 0.10, p<.001) and a reduction in the readmission rate (OR: 0.7, p=0.003). Opioid tolerance was an independent predictor of non-compliance with ERP (OR: 0.44, p<.001).

Conclusion: We provide evidence that opioid tolerance is associated with less favorable outcomes in patients undergoing major abdominal surgery and on an ERP; this is likely due to lack of pathway compliance.  Minimizing opioid use prior to elective major abdominal surgery may improve compliance and postoperative outcomes.

31.03 Postoperative Biologics Reduce the Risk of Recurrent Ileocolectomy

B. P. Kline1, T. Weaver1, A. Berg2, W. Koltun1  1Penn State University College Of Medicine,Deparment Of Surgery, Division Of Colon And Rectal Surgery,Hershey, PA, USA 2Penn State University College Of Medicine,Department Of Public Health Sciences,Hershey, PA, USA

Introduction:
Biologic medications are often prescribed to patients with Crohn’s disease after ileocolectomy to decrease the incidence of recurrent disease. Previous studies have focused primarily on their effect on recurrence of clinical symptoms or on endoscopic recurrence. There has been relatively little data on whether these biologics actually increase the time to a recurrent ileocolectomy or protect against the need for repeat surgery.

Methods:
A 28 year retrospective chart review was performed on 409 patients with Crohn’s disease who had undergone ileocolectomy and had been prospectively recruited into the Colorectal Diseases Biobank at our institution. The study cohort was made up of 241 of these patients who were biologic therapy naive prior to their initial ileocolectomy. 106 patients received biologics (infliximab, adalimumab, certolizumab pegol, vedolizumab, or ustekinumab) after their initial surgery (ICB group) and were compared to 135 patients who did not receive biologics (IC group). Clinical characteristics including sex, race, family history of IBD, smoking history, age of onset, date of diagnosis, Montreal classification, number of ileocolectomies, date of each ileocolectomy, and date of last visit were documented. Multivariate Cox proportional-hazards was used to model time to recurrent ileocolectomy after initial surgery. Covariates included the presence of biologics, smoking, sex, family history, and Montreal classification.

Results:
There was no significant difference in sex, race, family history, smoking history, age of diagnosis, or Montreal classification between the two groups. The mean follow up time was 11.8 years in the ICB group vs 14.8 years in the IC group. Only 34 of the 106 patients in the ICB group had subsequent surgeries compared to 65 of the 135 in the IC group (32% vs 48%, p = 0.017). On multivariate analysis, the presence of biologics reduced the hazard ratio (risk of a second surgery) by 40% (confidence interval: 7%-61%, p = 0.023). No other covariates had a significant impact on risk of recurrent surgery. The probability of a 2nd ileocolectomy over time is shown in the attached time to event figure.

Conclusion:
Patients who were placed on biologics after an initial ileocolectomy had a 40% decreased risk of requiring a second surgery. This led to both a lower number of subsequent ileocolectomies and an extended interval between surgeries in those patients that received biologics. This study confirms the effect of biologics in increasing the interval to a second ileocolectomy in patients with Crohn’s disease.

31.02 Patient Engagement with Mobile Applications Does Not Differ by Level of Health Literacy

K. E. Hudak1, M. F. Gleason1, S. J. Baker1, L. N. Wood2, J. A. Cannon2, M. S. Morris2, G. D. Kennedy2, D. I. Chu2  1University Of Alabama at Birmingham,Birmingham, Alabama, USA 2University Of Alabama at Birmingham,Department Of Gastrointestinal Surgery,Birmingham, Alabama, USA

Introduction: Patient engagement applications provide an electronic platform to guide patients recovering from surgery. From self-monitoring to quality of life surveys, these applications promote patient engagement through the surgical journey. It is unclear, however, whether patients with low health literacy have similar levels of engagement compared to those with adequate health literacy. Our objective was to characterize the utilization of a patient engagement application with patients with low health literacy. We hypothesized that differences in utilization measures would exist based on differences in health literacy.  

Methods:  Patients undergoing elective colorectal surgery at our institution from January to June 2018 were sequentially enrolled in a patient engagement application. A pre-operative survey was given which included a validated 3-question self-reported health literacy instrument. Patients were stratified by health literacy scores to four levels (high, intermediate high, intermediate, and low). Patients underwent surgery and were followed 30-days post-operatively. The primary outcomes were utilization measures which were defined as the number of days using the application, response rate to five individual surveys, and days to completion of each survey. Comparisons were made using ANOVA and chi-squared tests.

Results: A total of 78 patients enrolled in a patient engagement platform and underwent elective colorectal surgery with 30-day follow-up. The mean age was 59 years (IQR 15-83), 43.6% were female (n=34), and 17.9% were African American (n=14) with the remainder being white (82.1%, n=64).  On assessment of health literacy level, 43.6%  (n=34) of patients had high health literacy, 23.1% (n=18) had intermediate high, 19.2% (n=15) had intermediate, and 14.1% (n=11) had low health literacy.  Of low health literacy patients, 18.2% were female (n=2), compared to 53.3% (n=8) of intermediate, 44.4% (n=8) of intermediate high, and 47.1% (n=16) high health literacy (p=0.02).  There were no statistical differences in age or race by health literacy. The average number of days spent using the application was 8.5 days with a range of 7.9 to 9.2 days. The average response rate to the first five individual surveys was 100% for survey 1, 57.7% for survey 2, 52.6% for survey 3, 33.3% for survey 4, and 47.4% for survey 5. The average days to survey completion was 29.8 days with a range of 3.1 to 59.4 days.  On comparison by health literacy levels, there was no significant difference in utilization rates by number of days using the application, survey response rate, or days to completion of survey (p>0.05).

Conclusion: Patients with all levels of health literacy, including those with low health literacy, had similar engagement with a patient engagement platform after major surgery. These results suggest a potential role for such technology in caring for postoperative patients of all health literacy levels.

 

31.01 Racial Disparities in Treatment for Rectal Cancer Persist at Minority Serving Hospitals

P. Lu1,2, R. E. Scully1, Q. Trinh2,3, A. C. Fields1, R. Bleday1, J. E. Goldberg1, A. H. Haider1,2, N. Melnitchouk1,2  1Brigham And Women’s Hospital,Department Of Surgery,Boston, MA, USA 2Center for Surgery and Public Health,Brigham And Women’s Hospital,Boston, MA, USA 3Brigham And Women’s Hospital,Department Of Urologic Surgery,Boston, MA, USA

Introduction:

Racial disparities have been shown to exist in the treatment of rectal cancers with black patients having poorer survival and less adequate treatment compared to white patients. Minority serving hospitals (MSH) provide healthcare to a disproportionately large percent of minority patients in the United States. To better understand the cause of these disparities, we examined outcomes of rectal cancer patients treated at MSH using the National Cancer Database (NCDB). 

Methods:
NCDB was queried (2004-2014), and patients diagnosed with stage 2 or 3 rectal adenocarcinoma were identified. Racial case mix distribution was calculated at the institutional level and MSH were defined as those within the top decile of black and Hispanic patients. Standard of care (SOC) was defined by undergoing adequate surgery (low anterior resection, abdominoperineal resection or pelvic exenteration), chemotherapy, and radiation. A Cox proportional hazards model was used to evaluate adjusted risk of death and an adjusted logistic regression model was created for receipt of SOC. Analyses were clustered by facility.

Results:

60,855 patients were identified with stage 2 or 3 rectal adenocarcinoma. 55,727 (91.6%) patients were treated at non-MSH, and 5,128 (8.4%) were treated at MSH. Adjusting for age, gender, comorbidies, tumor stage, insurance, education, and income, black (OR 0.66 95%CI 0.55-0.80 p<0.001), white (OR 0.70 95% CI 0.61-0.80 p<0.001), and Hispanic (OR 0.68 95%CI 0.53-0.86 p<0.001) individuals were each less likely to receive SOC at MSH vs non-MSH. In unadjusted survival analysis, risk of death was significantly higher at MSH vs non-MSH for black individuals but not for white individuals (Figure 1). When adjusting for receipt of SOC, patient characteristics, and disease specific variables this difference was no longer seen (HR 1.03 95%CI 0.92-1.17 p=0.59).  In adjusted analysis of the overall group, black individuals had a significantly higher risk of mortality (HR 1.20 95%CI 1.14-1.26 p<0.001) compared to white individuals. This was persistent despite inclusion of receipt of SOC in the model (HR 1.16 95%CI 1.10-1.23 p<0.001).

Conclusions:

Treatment at MSH institutions was associated with significantly decreased odds of receipt of SOC for rectal adenocarcinoma across racial groups. Survival was worse for black individuals compared to white in both unadjusted and adjusted analyses. However, in adjusted analysis there was no difference in mortality for black individuals in MSH vs non-MSH when receipt of SOC was included in the model.  Further studies are needed to examine the racial disparity that persists in rectal cancer treatment, and address barriers facing MSH in providing rectal cancer SOC to all.

30.10 Trends In Utilization Of Left Ventricular Assist Devices Across Medicaid Expansion

A. Ehsan1, A. Zeymo3,4, N. M. Shara3,5, F. W. Sellke1, R. Yousefzai6, W. B. Al-Refaie2,3,4  1Brown University Medical School/Rhode Island Hospital,Division Of Cardiothoracic Surgery,Providence, RI, USA 2MedStar-Georgetown University Medical Center,Department Of Surgery,Washington, DC, USA 3MedStar Health Research Institute,Washington, DC, USA 4MedStar-Georgetown Surgical Outcomes Research Center,Washington, DC, USA 5Georgetown-Howard Universities Center for Clinical and Translational Science,Washington, DC, USA 6Brown University School of Medicine/Rhode Island Hospital,Division Of Cardiology,Providence, RI, USA

Introduction: Continuous flow left ventricular assist device (CF-LVAD) implantation is a payor sensitive procedure influenced by preoperative co-morbidities and social factors. Whether expansion in insurance coverage will further influence device utilization is unknown. We sought to assess the effects of Medicaid Expansion on vulnerable populations (namely racial/ethnic minorities and those with low income status) undergoing CF-LVAD implantation after the enactment of the 2014 Affordable Care Act (ACA).

Methods:  The 2012 to Q3 2015 State Inpatient Database (SID) was used for patients who were given a CF-LVAD from expansion states relative (Maryland and Kentucky) to non-expansion states (North Carolina and Florida).  Patients who were over 65 and patients with Medicare were excluded, as were patients who had a heart transplant, heart-lung transplant or non CF-LVAD, resulting in a cohort of 555 patients.  To detect if there were disparities between race, insurance, and income strata as a result of the ACA Medicaid expansion, Poisson Interrupted Time Series (ITS) were used with three-way interactions and change of slope and intercept parameters at 2014.

Results: Poisson ITS models show that within expansion states, the population of Medicaid and uninsured patients saw an increase in the utilization of LVAD's immediately after ACA expansion, from 2.8 in Q4 2013 to 6.6 Q1 2014 (IRR 2.54, p = 0.253). Utilization eventually decayed to pre-ACA levels, however, ending with 2.94 LVADs in Q3 2015 (IRR 0.920, 95% CI 0.759-1.113). Models testing for racial effect showed no statistically preferential or disparate effects (Immediate effect IRR 1.626, p = 0.545; marginal effect IRR 0.774, p = 0.174). ) (Figure 1).

Conclusion: Despite expanded insurance coverage, these preliminary post-ACA findings demonstrate that utilization of CF-LVADs was not increased in non-elderly racial and ethnic minorities. These preliminary results suggest that insurance coverage alone does not play a role in the eligibility of patients for CF-LVAD, however they deserve additional long-term evaluation. Instead, they point toward the importance of further exploring social, medical and hospital drivers of these disparities.

 

30.09 Is There a Difference in Heart Transplant Survival with Different Cardiac Preservation Solutions?

K. T. Carter3, S. Lirette4, A. Panos1, R. P. Cochran1, L. L. Creswell1, D. Baran2, H. Copeland1  2Sentara Healthcare,Norfolk, VA, USA 3University Of Mississippi,Surgery,Jackson, MS, USA 4University Of Mississippi,Data Science,Jackson, MS, USA 1University Of Mississippi,Cardiothoracic Surgery,Jackson, MS, USA

Introduction:  Various solutions are used for donor heart preservation. While studies have compared one or two solutions, no study has directly compared outcomes between the most commonly used preservation solutions in a large cohort. We hypothesize that there is no difference among cardiac preservation solutions.

Methods:  The United Network for Organ Sharing (UNOS) database was retrospectively reviewed from May 2007-March 2014 for donor hearts. Of the 141,500 potential donors, 1,240 were excluded for multiorgan transplants and 94,427 went on to heart transplant.  The preservation solutions noted in the database and analyzed included: saline, University of Wisconsin (UW), “cardioplegia”, Celsior, and Custodial. Collins solution was excluded from the study given its low usage (11 patients). The various solutions were compared against saline. The primary endpoints are recipient survival at 30 days, one year, and long-term. Logistic and Cox models were used to quantify survival endpoints.

Results: 17,452 patients had cardiac preservation solution data available. Saline was used as the final preservation solution in 3,087 patients (18%), UW in 7047 (40%), cardioplegia 1,893 (11%), Celsior in 4,337 (25%), and Custodial used in 1,088 (6%). Donor age ranged from 0 – 73 years (mean=27.7, median=26), 69% were male, and 2% were diabetic. Donor ejection fraction (EF) varied from 1 – 99 (mean=61.8, median=60) and ischemic time ranged from 0.18 – 12 hours (mean=3.09, median=3.03). Survival of recipients whose donor hearts were procured with saline was 2,946 (96%) at 30 day and 2,775 (90%) at one year, UW had 6,743 (96%) 30 day and 6,331 (90%) one year survival, cardioplegia had 1,795 (95%) 30 day and 1,668 (88%) one year survival, Celsior had 4,109 (95%) 30 day and 3,836 (89%) one year survival, and Custodial had 1,051 (97%) 30 day and 988 (91%) one year survival. Analysis of Cox models for long-term survival revealed no statistical differences when compare to saline for UW (p=0.192) nor Custodial (p=0.528). Cardioplegia (HR=1.16; p=0.016) and Celsior (HR=1.14; p=0.009) were found to have higher hazard of mortality than saline (Figure 1).

Conclusion: Celsior and cardioplegia solutions for cardiac preservation are associated with a higher mortality in heart transplant recipients, while UW and custodial solutions are equivocal to saline. Based on the UNOS database, UW and custodial solutions have improved outcome for heart transplant recipients.
 

30.08 The Impact of Opioid Addiction on Cardiac Surgery: An Analysis of 1.7 Million Surgeries

R. M. Shah1, S. A. Hirji1, S. McGurk1, M. P. Pelletier1, P. S. Shekar1, T. Kaneko1  1Brigham And Women’s Hospital,Division Of Cardiac Surgery,Boston, MA, USA

Introduction:  Despite the ongoing epidemic, the clinical impact of opioid addiction on cardiac surgery outcomes is not well described. We evaluated the impact of opioid use on in-hospital outcomes among opioid-addicted patients after cardiac surgery. 

Methods:  Using the National Inpatient Sample, we isolated patients undergoing coronary artery bypass grafting, valve repair or valve replacement from 2009-2014. Patients were stratified by opioid use using ICD-9 codes. Multivariable analysis was performed to evaluate the association between opioid use and in-hospital outcomes. 

Results: 1,743,161 patients underwent cardiac surgery, including 6,960 patients who suffered from opioid abuse or dependence (0.4%). Mean age was 47.2±14.9 and 65.8±12.8 years among opioid users and non-users, respectively.  Although in-hospital mortality did not differ among opioid and non-opioid users (2.9% and 2.7%), opioid users had significantly longer hospital LOS (18 vs 10 days) and higher hospitalization costs ($81,238 vs $58,654; all p < 0.05). After adjusting for patient and hospital-level factors, opioid use was significantly associated with complete heart block (OR 1.9, 95% CI: 1.3-2.6), stroke (OR 1.71, 95% CI: 1.2-2.4), acute kidney injury (OR 1.3, 95% CI: 1.1-1.6), and longer hospital LOS (3.5 days, 95% CI: 2.4-4.6; all p<0.01) compared to non-opioid users. (Figure 1) 

Conclusion: Cardiac surgery patients who suffer from opioid addiction are at high-risk for developing post-operative in-hospital complications. Strategies to minimize post-operative complications are warranted to improve overall morbidity and mortality in this vulnerable population in the context of the current opioid epidemic. 

 

30.07 Effects of Systemic Complications Outweight VAD Complications on Post-transplant Mortality

C. Lui1, A. Suarez-Pierre1, X. Zhou1, T. C. Crawford1, C. D. Fraser1, K. Giuliano1, S. Hsu2, R. Higgins1, K. J. Zehr1, G. J. Whitman1, C. W. Choi1, A. Kilic1  1The Johns Hopkins University School Of Medicine,Division Of Cardiac Surgery,Baltimore, MD, USA 2The Johns Hopkins University School Of Medicine,Department Of Cardiology,Baltimore, MD, USA

Introduction:
While the use of LVADs as a bridge to heart transplantation has increased over the last two decades, the physiological changes associated with LVAD support are poorly understood. We aim to explore the effect of pre-transplant systemic and device related complications on post-transplant survival for patients bridged with LVADs.

Methods:
The United Network of Organ Sharing (UNOS) database was queried for all adult heart transplant recipients (age >=18) transplanted from April 1, 2015 to June 31, 2018. Patients were categorized into patients without LVAD support and those who were bridged to transplantation with a Heartmate II, Heartmate III or Heartware HVAD device. Device related complications were defined as thrombosis, device infection or device malfunction. Systemic complications were identified as a new dialysis need or ventilator dependence between the time of listing and transplantation, transfusion, or systemic infection requiring treatment with IV antibiotics within two weeks of transplantation.

Results:
4032 patients underwent OHT without bridging with an LVAD while 2131 required LVAD support prior to transplantation. LVAD patients had greater rates of preoperative systemic complications (52.66% vs. 22.79%, p<0.001) compared to patients who were not bridged with an LVAD. Kaplan Meier analysis revealed a significantly decreased one-year survival for patients who experienced a pre-transplant systemic complication (p<0.001), and this finding persisted in the population of LVAD patients bridged to transplantation despite a smaller sample size (p=0.041). Interestingly, preoperative device related complications had no effect on one-year post transplantation survival (p=0.83). These findings suggest that the impact of systemic complications outweighs the effect of device related complications on post-transplant one-year survival. Multivariate cox modeling was performed to control for potential confounders, after which systemic complications were found to impart a significantly increased risk of post-transplant mortality for LVAD patients (HR 1.42, p=0.039).

Conclusion:
Our study provides insight on the importance of pre-transplant systemic complications for LVAD patients bridged to transplantation, and supports the recent changes to the UNOS allocation system for heart transplantation. These findings may help direct clinical management of LVAD patients waiting for a heart and assist in identifying at-risk recipients.
 

30.06 Predictors of Renal Dysfunction following Hypothermic Circulatory Arrest

C. V. Ghincea1, M. Aftab1, M. Eldeiry1, G. Roda1, M. Bronsert2, J. D. Pal1, J. C. Cleveland1, D. Fullerton1, T. B. Reece1  1University Of Colorado Denver,Cardiothoracic Surgery,Aurora, CO, USA 2University of Colorado,Aurora, CO, USA

Introduction:  Acute kidney injury (AKI) following aortic arch surgery is a frequent complication associated with increased morbidity and mortality. The purpose of this study was to evaluate risk factors for post-operative AKI in patients who underwent open aortic arch surgery utilizing hypothermic circulatory arrest. 

Methods:  We evaluated 295 consecutive patients undergoing aortic surgery between January 2011 and March 2018. AKI was defined according to KDIGO (Kidney Disease Improving Global Outcomes) Guidelines. Mean age was 58.3 ±13.9 years (range 20-88), 29% (79/295) were female. Mean BMI was 28.7 ±5.9 (range 15.4-52.8) and 33% (96/295) were classified as either urgent or emergent. There were 20% (60/295) reoperations. Chi-squared (and Fisher’s exact when necessary) was used for categorical variables, results expressed as odds ratios. T-test was used for continuous variables, results expressed as means ±standard deviation. Multivariate logistic regression analysis was performed using statistically and clinically significant variables from the univariate analyses.

Results: Of the 295 patients, 93 (32%) developed Stage 1 AKI or greater, 9.2% (27/295) Stage 2 AKI or greater, and 3.7% (11/295) had Stage 3 AKI. In the bivariate analysis, significant predictors of Stage 1 AKI or greater included: history of hypertension (OR 2.78, 95% CI 1.51-5.12, p=0.0008), diabetes (OR 2.36, 95% CI 1.05-5.32, p=0.0337), operative urgency (OR 1.97, 95% CI 1.18-3.29, p=0.0092), cardiopulmonary bypass (CPB) time (p=0.0006), cross clamp time (p=0.0085), circulatory arrest time (p=0.0062), total post-operative transfusions (p=0.0004), and the need for reoperation during hospitalization (OR 3.58, CI 1.80-7.09, p=0.0001). All of these, except cross clamp time, remained significant predictors for Stage 2 AKI or greater. In the multivariate analysis, significant predictors of any AKI were history of hypertension (p=0.0101), CPB time (p=0.0363), and total post-operative transfusions (p=0.0155). Operative urgency, circulatory arrest time, nadir operative bladder temperature, and reoperation during hospitalization were not significant in the multivariate analysis.

Conclusion: Hypertension, CPB time, and total post-operative transfusions significantly predicted AKI in cases undergoing circulatory arrest. Interestingly, circulatory arrest time and nadir temperature were not significantly associated with AKI in the multivariate analysis, but prolonged bypass time was associated with poor renal outcomes. In conclusion, approaches to reducing bypass time should be the focus of decreasing risk for post-operative AKI in hypothermic circulatory arrest cases. 

30.05 Influences of Sarcopenia on Clinical Outcomes in Patients with Advanced Esophageal Cancer

T. Makino1, T. Ishida1, K. Tanaka1, M. Yamasaki1, M. Mori1, Y. Doki1  1Osaka University,Gastroenterological Surgery,Suita, OSAKA, Japan

Introduction: Although some studies reported the association of sarcopenia with clinical outcomes of maltiple types of cancers, the association remains to be elucidated in esophageal cancer (EC). The aim of this study was to clarify the influence of muscle mass measurement on the clinical outcomes of multidisciplinary treatments for patients with esophageal cancer (EC).

Methods: A total of 165 EC patients who underwent neoadjuvant chemotherapy (NAC) followed by esophagectomy were analyzed. The cross-sectional area of psoas muscle was measured by computed tomography at third lumbar vertebra and the Psoas Muscle Index (PMI) was calculated (adjusted by height) . Pre- and post-NAC PMI were evaluated to investigate their associations with response and adverse events of NAC and postoperative complications in addition to long-term survivals. The cut-off values of PMI were set at 6.36 cm2/m2 for male and 3.92 cm2/m2 for female, which has been reported as PMI values of  "healthy" subjects.

Results: The PMI significantly decreased after NAC from 7.17 to 6.96 cm2/m2 (p=0.0008) , particularly in males (from 7.45 to 7.23 cm2/m2, P=0.0001), while PMI showed no fluctuation in females (from 5.21 to 5.17 cm2/m2, P=0.810). Pre-NAC PMI (low vs high group) was significantly associated with clinical response to NAC (response rate 65.1 vs 80.3%; P=0.0494), adverse events of NAC (neutropenia: 93.0 vs 78.7%; P=0.0337, febrile neutropenia: 53.5 vs 34.3%; P=0.0278, and hyponatremia: 51.2 vs 31.2 %; P=0.0190). Meanwhile, post-NAC PMI correlated with development of overall postoperative complications (56.9 vs 33.3%; P=0.0046), in particular, pneumonia (31.4 vs 9.7% P=0.0008). Neither pre- nor post-NAC PMI was associated with patient survival.

Conclusion: Sarcopenia determined by PMI measurement via CT before and after NAC could be used to predict tumor response, adverse events of NAC, and postoperative complications in multidisciplinary treatments for EC patients.
 

30.04 Abdominal Operations Following Implantation of Ventricular Assist Devices and Heart Transplantation

H. Xing1, Y. Sanaiha1, B. Kavianpour1, S. E. Rudasill1, A. L. Mardock1, H. Khoury1, R. Morchi2, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Cardiothoracic Surgery,Los Angeles, CA, USA 2University Of California – Irvine,General Surgery,Orange, CA, USA

Introduction:
Ventricular assist devices (VAD) are increasingly used to supplant the limited number of heart transplants (OHT). Given variable diaphragmatic implantation sites and the potential for low flow and embolism, VAD patients have been demonstrated to require emergency general surgery (EGS) in small series. The present study aimed to evaluate the perioperative incidence of EGS, after VAD and OHT, and explore its impact on patient outcomes.

Methods:
The 2005-2015 National Impatient Sample, an all-payer hospitalization database in the U.S, was utilized to identify all adult patients who had received VAD or OHT. Patients receiving both modalities during the same hospitalization were excluded. The primary outcome of interest was the rate of EGS (small and large bowel resection, cholecystectomy, ulcer procedures, and lysis of adhesions) after VAD or OHT during the same hospitalization. We employed univariate analysis to compare VAD and OHT patients who received EGS, considering over 30 comorbidities as well as hospital factors. Logistic regression was used to determine risk factors for EGS as well as the association between EGS and mortality in both the VAD and OHT populations.

Results:
In this study, an estimated 23,440 patients underwent VAD implantation and 19,391 had OHT, with VAD patients having a higher rate of EGS (2.7 vs 1.9%, p=0.012). Among VAD patients, EGS decreased by 0.2% annually (p<0.001) while the OHT group exhibited a steady trend. On average, VAD patients with EGS were older (58.7 vs 53.7 y, p=0.007) but had a similar Elixhauser comorbidity index (4.2 vs 4.0, p<0.361) compared to the OHT/EGS group. In both the VAD and OHT cohorts, requirements for EGS procedures were associated with significant unadjusted mortality (see Figure). Adjusting for patient and hospital level factors, VAD implantation was not independently predictive of EGS (OR 1.2, 95% CI 0.9-1.7). Infection, peritonitis, intestinal ischemia, intestinal obstruction, and paralytic ileus were associated with increased odds of EGS for both the VAD and OHT cohorts. EGS was associated with higher odds of mortality in both the VAD (OR 1.8, 95% CI 1.1-3.0) and OHT (OR 2.8, 95% CI 1.3-5.9) cohorts.

Conclusion:
Abdominal complications necessitating EGS after VAD and OHT are associated with increased odds of adjusted mortality. Although EGS rates seem to have decreased for VAD patients, the high mortality of several EGS categories remain concerning. Management strategies that ensure adequate cardiac output, reduce thromboembolic risk, and prevent ileus may mitigate the need for EGS in this vulnerable population.
 

30.03 Effect of Portable, In-Hospital ECMO on Clinical Outcomes

N. Wall1, J. E. Tonna1, A. Koliopoulou1, K. Stoddard1, S. G. Drakos2, C. H. Selzman1, S. H. McKellar1  1University of Utah,Cardiothoracic Surgery,Salt Lake City, UT, USA 2University Of Utah,Cardiovascular Medicine,Salt Lake City, UT, USA

Introduction:
The time between the onset of cardiogenic shock and initiation of mechanical circulatory support is inversely related to patient survival. The delays inherent to transporting a patient to the operating room (OR) for initiation of extracorporeal membrane oxygenator (ECMO) could prove fatal. A primed and portable VA ECMO system would allow initiation of ECMO in various locations within the hospital, including the emergency department for patients with out of hospital cardiac arrest (OHCA). We hypothesized that an in-hospital, portable VA ECMO program would improve outcomes for patients in cardiogenic shock.

Methods:
We retrospectively reviewed our institutional experience with VA ECMO based on two periods: the first was from the beginning of our VA ECMO program (2009), and the second from initiation of our primed and portable in-hospital ECMO system (April 2015). The primary end point was patient survival to discharge.

Results:
A total of 137 patients were placed on VA ECMO during the study period; n= 66 (48%) and n=71 (52%) before and after program initiation, respectively. The average age was 55 years old, with 69% being male. Non-ischemic cardiomyopathy was the etiology of heart failure in 55% of patients. There were no significant differences in demographics between the two groups. In the second era, the proportion of OR ECMO initiation decreased significantly (from 92% to 49%, P<0.01) as more patients received ECMO in other hospital units, including the emergency department for OHCA (P<0.01). Additionally, while the proportion of patients receiving central vs peripheral cannulation did not change, peripherally cannulated patients in the second era received smaller arterial cannulae (21 +/- 3.6 vs 17 +/- 3.1 French, P<0.01), and a greater proportion of these patients received distal limb perfusion cannulae (21% vs 45%, P=0.02). Survival to ECMO removal was similar for both groups (53% and 52%), while survival to hospital discharge was numerically higher for the current era (30% vs 42%, P=0.1). Finally, we observed a significant increase in clinical volume since initiation of the in-hospital, portable ECMO system from an average of 10 patients/year to 26 patients/year (P<0.01).

Conclusion:
After developing an in-hospital, primed and portable VA ECMO system, we observed increased clinical volume with more ECMO being initiated in non-OR settings. We conclude that more rapid deployment of VA ECMO may extend the treatment eligibility to more patients and improve patient outcomes.
 

30.02 Predictors of Post-Operative Transfusions for Hemiarch Replacement

M. Eldeiry1, M. Aftab1, J. Pal1, J. C. Cleveland1, D. Fullerton1, T. B. Reece1  1University of Colorado,Cardiothoracic Surgery / General Surgery / School Of Medicine,Aurora, CO, USA

Introduction: Hypothermic circulatory arrest (HCA) with antegrade cerebral perfusion has allowed an evolution in the nadir temperature for proximal arch replacement. Colder temperature provides neuronal protection but also compounds coagulopathy following bypass. We hypothesized that nadir temperatures during circulatory arrest (CA) can predict the need for post-operative transfusions in hemiarch replacements.

Methods: Data on hemiarch replacements from 2009 – 2018 was analyzed at a single institution.  Univariate logistic regressions were performed on post-operative red blood cell (RBC) transfusions and factor (platelet, plasma, cryoprecipitate) transfusions as a function of 22 variables. These included age, gender, co-morbidities, baseline lab measurements, operative times, and nadir temperature during HCA. Multivariate logistic models were subsequently generated using the variables with significant odds ratio (OR, p < 0.05) in the univariate analysis.

Results: A total of 282 cases were performed. Out of 10 significant  variables, lower baseline hemoglobin (Hgb) and creatinine clearance (Cr Cl), female gender, and redo status were associated with higher odds of requiring a RBC transfusion (Table). Nadir Temperatures ranged from 18-30 °C and, along with female status, were the only variables correlating to factor transfusions (Table). In a post-hoc analysis, nadir temperatures were not associated with a difference in neurologic outcomes (p = 0.66).

Conclusions: Overall, female patients tended to require more transfusions. The correlation for factor transfusions were interestingly associated strongly to nadir temperatures during HCA with 26% drop in odds of requiring a factor transfusion for each 1 °C increase in temperature. Furthermore, lack of difference in neurologic outcomes with nadir temperatures suggests that increasing the temperature during HCA maybe safe and potentially advantageous.

30.01 An Examination of KRAS Mutations in Primary Lung Adenocarcinomas Metastatic to Brain

S. N. Mazur1, S. Dacic2, J. D. Luketich1, M. J. Schuchert1  1University of Pittsburgh Medical Center,Cardiothoracic Surgery,Pittsburgh, PA, USA 2University of Pittsburgh Medical Center,Pathology,Pittsburgh, PA, USA

Introduction: Brain metastases are arguably one of the most feared and devastating consequences of lung cancer. Previous studies have found a relationship between Epidermal Growth Factor Receptor (EGFR) mutations and brain metastasis, especially in Asian populations where a higher rate of EGFR mutations has been observed. A potential link between Kirsten rat sarcoma viral oncogene homolog (KRAS) mutations, which are quite prevalent in Western non-small cell lung cancer (NSCLC) patients, and brain metastasis has been largely unexamined. In this study, we evaluated the prevalence of molecular mutations in patients with biopsy-proven brain metastases. We hypothesized that there would be a significantly higher number of lung cancer patients with brain metastasis who also harbored a KRAS mutation.

Methods: Retrospective review of all patients undergoing anatomic lung resection (segmentectomy or lobectomy) for primary lung adenocarcinoma with biopsy-proven brain metastases from 2002-2017. Molecular testing data was derived from pathology report summaries. Molecular mutations analyzed include KRAS (primarily codon 12/13) and EGFR (primarily exons 19 and 21). Primary outcome variables included molecular expression patterns and overall survival. Significance of molecular expression was assessed with the Fisher’s Exact test. Survival curves were analyzed utilizing the Kaplan-Maier method, with significance being assessed by the log rank test.

Results: Seventy patients with biopsy-proven brain metastases were identified.  Among these, 17 (24.3%) of patients had brain metastases at the time of clinical presentation prior to lung resection. The remaining 53 patients developed brain metastases subsequent to lung resection (median time to brain metastasis = 19.5 months, range: 1.8 – 99.7 months). The average patient age was 62.1 years, and there were 37 male patients (53.0%). Twenty-two patients received neoadjuvant treatment prior to surgery. Molecular testing was performed in 56 (80.0%) of patients. KRAS mutations were identified in 22/56 (39.2%), and EGFR mutations were identified in 4/56 (7.1%) patients undergoing testing [p<0.0001]. The most common KRAS mutation was G12C (59.1% of KRAS-positive patients). The difference in survival time between KRAS-mutated patients and EGFR-mutated patients was not significant (p = 0.67). Likewise, the difference in survival time between KRAS-mutated patients and patients who were wild-type for both KRAS and EGFR was not significant (p = 0.39).

Conclusions: An increased rate of KRAS mutations is noted in patients with brain metastases in the setting of resected lung adenocarcinoma. This may mean that the presence of KRAS mutations could play a larger role in the development of brain metastases in Western populations. The presence of KRAS mutations does not appear to affect overall survival.

 

29.10 Insurance Coverage Trends for Breast Surgery in Cisgender Women, Cisgender Men, and Transgender Men

A. Almazan2, E. Boskey1, O. Ganor1  1Boston Children’s Hospital,Plastic And Oral Surgery,Boston, MA, USA 2Harvard Medical School,Boston, MA, USA

Introduction:  The criteria used to judge the medical necessity of a surgery can vary substantially between insurance providers and related procedures. Despite procedural similarities, insurance policies enforce different requirements for reimbursement of reduction mammoplasty (RM) in cisgender women, gynecomastia excision (GE) in cisgender men, and gender-affirming mastectomy (GAM) in transgender men. In this study, we examine how analogous procedures may be treated differently for patients of different genders, and we identify differences in coverage policies across insurance providers.

Methods: For each procedure, we examined the medical necessity criteria from the websites of the 9 largest national insurance networks that have national coverage guidelines, the 6 federal plans available through the Federal Employees Health Benefits plan, and 5 state plans for a large national network with state-based coverage policies. Plan policies were reviewed to determine coverage and identify standard medical necessity criteria for each procedure. For each plan, we recorded whether each procedure was covered and whether each medical necessity criterion was adopted.

Results: Coverage was highly variable between procedures. None of the plans excluded RM from coverage. 2 national networks, 2 federal plans, and 2 state plans excluded GE. 2 federal plans excluded GAM. Minimum age was the medical necessity criterion with the most variability between procedures. 5 of the 14 policies that cover GE explicitly required patients to be over the age of majority, compared to 10/20 RM policies and 16/18 GAM policies. GAM was the procedure with the most variable criteria between policies, with 10 different combinations of 5 criteria observed.

Conclusion: Insurance coverage and restrictiveness of medical necessity criteria for breast tissue removal are highly variable. Coverage for GE is fairly limited, and coverage exclusions for GAM exist despite the passage of transgender-specific insurance non-discrimination laws. Medical necessity criteria for RM and GE are somewhat inconsistent across insurers. Criteria for GAM are even more variable, despite the existence of published standards of care for transgender patients. Improving the consistency of insurance coverage for breast tissue removal, and streamlining procedure guidelines, has the potential to streamline the process of care.

29.09 Postoperative Analgesia after Iliac Crest Bone Graft Harvest using Liposomal Bupivacaine

R. Patel1, M. R. Borrelli1, K. Rustad1, B. Pridgen1, A. Momeni1, H. P. Lorenz1, S. Virk1, D. C. Wan1  1Stanford University,Palo Alto, CA, USA

Introduction: Bone grafting of alveolar clefts is routinely performed using cancellous bone harvested from the iliac crest. Graft site morbidity, however, is common, with many patients experiencing early post-operative pain. Conventional intraoperative use of local anesthetics such as Marcaine is often insufficient and requires additional opioid-based medications to achieve adequate postoperative analgesia. Marketed under the name Exparel®, liposomal bupivacaine has been demonstrated to provide significant improvement in post-operative pain for patients undergoing bunionectomy or hemorrhoidectomy, and this medication may similarly provide relief of donor site pain in patients requiring bone graft harvest. In this study we assessed the efficacy of a single dose of intraoperatively administered liposomal bupivacaine in children undergoing iliac crest bone graft harvest for repair of alveolar clefts.

Methods: 10 patients undergoing iliac crest bone graft from June 2017 to October 2017 were included in the study, which was performed under IRB approval. 5 patients underwent open iliac crest bone graft harvest, with administration of 0.25% Marcaine in Gelfoam at the hip donor site. The other 5 patients underwent open iliac crest bone graft harvest with direct infiltration of 1.3% liposomal bupivacaine around the osteotomy site. Post-operative measures included: patient-reported pain score, total narcotic use (in oral morphine equivalent) during hospitalization, length of stay, postoperative steps, as measured by a Fitbit Activity Tracker, and thigh numbness.

Results: There were no significant differences in age, weight, or distribution of clefts between the two groups. Patients receiving 0.25% Marcaine were discharged on average 1.4 ± 0.55 days after surgery and patients receiving Exparel discharged on average 1.2±0.45 days after surgery. However, differences were noted in average postoperative pain scores (4.25 ±2.15 vs. 2.50 ±1.51), oral morphine equivalents administered (7.08 ± 1.05 vs. 4.82 ± 1.55), and postoperative steps (498 ± 32 vs. 786 ± 157) for patients receiving 0.25% Marcaine vs. Exparel, respectively. Of note, two patients receiving liposomal bupivacaine did report transient thigh numbness lasting three days. No other complications were noted with these patients.

Conclusion: Liposomal bupivacaine may provide reliable and long-acting post-operative analgesia which contributes to a reduction in pain scores and need for additional narcotic administration. This is also reflected in improved post-operative activity, as measured by patient steps. Importantly, there are no recommendations for pediatric dosing of Exparel, and no studies exist in the literature describing use in this patient population. Nonetheless, safe use was observed in this study, highlighting the promise of this analgesic to improve postoperative pain management in children undergoing alveolar bone grafting.