63.10 Acid Suppression to Prevent Gastrointestinal Bleeding in Patients with Ventricular Assist Devices

A. W. Hickman1, N. W. Lonardo1, M. C. Mone1, A. P. Presson1, C. Zhang1, R. G. Barton1, S. H. McKellar1, C. H. Selzman1  1University Of Utah,Salt Lake City, UT, USA

Introduction:  The high incidence of gastrointestinal bleeding (GIB) in patients with ventricular assist devices (VAD) is well known, but there is limited evidence to support the use of proton pump inhibitors (PPI) or histamine receptor antagonists (H2RA) for preventing GIB in patients who require treatment for their cardiac disease with VAD implantation.

Methods: The institutional Surgical and Cardiovascular ICU and VAD databases within an academic cardiac mechanical support and transplant center were queried for patients who underwent VAD implantation between 2010 and 2014. The devices included HeartWare, HeartMate II, Jarvik 2000, or SynCardia TAH devices and could be used for left, right or both-sided failure. An observational cohort study was conducted on the final population to identify which prophylactic acid suppressing drug regimen was associated with the fewest number of GIB events within 30-days after VAD implantation: PPI or H2RA, or no acid suppressing therapy. Secondary outcomes included an evaluation of the timing, etiology, and location of all GIB events. Univariate and multivariate regression was performed using clinically important covariates. A combined variable for pre-existing GIB risk was created based on history of GIB and previous use of acid suppressive medication. Based on the number of GIB events, the acid suppressing treatment and three other covariates were used in the final model.

Results: There were a total of 138 patients included for analysis, 19 (13.8%) of which had a GIB event within the 30-day period. Both H2RA and PPI use were associated with a reduction in GIB events when compared to no acid suppressive therapy. In the logistic regression analysis controlling for ICU admission APACHE II score, preoperative hematocrit, and pre-existing GIB risk, the PPI cohort had a statistically significant reduction in GIB [OR 0.18 (0.04-0.79) p=0.026] (see table).

Conclusion: This review of patients with newly implanted VAD revealed that the use of acid suppressing therapy during the postoperative ICU period resulted in fewer GIB events. When controlling for severity of illness and known risks for bleeding, those patients treated with a PPI had a statistically lower risk for GIB. Cardiothoracic surgeons and ICU clinicians should consider this treatment option in order to reduce complications for this high-risk subset of patients.

 

63.07 Outcomes and Readmissions after Orthotopic Heart Transplant vs Left Ventricular Assist Devices

E. Aguayo1, L. Mukdad1, A. Mantha1, A. Iyengar1, R. Hernandez1, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Cardiac Surgery,Los Angeles, CA, USA

Introduction:  Left ventricular assist devices (LVAD) have significantly expanded the range of options for stabilizing and treating end stage heart failure.  As LVAD technology continues to improve, the morbidity and mortality for patients is expected to approach that of orthotopic heart transplantation (OHT). While others have examined outcomes of such individual heart replacement modalities in trials, large-scale comparisons between OHT and LVADs have not been performed thus far. The present study was performed to compare the perioperative outcomes and 30-day readmissions between LVAD implantation and OHT using a national cohort.

Methods:  Patients who underwent either OHT or LVAD implantation from 2010 to 2014 in the National Readmission Database (NRD) were selected. The NRD is an all-payer inpatient database maintained by the Healthcare Cost and Utilization Project that estimates more than 35 million annual U.S. hospitalizations. Mortality, readmission, and GDP-adjusted cost were evaluated using hierarchical linear models adjusting for socioeconomics, demographics, and comorbidities.

Results: Of the 13,660 patients identified during the study period, 5,806 (43%) received OHT while 7,854 (57%) received LVADs. LVAD patients were on average older (56 vs. 52, P<0.001) and had less severe comorbidities based on the Elixhauser Index (5.7 vs. 6.6, P<0.001). LVAD was associated with shorter length of stay after adjustment (37.1 vs 36.0 days, IRR:0.95, P<0.001), higher adjusted in-hospital mortality (12.3% vs. 7.0%, OR= 2.01, P<0.001), higher adjusted costs ($220,052 vs. $184,625, P<0.001), and longer readmission (10.0 days vs. 6.9 days, IRR: 1.28, P<0.001) length of stay. All-cause readmission at 30 days (27.5% LVAD vs 24.4% OHT, OR=1.02, P=0.81) and cost of readmission ($28,653 LVAD vs. $22,105 OHT, P=0.73) were not significantly different between modalities.

Conclusion: In this nationwide analysis of patients who underwent cardiac replacement therapy from 2010 to 2014 patients receiving LVAD had similar rate and cost of 30-day readmission compared to those undergoing OHT. These results further support recent studies indicating improved outcomes and survival using LVAD implantation. However, the initial cost of implantation and in-hospital mortality remain significantly greater among LVAD recipients after adjusting for demographics, comorbidities, and hospital variation. Given the projected increases in LVAD utilization and limited transplant donor pool, further emphasis on LVAD cost containment and comparative effectiveness is essential to the viability of such therapy in the era of value-based healthcare delivery.
 

63.08 In-hospital Outcomes and Resource Use for Robotic Mitral Valve Repair: Beyond the Learning Curve

Y. Seo1, Y. Sanaiha1, K. Bailey1, E. Aguayo1, A. Mantha3, V. Dobaria2, A. Chao1, T. Fan1, N. Satou1, P. Benharash1  3University Of California – Irvine,Orange, CA, USA 1David Geffen School Of Medicine, University Of California At Los Angeles,Los Angeles, CA, USA 2University Of California – Los Angeles,Los Angeles, CA, USA

Introduction:
The additional cost burden associated robotic operations have been cited as a major barrier to the wide dissemination of this technology. Robotic valve operations have demonstrated similar safety and efficacy but higher initial costs when compared to open surgery.  Previous studies have demonstrated the initial learning curve of robotic procedures to be followed by a plateau phase where operative time and complication rates stabilize. The objective of the present study was to evaluate our institutional experience with robotic mitral valve repairs (rMVR) beyond the learning curve, and to compare clinical and financial outcomes to the open approach.

Methods:
The prospectively-maintained institutional Society of Thoracic Surgeons database was utilized to identify all adult patients undergoing robotic and open isolated mitral valve repair from January 2008 to December 2016. Subjects with concomitant surgeries and previous cardiac surgeries were excluded. Multivariate regressions were performed to produce risk-adjusted operative times, complications, length of hospitalization, and costs. Financial data was obtained from the hospital database and adjusted for inflation. Categorical variables were analyzed by Fisher’s exact test and continuous variables were analyzed by the independent sample T-test for unequal sample size. An alpha of < 0.05 was considered statistically significant.

Results:
During the study period, 175 robotic and 259 open MVR cases were performed. Compared to open, rMVR patients were less likely to be hypertensive (51 vs 41%, p=0.002), or have chronic lung disease (13 vs 5%, p=0.005) and had higher hematocrit values (36 vs 39%, p<0.001) and ejection fractions (58 vs 60%, p=0.023). With increasing robotic experience, operative times decreased significantly as shown in Figure 1, but rates of complications, hospital and ICU lengths of stay did not change. Compared to costs of open surgery, rMVR was associated with 43% less cost (p=0.001), fewer cases needing postoperative blood products (27 vs 15%, OR=0.61, p=0.004) and lower rates of complication (46 vs 30%, OR=0.44, p=0.001). Also rMVR was associated with significantly shorter times in the ICU (84 vs 144 hours, p<0.001) and length of stay (6.5 vs 9.9 days, p<0.001).

Conclusion:
In this longitudinal single institution experience, increasing number of rMVR’s beyond the initial learning curve was associated with decreasing operative times. Our findings demonstrate comparable short-term outcomes between robotic and open MVR. Interestingly, the robotic approach was more cost effective likely due to shorter hospital and ICU length of stay. With increasing experience, robotic MVR can surpass the open technique in cost effectiveness while providing equivalent short term outcomes. 
 

63.04 Incidence, Costs and Length of Stay for Heparin Induced Thrombocytopenia in Cardiac Surgery Patients

E. Aguayo1, K. L. Bailey1, Y. Seo1, A. Mantha2, V. Dobaria1, Y. Sanaiha1, P. Benharash1  1University Of California At Los Angeles,Department Of Surgery/ Division Of Cardiac Surgery,Los Angeles, CA, USA 2University Of California – Irvine,School Of Medicine,Orange, CA, USA

Introduction:
Heparin is routinely used in many cardiovascular procedures to prevent thrombosis. An antibody mediated process, heparin-induced thrombocytopenia (HIT) occurs in a small subset of patients exposed to heparin. While hemorrhage is thought to be rare in HIT, the incidence of stroke, pulmonary embolism, and deep vein thrombosis dramatically increase. While some have suggested a recent increase in the incidence of HIT, data on the impact of HIT on costs and length of stay (LOS) after cardiac surgery is generally lacking. The present study aimed to assess national trends in the incidence and resource utilization associated with HIT in cardiac surgical patients. 

Methods:
A retrospective cohort study was performed identifying adult cardiac surgery patients (≥ 18 years) with a diagnosis of HIT were identified using the 2009-2014 National Inpatient Sample (NIS) Database and International Statistical Classification of Diseases and Related Health Problems (ICD9) codes. In hospital mortality and GDP-adjusted cost were evaluated using hierarchical linear models adjusting for socioeconomic, demographic and comorbidity measured by Elixhauser Index.

Results:
Of the 3,985,878 adult cardiac surgery patients, 16,610 (0.42%) had HIT as a primary diagnosis with no trend over the study period. Compared to those without the diagnosis, HIT patients were on average older (67.1 vs 65.1, p<0.001), insured by Medicare (62% vs 52%, p<0.001), and had a higher Elixhauser comborbidity index (4.48 vs. 3.75, p<0.001). HIT was associated with significantly longer index LOS (19.1 vs 10.6 days, p<0.001) and higher hospitalization costs (91,977 vs $52,090, p<0.001). After adjustment for baseline differences, HIT was independently associated with increased risk of death (OR 2.72, 95% CI: 2.41-3.06), stroke (OR 2.12, 95% CI: 1.72-2.62), deep venous thrombosis (OR: 8.63, 95% CI: 7.60-9.80), and pulmonary embolism (OR: 5.43, 95% CI: 4.55-6.48).

Conclusions:

Based on this national analysis of adult cardiac surgical patients, HIT disproportionately affected those with government sponsored health insurance. The presence of HIT was associated with a significantly longer LOS, higher costs and comorbidities. The incidence of serious complications such as stroke, DVT, and PE more than doubled in HIT patients. These findings have significant implications in the era of value-based healthcare delivery. In addition to reducing unnecessary exposure to heparin, proper diagnosis and treatment is essential for favorable outcomes in these patients.

 

63.05 Immune Cell Alterations after Cardiac Surgery Associated with Increased Risk of Complications and Mortality

D. J. Picone1, N. R. Sodha1, T. C. Geraci1, J. T. Machan1, F. W. Sellke1, W. G. Cioffi1, S. F. Monaghan1  1Brown University School Of Medicine,Surgery,Providence, RI, USA

Introduction: Systemic inflammatory response syndrome (SIRS) frequently occurs following cardiac surgery, a controlled traumatic event. Typically, emphasis is placed on the white blood cell count; however, immune cell responses following trauma have been associated with poor outcomes.  We hypothesize that lymphocyte loss and lack of recovery after cardiac surgery will predict poor outcomes.

Methods: This is a retrospective review of all adult post-cardiac surgery patients from a single institution from Oct 2008 to Oct 2015. Patients were included if they had more than two complete blood counts (CBC) drawn in the first 7 days post operatively. Demographic data, complications, hospital and ICU length of stays, operative data, and mortality were obtained from the Society of Thoracic Surgery (STS) database. Laboratory data was obtained from the medical record.  Leukocyte, neutrophil, and lymphocyte counts were retained. Patients were grouped based on the pattern of response of elevation/depression and normalization versus failure of normalization for each component (leukocyte, neutrophil, lymphocyte). Kaplan-Meier curves and odds ratios were used to analyze association with 30 day mortality, development of pneumonia, renal failure, post-operative sepsis, and all complications. 

Results: 2401 patients were included in the leukocyte group and 1795 patients in both the neutrophil and lymphocyte groups. Patients who developed increased leukocytosis that remained elevated within 7 days had an increased risk of mortality (8.7%), compared to both those who normalize (2.9%, p <0.0001) and those who did not develop a leukocytosis (1.8%, p <0.0001). There was no difference in mortality for the neutrophil or lymphocyte groups.  Patients who did not develop post-operative lymphopenia had decreased risk compared both to those with persistent lymphopenia and those with normalization of lymphopenia, as indicated respectively in the following: pneumonia (OR 0.42 (CI 0.25-0.69), 0.49 (CI 0.24-0.98)); renal failure (OR 0.21 (CI 0.12-0.39), 0.36 (CI 0.15-0.8)); sepsis (OR 0.21 (CI 0.06-0.64), 0.11 (CI 0.03-0.36)); all complications (OR 0.42 (CI 0.30-0.61), 0.37 (CI 0.23-0.58)). Leukocytosis that failed to normalize was associated with increased risk of pneumonia (OR 2.5 (CI 1.2-3.4)) and all complications (OR 3.46 (CI 2.5-4.8)). There was no associated complication risk with neutrophilia.

Conclusion: Failure to normalize leukocytosis after cardiac surgery is associated with higher risk of mortality. Development of lymphopenia in the post-operative period is associated with increased risk of post operative complications. Use of these routinely ordered labs may help identify patients at risk for complications and who should not be “fast tracked” for discharge. Future work will compare the predictive nature of these laboratory tests versus standard predictors from the STS database.

 

63.06 A Nationwide Study of Treatment Modalities for Thoracic Aortic Injury

Y. Seo1, E. Aguayo1, K. Bailey1, Y. Sanaiha1, V. Dobaria2, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Los Angeles, CA, USA 2University Of California – Los Angeles,Los Angeles, CA, USA

Introduction:
Thoracic aortic injuries (TAI) have traditionally been associated with high morbidity and mortality. Since its FDA approval in 2005, thoracic endovascular aortic repair (TEVAR) has emerged as a suitable alternative to open repair. However, use of TEVAR and impact on other treatment modalities at a national level remains ill defined.  This study aims to analyze the national trends of hospital characteristics, patient characteristics, and resource utilization in the treatment of TAI.

Methods:

Patients admitted with TAI between 2005 and 2014 were identified in the National Inpatient Sample (NIS). Patients were identified as undergoing TEVAR, open surgery and non-operative management. The primary outcome was in-hospital mortality while secondary outcomes included complication, length of stay, and GDP-adjusted costs. Multivariate logistic regression accounting for comorbidities, concomitant injuries, and other interventions was used to determine predictors of mortality and receiving a particular treatment.

Results:
Of the 11,257 patients who were admitted for TAI during the study period, 33% received TEVAR, 2% open surgery, and 12% non-operative management. Trends in the use of various modalities are shown in Figure 1 with TEVAR having the largest growth (p<0.001). Compared to open surgery, TEVAR patients had higher rates of concomitant brain injury (17 vs 26%, p=0.01), pulmonary injury (21 vs 33%, p<0.001) and splenic injury (2 vs 4%, p=0.031). Patients were less likely to undergo TEVAR if they were female (OR=0.73, P=0.026), older than 85 (OR=0.29, P=0.019), had congestive heart failure (OR=0.27, P=0.014), or coronary artery disease (OR=0.34, P=0.035). In hospital mortality was greater for open surgeries (OR=3.06, p=0.003) and nonoperative management (OR=4.33, p<0.001) than TEVAR.  Open had higher rates of cardiac complication (10 vs 4%, p<0.001). Mortality rate for TEVAR and nonoperative management did not change throughout the years but mortality for open surgery increased (p=0.04). Interestingly, the cost for admissions with TEVAR increased from $35K to $95K (p=0.004), while the cost for open surgery has steadily declined (p=0.031). 

Conclusion:

Our findings indicate the rapid adoption of TEVAR over open surgery for management of TAI. TEVAR is associated with lower mortality and complication rates but has increased costs not otherwise explained by other patient factors. This warrant further studies into to change in cost and socioeconomic barriers to receiving optimal care.

63.03 The Additive Effect of Comorbidity and Complications on Readmission after Pulmonary Lobectomy

R. A. Jean1,2, A. S. Chiu1, J. D. Blasberg3, D. J. Boffa3, F. C. Detterbeck3, A. W. Kim4  1Yale University School Of Medicine,Department Of Surgery,New Haven, CT, USA 2Yale University School Of Medicine,National Clinician Scholars Program,New Haven, CT, USA 3Yale University School Of Medicine,Section Of Thoracic Surgery, Department Of Surgery,New Haven, CT, USA 4Keck School Of Medicine Of USC,Division Of Thoracic Surgery, Department Of Surgery,Los Angeles, CA, USA

Introduction: Hospital readmission after cardiothoracic surgery has a significant effect on healthcare delivery, particularly in the era of value-based reimbursement. Studies have shown that readmission after major surgery is significantly associated with preoperative comorbidity burden and the development of postoperative complications. We sought to investigate the additive impact of comorbidity and postoperative complications on the risk of readmission after thoracic lobectomy, and compare which of these factors were driving this phenomenon. 

Methods:  The Healthcare Cost and Utilization Project’s Nationwide Readmission Database (NRD) between 2010 and 2014 was used as the dataset for this study. The NRD was queried for discharges for pulmonary lobectomy with a primary diagnosis of lung cancer. Patients surviving to discharge were followed for rates of 90-day readmision. Readmission rates were calculated for low-risk patients who had no comorbidity and no postoperative complications.  Next, rates were compared iteratively by the presence of Elixhauser comorbidity and postoperative complications. Adjusted linear regression, accounting for patient age, sex, insurance status, and income, was used to calculate the mean change in readmission rate by the number of comorbidities and postoperative complications.

Results: A total of 106,262 pulmonary lobectomies were identified over the study period, of whom 20,112 (18.9%) were readmitted within 90 days of discharge. Of this total cohort, the mean age was 67.7 ± 0.11 years, with a mean of 2.5 Elixhauser comorbidities and an mean incidence of 0.8 postoperative complications per patient. Of the 5812 (5.5%) patients with no comorbidities or postoperative complications, 680 (11.7%) were readmitted. At the other extreme, of the 6121 (5.8%) of patients with 3+ comorbidities and 3+ complications, 1877 (30.7%) of patients were readmitted. After adjusting for age, sex, and insurance status, each additional comorbidity and any postoperative complication were associated with a 2.3% (95% CI 2.0% – 2.6%) and 2.7% (95% CI 2.3% – 3.2%) increased probability of readmission, respectively.

Conclusion: Among patients with the lowest risk profile, there was an 11.7% 90-day readmission rate. Adjusting for other factors, each additional comorbidity increased this rate by approximately 2.3%, while each postoperative complication increased this rate by 2.7%. These results demonstrate that even among optimized patients without postoperative complications, there remains notable risk of rehospitalization, indicating that careful patient selection and the avoidance of complications may not completely reduce readmission risk after pulmonary lobectomy.

 

63.02 Risk Factors for and Outcomes of Conversion to Thoracotomy during Robotic-Assisted Lobectomy

S. Hernandez2, F. Velez-Cubian2, R. Gerard2, C. Moodie1, J. Garrett1, J. Fontaine1,2, E. Toloza1,2  1Moffitt Cancer Center,Thoracic Oncology,Tampa, FL, USA 2University Of South Florida Health Morsani College Of Medicine,Tampa, FL, USA

Introduction:   We aimed to identify risk factors and outcomes for conversion to thoracotomy from robotic-assisted video-assisted thoracoscopic (R-VATS) pulmonary lobectomy.

Methods:   We retrospectively analyzed all patients (pts) who underwent R-VATS lobectomy for primary lung cancer by one surgeon between September 2010 and August 2016.  Patients were grouped to “conversion” versus “non-conversion” to open lobectomy.  Patients’ demographics, co-morbidities, pulmonary function tests (PFTs), perioperative outcomes, hospital length of stay (LOS), tumor histology, and pathologic stage were compared between groups.  Chi-square, analysis of variance, Student’s t-test, or Kruskal-Wallis test was used, with p≤0.05 as significant.

Results:  Twenty pts (5.3%) required conversion to open lobectomy from a total of 380 R-VATS lobectomy pts.  “Conversion” pts were similar in age, BMI, smoking history, co-morbidities, and PFTs to “non-conversion” pts.  More “conversion” pts received neoadjuvant therapy than “non-conversion” pts (25.0% vs. 3.6%; p<0.001).  Estimated blood loss (EBL) was higher in “conversion” pts (500 mL [interquartile range (IQR)=675] vs 150 mL [IQR=150]; p<0.001), and median operative time was longer for “conversion” pts (298 min [IQR=157] vs 171 min [IQR=71]; p<0.001), compared to “non-conversion” pts.  Tumor laterality and having an extended resection or re-do surgery did not significantly differ between groups.  Bleeding from a pulmonary vessel occurred in 50% of “conversion” pts versus 0.3% of “non-conversion” pts (p<0.001).  Tumor size, histology, grade of differentiation, and lymphovascular invasion were not significant factors for conversion.  Patients with pN2 disease had higher risk for conversion (45.0% vs 16.4%; p<0.001).  Pulmonary complications were similar between groups, including prolonged air leak (15.0% vs 21.9%; p=0.46), pneumonia (5.0% vs 6.4%; p=0.80), and respiratory failure (0% vs 1.9%; p=0.53), as was in-hospital mortality (5.0% vs 1.1%; p=0.14).  However, “conversion” pts were at higher risk for cardiopulmonary arrest (5% vs 0.6%; p=0.029), cerebrovascular accident (5% vs 0%; p<0.001), and multi-organ failure (10% vs 0.6%; p<0.001).  Median chest tube duration (5.0 days [IQR=3.8]) for “conversion” pts was longer compared to “non-conversion” pts (4.0 days [IQR=4.0]; p=0.022).  Median hospital LOS was also longer for “conversion” pts (6.0 days [IQR=5.5] vs 4.0 days [IQR=4.0]; p=0.026). 

Conclusions:  Pulmonary lobectomy via R-VATS approach is associated with low conversion rate to thoracotomy.  However, pts with neoadjuvant therapy or clinical N2 disease should be counseled about higher risk of conversion to thoracotomy.  Further, preoperative cardiovascular risk assessment and postoperative monitoring for cardiovascular events are important.

62.10 Failure to Rescue is Associated with Delayed Exploratory Laparotomy After Traumatic Injury

A. M. Stey1, T. Bongiovanni1, R. Callcut1  1University Of California San Francisco,Department Of Surgery,San Francisco, CA, USA

Introduction: Failure to rescue is an outcome measure more dependent on care related factors than other outcome measures. As such, failure to rescue may be a better means of targeting areas for improvement in quality of care. The aim of this study was to determine whether delayed exploratory laparotomy following traumatic injury was associated with higher failure to rescue rates postoperatively.

Methods:  The National Trauma Data Bank (NTDB) National Sample Program 2008-2012 weighted file was used to identify patients older than 12 year of age that underwent exploratory laparotomy following injury. Delay was defined as greater than one day following presentation. A multi-level logistic model estimated the association between delay and failure to rescue while controlling for hypotension in the emergency room, Glasgow Coma Scale (GCS), Injury Severity Score and age at the patient level and hospital strata as well as response weight at the hospital level. 

Results: A total of 2,245 patients underwent an exploratory laparotomy following traumatic injury. Of those, 5.5% experienced a delay greater than one day, despite the fact that 9.9% were hypotensive in the emergency room, 9.8% had a GCS of three and 16.5% had an Injury Severity Score greater than 25. In total, 31.2% of patients had a complication as defined in NTDB. The mortality rate overall was 9.8%. Failure to rescue occurred in 4.1% of patients. The average odds of failure to rescue if there was delayed exploratory laparotomy was 2.9 times higher than if there was no delay (95% confidence interval 1.3-6.2, p=0.0008) after adjusting for hypotension in the emergency room, GCS, injury severity score, age and hospital strata.

Conclusion: Operative delay greater than one day in exploratory laparotomy after traumatic injury was associated with significantly higher failure to rescue rates during their hospitalization. These findings could imply that, either that injury sets into motion pathophysiologic insults that cannot easily be reversed upon delayed exploration or possibly patients whose care is delayed initially may also be more likely to fail to be rescued from subsequent complications.

63.01 Nationwide Comparison of Cost & Outcomes of Transcatheter vs Surgical Aortic Valve Replacement

M. Lopez1, J. Parreco1, M. Eby1, J. Buicko1, R. Kozol1  1University Of Miami, Palm Beach,General Surgery Residency,Miami, FL, USA

Introduction:

The Healthcare Cost and Utilization Project (HCUP) released the first version of the Nationwide Readmission Database (NRD) in 2015. The NRD is unique in that it allows for tracking a patient across hospitals and supports analysis of national readmission rates. This database was used to compare various clinical outcomes and costs associated with transcatheter aortic valve implantation (TAVI) versus surgical aortic valve replacement (SAVR).

Methods:
The NRD was queried for all patients undergoing SAVR or TAVI in 2013 that were over the age of 18. Propensity matching was performed, 1 to 1, using age, gender, elective status and 12 comorbidities. Multivariate logistic regression was then implemented on matched versus unmatched datasets on all outcomes of interest. Kaplan-Meier curves were then constructed for the mortality endpoint.

Results:
There were 39,270 patients who underwent aortic valve procedures (33,191 with SAVR compared to 6,079 with TAVI). Baseline analysis revealed the TAVI patients were older and had more comorbidities. Propensity matching resulted in 8,774 patients (4,387 patients per group). The mean sum of length of stay (LOS) including readmissions was less in TAVI patients (unmatched 12.5 vs 13.4 days, p < 0.01 and matched 12.0 vs 14.2 days, p < 0.01) and the mean total cost of readmissions was similar in both groups (unmatched $23,114 vs $22,629, p = 0.57 and matched $22,805 vs $22,220, p = 0.62).

Conclusion:
Initial admission costs are greater for patients undergoing TAVI compared to patients undergoing SAVR. However, after factoring in readmissions, the mean LOS for patients undergoing TAVI is less than the LOS for those undergoing SAVR, and the mean cost of readmissions is similar in both groups. Additionally, any excess mortality risk associated with TAVI was eliminated via a matching procedure. 
 

62.07 Rapid Release of Blood Products Protocol Optimizes Blood Product Utilization Compared to Standard MTP

S. Jammula1, C. Morrison1  1Penn Medicine Lancaster General Health,Trauma Services,Lancaster, PA, USA

Introduction:  While massive transfusion protocols (MTPs) are effective means of expeditiously delivering blood products to patients with exsanguinating hemorrhage, activation often occurs in cases with small blood volume deficits, leading to blood product wastage and over-transfusion. We sought to determine whether the implementation of a rapid release of blood products protocol (RR), which utilizes less resources, would impact the manner in which MTPs were activated in our level II trauma center, and if decreases in blood product wastage would be observed. We hypothesized that RR would result in the reservation of MTPs for sicker patients and that blood product wastage would decrease.

Methods: All MTP activations one year pre (May 2015-2016) and post-RR (May 2016-2017) were analyzed. Compared to MTP (6 pack red blood cells [RBCs], 6 units fresh frozen plasma [FFP]), RR only releases 4 RBCs and 1 FFP per activation. MTP resource utilization and blood product wastage was compared pre to post RR in both trauma and non-trauma populations. p ≤ 0.05 was considered significant.

Results: A total of 61 MTPs were activated pre (n=24) to post (n=37) RR (Trauma: 36; non-Trauma: 25), with 34 RRs activated in the post-RR period. Compared to the pre-RR group, significantly higher transfusion rates for FFP (Pre: 4.46±7.91 units, Post: 9.20±10.5 units; p=0.050) and platelets (Pre: 0.79±1.35 units, Post: 1.68±1.86 units; p=0.036) were found. Higher, albeit non-significant, transfusion rates were also observed for RBCs and cryoprecipitate pre to post for the total population. No difference in wasted blood products was found pre to post RR within the trauma population, however significant increased waste of FFP was observed post-RR in the non-trauma cohort.

Conclusion: Institution of the RR protocol results in more appropriate activation of MTPs, but does not decrease waste within a mature trauma center.
 

62.08 Rapid and Efficient: Comparison of Native, Kaolin and Rapid Thrombelastography Assays in Trauma

J. Coleman1, E. E. Moore2, A. Banerjee1, C. C. Silliman1, M. P. Chapman1, H. B. Moore1, A. Ghasabyan2, J. Chandler2, J. Samuels1, A. Sauaia1  1University Of Colorado,Surgery,Denver, CO, USA 2Denver Health,Surgery,Denver, CO, USA

Introduction:  Several thrombelastography (TEG) functional assays have been developed to reduce time to available data to guide transfusion in trauma and hemorrhage.  These assays involve addition of various agonists: native, kaolin (contact activation/intrinsic pathway) and rapid with kaolin plus tissue factor (extrinsic pathway).  How this acceleration of TEG affects its efficiency and accuracy as a predictor of adverse outcomes is unknown.  We hypothesize that citrated native TEG, without addition of activators, best predicts massive transfusion (MT).

Methods:  The Trauma Activation Protocol (TAP) study is an ongoing prospective study of trauma activation patients.  Whole blood samples were immediately collected upon presentation from April 2015 to March 2017 for concomitant citrated native (CN-TEG), citrated kaolin (CK-TEG) and citrated rapid (CR-TEG) TEGs.  We assessed the predictive performance for MT (defined as >10 RBC units or death within 6 hours postinjury) via the area under the receiver-operating- characteristics curve (AUROC, derived with a logistic regression model) for ACT/R, maximum amplitude (MA) and angle, and via positive/negative predictive values (PV+, PV-) for fibrinolysis (LY30, stratified as >3% or >5% or >7%).  We excluded 43 (12%) patients who died within 30 minutes postinjury. Data are expressed as median(IQR) or n(%). 

Results: Overall, 339 TAP patients had data for all TEGs (age: 32 years, IQR:26-46,  51% blunt injuries, NISS: 21, IQR:9-34). RBC transfusion was required in 140 (41%) patients; 12% (n=41) qualified as MT (>10RBC units or death/6hrs). The figure depicts AUROCs for ACT (CR-TEG) or R (CN-TEG, CK-TEG), MA and angle.  Compared to CR-TEG, CK-TEG performed better for ACT/R (p=0.06 and angle (p=0.09), while CN-TEG was better for MA (p=0.16). LY30>7% produced the best balance between PV+ (CR-TEG: 47%, CK-TEG: 58%, CN-TEG: 67%) and PV- (CR-TEG: 91%, CK-TEG: 92%, CN-TEG: 92%).  The 95% confidence intervals of the AUROCs and of the PVs of the different assays overlapped considerably, suggesting the CR-TEG’s results were not inferior to the other functional assays.  When all TEG measurements (ACT/R+MA+angle+LY30>7%) for each assay (CR-, CN-, CK-TEG) were in the model, again there was considerable overlap between predictive performances of the different assays and all AUROCs were >0.79.

Conclusion: There was a significant overlap in the performance of the different TEG assays, suggesting CR-TEG is a rapid and efficient method to guide hemostatic resuscitation in trauma patients.  

 

62.09 Readmission for Falls Among Elderly Trauma Patients and the Impact of Anticoagulation Therapy

A. S. Chiu1, R. A. Jean1, M. Flemming1, B. Resio1, K. Y. Pei1  1Yale University School Of Medicine,Surgery,New Haven, CT, USA

Introduction:
Traumatic falls are the leading source of injury and trauma related hospital admission for adults over 65 in the United States. A strong predictor of future falls is a history of previous falls, making patients hospitalized for a fall a high-risk population. It is unknown exactly how frequently this group is hospitalized for a repeat fall. Additionally, there remains debate whether to resume anticoagulation in elderly patients who fall due to fears of bleeding complications with repeat falls. We evaluated the rates of readmission after a fall and frequency of bleeding complications.

Methods:

The National Readmission Database is a nationally representative, all-payer database that tracks patient readmissions. All patients over 65 and admitted in the first 6 months of 2013 and 2014 for a traumatic fall were included for analysis. Those who died during their index hospitalization were excluded.

Primary outcome measured was 6-month readmission rate for a subsequent traumatic fall. Secondary outcomes included the frequency of death and bleeding complications (intracranial hemorrhage, solid organ bleed, and hemothorax) during readmission. Further analysis was conducted stratifying by anticoagulation use.

Results:

In the first 6 months of 2013 and 2014, there were 342,731 admissions for a fall. The cohort had a mean age of 80.2 and 9.3% were on anticoagulation. The rate of 6-month readmission for a repeat fall was 4.7%. Of those who were readmitted for a fall, 3.9% died during the subsequent admission and 12.6% had a bleeding complication. The mortality rate among those with a bleeding complication was 8.5%. The most common bleeding complication on readmission was intracranial bleed (90.8%), followed by hemothorax (5.8%) and solid organ bleed (3.5%).

The rate of readmissions for falls among patients on anticoagulation (4.4%) was not significantly different from those not on anticoagulation (4.7%, p=0.0933). The percent of readmitted patients with bleeding complications was also not statistically different (12.2% with anticoagulation vs. 12.6% without anticoagulation, p=0.7629). However, the mortality rate was higher among those on anticoagulation (6.0% vs. 3.7% without anticoagulation, p=0.0211). Specifically, among patients readmitted with a bleeding complication, those on anticoagulation had a significantly higher mortality rate (24.8% vs. 7.0% without anticoagulation, p<0.0001).

Conclusion:

Among patients hospitalized for a fall, nearly 5% will be re-hospitalized for a subsequent fall within 6 months. Patients on anticoagulation do not have increased rates of bleeding complications when hospitalized for repeat falls; however, when they do have a bleed, they have far higher mortality rates than those not on anticoagulation. Given the high rate of repeat falls and the potential to fatally exacerbate injuries when on anticoagulation, caution should be exercised when restarting anticoagulation among elderly patients hospitalized for a fall.

62.05 The Sooner the Better: Use of a Real Time Automated Bedside Dashboard to Improve Sepsis Care

A. Jung1, M. D. Goodman1, C. Droege1, V. Nomellini1, J. A. Johannigman1, J. B. Holcomb2, T. A. Pritts1  1University Of Cincinnati,Surgery,Cincinnati, OH, USA 2University Of Texas Health Science Center At Houston,Houston, TX, USA

Introduction: Despite advances in modern critical care, sepsis and septic shock remain major contributors to morbidity and mortality.  Recent data indicates that decreasing the time interval between the diagnosis and administration of antibiotics is associated with improved patient outcomes.  The effect of a visual clinical decision support system on this interval is unknown.  We hypothesized that implementation of a commercially available bedside clinical surveillance visualization system would be associated with earlier antibiotic administration and decreased length of stay in surgical intensive care unit (SICU) patients.

Methods: An automated clinical surveillance visualization system was implemented in our SICU beginning in July 2016.  This system was integrated with our electronic medical record and continuously displayed vital signs and laboratory data at the bedside on a dedicated clinical dashboard (42” screen).  A bedside visual sepsis screen bundle (SSB) was added to the dashboard in June 2017.  Among other variables, the clinical dashboard displayed each patient’s calculated sepsis score based on heart rate, body temperature, respiratory rate, and white blood cell count.  The SSB indicator turned red if the sepsis score exceeded four points.  We retrospectively analyzed prospectively collected data from patients with bedside visualization systems before and after implementation of the SSB.  We determined mean sepsis score, maximum sepsis score, time to antibiotic administration, and SICU length of stay.

Results: During the study period, data were collected on 232 patients admitted to the beds with bedside clinical surveillance visualization systems.  Of these, 37 patients demonstrated elevated sepsis scores and were given antibiotics for clinical evidence of sepsis or septic shock (26 prior to SSB implementation, 11 after SSB implementation).  The mean sepsis score was similar in the pre- and post-SSB groups (1.8 vs 1.6, p=NS) as was the maximum sepsis score (6.0 vs 5.4, p=NS).  Time to antibiotic administration was significantly less in the post-SSB patients (pre= 48.9+14 hours vs post= 11.6+6 hours, p<0.05, see figure).  ICU length of stay was also shorter (pre=16.8 days, post=6.2 days, p<0.05) following the introduction of the SSB.   

Conclusion: Implementation of a bedside clinical surveillance visualization decision support system was associated with a decreased time interval between the diagnosis of sepsis or septic shock to administration of antibiotics. Integration of decision support systems in the ICU setting may help providers to adhere to Surviving Sepsis guidelines for identification and treatment of surgical patients with infections as well as improve quality of care.

62.06 ADAMTS13 activity correlates with coagulopathy after trauma: a potential novel marker and mechanism

M. R. Dyer1, W. Plautz1, M. A. Rollins-Raval2, J. S. Raval2, B. S. Zuckerbraun1, M. D. Neal1  1University Of Pittsburgh,Surgery,Pittsburgh, PA, USA 2University Of North Carolina At Chapel Hill,Pathology And Laboratory Medicine,Chapel Hill, NC, USA

Introduction: ADAMTS13 is metalloprotease that binds and cleaves von Willebrand factor (VWF), which is a critical regulator of thrombus formation.  ADAMTS13 deficiency has been implicated as the cause of idiopathic thrombotic thrombocytopenia purpura (TTP) but is also closely associated with other diseases of microvascular injury. Trauma is a leading cause of mortality worldwide and is characterized by a unique trauma-induced coagulopathy (TIC), with a pathophysiology that is incompletely understood. We sought to evaluate a possible link between ADAMTS13 activity and coagulopathy in trauma patients.

Methods:  Plasma samples from an ongoing randomized trial in trauma patients were obtained for analysis. Plasma from healthy individuals served as controls. Fluorescence resonance energy transfer analysis (FRET) was utilized to determine the activity level of ADAMTS13 in circulation. ADAMTS13 antigen and antibody levels were determined by enzyme-linked immunosorbent assays (ELISA). FRET activity levels, ADAMTS13 antigen, and antibody levels were compared between trauma patients and healthy controls using Mann-Whitney U tests. FRET activity levels were correlated with laboratory markers of coagulopathy (INR, Thromboelastography (TEG) maximum amplitude value, TEG activated clotting time, and blood product requirement) by determining the Spearman correlation coefficient.

Results: 70% of trauma patients had abnormal ADAMTS13 activity on FRET analysis (normal >80% IU/ml). Compared to healthy controls, the ADAMTS13 activity in plasma is significantly lower in trauma patients (96% vs. 66% IU/ml, p=0.0013). Correspondingly 81% of trauma patients displayed depressed levels of circulating ADAMTS13 compared to healthy controls, and overall levels of circulating ADAMTS13 were significantly lower in trauma patients (0.96 vs. 0.47 ug/ml, p=0.0019). There was no difference in circulating antibodies against ADAMTS13 between trauma patients and healthy controls. Interestingly, ADAMTS13 activity was found to be significantly correlated with injury severity score in trauma patients (ISS) (r=-0.34, p<0.05). Next, ADAMTS13 activity levels were correlated to markers of coagulopathy. Strikingly, ADAMTS13 activity was closely correlated with admission INR (r=-0.6250, p<0.001), admission TEG activated clotting time (r=-0.36, p<0.05), and admission TEG MA (r=0.36, p<0.05). Finally, ADAMTS13 activity significantly correlated with overall blood product transfusion requirements in trauma patients (r=-0.46, p<0.05).

Conclusion: Trauma results in several derangements in the normal clotting process. We now present evidence that circulating levels and enzymatic activity of ADAMTS13, a key regulator in normal hemostasis, are decreased following severe trauma. The decreased activity was found to be significantly correlated with markers of coagulopathy and may represent a novel insight into potential mechanisms of TIC.
 

62.04 Functional Inclusivity of the Northwest London Trauma Network

J. M. Wohlgemut1, J. Davies2, C. Aylwin2, J. J. Morrison3, E. Cole4, N. Batrick2, S. I. Brundage4, J. O. Jansen5  1Aberdeen Royal Infirmary,Department Of Surgery,Aberdeen, SCOTLAND, United Kingdom 2Imperial College Healthcare NHS Trust,St Mary’s Hospital, North West London Trauma Network,London, ENGLAND, United Kingdom 3University Of Maryland,R Adam Cowley Shock Trauma Center,Baltimore, MD, USA 4Queen Mary University Of London,Centre For Trauma Sciences, Blizard Institute,London, ENGLAND, United Kingdom 5University Of Alabama at Birmingham,Division Of Acute Care Surgery, Department Of Surgery,Birmingham, Alabama, USA

Introduction:
Inclusive trauma systems have been shown to improve survival compared to exclusive systems. Metrics exist to assess and validate trauma system outcomes; however, these are clinically focused and do not evaluate the appropriateness of admission patterns, relative to geography and triage category. This is an important consideration in system performance, as the referral of triage negative, non-severely injured patients to a higher echelon of care can overwhelm capability and dilute clinical experience. We propose to term this phenomenon “functional inclusivity” and to define it as the proportion of triage negative, non-severely injured patients, who were injured in proximity to a Level II/III trauma center but were admitted to a Level I facility. The aim of this study was to evaluate the usefulness of this metric in the context of the Northwest London Trauma Network.

Methods:
Retrospective, geospatial, observational analysis of registry data from the Northwest London Trauma Network. We included all adult (≥16 years) patients who were transported by road directly from the scene to the Level I trauma center between 1/1/13 – 31/12/16. Incident location data were geocoded into longitude/ latitude, and drive-times were calculated from each incident location to each hospital in London’s Trauma System, using Google Maps and R. We determined the number and proportion of non-severely injured patients and triage-negative patients, who were injured in closer proximity to a Level II/III but taken to the Level I. We evaluated changes in these metrics over time.

Results:
Of 2051 patients, 907 (44%) were severely injured (ISS ≥15), and 1144 (56%) non-severely injured (ISS 1-15). 795 of the 1144 non-severely injured patients (69%) were injured in closer proximity to a Level II/III than the Level I, but nevertheless taken to the Level I. Similarly, 488 patients were triage-negative, and 229 (47%) of these were injured in closer proximity to a Level II/III than the Level I, but nevertheless taken to the Level I. Patient volume increased over time, from 23 to 85 severely injured patients, and 25 to 85 non-severely injured patients per quarter. However, although the number of non-severely injured and triage-negative patients also increased over time, the corresponding proportions have remained the same.

Conclusion:
This study has demonstrated the potential value of the concept of functional inclusivity in characterising trauma system performance. Further work is required to establish what constitutes an acceptable level of functional inclusivity, and what the denominator should be. The Northwest London Trauma Network is a maturing system. Though the overall number of trauma patients increased over the study period, system exclusivity is not increasing. Future research should focus on validating and further evaluating the concept of functional inclusivity across the London Trauma System as a whole, and in other settings. 
 

62.03 Trauma Patients have Improved Access to Post-Discharge Resources through The Affordable Care Act

A. Y. Williams1, S. N. Davis1, S. K. Hill1, P. Connor1, Y. Lee1, C. C. Butts1, J. Simmons1, S. Brevard1, L. Ding1  1University Of South Alabama Medical Center,Department Of Trauma, Surgical Critical Care And Burn Surgery,Mobile, AL, USA

Introduction: The state of Alabama is one of 19 states opting out of Medicaid expansion under the Affordable Care Act. Many injured patients require post-hospital discharge specialty services and it has been demonstrated that the uninsured have less access to such services and are more likely to be discharged directly to home. Previous studies in Medicaid-expanded states have demonstrated up to a 20% decrease in the uninsured trauma population. The purpose of this study was to evaluate our trauma population payer mix before and after the implementation of the ACA and its effect on disposition.

Methods: This was a retrospective review of adult trauma patients treated at a level I trauma center with a large rural catchment area between 2011-2012 (pre-ACA) and 2014-2015 (post-ACA). Demographics, payer status, patient outcomes and disposition were recorded. 

Results: We identified 3716 patients in the pre-ACA period and 3657 patients in the post-ACA period. Patients in the post-ACA period were more likely to be insured (59.37% vs 55.14%, p<0.001) and utilize post-discharge services (13.48% vs. 9.28%, p<0.001) when compared to the pre-ACA period.  Medicaid patients are more likely than uninsured patients to receive home health services (OR 6.24, CI 2.74-14.18) or intermediate nursing care (OR 16.88, CI 7.05-40.43).

Conclusion: After the implementation of the ACA, our trauma center experienced a statistically significant decrease in the overall number of uninsured trauma patients with increased utilization of extended services after discharge.  However, in our state without Medicaid expansion, this still leaves 40% of our patient population without health insurance.

 

62.02 Are Trends in Fatal and Non-Fatal Firearm Related Injuries in Miami-Dade County Racially Divergent?

A. B. Padiadpu1, S. A. Eidelson1, J. Stoler2, M. B. Mulder1, R. Rattan1, T. L. Zakrison1  1University Of Miami,Ryder Trauma Center, The DeWitt Daughtry Family Department Of Surgery, Miller School Of Medicine,Miami, FL, USA 2University Of Miami,Department Of Geography And Regional Studies, Department Of Public Health Sciences,Miami, FL, USA

Introduction: Firearm related injuries are a significant public health problem in Florida and a growing cause of preventable morbidity and mortality. However, the associated sociodemographic factors remain poorly examined. This study addressed this gap by assessing the racial, ethnic, gender and age demographics of firearm related injuries in the region of Miami-Dade County over a 20-year period.

 

Methods: We retrospectively reviewed all patients involved in firearm incidents who were treated at the only American College of Surgeons verified Level 1 trauma center in Miami-Dade County during the study period. We analyzed the demographic trends and tested for significance (p<0.05) using chi-square test and ANOVA test. Parametric data was expressed as mean ± SD and nonparametric data as median (interquartile range).

 

Results: During 1992-2012, 11839 patients were transported to our trauma center after a firearm-related injury. The population was 30±13 years, 90% male, 66% Black and 34% White. 19% of the population was Hispanic (91% White).  97% of the incidents were intentional. Patients had an Injury Severity Score (ISS) of 14±15 with hospital length of stay (LOS) 2(0-8) days and 13% mortality. The injury and mortality rates trended downward until 2000 and then upward through 2012, more so among blacks (p≤0.001). Black males comprised 60% of the population with mean age of 28±11yrs, ISS of 14±15, LOS 3(0-8) days and 12% mortality. Among these patients, mortality has trended upwards since 2000 (p≤0.001). For patients ≤18 years of age (14% of the population), the median age was 17(16-18) years, ISS 13±15, LOS 2(0-7) days and 10% mortality rate. In this category, mortality has been on the rise over the last 12 years of the study period (p=0.09), significantly so for black pediatric population (p≤0.001).

 

Conclusion: This analysis reveals a recent and sustained increase in firearm related injuries and associated mortality since 2000, an alarming trend that has been driven mainly by the rise in injuries inflicted on younger Black males. These results beckon a focused evaluation and public health intervention to prevent firearm related violence in this highly vulnerable population.

61.10 Altered Shear Forces Precipitate Fibrotic Remodeling in Discrete Subaortic Stenosis

E. H. Steen1, M. Fahrenholtz4, M. Kang3, L. Wadhwa4, J. Grande-Allen1,3, S. Balaji1, S. Keswani1,2  1Baylor College Of Medicine,Department Of Surgery,Houston, TX, USA 2Texas Children’s Hospital,Department Of Pediatric Surgery,Houston, TX, USA 3Rice University,Department Of Bioengineering,Houston, TX, USA 4Texas Children’s Hospital,Department Of Surgery, Congenital Heart Surgery Service,Houston, TX, USA

Introduction:  Discrete subaortic stenosis (DSS) is characterized by the formation of fibromembranous tissue in the left ventricular outflow tract (LVOT), leading to cardiac dysfunction. Surgical resection has a high rate of recurrence which characterizes an aggressive phenotype. The pathogenesis of DSS is theorized to be due to altered LVOT geometry, producing shear forces that stimulate a fibrotic response. We used computational modeling of patient echocardiograms (ECHO) to develop a novel parallel flow bioreactor to mimic DSS shear forces. We hypothesize that increased shear force on cardiac endothelial cells promote endothelial-fibroblast crosstalk via CD31, a known mechanosensory molecule, and via inflammatory cytokines to stimulate fibrotic tissue formation.

Methods:  Human DSS tissues (n=7; 3 aggressive/recurrent DSS, 4 primary DSS) were compared using special stains and ICC to develop a histology-based scoring system, then compared to the patient’s preoperative ECHO reports to define a correlation between ECHO and aggressive histologic phenotype. In a parallel plate flow chamber, co-cultures of endocardial endothelial cells (EEC) and cardiac fibroblasts (CF) isolated from porcine LVOT were subjected to static, low, and high shear conditions modeled on patient ECHO data, then analyzed by ICC (CD31, VE-cadherin) and PCR. To test the role of CD31 and EEC, CD31 signaling was inhibited using a Src inhibitor in high shear conditions and similarly analyzed.

Results: Histological analysis reveals notable differences in aggressive membranes, characterized by immature collagen with increased matrix turnover and by increased cellularity compared to non-aggressive samples. These parameters were developed into a scoring system with aggressive samples having higher scores, which are commensurately associated with high preoperative mean LVOT gradients (55± 5?mmHg by ECHO), an accepted predictor of DSS recurrence. In vitro, high shear conditions causes de-localization of mechanosensitive CD31 from VE-cadherin-rich cellular junctions. PCR array reveals upregulation of pro-inflammatory CSF1 and CSF3 in the high shear condition (4.2 and 4.7fold vs low shear) and upregulation of chemoattractants CCL3, CCL4, and CCL5 (15.3, 15.6, 7.6fold). Inhibition of CD31 signaling attenuates the effect of high shear (3.5fold decrease in CCL3 and CCL4, 4fold decrease in CCL5, and 2fold decrease in CSF3).

Conclusion: Our data suggest that increased shear forces change the expression profile of mechanosensitive proteins and cytokines in EEC in part due to a CD31-dependent signaling mechanism. Additionally, we developed a histology-based scoring system that correlated with increased shear force on ECHO to define an aggressive/recurrent DSS phenotype. This work implicates the role of shear forces in the development of DSS and may help predict patients susceptible to recurrence and elucidate molecular therapeutics for DSS and other disease processes characterized by fibrosis.

62.01 High Value Care through Standardization of Continuous Renal Replacement Therapy

J. Tseng1, N. Minissian2, R. J. Halbert2, P. Hain2, T. Griner2, H. Rodriguez2, S. Barathan2, R. F. Alban1  1Cedars-Sinai Medical Center,Department Of Surgery,Los Angeles, CA, USA 2Cedars-Sinai Medical Center,Los Angeles, CA, USA

Introduction:
Continuous Renal Replacement Therapy (CRRT) is widely used to manage renal failure in critically ill patients.  Though CRRT utilization has increased nationwide, it is significantly costlier in comparison to intermittent hemodialysis.  Furthermore, the indications for initiation, maintenance, and cessation lack consensus and are not standardized.  In order to target high value care at our institution, we organized a multidisciplinary task force to evaluate current utilization patterns of CRRT, standardize its usage, and assess the outcomes of our intervention.

Methods:
A multidisciplinary task force consisting of intensivists, nursing staff, and nephrologists was established in October 2015 to assess and promote high value utilization of CRRT.  This team created a set of evidence-based guidelines to standardize the initiation, maintenance, and termination of continuous dialysis.  Other interventions were implemented to improve transparency regarding CRRT, mandate daily communication between medical teams and ancillary staff, encourage goals of care discussion, revise electronic order sets, and curb excess lab orders related to dialysis.  Patients receiving CRRT from fiscal years 2013 to 2017 before and after the intervention were compared.

Results:
The total volume of patients on CRRT increased by 104% (216 to 440) from 2013 to 2016, and decreased to 326 patients in 2017.  Similarly, the total number of CRRT days increased by 120% (1490 to 3285) from 2013 to 2016, and decreased to 1879 days in 2017. Prior to our intervention, the average duration of CRRT peaked at a mean of 7.69 ± 7.46 days, or a median of 8 (IQR 3-10) in 2015.  After our intervention, the average duration of CRRT decreased to a mean of 5.76 ± 4.50, or a median of 4 (IQR 3-8) in 2017 (p=0.018).  Solid organ transplant patients utilized continuous dialysis for longer durations compared to non-transplant patients.  The total direct cost of CRRT per case decreased from $12167.44 in 2013 to $10545.96 in 2017, translating to 13% cost savings.  Upon termination of CRRT, the proportion of patients who expired on CRRT decreased from 26.4% in 2013 to 5.8% in 2016, while the proportion of patients expected to transition to hospice care increased from 21.7% to 53.1%.  An increasing number of patients were enrolled in hospice upon hospital discharge after our intervention, from 0.4% in 2014 to 8.8% in 2017 (p<0.001).

Conclusion:
By establishing a task force to critically review utilization of continuous, standardize its usage, and promote daily communication regarding patient progress and goals of care, we significantly reduced the cost and duration of CRRT.  In addition, our intervention was also associated with fewer patients expiring on continuous dialysis, and more patients transitioned to comfort care measures.  Solid organ transplant patients utilize CRRT at higher rates than non-transplant patients, and may be the focus of further efforts to achieve high value care.