69.05 Survival Outcome of RNF43 Mutant-type Differs between Right-sided and Left-sided Colorectal Cancer

Y. Shimada1, Y. Tajima1, M. Nagahashi1, H. Ichikawa1, K. Yuza1, Y. Hirose1, T. Katada1, M. Nakano1, J. Sakata1, H. Kameyama1, Y. Takii2, S. Okuda3, K. Takabe4, T. Wakai1  1Niigata University,Digestive And General Surgery,Niigata, NIIGATA, Japan 2Niigata Cancer Center Hospital,Surgery,Niigata, NIIGATA, Japan 3Niigata University,Bioinformatics,Niigata, NIIGATA, Japan 4Roswell Park Cancer Institute,Breast Surgery,Buffalo, NY, USA

Introduction: Right-sided colorectal cancer (CRC) demonstrates worse survival outcome compared with left-sided CRC, and clinicopathological characteristics of right-sided CRC differ from left-sided CRC. Recently, the importance of RNF43 mutation has been reported along with BRAF mutation in serrated neoplasia pathway. We hypothesized that clinical significance of RNF43 mutation differs between right-sided and left-sided CRCs, and RNF43 mutation associates with tumor biology of right-sided CRC. To test this hypothesis, we investigated the clinicopahotlogical characteristics and survival outcome of patients with RNF43 mutation in right-sided and left-sided CRCs.

Methods: One-hundred-nine microsatellite stable Stage IV CRC patients were analyzed. Thirty-three and 76 patients were right-sided CRC and left-sided CRC, respectively. We investigated genetic alterations using a 415-gene panel, which includes RNF43 and the other genes associated with tumor biology. We analyzed clinicopathological characteristics between RNF43 wild-type and RNF43 mutant-type using Fisher’s exact test. Moreover, we classified RNF43 mutant-type according to primary tumor sidedness, i.e., right-sided RNF43 mutant-type or left-sided RNF43 mutant-type, and compared clinicopathological characteristics between the two groups. Overall survival rates of RNF43 wild-type, right-sided RNF43 mutant-type, and left-sided RNF43 mutant-type were analyzed using log-rank test.

Results:CGS revealed that 8 of 109 patients (7%) had RNF43 mutation. RNF43 mutation was significantly associated with high age (65 or more) (P = 0.020), presence of BRAF mutation (P = 0.005), absence of KRAS and PTEN mutations (P = 0.049 and P = 0.026, respectively). RNF43 mutation was observed in 3 of 33 right-sided CRC (9%) and 5 of 76 left-sided CRC (7%), respectively. Interestingly, RNF43 mutations in right-sided CRC were nonsense mutation (R145X) or frameshift mutation (P192fs, S262fs), while those in left-sided CRC were missense mutations (T58S, W200C, R221W, R519Q, R519Q). All the three right-sided RNF43 mutant-type were high age (65 or more), female, BRAF V600E mutant-type. Right-sided RNF43 mutant-type showed significantly worse OS than RNF43 wild-type and left-sided RNF43 mutant-type (P = 0.007 and P = 0.046, respectively).

Conclusion:Clinicopathological characteristics and survival outcome of patients with RNF43 mutation might differ between right-sided and left-sided CRC. In right-sided CRC, RNF43 mutation is a small, but distinct molecular subtype which is associated with aggressive tumor biology along with BRAF V600E mutation. Future preclinical and clinical studies might have to focus on RNF43 mutation for improving survival outcome in right-sided CRC.

 

69.04 What Drives Surgeon Workload in Colorectal Surgery?

K. E. Law1,2, B. R. Lowndes1,2,3, S. R. Kelley4, R. C. Blocker1,2, D. W. Larson4, M. Hallbeck1,2,4, H. Nelson4  1Mayo Clinic,Health Sciences Research,Rochester, MN, USA 2Mayo Clinic,Kern Center For The Science Of Health Care Delivery,Rochester, MN, USA 3Nebraska Medical Center,Neurological Sciences,Omaha, NE, USA 4Mayo Clinic,Surgery,Rochester, MN, USA

Introduction: Surgical techniques and technology are continually advancing, making it crucial to understand potential contributors to surgeon workload. Our goal was to measure surgeon workload in abdominopelvic colon and rectal procedures and attribute possible contributors.

Methods: Between February and April 2018, following each surgical case surgeons were asked to complete a modified NASA-Task Load Index (NASA-TLX) which included questions on distractions, fatigue, procedural difficulty, and expectation in addition to the validated NASA-TLX questions. All but the expectation question were rated on a 20-point scale (0=low, 20=high). Expectation was rated on a 3-point scale (i.e., more difficult than expected, as expected, less difficult than expected). Patient and procedural data were analyzed for procedures with completed surveys. Surgical approach was categorized as open, laparoscopic, or robotic.

Results: Seven surgeons (3 female) rated 122 procedures over the research period using the modified NASA-TLX survey. Across the subscales, mean surgeon-reported workload was highest for effort (M=10.83, SD=5.66) followed by mental demand (M=10.18, SD=5.17), and physical demand (M=9.19, SD=5.60). Procedures were rated moderately difficult (M=10.74, SD=5.58). There was no significant difference in procedural difficulty or fatigue by surgical approach.
Fifty-four percent (n=66) of cases were rated as meeting expected difficulty, with 35% (n=43) considered more difficult than expected. Mean surgeon-reported procedural difficulty aligned with expectation with a mean procedural difficulty level of 9.29 (SD=5.11) for as expected, 14.39 (SD=4.49) for more difficult than expected, and 5.92 (SD=4.15) for less difficult than expected (F(2,118)=21.89, p<0.001). Surgeons also reported significantly more fatigue for procedures considered more difficult than expected (F(2,118)=8.13, p<0.001) compared to procedures less difficult than expected.
Self-reported mental demand (r=0.88, p<0.001), physical demand (r=0.87, p<0.001), effort (r=0.90, p<0.001), and surgeon fatigue (r=.71, p<0.001) were strongly correlated with procedural difficulty. Furthermore, fatigue was strongly correlated with overall workload and the NASA-TLX subscales (r>0.7, p<0.001). Surgeons most frequently reported patient anatomy and body habitus, unexpected adhesions, and unfamiliar team members as contributors to ease or difficulty of cases.

Conclusion: Self-reported mental demand, physical demand, and effort were strongly correlated with procedural difficulty and surgeon fatigue. Surgeons attributed case ease or difficulty levels to patient and intraoperative factors; however, procedural difficulty did not differ across surgical approach. Understanding contributors to surgical workload, especially unexpectedly difficult cases, can help define ways to decrease workload.

 

69.03 Population-based Analysis of Adherence to Extended VTE Prophylaxis after Colorectal Resection

A. Mukkamala1, J. R. Montgomery1, A. De Roo1, S. E. Regenbogen1  1University Of Michigan,Surgery, Center For Healthcare Outcomes And Policy,Ann Arbor, MI, USA

Introduction:  Since 2012, the American College of Chest Physicians (ACCP) has recommended 4 weeks of pharmacologic prophylaxis against venous thromboembolism (VTE) after abdominopelvic cancer surgery. Additionally, there is growing expert consensus favoring extended prophylaxis after surgery for inflammatory bowel disease (IBD). National studies have revealed very low uptake of prophylaxis before adoption of the ACCP guideline, but it remains unclear to what extent it has been adopted in standard practice in recent years. We sought to understand responsiveness to guidelines versus expert opinion by evaluating adherence to extended VTE prophylaxis after colectomy in a statewide registry. 

Methods:  We identified all patients in the Michigan Surgical Quality Collaborative (MSQC) registry who underwent elective colon or rectal resection between October 2015 (when MSQC first began recording post-discharge VTE prophylaxis) and February 2018. MSQC is an audited and validated, statewide population-based surgical registry including all major acute care hospitals in the state. We used descriptive statistics and chi-square tests to compare annual statewide utilization trends for extended VTE prophylaxis with low molecular weight heparin by operative year and by diagnosis among all patients without documented exclusions.

Results: Of 5722 eligible patients, 373 (6.5%) received extended VTE prophylaxis after discharge. Use of extended prophylaxis was similar between patients with cancer (282/1945, 14.5%) and IBD (31/242, 12.8%), but was significantly increased when compared with patients with other indications (60/3051, 1.97%, p<0.001). Overall use during the study period significantly increased among cancer patients from 8.2% in 2015 to 9.0% in 2016 to 18.6% in 2017-18 (p=0.001). Use among IBD patients also significantly increased from 0% to 6.6% to 17.1% (p=0.03). Use among patients with other diagnoses was rare and did not vary over the study period (1.5 to 2.4%, p=0.50). Annual trends are shown in Figure 1.

Conclusion: Use of extended VTE prophylaxis after discharge is increasing, but remains uncommon in spite of guidelines recommending its use for colorectal cancer surgery and expert consensus supporting its use in IBD. Efforts to improve dissemination of guidelines and recommendations may require quality implementation initiatives accompanied by payment incentives to improve adherence.

 

69.02 Statewide Utilization of Multimodal Analgesia and Length of Stay After Colectomy

A. C. De Roo1,2, J. V. Vu1,2, S. E. Regenbogen1,2,3  1University Of Michigan,Center For Healthcare Outcomes And Policy,Ann Arbor, MI, USA 2University Of Michigan,Department Of General Surgery,Ann Arbor, MI, USA 3University Of Michigan,Division Of Colorectal Surgery,Ann Arbor, MI, USA

Introduction:
Multimodal analgesia is a critical component of both enhanced recovery protocols (ERP) and efforts to reduce opioid misuse after surgery. Postoperative multimodal pain therapy, using more than one class of pain medication: opioids, acetaminophen, non-steroidal anti-inflammatories (NSAIDs), gabapentinoids, and regional and epidural anesthesia, has been associated with lower pain scores, decreased opioid use, and avoidance of opioid inhibition of gut motility. Whether multimodal analgesia is widely used in practice remains unknown, and its effect on hospital length of stay has not been evaluated outside of controlled trials.

Methods:
Within the population-based, statewide Michigan Surgical Quality Collaborative (MSQC), we evaluated all adult patients undergoing elective colorectal resection between 2012 and 2015. Colectomy has been a targeted procedure for ERP implementation and MSQC collects ERP-specific data elements for colectomy, including details of perioperative analgesia. The primary outcome was mean postoperative hospital length of stay (LOS). To reduce bias from rare, extremely prolonged hospitalizations, we winsorized LOS at 30 days which excluded 27 patients. T-tests were used to evaluate associations between LOS and opioid-only vs multimodal therapy, defined as two or more classes of pain medication used.

Results:
Among the 7249 patients who underwent elective colectomy, 6746 received opioids (93.1%), and 2391 patients (33.0%) received no other analgesia besides opioids. Acetaminophen was used by 2701 (37.2%) patients, NSAIDs in 2551 (35.2%), and epidural, spinal, or regional anesthesia in 1400 (19.3%) patients. Average LOS for patients receiving multimodal analgesia (5.4 days, 95% CI 5.3-5.5) was significantly shorter than for patients receiving opioids alone (6.0 days, 95% CI 5.8-6.2; p<0.001).

Conclusion:
One third of patients undergoing colectomy in the state of Michigan received solely opioid analgesia. Ongoing improvement efforts will aim for near-universal use of opioid sparing pain regimens, in order to reduce opioid-related adverse effects and opioid exposure. Use of opioid-sparing multimodal analgesia, compared with opioids alone, is associated with a small reduction in hospital LOS, perhaps from improved pain control and lower rates of ileus, and could therefore accrue cost savings at a population level.  Multimodal analgesia is also an essential component of efforts to combat opioid use disorders related to surgical encounters and Michigan hospitals have room for improvement.
 

69.01 Achieving the High-Value Colectomy: Preventing Complications or Improving Efficiency

J. V. Vu1, J. LI3, D. S. LIKOSKY2, E. C. NORTON4,5, D. A. CAMPBELL1, S. E. REGENBOGEN1  1University Of Michigan,SURGERY,Ann Arbor, MI, USA 2University Of Michigan,CARDIAC SURGERY,Ann Arbor, MI, USA 3University Of Michigan,SCHOOL OF PUBLIC HEALTH,Ann Arbor, MI, USA 4University Of Michigan,ECONOMICS,Ann Arbor, MI, USA 5University Of Michigan,HEALTH MANAGEMENT AND POLICY,Ann Arbor, MI, USA

Introduction:  As payers increasingly tie reimbursement to value, there is increased focus on both outcomes and expenditures for surgical care. One way of measuring hospital value is by comparing episode payments to adverse outcomes. While postoperative complications increase spending and decrease value, it is unknown whether hospitals that achieve highest value in major surgery also deliver efficient care beyond the prevention of complications. We aimed to identify the contributions of clinical quality and efficiency of perioperative care to high-value strategies for success in episode-based reimbursement for colectomy.

Methods:  This was a retrospective observational cohort study of elective colectomy patients from 2012 to 2016, from 56 hospitals in the Michigan Surgical Quality Collaborative and Michigan Value Collaborative. Hospitals were assigned a value score (proportion of cases without adverse outcome divided by mean episode payment). Adverse outcomes included postoperative complications, reoperation, or death within 30 days of surgery. Risk-adjusted payments for total 30-day episode and components of care were compared using ANOVA between hospitals by value tertile.

Results: We matched 2,947 patients enrolled in both registries, 646 (22%) of which experienced adverse outcomes. Mean adjusted complication rate was 31% (+10.7%) at low-value hospitals and 14% (+4.6%) at high-value hospitals (p<0.001). Mean episode payments for all cases were $3,807 (17%) higher in low-value than high-value hospitals, ($22,271 vs. $18,464 p<0.001). Among cases without adverse outcomes only, payments were still $2,257 (11%) higher in low-value hospitals ($19,424 vs. $17,167, p=0.04).

Conclusion: In elective colectomy, high-value hospitals achieve lower episode payments than low-value hospitals for cases both with and without complications, indicating mechanisms for increasing value beyond reducing complications alone. High-value hospitals had two-fold lower complication rates, but also achieved 11% savings in uncomplicated cases. Worthwhile targets to optimize value in elective colectomy may include enhanced recovery protocols or other interventions that increase efficiency in all phases of care.

 

68.10 Is Amiodarone Prophylaxis Necessary After Non-Anatomic Lung Resection?

B. R. Beaulieu-Jones2, E. D. Porter1, K. A. Fay1, E. Diakow2, R. M. Hasson1, T. M. Millington1, D. J. Finley1, J. D. Phillips1  1Dartmouth-Hitchcock,Thoracic Surgery,Lebanon, NH, USA 2Giesel School of Medicine at Dartmouth,Hanover, NH, USA

Introduction: Post-operative atrial fibrillation (POAF) after non-cardiac thoracic surgery is a common complication that is associated with increased morbidity and hospital stay.  Most reports review POAF incidence in only anatomic resection; however, the use of non-anatomic (wedge) resections is increasing.  Since 2015, our institution has implemented an amiodarone protocol for patients ≥65 years of age undergoing anatomic resection, resulting in a POAF incidence of 9%. We sought to investigate the incidence of POAF in our non-anatomic resection cohort in comparison to our anatomic resection cohort to assess the need for amiodarone prophylaxis following non-anatomic resection.

Methods: Retrospective cohort study at a single tertiary referral center. All anatomic and wedge lung resections from January 2015 through April 2018 were selected. Anatomic resection patients ≥65 years of age, or at the discretion of the Attending Surgeon, were eligible to receive our amiodarone protocol: immediate post-operative IV bolus of 300 mg given over 1 hour followed by 400 mg PO TID x 3days. We excluded patients with chronic atrial fibrillation or a contraindication to amiodarone (hypotension or electrical conduction abnormality). Primary outcome was incidence of POAF within 30 days. We compared anatomic and wedge resection patients using two-sample, two-tailed students t-test and Pearson’s chi-squared test for continuous and categorical data, respectively.

Results: A total of 355 patients underwent lung resection with 85% (300) undergoing an anatomic resection and 15% (55) a wedge resection. On comparative analysis, patients undergoing wedge resection were significantly younger (58.1±17.2 vs. 65.2±9.7 years, p<0.001) and had a shorter duration of surgery (141.4±55.8 vs. 271.2±81.4 mins, p<0.001) than those undergoing anatomic resection. There were no differences with regard to sex, comorbidities, preoperative pulmonary function tests, or length of stay (Table 1). Within wedge resection patients, only 3 received the amiodarone protocol. No wedge resection patients developed POAF. Among patients with anatomic resection, over 89% of patients ≥65 had received amiodarone and POAF occurred in 9% (28) of patients. POAF significantly increased the post-operative length of stay (6.9±4.1 vs. 4.2±4.5 days, p=0.003).

Conclusion: POAF continues to be a challenging problem after non-cardiac thoracic surgery. Amiodarone prophylaxis can reduce the incidence of POAF to 9% among anatomic resections. However, our data indicates that POAF following non-anatomic or wedge resection is rare, and chemoprophylaxis is unnecessary in this population.

 

68.09 Accuracy of Multidetector Computed Tomography in Preoperative Aortic Valve Annulus Sizing

S. Banerjee1, A. Das1, H. Zimmerman1, R. Jennings1, R. Boova1  1Temple University Hospital,Department Of Cardiothoracic Surgery,Philadelpha, PA, USA

Introduction:

Surgical aortic valve replacement (SAVR) may be associated with unanticipated intraoperative aortic pathology that is not identified by routine pre-operative evaluation. Such findings may alter the conduct of SAVR. Pre-operative multidetector computed tomography (MDCT) was adopted to mitigate unexpected intraoperative aortic findings.

MDCT is integral in preoperative sizing for transcatheter aortic valve replacement (TAVR) sizing. As TAVR emerged as an alternative to SAVR, our institutional TAVR MDCT protocol was implemented in pre-operative SAVR assessment to avoid duplicate MDCT, if findings resulted in pathology more amenable to TAVR than SAVR.

The purpose of this study is to determine if our institutional TAVR MDCT accurately predicts aortic valve prosthesis size. The secondary objective is to determine if there is a trend towards over- or under-sizing, if MDCT is not consistent with implant size.

Methods:

Between July 2012 and July 2017, 102 patients who underwent surgical aortic valve replacement had preoperative aortic valve sizing by MDCT. The aortic annulus diameter calculated using MDCT was compared to intraoperative valve sizing during SAVR. Implanted valve size within 1 mm of the MDCT calculated size was regarded as accurate in predicting valve size. If the implanted valve was outside the 1 mm range, it was classified as either smaller or larger. This was done because valves used in SAVR are manufactured in 2 mm increments. To evaluate if MDCT accuracy was affected by aortic valve annulus size, we stratified the valve diameters based on MDCT measurements into categories: 17.8-19.9, 20- 21.9, 22-23.9, 24-25.9, and >26mm. Statistical analysis was performed using SPS software and paired t-test was used to evaluate whether the results were statistically significant.

Results:

Forty-one (40.2%) of the 102 patients studied had MDCT aortic valve measurements that were within 1mm of implant size. Implanted valves were smaller than MDCT calculation in 40 patients (39.2%) and were larger in 21 patients (20.6%). MDCT measurements remained inconsistent with intraoperative sizing regardless of aortic annulus diameter. The variance between MDCT annulus measurements and intraoperative sizing was statistically significant, p value less than 0.0005, as determined by paired t-test.

Conclusion:

Preoperative aortic annulus measurements by MDCT differed substantially from intraoperative sizing. Furthermore, there was no trend towards over- or under-sizing. These results may impact preoperative planning for patients undergoing SAVR if MDCT is utilized for preoperative planning. The implication of this information on preoperative TAVR planning is indeterminate and may warrant further investigation.
 

68.08 The Modifying Effect of Insurance on Racial Disparities in CABG Utilization for AMI Patients

A. G. Sassine1, J. Nosewicz1, F. Aboona1, T. Siblini1, J. M. Clements1  1Central Michigan University College Of Medicine,Mount Pleasant, MI, USA

Introduction:
Racial disparities in the utilization of coronary artery bypass graft surgery (CABG) have been documented in certain parts of the country. The influence of insurance status on these racial disparities has been inconsistently reported.  Apart from regional studies documenting these disparities, to our knowledge, no studies have examined disparities at the national level. Our objective was to assess racial disparities in CABG utilization using national discharge data from the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality. We hypothesize that minority populations are less likely than whites to receive CABG and that these racial disparities will persist when controlling for insurance.

Methods:
We identified 456,895 discharges with a diagnosis code for acute myocardial infarction (ICD 9 CM code 410.x1 or 410.x3) that also had a CABG procedure code (ICD 9 CM Code 36.10-36.16, 36.19) in any of the 15 procedure code fields. We ran logistic regression models to determine the influence of age, number of chronic conditions, gender, rural/urban patient location, race, and insurance status on undergoing CABG surgery.

Results:
Blacks (OR=.794, 95% CI .742-.849) were less likely than Whites to receive CABG when controlling for all demographic variables, including insurance status. This disparity persists when including an interaction term between race and insurance with Blacks on Medicaid being less likely to receive CABG compared to whites with Private/HMO insurance (OR = ..777, 95% CI .690-.874). Hispanics (OR=1.13, 95% CI 1.06-1.22) and Asian/Pacific Islanders (OR=1.55, 95% CI 1.40-1.70) are more likely to receive CABG compared to whites.  However, compared to whites with Private/HMO insurance, Hispanics (OR-.852, 95% CI .781-.930) and Asians (OR=0.764, 95% CI .669-.874) on Medicare are less likely to receive CABG, indicating that insurance status completely moderates the effect of race on CABG for these race/ethnic groups. Native Americans were as likely as whites to receive CABG across all logit models.

Conclusion:
Disparities in CABG utilization for Black AMI patients were not explained by the interaction effect between race and insurance; however, insurance status appears to moderate the effects of race for Hispanics and Asian/Pacific Islanders. These findings suggest policies be implemented that improve access to invasive revascularization procedures for African Americans. Future studies should evaluate why the positive effects of race for Hispanics and Asian/Pacific Islanders is negated by insurance status, specifically public insurance programs such as Medicare.
 

68.07 Impact of Hospital Volume on Readmissions Following Interventions for Hypertrophic Cardiomyopathy

H. Khoury1, Y. Sanaiha1, S. Rudasill1, H. Xing1, A. Mardock1, J. Antonios2, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Cardiothoracic Surgery,Los Angeles, CA, USA 2David Geffen School Of Medicine, University Of California At Los Angeles,Los Angeles, CA, USA

Introduction:  Unplanned readmissions are considered a marker for quality of care and are associated with increased resource utilization. The literature on readmissions following alcohol septal ablation (ASA) and septal myectomy (SM) for the treatment of hypertrophic cardiomyopathy remains limited while the impact of center volume on such parameters is poorly characterized.

Methods:  The Nationwide Readmissions Database 2010-2015 containing approximately 17 million annual discharges in the U.S., was used to identify all adult (>18 years) patients with the diagnosis of hypertrophic cardiomyopathy. Diagnosis and procedural codes were used to identify patients undergoing ASA and SM, respectively. Hospitals were characterized based on annual ASA/SM case volume into low (LVH), medium (MVH), and high (HVH) volume tertiles. Student t-tests and chi-squared tests were used to analyze baseline continuous and categorical variables, respectively. The independent impact of hospital volume on mortality and readmission was determined using multivariable logistic regression.

Results: Of 7,957 patients who underwent septal reduction procedures, 4,870 (61.2%) underwent alcohol septal ablation and 2,727 (38.8%) underwent septal myectomy. Patients who underwent ASA at a LVH experienced higher rates of readmission (12.6 vs. 10.0 vs. 7.0%, P<0.001), emergent index admission (50.2 vs. 30.8 vs. 43.4%, P<0.001), overall in-hospital complications (28.1 vs. 24.5 vs. 20.6%, P=0.012), and comorbid atrial fibrillation (51.2 vs. 37.9 vs. 43.2%, P=0.005), and renal failure (12.9 vs. 11.6 vs. 8.2%, P=0.030) than those at MVH and HVH. Additionally, length of stay (5.2 vs. 4.4 vs. 4.2 days, P=0.033) and index ASA costs ($28,022 vs. $25,089 vs. $23,792, P=0.039) were greater in LVH. In contrast, patients who underwent SM at a LVH experienced lower rates of in-hospital complications (37.8 vs. 34.1 vs. 45.6%, P=0.049), and comorbid coagulopathy (8.5 vs. 19.0 vs. 23.5%, P<0.001) than patients who underwent SM at a HVH. For both procedures, rates of in-hospital mortality were not significantly different between hospital tertiles. While hospital SM volume was not identified as an independent predictor for thirty-day readmission, ASA performed LVH (adjusted OR, 1.96; 95% CI 1.20–3.21) or MVH (adjusted OR, 1.55; 95% CI 1.02–2.35) was an independent predictor for emergent thirty-day readmission, compared to HVH (Figure). 

Conclusion: Low and medium hospital volume were found to be associated with increased and thirty-day readmission following ASA, but not SM. Patients with hypertrophic cardiomyopathy should be referred to experienced centers for ASA to reduce rates of readmissions and decrease national healthcare expenditures.

 

68.06 Pre-Op IABP Placement Rates in Coronary Artery Bypass Grafting Patients by Day of Admission

G. A. Del Carmen1, A. Axtell1, D. Chang1, S. Melnitchouk2, T. M. Sundt2, A. G. Fiedler2  1Massachusetts General Hospital,Department Of Surgery,Boston, MA, USA 2Massachusetts General Hospital,Division Of Cardiac Surgery,Boston, MA, USA

Introduction:  Intra-Aortic Balloon Pumps (IABPs) can be utilized to provide hemodynamic support in high risk patients awaiting coronary artery bypass grafting (CABG).  There are many indications for IABP and institutional practice patterns regarding the placement of IABPs is variable.  As a result, the preoperative placement of an IABP in a patient awaiting CABG is not standardized and may vary according to non-clinical factors. We hypothesize that the rate of IABP placement varies by day of the week.

Methods:  A retrospective cohort analysis of the Office of Statewide Health Planning and Development database from 2006-2010 was performed. All patients admitted for CABG were included. Patients who died within 24 hours of admission and those who had absolute contraindications to IABP placement were excluded. The primary outcome was preoperative IABP placement versus non-placement. A multivariable logistic regression analysis to identify predictors of IABP placement was performed, adjusting for patient demographics, clinical factors, and system variables.

Results: A total of 46,347 patients underwent CABG, of which 7,695 (16.60%) had an IABP placed preoperatively. On unadjusted analysis, IABP rates were significantly higher on weekends versus weekdays (20.83% vs. 15.70%, p < 0.001). On adjusted analysis, patients awaiting CABG were 1.31 times more likely to have an IABP placed on weekends than on weekdays (OR: 1.31, 95% CI 1.23-1.40, p <0.001).

Conclusion: The odds of preoperative IABP placement prior to CABG is significantly increased on weekends compared to weekdays, even when controlling for clinical factors. Further exploration of this phenomenon and its associations are warranted.

 

68.05 Correlation Between Air Quality and Lung Cancer Incidence: A County By County Analysis

B. D. Hughes1, S. Maharsi1, H. Mehta1, S. Klimberg1, D. S. Tyler1, I. C. Okereke2  1University Of Texas Medical Branch,Department Of Surgery,Galveston, TX, USA 2University Of Texas Medical Branch,Division Of Cardiothoracic Surgery,Galveston, TX, USA

Introduction:
Lung cancer is the leading cause of cancer-related death with a geographic variability in its incidence.  Poor air quality has previously been associated with lung cancer development, but the risk associated with regional differences in air quality are poorly understood.  We hypothesized that there would be difference in the incidence of lung cancer by county in Texas associated with air quality indicators in that county. 

Methods:
For each county in Texas (n = 254), lung cancer incidence, air quality indicators (average particulate matter greater than 2.5 micrometers [PM2.5], radon levels), and known risk factors were obtained using data from the Texas Commission on Environmental Quality and the Texas Cancer Registry. Linear regression models were constructed to determine the association of air quality indicators with lung cancer incidence and advanced stage at diagnosis (stage III or IV), while controlling for county-level sociodemographic characteristics and smoking rates.

Results:
Lung cancer incidence ranged from 27.6 to 103.4 cases per 100,000 people (Figure 1).  After controlling for risk factors, PM2.5 was associated with increased lung cancer incidence (β = 4.38, p < 0.0001), and radon levels were not significantly associates with lung cancer incidence (β = -2.70, p=0.41).  Air quality indicators (PM2.5 and radon level) were not significantly associated with advanced cancer diagnosis.

Conclusion:
There are wide differences in incidence of lung cancer across Texas.  These differences appear to be related to air quality.  Identifying high-risk areas may help to guide strategies such as implementation of targeted lung cancer screening programs.
 

68.04 Impact of Hospital Safety-Net Status on Failure to Rescue after Major Cardiac Surgery

Y. Sanaiha1, A. Mantha1, H. Khoury1, S. Rudasill1, H. Xing1, A. L. Mardock1, B. Ziaeian2, R. Shemin1, P. Benharash1  1University Of California – Los Angeles,Division Of Cardiac Surgery,Los Angeles, CA, USA 2University Of California – Los Angeles,Division Of Cardiology,Los Angeles, CA, USA

Introduction: An increasingly utilized metric for assessing hospital quality performance is rescue after complications occur. While hospital safety-net status has been associated with inferior surgical outcomes and higher costs, the mechanism of this discrepancy is not well understood. We hypothesized that discrepant rates of failure to rescue following complications of routine cardiac surgery would explain the observed inferior outcomes at safety-net hospitals. 

Methods: The 2005-2014 National Inpatient Sample was used to identify adult patients undergoing elective coronary artery bypass graft, and isolated/concomitant valvular operations. Hospitals were stratified into low (LBH), medium (MBH) or high (HBH) burden categories based on the proportion of uninsured or Medicaid patients to emulate safety-net status defined by the Institute of Medicine. Cardiovascular, respiratory, renal, hemorrhagic, and infectious complication rates were calculated categorically and as a composite variable, minor/major composite comorbidity (MMC). Failure to rescue (FTR) was defined as mortality after occurrence of a complication. Multivariable logistic regression was utilized to perform a risk-adjusted predictive model of complications and FTR. Incremental adjusted cost of MMC was calculated using a linear regression model.

Results:Of an estimated 1,521,129 patients undergoing elective major cardiac operations, 2% experienced mortality while 36.1% suffered MMC. Compared to LBH patients, the HBH cohort was younger (HBH 64.1 vs. 66.7 years, P<0.0001), more commonly female (31.4 vs 30.0%, P<0.0001), and had a higher incidence of diabetes (35.0 vs 29.7%, P<0.0001) and morbid obesity (5.9 vs 3.9%, P<0.0001).  As shown in Figure 1, safety net hospitals are at higher odds of several complications, including tamponade and new dialysis, with a  concurrent higher risk for complication-related FTR. In contrast, HBH had higher odds of FTR of respiratory complications despite a lower adjusted risk of this complication category. Occurrence of MMC at HBH was associated with a $2,494 higher cost than at LBH, which would result in a cost-savings of 42.5 million for MMC if HBH had comparable costs and complication rates to LBH. 

Conclusion:Safety net hospitals were associated with higher FTR after occurrence of cardiovascular and renal complications. Despite elevated odds of septicemia at HBH, rescue of this complication is superior to LBH. Implementation of care-bundles to tackle cardiovascular, respiratory, and renal complications may impact the discrepancy in incidence and rescue of complications at safety-net institutions.

 

68.03 Patient Perceptions of Nurse Communication in HCAHPS Survey Predict 30-Day CABG Mortality Rates

S. J. Masoud1, O. K. Jawitz2, H. R. Phillips3, P. J. Mosca2  1Duke University Medical Center,School Of Medicine,Durham, NC, USA 2Duke University Medical Center,Department Of Surgery,Durham, NC, USA 3Duke University Medical Center,Department Of Medicine, Division Of Cardiology,Durham, NC, USA

Introduction: There is mounting evidence that safety culture and quality of communication within hospitals is linked to patient outcomes. Of outcomes publicly reported by Medicare through its Hospital Compare website, only 30-day mortality following coronary artery bypass graft (CABG) is attached to a specific surgical procedure. Communication quality is in part measured by the 25-item Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey, which Medicare uses in adjusting reimbursement to over 3,000 hospitals nationwide. We aimed to assess the relationship between HCAHPS patient ratings of doctor or nurse communication and CABG mortality.

Methods: HCAHPS and complications data were extracted from Medicare Hospital Compare (July 2018 update). Hospitals without CABG mortality data were excluded from the analysis. Pearson and multivariate partial correlation with multiple regression modeling measured the association of HCAHPS ratings, or the percent of surveys reporting providers “always” (rather than usually, sometimes, or never) communicated well, and case volume on log-transformed 30-day CABG mortality rates. Archived data (2014-2018) were used to explore the reciprocal effect of CABG mortality on HCAHPS ratings over time in repeated measures ANOVA with post-hoc main effects tests.

Results: Among 4,973 hospitals, 1,017 had available CABG data and were included in the study. Ratings of nurse and doctor communication each correlated inversely with CABG mortality (Pearson’s r = -.132, p < .001 and r = -.066, p < .035). When controlling for CABG case volume, only ratings for nurses correlated significantly (r = -.092, p < .01), with multiple regression predicting a 0.7% decrease in CABG mortality for each 1% increase in HCAHPS ratings (R2 = .074, p < .001). Repeated measures ANOVA (figure) showed that improvement of HCAHPS ratings was dependent upon whether a hospital ranked in the top or bottom half of included hospitals by CABG mortality (p = .004, ηp2 = .005). While both groups had comparable nurse communication ratings in 2014 (p = .849), low mortality hospitals had significantly higher ratings relative to high mortality hospitals by 2018 (p = .025).

Conclusions: HCAHPS patient ratings of nurse communication, but not doctor communication, had a small albeit significant inverse relationship with 30-day CABG mortality, even when controlling for CABG case volume. Moreover, low mortality hospitals demonstrated greater improvement in nurse communication ratings over time. Though not establishing causal relationships, our study suggests that a better understanding of how frontline staff communicate with patients may also inform efforts to improve surgical outcomes.

68.02 Characterization of Unplanned Early Readmission Following Extracorporeal Membrane Oxygenation

B. Kavianpour1,2, Y. Sanaiha1, H. Khoury1, S. E. Rudasill1, R. Jaman1, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Cardiothoracic Surgery,Los Angeles, CA, USA 2Stony Brook University School of Medicine,Department of Medicine,Stony Brook, NY, USA

Introduction:

With increasing dissemination and improved survival following of extracorporeal membrane oxygenation (ECMO), readmission reduction following ECMO hospitalization is a imminent priority. Early readmissions following hospitalization requiring ECMO have not been characterized at the national level. The present study aimed to identify predictors of early readmission in the largest, all-payer national discharge database.

Methods:

This was a retrospective cohort study using the Nationwide Readmissions Database to identify all adult patients (>=18 years) who underwent ECMO from 2010-2015 and survived index hospitalization. Patients were stratified as requiring ECMO for cardiac or respiratory etiologies. Cardiac ECMO included post-cardiotomy and cardiogenic shock patients while respiratory patients had no other concurrent cardiac diagnosis. All heart and lung transplant patients were excluded. The primary outcome of the study was early (30-day) rehospitalization after index ECMO encounter. Univariate analyses were performed for age, Elixhauser comorbidity index, and cost of readmission.  A multivariable logistic regression model was developed to predict the odds of urgent 30-day readmission.

Results:

Of an estimated 9,391 discharged patients who underwent ECMO, 4,352 (46.3%) required ECMO for primary cardiac indications while 3,669 (39.1%) required ECMO for primary respiratory failure. Unplanned readmission within 30 days of discharge was similar across both cardiovascular and respiratory ECMO groups (18.3 vs 16.2%, P=0.20). Readmission status was not associated with age in both patient populations (Cardiac: 60.2 vs. 58.8 years, P=0.19; Respiratory: 46.0 vs. 43.5 years, P=0.06). Readmitted patients had higher comorbidity score for cardiac indications but not for respiratory indications compared to the non-readmitted cohort (Cardiac: 5.8 vs. 5.2, P<0.01; Respiratory: 5.0 vs. 4.9, P=0.77). Coronary artery disease (CAD) was a significant predictor of readmission within 30 days for both cardiac and respiratory indications (Table 1). Renal failure and bleeding were significant predictors of readmission for cardiac indications while a prolonged length of stay (> 10 days) and infection were significant predictors for respiratory indications. The mean cost of urgent 30-day readmission was $174,713.40 (SE $10,280.97) for cardiac ECMO and $242,422.3 (SE $12.632.89) for respiratory ECMO.

Conclusion:

CAD, renal failure, and complications, such as bleeding and infection, during ECMO place patients at the highest risk for readmission within 30 days. Given the high costs of readmission following ECMO, attention to processes of discharge and outpatient care for this vulnerable population is warranted.

68.01 Lower Episode Payments for Transcatheter versus Surgical Aortic Valve Replacement

P. K. Modi1, M. Oerline1, D. Sukul3, C. Ellimoottil1, V. B. Shahinian2, B. K. Hollenbeck1  1University Of Michigan,Urology,Ann Arbor, MI, USA 2University Of Michigan,Nephrology,Ann Arbor, MI, USA 3University Of Michigan,Cardiology,Ann Arbor, MI, USA

Introduction:  While transcatheter aortic valve replacement (TAVR) was initially developed as a treatment option for patients ineligible for surgical aortic valve replacement (SAVR), its indications have expanded to include patients who would be candidates for surgery. As the use of TAVR continues to expand, it is essential to understand the economic impact of this substitution of TAVR for SAVR in real-world clinical practice. Therefore, we examined Medicare payments for TAVR and SAVR in episodes spanning from 90-days before surgery through 90-days after surgery.

Methods: We used a 20% national sample of fee-for-service Medicare beneficiaries who underwent TAVR or SAVR from 2012 through 2015. We used negative binomial regression models adjusted for age, race, sex, baseline health status (using Hierarchical Condition Categories risk score), socioeconomic class, and place of residence to estimate spending differences between TAVR and SAVR. We also examined the components of episodes to identify specific differences between the spending associated with these procedures. Finally, we assessed the effect of patient health status on the association between procedure type and payments.

Results: We identified 6,455 patients who underwent TAVR (34.3%) and 12,349 patients who underwent SAVR (65.7%) during the study period. The use of TAVR increased from 20.5% of all aortic valve replacements in 2012 to 46.7% in 2015. As TAVR replaced SAVR for the highest risk patients, the average baseline health status improved for both groups. Total adjusted TAVR episode payments were approximately 7% lower than payments for SAVR ($55,545 [95% confidence interval {95%CI} $54,643-56,446] vs $59,467 [95%CI $58,723-60,211], p<0.001). TAVR patients had higher pre-operative payments (Incidence rate ratio [IRR] 1.22 [95%CI 1.17-1.26], p<0.001), but lower payments during (IRR 0.96 [95%CI 0.94-0.98], p<0.001) and after the initial hospitalization for surgery (IRR 0.73 [95%CI 0.68-0.77], p<0.001). Episode payments increased with increasing comorbidity score, but this effect was greater for SAVR than TAVR.

Conclusion: After adjusting for patient factors, TAVR is associated with lower episode spending than SAVR due to savings during and after the initial hospitalization. As baseline health status of treated patients improves, the savings associated with TAVR relative to SAVR diminish.

 

67.10 Rapid Detection of Clostridium difficile Toxins in Stool by Raman Spectroscopy

S. Koya1, M. Brusatori1, S. Yurgelevic1, C. Huang1, L. N. Diebel2, G. W. Auner2  1Wayne State University,Smart Sensors And Integrated Microsystems, Michael And Marian Ilitch Department Of Surgery,Detroit, MI, USA 2Wayne State University,Michael And Marian Ilitch Department Of Surgery,Detroit, MI, USA

Introduction:
Clinical practice guidelines define Clostridium difficile infections (CDI) as diarrhea (≥3 unformed stools in 24 hrs.) with either a positive C. difficile stool test or detection of pseudomembranous colitis. Pathogenicity of CDI is due to toxins, toxin A (TcdA) and toxin B (TcdB). While presence of toxin is necessary for disease, detection of toxins using immunoassays is complex and lacks sensitivity. For this reason, positive C. difficile in stool by toxigenic culture (TC) and nucleic acid amplification testing (NAAT) are used. These tests are confounded by the presence of asymptomatic colonization of toxigenic C. difficile and leads to overdiagnosis of CDI. Raman spectroscopy (RS) is a novel technology that is used to detect bacteria and their toxins. RS doesn’t require antibodies for detection of toxins. We hypothesis that RS may be a sensitive method to detect C. difficile toxins in stool and solve overdiagnosis of CDI.

Methods:
CDI negative stool samples were spiked with varying concentrations of TcdA and TcdB. RS was performed on smear of stool on a mirror polished stainless-steel slide. RS of feces is difficult due to confounding background material and autofluorescence. The samples were photo-bleached to reduce autofluorescence. Raman spectra were obtained, background corrected, vector normalized and analyzed by machine learning methods including Support Vector Machine (SVM), Random Forest (RF) and Partial Least Square Linear Discriminant Analysis (PLS-LDA). The best model was chosen, and its accuracy was measured by train-test, cross-validation and bootstrap methods.

Results:
At 1ng/ml concentration, TcdA and TcdB spiked stool were distinguished from control stool by all models with various accuracies. SVM with linear kernel performed best for TcdA with an accuracy of 85% and and PLS-LDA performed best for TcdB with 75% accuracy.

Conclusion:
RS of feces is difficult due to autofluorescence. Despite the difficulty, RS successfully detected TcdA and TcdB in stool samples. The autofluorescence could be further decreased by using diluted samples of stool and the accuracy of the separation could be increased by deep learning algorithms. Thus, RS has the potential to rapidly detect C. difficile toxins in stool at clinically relevant concentrations, be a point of care diagnostic tool and solve overdiagnosis of CDI.  
 

67.09 UCH-L1 IS A SERUM BIOMARKER FOR PERIPHERAL NERVE INJURY IN A MODEL OF EXTREMITY ISCHEMIA

A. P. Bercz1, M. C. Morris1, F. Kassam1, R. Veile1, L. Friend1, R. Schuster1, T. A. Pritts1, A. T. Makley1, M. D. Goodman1  1University Of Cincinnati,Department Of Surgery, College Of Medicine,Cincinnati, OH, USA

Introduction:  Ubiquitin carboxy-terminal hydroxylase L1 (UCH-L1) is specifically expressed by neurons and its release reflects the neuronal response to injury. In humans and murine models, UCH-L1 is elevated in serum and cerebrospinal fluid following traumatic brain injury (TBI), hypoxic encephalopathy, and intracerebral hemorrhage, but it is unknown whether UCH-L1 is specific to central nervous system injury. Our lab has recently demonstrated that UCH-L1 is significantly elevated in a murine polytrauma model, regardless of TBI. In this study, we hypothesized that UCH-L1 would function as a marker of peripheral nerve injury induced by extremity ischemia.

Methods:  Mice were anesthetized mice with pentobarbital and the left femoral vessels isolated. Mice were then subjected to either femoral artery isolation alone (sham), femoral artery ligation, femoral vein ligation, femoral artery ligation after 60 minutes of controlled hemorrhagic shock followed by resuscitation with shed blood, or four hours of ischemia with a femoral artery clamp followed by reperfusion (I/R). Mice were sacrificed at 4, 24, or 72 hours after treatment. Serum was analyzed using enzyme-linked immunosorbent assays for UCH-L1, neuron specific enolase (NSE), and creatine kinase (CK), the latter two being known biomarkers of neuronal and muscle injury, respectively. The bilateral quadriceps muscles were harvested and analyzed using hematoxylin and eosin staining.

Results: UCH-L1 was significantly increased in the hemorrhagic shock/resuscitation group at 4 hours, compared to the sham, artery ligation, vein ligation, and I/R groups (Figure 1). No other groups had significant UCH-L1 elevations compared to Sham at 4 hours. In the setting of hemorrhagic shock, UCH-L1 elevation persisted through 72 hours. NSE was not as sensitive to these ischemic changes. CK was altered in a pattern similar to that of UCH-L1 in its temporal response profile and was significantly increased in the hemorrhagic shock/resuscitation group with respect to the other groups at 4 hours. Hematoxylin and eosin staining of the quadriceps revealed muscle necrosis in the hemorrhagic shock/resuscitation group only, demonstrated by increased inflammatory infiltrate surrounding enucleated fascicles.

Conclusion: The acute elevation of UCH-L1 in response to isolated, controlled hemorrhagic shock and resuscitation suggests that UCH-L1 is not specific for TBI or central nervous system injury. The lack of NSE elevation in response to femoral vessel ligation suggests that NSE may be a more specific biomarker for TBI than peripheral nerve injury. UCH-L1 may be utilized as an early marker of neuronal injury in the context of hemorrhagic shock, polytrauma, and isolated extremity ischemia.

 

67.08 The Role of Chemoprophylactic Agents in Modulating Hypercoagulability after Traumatic Brain Injury

F. Kassam1, M. C. Morris1, A. Bercz1, R. Veile1, L. Friend1, N. Beckmann1, C. C. Caldwell1, M. D. Goodman1  1University Of Cincinnati,Department Of Surgery, College Of Medicine,Cincinnati, OH, USA

Introduction:  The pathophysiology behind the subacute but persistent hypercoagulable state following traumatic brain injury is poorly understood but contributes to morbidity induced by venous thromboembolism (VTE). Because platelets and their microvesicles have been hypothesized to play a role in posttraumatic hypercoagulability, administration of commonly utilized agents for inflammation, VTE chemoprophylaxis, and sphingolipid modulation may ameliorate this coagulability. We hypothesized that utilization of aspirin, ketorolac, amitriptyline, unfractionated heparin, and enoxaparin would modulate the platelet and whole blood coagulation response following traumatic brain injury.

Methods:  A standard weight-drop system was utilized to induce concussive traumatic brain injury in mice. Following injury, mice were randomized into drug treatment groups to receive aspirin (100 mg/kg), ketorolac (5 mg/kg), amitriptyline (10 mg/kg), heparin (75 IU/kg), enoxaparin (3 mg/kg), or saline control (100µL) at 2 and 8 hours post-TBI. Mice were then sacrificed at 6 or 24 hours after injury and blood was drawn via cardiac puncture to determine coagulability by thromboelastometry (EXTEM and FIBTEM), platelet function testing with impedance aggregometry, and microvesiscle enumeration utilizing nanoparticle tracking analysis.

Results: Thromboelastometry results demonstrated that the platelet contribution to maximum clot firmness (%MCF-Platelet) at 6 hours (Figure 1) was significantly higher in mice that received aspirin (69%, p<0.002) or amitriptyline (68%, p<0.007) compared to mice that received saline (57%). At 24 hours, the %MCF-Platelet remained significantly higher in mice that received amitriptyline (66%, p=0.04) compared to those that received saline (63%). The overall ADP- and arachidonic acid-induced platelet aggregation was significantly lower in mice receiving ketorolac,  aspirin, and amitriptyline compared to mice receiving saline at 6 hours post-injury. By  24 hours after injury, mice that received aspirin (40 a.u., p<0.005), ketorolac (38 a.u., p=0.02), and enoxaparin (35 a.u., p=0.04) had significantly lower ADP-induced platelet aggregation than saline control mice (54 a.u.). However, there was no difference in the total or percentage of platelet-derived (CD41+) microvesicles between any treatment group at both 6 and 24 hours.

Conclusion: Following traumatic brain injury, amitriptyline decreased platelet aggregability and increased contribution to clot in a manner similar to aspirin. The effect of amitriptyline on platelet function reflects a possible role of acid sphingomyelinase in the hypercoagulability observed following injury and suggests sphingolipid metabolism as a novel target for multimodal VTE chemoprophylaxis. Additionally, inhibition of platelet reactivity may be an underappreciated benefit of low molecular weight heparins, such as enoxaparin, compared to unfractionated heparin use.

 

67.07 Developing Collateral Arteries Unravel Their Internal Elastic Laminae for Diameter Growth

R. M. McEnaney1,2, D. D. McCreary1, E. Tzeng1,2  1VA Pittsburgh Healthcare System,Vascular Surgery,Pittsburgh, PA, USA 2University Of Pittsburgh School of Medicine,Vascular Surgery,Pittsburgh, PA, USA

Introduction:   Collateral artery growth is a natural and sometimes life-preserving response to arterial occlusive disease.  This is because collateral arteries maintain end organ perfusion once a conductance artery becomes occluded.  “Outward remodeling” describes a process of active cellular activity and matrix turnover that expands the vessel diameter.   For outward remodeling to occur, matrix constraints on arterial lumen diameter must be released. These constraints exist due to the matrix structure of an artery. Elastic fibers coalesce to form lamellae and are the major structural elements within the intima and media.  These matrix elements create the elasticity and the luminal topography characteristic of arteries.  However, it is likely that the elastic lamellar structure also creates diameter restraint and must be degraded to achieve outward remodeling.

Methods:   A modified procedure of hind limb ischemia was performed in rats, as per our previously published report.  Animals were euthanized, and collateral artery tissues were harvested at up to 12 weeks and preserved in paraformaldehyde.  Microscopy was performed with an Olympus FluoView MPE Multiphoton microscope with second-harmonic generation to view elastin and collagen structure.

Results:  Striking structural alterations occur along with arterial diameter expansion that are persistent at 12 weeks. (Figure) We observed in collateral vessels which increased diameter up to threefold, a change in the internal elastic lamina from a nearly continuous, wrinkled and fenestrated sheet to a web-like appearance.  Collagen orientation in collateral arteries appears perturbed, with straightening of groups rather than the typical wavy, ribbon-like appearance.

Conclusion:  Outward remodeling is an important vascular adaptation capable of producing functional collateral arteries which are relevant to patients with cardiovascular diseases.  This study shows that during outward remodeling, the normal elastic lamellar structure of an artery becomes irrevocably altered. This also results in a topographical change in the luminal surface of the collateral artery.  Elastic fibers, structural ECM components unique to vertebrates, are integral to the function of the cardiovascular system and require a complex assembly known to occur only in development.  Perhaps more clinically relevant is understanding the mechanisms underlying this elastic remodeling to develop enhanced collateral artery growth for patients with arterial occlusive disease.

 

67.06 Lipidomic Analysis of Exosomes in Mesenteric Lymph after Intestinal Ischemia-Reperfusion Injury.

A. Senda1, K. Morishita1, M. Kojima1, S. Doki2, M. Yagi1, T. Kobayashi2, J. Aiboshi1, Y. Otomo1  1Tokyo Medical And Dental University,Departments Of Acute Critical Care And Disaster Medicine,Bunkyo-ku, Tokyo, Japan 2Ochanomizu University,Faculty Of Science,Bunkyo-Ku, TOKYO, Japan

Introduction: Intestinal ischemia-reperfusion (IR) leads to gut barrier failure that initiates a systemic inflammatory response resulting in acute lung injury/multiple organ dysfunction syndrome. Previous studies have shown that mesenteric lymph (ML) has a crucial role in driving the gut-mediated inflammation after IR. Our group have reported that lipid mediators, such as lysophosphatidylcholine (LPC) and arachidonic acid, contained in ML mediate organ injury following trauma/hemorrhagic shock (T/HS) via activation of immune cells. Concurrently, there is a research proving exosome as the major component of inflammatory response. Therefore, we investigated the bioactivity of exosomes released to the ML during intestinal IR injury. We also analyzed the profiles of exosomal lipids of ML to identify the specific mediators carried by ML exosomes.

Methods: Male Sprague-Dawley rats underwent laparotomy followed by ML duct cannulation. Animals were then subjected to 60 minutes of superior mesenteric artery clumping and de-clamping to induce gut IR injury. ML was obtained before (Pre) and after IR (Post-I/R). Exosomes were isolated from ML by ultracentrifugation. The biological activity of ML exosomes in each period was measured by monocyte NF-κB activation. The exosomal lipids in each group were extracted using of Bligh and Dyer method and performed liquid chromatography electrospray ionization mass spectrometry.

Results:The pro-inflammatory activities of exosomes harvested after IR were confirmed by an increase in NF-κB activation (5.6-fold increase compared to control). Lipid analysis of exosomes fraction of ML revealed significant increase in the concentration of polyunsaturated fatty acid (PUFA) (C18:2, C20:4) containing LPCs after IR when compared to that of control ML exosomes?(see Figure).

Conclusion:In response to IR injury, exosome fraction of ML shows significant increase in lipid mediators and triggers inflammation.