64.08 Preoperative Intraaortic Balloon Support: Is it Necessary for Significant Left Main Disease?

A. Fiedler1, C. Walsh1, G. Tolis1, G. Vlahakes1, T. MacGillivray1, T. Sundt1, S. Melnitchouk1  1Massachusetts General Hospital,Cardiac Surgery,Boston, MA, USA

Introduction: Some practitioners routinely place an intra-aortic balloon pump (IABP) preoperatively in high-risk patients undergoing coronary artery bypass grafting (CABG) such as those with significant left main (LM) disease identified in the setting of acute coronary syndrome.  Their value in the current era with increasing burden of comorbid conditions including peripheral vascular disease, however, has not been examined.

Methods: We identified 495 patients presenting with significant LM disease (≥ 50%) and acute coronary syndrome undergoing CABG between January 2002 and April 2014 using our institutional prospective cardiac surgery database. Of these, 198 patients had an  IABP placed preoperatively (IABP group) while the other 297 did not (No-IABP group). Operative mortality (30-day or in-hospital) and major complications were compared unadjusted and after adjustment using propensity score based on 25 pre-specified variables.

Results:  The IABP group patients had significantly worse baseline clinical profiles with higher rates of ST-elevation myocardial infarction (12.0% vs. 3.2%), critical status (15.3% vs. 3.5%), and lower left ventricular ejection fraction (50.2±15.3% vs. 57.9±13.1%), but had superior renal function (all P values<0.01) compared with those in the No-IABP group.  As might be expected, unadjusted operative mortality rates were significantly higher in IABP group (8.8% and 3.2% in P=0.006) as were major complications (cardiac arrest, need for mechanical support, neurologic injuries, requirement for dialysis, limb ischemia, multi-organ failure, pneumonia and pulmonary thromboembolism; 26.9% (58/216) vs. 14.83% (47/317); P=0.001) as compared with No-IABP group. With propensity score matching, however, 109 pairs of patients well balanced with regard to all the baseline variables (P values, 0.27-0.99) were identified. In this matched cohort, the IABP group still had a higher operative mortality (odds ratio [OR], 3.52; 95% CI, 0.95-13.14; P=0.060) and rate of major complications (odds ratio, 1.84; 95% CI, 1.00-3.39; P=0.050) compared with the No-IABP group. The increased risk of mortality and morbidity were validated by propensity-adjustment models and inverse-probability-treatment weighting. Higher rates of major complication by the IABP usage were mainly derived by increased risks of renal failure, limbs ischemia and multi-organ failure. Although there may have been an impact of unmeasured covariates unaccounted for in the matching scheme, the nature of the increased morbidity is consistent with known complications of IABP. 

Conclusion:  Preoperative use of IABP in patients with significant LM disease in the setting of acute coronary syndrome did not reduce 30-day mortality following CABG, but was associated with greater morbidity. Circumspection in the prophylactic use of IABP in this clinical setting may be advisable.

 

64.07 Grade of Differentiation Affects Survival after Lobectomy for Stage-I Neuroendocrine Cancer

D. T. Nguyen4, J. P. Fontaine1,2,3, L. A. Robinson1,2,3, R. J. Keenan1,2,3, E. M. Toloza1,2,3  1Moffitt Cancer Center,Thoracic Oncology,Tampa, FL, USA 2University Of South Florida Morsani College Of Medicine,Surgery,Tampa, FL, USA 3University Of South Florida Morsani College Of Medicine,Oncologic Sciences,Tampa, FL, USA 4University Of South Florida,Morsani College Of Medicine,Tampa, FL, USA

Introduction:   Non-small cell lung cancer (NSCLC) is staged using Tumor-Node-Metastasis(TNM) status.  Tumor histology is graded based on being well differentiated (Grade1), moderately differentiated (Grade2), or poorly differentiated (Grade3).  We investigated effects of histologic grade on survival of patients (pts) with stage-I neuroendocrine tumors after surgical resection.

Methods:   Using the Surveillance, Epidemiology, and End Results (SEER) database, we identified pts who underwent lobectomy for stage-I (T1N0M0 or T2N0M0) neuroendocrine carcinoma during 1988-2013, but excluded those who had multiple primary NSCLC tumors.  We grouped pts by histologic grade and performed Kaplan-Meier survival analyses, with log-rank test to compare 5-yr cancer-specific survival between grades, between T status, and between 1988-2003 versus 2004-2013.

Results:  Of 515 study pts, 348 were T1N0 pts, of whom 200 were Grade1, 52 were Grade2, and 96 were Grade3, and 167 were T2N0 pts, of whom 53 were Grade1, 31 were Grade2, and 83 were Grade3.  During 1988-2013, T1N0 Grade3 pts had worse 5-yr survival than either T1N0 Grade2 or Grade1 pts (52.6% vs. 82.8% or 97.6%; p<0.001) and worse mean survival time (MST) than either T1N0 Grade2 or Grade1 pts (42.8 mon vs. 56.2 mon or 59.4 mon, p<0.05).  In contrast, both T2N0 Grade3 and Grade2 pts had worse 5-yr survival than T2N0 Grade1 pts (54.0% or 68.4% vs. 93.6%; p=0.001) and MST (41.2 mon or 47.9 mon vs. 56.8 mon, p<0.05).  For all Grade2 pts during 1988-2013, T2N0 pts had worse 5-yr survival than T1N0 pts (68.4% vs. 82.8%; p=0.07 by Log Rank Mantel-Cox, but p=0.038 by Breslow-Wilcoxon and p=0.047 by Tarone-Ware) and worse MST than T1N0 pts (47.9 mon vs. 56.2 mon, p<0.05).  In contrast, for all Grade3 pts, T1N0 and T2N0 pts had similar 5-yr survival (52.6% vs 54.0%; p>0.53) and MST (41.2 mon vs. 42.8 mon, p>0.05) during 1988-2013.  Neither Grade2 T1N0 or T2N0 nor Grade3 T1N0 or T2N0 had 5-yr survival or MST that significantly changed between 1988-2003 versus 2004-2013 (p>0.05). 

Conclusion:  Using SEER data, we found that histologic grade significantly affected survival of stage-I neuroendocrine carcinoma pts after pulmonary lobectomy, with Grade3 pts having significantly worst survival than Grade2 or Grade1 for both T1N0 and T2N0 pts, but with Grade2 pts having worse survival than Grade1 only for T2N0 pts.  These results suggest that histologic grade should be considered when determining adjuvant therapy and prognosis for surgically resected stage-I neuroendocrine carcinoma pts.

64.06 Outcomes and Costs of Surgical versus Transcatheter Aortic Valve Replacement

Y. Juo1, A. Mantha1, P. Benharash1  1University Of California – Los Angeles,Cardiac Surgery,Los Angeles, CA, USA

Introduction:
Surgical aortic valve replacement (SAVR) is considered the standard of care for adults with severe symptomatic aortic stenosis. More recently tanscatheter aortic valve replacement (TAVR) is being utilized in patients with high surgical risk with encouraging results. With rapidly evolving technology, application of TAVR is expected to reach moderate and low-risk cohorts. We aim to examine the national outcomes and costs of SAVR versus TAVR in the first year following US Food and Drug Administration (FDA) market approval of TAVR in the United States.

Methods:
Patients who underwent elective SAVR or TAVR in the year 2013 were identified from Nationwide Inpatient Sample, a longitudinal inpatient health care database with weighted estimates of more than 35 million annual hospitalizations. Baseline demographics, primary diagnoses, income/payer type and hospital characteristics were tabulated. The primary outcome was in-hospital mortality while secondary outcomes included hospital length of stay (LOS) and overall hospitalization cost.

Results:
During the study period, a total of 68,845 discharges after aortic valve replacements were identified from Nationwide Inpatient Sample, 12,125 (17.6%) of which were TAVR. Compared to SAVR, the TAVR patients were more likely to be elderly (> 85 y/o: 42.4% vs 6.1%, p<0.05), less likely to be uninsured or Medicaid recipients (1.2% vs 6.7%, p<0.05), and and less likely to be of low income status (21.2 vs 22.7%, p<0.05). Primary disease severity information was not available in the database. Unadjusted in-hospital mortality was significantly higher among TAVR than SAVR patients (4.62% vs 3.01%, p<0.05). SAVR was associated with a significantly longer mean LOS (9.7 vs 8.6 days, p<0.05) but lower mean overall hospitalization cost ($184,715 vs $218,246, p<0.05).

Conclusion:
Our results demonstrate rapid adoption of TAVR technology in the US in the year following FDA market approval. As expected based on selection criteria, SAVR is associated with less in-hospital mortality and less overall hospitalization cost than TAVR. The post-market mortality in TAVR patients was significantly lower than previously reported in the PARTNER trial. Adoption of TAVR technology and its impact on SAVR warrant further investigation in order to develop optimal decision algorithms for SAVR versus TAVR.
 

64.05 Venous thromboembolism Following Lung Transplantation: A study of the National Inpatient Sample.

J. K. Aboagye1, J. W. Hayanga4, B. D. Lau1, D. Shaffer1, P. Kraus2, D. Hobson1, M. Streiff3, J. D’Cuhna4, E. R. Haut1,5  1Johns Hopkins University School Of Medicine,Surgery,Baltimore, MD, USA 2Johns Hopkins School Of Medicine,Pharmacy,Baltimore, MD, USA 3Johns Hopkins University School Of Medicine,Hematology,Baltimore, MD, USA 4University Of Pittsburgh Medical Center,Cardiothoracic Surgery,Pittsburgh, PA, USA 5Johns Hopkins Bloomberg School Of Public Health,Health Policy,Baltimore, MD, USA

Introduction:

Previous studies on the incidence of and risk factors for venous thromboembolism (VTE) following lung transplantation (LT) were largely limited to single centers, hence limiting their generalizability. The purpose of this study was to estimate the incidence of VTE, identify risk factors associated with VTE post LT using a nationally representative sample of patients. We also aimed to determine the impact of VTE following LT on in-hospital mortality, length of hospitalization and cost.

Methods:
We retrospectively examined the National Inpatient Sample database to identify patients who had undergone LT from 2000 to 2011. We calculated the incidence of VTE and predictors of VTE following LT. In multivariate analyses we estimated the association between VTE and in-hospital mortality, length of hospitalization and total hospital cost.

Results:
A total of 16,318 adult lung transplant recipients underwent LT during the study period. Of these 1029 (6.3%) developed VTE post-operatively. This comprised of 854 (5.4%) and 175 (1.1%) who developed only deep vein thrombosis and pulmonary embolism respectively. The predictors of VTE in this cohort included age greater than 60 years, (OR 1.37 95% CI 1.02-1.85), female gender, (OR 0.61 95% CI 0.45-0.84) receiving mechanical ventilation support (OR 2.35 95% CI 1.77- 3.13) and receiving extracorporeal membrane oxygenation support 1.75 95% CI (1.05-2.91). The adjusted odds of in-hospital mortality in LT recipients who had VTE was twice as much as their counterparts who did not. (OR 1.89 95% CI 1.17-3.06).  On the average, the length of hospitalization and total cost of hospitalization were 46% (95% CI 33-58) and 29% (95% CI 21-36) higher for LT patients who developed VTE compared with those who did not after adjusting for confounders.

Conclusion:
VTE is a frequent complication after lung transplantation with an associated odds of increased mortality, total hospital length of stay and cost.  There is the need to reexamine prophylaxis practices in these patients to address this complication.
 

64.04 Sequelae of Donor-Derived Ureaplasma Transmission in Lung Recipients

R. Fernandez1, M. M. DeCamp1, D. D. Odell1, A. Bharat1  1Feinberg School Of Medicine – Northwestern University,Division Of Thoracic Surgery,Chicago, IL, USA

Introduction:  Hyperammonemia is a fatal syndrome of unclear etiology following lung transplantation. It is postulated to result from an inborn error of Urea-cycle metabolism. We  recently demonstrated that a Mollicute, Ureaplasma, causes this syndrome. Here, we further investigated the source of Ureaplasma infection and the incidence of hyperammonemia.

Methods:  Consecutive lung transplant recipients were prospectively evaluated between July 2014 and May 2016. All recipients had pre-transplant urine and bronchoalveolar lavage fluid (BALF) tested for all Mollicutes (Ureaplasma and Mycoplasma species) by PCR and culture, and antimicrobial susceptibilities determined. Additionally, BALF from all donors was tested. Patients found positive for Mollicutes pre-transplant were successfully treated using antimicrobials based on the antimicrobial susceptibility. Ammonia was analyzed in all patients post-transplant. Ureaplasma isolates were grown in specialized culture media and titrated amounts were injected into immunocompetent C57BL6 mice to determine development of hyperammonemia, thereby testing Koch’s third postulate. 

Results: Human Studies: Five of the 29(17%) recipients tested positive for Mollicutes (Ureaplasma=4, Mycoplasma=1) in urine pre-transplant and were successfully treated. Native lung BALF from all patients was negative for Mollicutes. BALF from four (14%) donors was positive for Ureaplasma, but not Mycoplasma. These donors were younger, 23.3 vs 38.3 years (p<0.001), tended to be male, sexually active, and all had aspiration. All recipients of Ureaplasma-positive, but not of Ureaplasma-negative, donor lungs developed hyperammonemia and demonstrated increased morbidity and mortality. One isolate revealed macrolide-resistance associated with a novel ribosomal mutation. All other isolates were pan-susceptible to macrolides, fluoroquinolones, and tetracycline. All recipients demonstrated a decrease in ammonia levels within 24-hours of antimicrobial therapy and normalization within 7 days. Murine Studies: Human isolates of Ureaplasma led to dose-dependent hyperammonemia in wild-type mice (Ammonia >3 times the baseline, p<0.001). Treatment with antibiotics to which the isolate was susceptible prevented hyperammonemia. 

Conclusion: Ureaplasma infection in lung recipients is transmitted via donor lungs resulting in significant morbidity and mortality. The incidence of Ureaplasma infection is 14%, higher than previously reported, which may support the role of routine donor screening. Early recognition and treatment of Ureaplasma infection can improve lung transplant outcomes. 

 

64.02 Learning curve of minimally invasive Ivor-Lewis esophagectomy

F. Van Workum1, G. H. Berkelman3, A. E. Slaman4, M. Stenstra1, M. I. Van Berge Henegouwen4, S. S. Gisbertz4, F. J. Van Den Wildenberg5, F. Polat5, M. Nilsson2, T. Irino2, G. A. Nieuwenhuijzen3, M. D. Luyer3, C. Rosman1  1Radboudumc,Surgery,Nijmegen, GELDERLAND, Netherlands 2Karolinska Institutet,Surgery,Stockholm, -, Sweden 3Catharina Hospital,Surgery,Eindhoven, BRABANT, Netherlands 4AMC,Surgery,Amsterdam, NOORD HOLLAND, Netherlands 5Canisius-Wilhelmina Ziekenhuis,Surgery,Nijmegen, GELDERLAND, Netherlands

Introduction: Totally minimally invasive Ivor-Lewis esophagectomy (TMIE-IL) has a learning curve but the length of the learning curve and the extent of learning curve associated morbidity for surgeons experienced in TMIE-McKeown is unknown.

Methods: This study was performed in 4 high volume European esophageal cancer centers from December 2010 until April 2016. Surgeons experienced in TMIE-McKeown changed operative technique to TMIE-IL. All consecutive patients with esophageal carcinoma undergoing TMIE-IL with curative intent were included. Baseline, surgical and outcome parameters were analyzed in quintiles and were plotted in order to explore the learning curve. Textbook outcome (the percentage of patients in which the process from surgery until discharge was <21 days and uneventful in terms of complications, interventions, mortality and oncological aspects) was also analyzed. CUSUM analysis was performed in order to determine after how many cases proficiency was reached. An area under the curve analysis was performed to calculate the learning associated anastomotic leakage and costs.

Results: Four hundred and sixty eight patients were included. In one hospital, ASA classification was significantly higher in quintile 2 and 3 (p=0.01) and in one hospital, more distal esophageal tumors were operated in quintile 4 and 5 (p=0.01). In the pooled curve analysis, anastomotic leakage decreased from 26% at introduction of MIE-IL to 8% at the plateau phase which occurred after 121 cases. Textbook outcome increased from 39% to 60% and the plateau phase occurred after 128 cases. Learning curve associated anastomotic leakage occurred in 42 patients and this excess morbidity was associated with more than € 2 million in healthcare costs.

Conclusion: TMIE-IL has a significant learning curve. Learning curve associated morbidity and costs are substantial, even for surgeons experienced in TMIE-McKeown. The length of the learning curve was more than 100 operations.

 

64.01 Drivers of Variation in 90-day Episode Payments for Coronary Artery Bypass Grafts (CABG)

V. Guduguntla1,2,5, J. Syrjamaki1,5, C. Ellimoottil1,3,4,5, D. Miller1,3,4,5, P. Theurer1,6, D. Likosky1,7, J. Dupree1,3,4,5  1University Of Michigan,Ann Arbor, MI, USA 2University Of Michigan,Medical School,Ann Arbor, MI, USA 3Institute For Healthcare Policy And Innovation,Ann Arbor, MICHIGAN, USA 4Dow Division Of Health Services Research,Department Of Urology,Ann Arbor, MICHIGAN, USA 5Michigan Value Collaborative,Ann Arbor, MICHIGAN, USA 6The Michigan Society Of Thoracic And Cardiovascular Surgeons Quality Collaborative,Ann Arbor, MICHIGAN, USA 7Section Of Health Services Research And Quality,Department Of Cardiac Surgery,Ann Arbor, MICHIGAN, USA

Introduction:  Coronary Artery Bypass Grafting (CABG) is a common and expensive surgery. Starting in July 2017, CABG will become part of a mandatory Medicare bundled payment program, an alternative payment model where hospitals are paid a fixed amount for the entire care process, including care for 90-days post-discharge. Details on the specific drivers of CABG payment variation are largely unknown, and an improved understanding will be important for policy makers, hospital leaders, and clinicians.

Methods:  We identified patients undergoing CABG at 33 non-federal hospitals in Michigan from 2012 through June 2015 using data from the Michigan Value Collaborative, which includes adjusted claims from Medicare and Michigan’s largest private payer. We calculated 90-day price-standardized, risk-adjusted, total episode payments for each of these patients, and divided hospitals into quartiles based on mean total episode payments. We then disaggregated payments into four components: readmissions, professional, post-acute care, and index hospitalization. Lastly, we compared payment components across hospital quartiles and determined drivers of variation for each component.

Results: We identified a total of 5,910 patients across 33 Michigan hospitals.  The mean age was 68 and 74% were male.  The mean 90-day episode payment for CABG was $48,571 (SD: $20,739; Range: $11,723-$356,850). The highest cost quartile had a mean total episode payment of $45,487 compared to $54,399 for the lowest cost quartile, resulting in a difference of $8,912 or 16.4%. The highest cost quartile hospitals, when compared to the lowest cost quartile hospitals, had greater readmissions (1.35x), professional (1.34x), post-acute care (1.30x), and index hospitalization payments (1.15x) (all p<0.05). The main drivers of this variation are patients with multiple readmissions, increased use of evaluation and management (E&M) services, higher utilization of inpatient rehabilitation, and diagnosis related group distribution, respectively.

Conclusions: Significant variation exists in 90-day CABG episode payments for Medicare and private payer patients in Michigan. These results have implications for policymakers implementing CABG bundled payment programs, hospital administrators leading quality improvement efforts, and clinicians caring for these patients. Specifically, hospitals and clinicians entering bundled payment programs for CABG should work to understand local sources of variation, including multiple readmissions as well as inpatient E&M and post-discharge rehabilitation services. 

61.08 IL-10 Regulates the Perivascular Hyaluronan Metabolism to Improve Pulmonary Hypertension in CDH

S. Balaji1, M. Shah2, X. Wang1, M. Phillips2, M. Fahrenholtz1, C. M. Moles1, M. Rae1, S. G. Keswani1, S. E. Mclean2  1Texas Children’s Hospital And Baylor College Of Medicine,Division Of Pediatric Surgery,Houston, TX, USA 2University Of North Carolina At Chapel Hill School Of Medicine,Division Of Pediatric Surgery,Chapel Hill, NC, USA

Introduction: The management of pulmonary arterial hypertension(PAH) associated with congenital diaphragmatic hernia(CDH) is challenging. In CDH, the pulmonary arteries have thick walls due to smooth muscle cell hyperplasia, increased collagen deposition, and marked inflammation, leading to fibrotic remodeling of the perivasculature. Slit3-/- mice demonstrate CDH at birth and develop PAH over time with associated alterations in hyaluronan(HA) metabolism in perivascular matrix in the lungs. Inhaled IL-10 decreases or reverses the development of PAH in Slit3-/- mice with CDH. The role of IL-10 in regulating HA in pulmonary artery smooth muscle cells(PASMC) and pulmonary fibroblasts(PFB) has not been examined.

Methods: Primary PASMC and PFB isolated from C57BL/6J wildtype mice were cultured under static and mechanical tension and were treated with murine IL-10(200ng/ml). To determine the effect of IL-10 on regulation of pulmonary perivascular HA in CDH, 2-3m old Slit3-/- mice were treated twice with IL-10 mixed with HA-hydrogel, administered intranasally 7d apart. Lungs (n=3-4/group) were harvested and embedded. RNA was isolated from cell cultures and frozen lung samples, and hyaluronan synthases(HAS1-3) and hyaluronidases(Hyal1-2) were measured (qPCR). HA expression and localization(HA-binding protein) and leucocyte(CD45) and macrophage(CD206) infiltration was determined histologically in paraffin sections. Data represented as mean+/-SD; p<0.05 denotes significance; t-test
 

Results: PASMC expressed increased IL-6(2.8 fold; p<0.05), HAS1(31.4 fold; p<0.005) and MMP9(15.5 fold; p<0.05), but expressed lower collagen 1 and 3(1.75 fold; p<0.05) and MMP2(3.3 fold; p=0.005) under mechanical stress, which may partly explain the dysregulation of the HA in the perivascular matrix in CDH lungs. IL-10 over expression did not alter the HAS1, 2 or 3 expression of the PASMC under normal culture conditions. In PFB, however, IL-10 treatment significantly increased HAS1(2.5 fold; p<0.05) and decreased Hyal1(1.5 fold; p<0.05), suggesting a differential regulation of HA metabolism by IL-10 in the key cell types responsible for perivascular matrix turnover. Slit3-/- mice that developed PAH demonstrated significantly more HA deposition in the perivasculature, but exhibited a less dense, dysregulated matrix structure. Inhaled IL-10 hydrogel treatment for 2wk resulted in a more cohesive and dense distribution of HA. IL-10 treatment also significantly reduced inflammatory response in Slit3-/- murine lungs: CD45 infiltration(3.2 fold decrease; p<0.05) and macrophage infiltration(1.5 fold decerase; p<0.05)

Conclusion: These data provide evidence for a possible role for IL-10 in the regulation of altered HA metabolism, along with its anti-inflammatory role in the attenuation of PAH in CDH. Targeting the ECM of the pulmonary vasculature would represent a paradigm shift in treatment of PAH in CDH patients and the possible development of novel therapeutics.

 

57.18 Innovative High Quality Model for Thoracic Surgery Education Using Soft Embalmed Cadaver

J. H. Mehaffey1, E. J. Charles1, B. Lapierre1, D. Sikon1, M. E. Roeser1, I. L. Kron1, L. T. Yarboro1  1University Of Virginia,Surgery,Charlottesville, VA, USA

Introduction:  General surgery training programs face the growing challenge of educating trainees in limited time, with increasingly complex cases, while maintaining the highest quality of care and the lowest complication rate. These constraints have resulted in a case volume shift to senior residents and fellows with increased oversight and decreased trainee autonomy. The purpose of this study was to determine the value of clinically relevant surgical simulation for resident training in thoracic surgery. We hypothesized that the use of our innovative thoracic model would provide improved preparation in a controlled environment with reduced cost.

Methods:  Using a proprietary soft embalming method of cadaver preservation pioneered at our institution we created a clinically relevant model of thoracic surgery. As a pilot study, 10 residents completed a structured curriculum including lecture, directed surgical simulation and oral board style questioning on major topics in thoracic surgery relevant to general surgeons, including trauma thoracotomy and esophagectomy.

Results: Cost-analysis demonstrates the soft embalming method of preservation was superior to single-use fresh frozen cadavers (both $2500) but soft embalmed allows multiple teaching sessions with durability up to a year. Resident feedback supported the hypothesis of improved preparation, with survey results listed in Table 1.  Example feedback included “helps drive home techniques I don't often see in clinical practice.” 

Conclusion: This study demonstrates clinically relevant surgical simulation successfully added to surgical training using a novel soft embalmed cadaver model of thoracic surgery. Given current constraints in general surgery training, this model provides a cost and time effective adjunct during residency training.

 

57.01 Lung Cancer Screening in Community Primary Care Settings: A Pilot Study

M. Masika1,2, M. Mahoney1,2, O. Lucas1,2, B. Bigham2,3, K. Attwood1,2, M. Reid1,2, L. Ansari1,2, W. Underwood1,2, D. Irwin1,2, C. Nwogu1,2  1Roswell Park Cancer Institute,Buffalo, NY, USA 2State University Of New York At Buffalo,Buffalo, NY, USA 3Howard University College Of Medicine,Washington, DC, USA

Introduction:

Lung cancer is the greatest cause of cancer mortality in the United States. This is in large part due to the late presentation of most cases. Previous efforts at lung cancer screening with chest radiography or sputum cytology were unsuccessful. The National Lung Screening Trial (NLST) demonstrated that screening with the use of low-dose Computed Tomography (CT) reduced lung cancer specific mortality in academic centers. However, implementing lung cancer screening in a community setting may be challenging. Frequently, there is a delay between important research findings and adoption in an average clinical practice. This study surveyed primary care clinicians’ knowledge of lung cancer screening and promoted the use of screening guidelines.

Methods:

The target population consisted of primary care clinicians from four community based healthcare centers. A brief survey was conducted at the centers assessing the clinicians’ baseline knowledge about lung cancer screening. Promotion of national lung cancer screening guidelines was accomplished via interactive group educational sessions.

Results:

The study group included 18 physicians, 8 nursing practitioners and 2 physician assistants. Target accrual was 60 clinicians, but only 28 (46%) participated. 21% of the clinicians were not familiar with any effective screening modality. 39% of the clinicians reported being aware of screening methods, but had not incorporated them into their practice. 50% of the clinicians were uncertain about how the CT scans were to be paid for.

Conclusion:

There seems to be a knowledge gap about lung cancer screening in the average community primary care setting. Engaging primary care clinicians in survey studies or educational programs is challenging because of demanding clinical schedules. Creative strategies are required to overcome these challenges which may include online surveys, webinars and compensated focus-groups. A larger follow-up study will be performed to better understand these issues, with an ultimate goal of increasing lung cancer screening and changing the stage distribution of lung cancer. 

45.01 Is it cancer? Quantifying the clinician guessing-game

A. W. Maiga1,2, S. A. Deppen1,2, R. Pinkerman2, C. Callaway-Lane2, R. S. Dittus1,2, E. Lambright1,2, J. Nesbitt1,2, E. L. Grogan1,2  1Vanderbilt University Medical Center,Nashville, TN, USA 2VA TN Valley Healthcare System,Nashville, TN, USA

Introduction:
Clinical guidelines recommend that clinicians estimate the probability of malignancy for patients undergoing evaluation of indeterminate pulmonary nodules (IPN) > 8mm. Adherence to these guidelines is unknown. Our objective was to determine whether clinicians document the probability of malignancy in high risk IPNs, and to compare these quantitative or qualitative predictions with the validated Mayo prediction model.

Methods:
We queried our retrospective single-institution surgical database of 298 Veteran patients who underwent lung resections for known or suspected lung cancer from 2003 to 2015. We reviewed preoperative documentation from pulmonary and thoracic surgery providers, as well as multidisciplinary tumor board presentations. Any documented quantitative or qualitative predictions of malignancy were extracted and summarized using descriptive statistics. We compared clinicians’ quantitative and qualitative predictions of malignancy to risk estimates from the Mayo prediction model. 

Results:
Cancer prevalence was 88% (261/298). Only 13 patients (4%) had a documented quantitative prediction of malignancy prior to tissue diagnosis; 217 (76%) of the remaining 285 patients had a qualitative risk statement. By service, 62% (185/298), 47% (76/163), and 28% (27/96) of pulmonary, thoracic surgery, and tumor board notes, respectively, documented a qualitative estimate of malignancy risk prior to tissue diagnosis. After the American College of Chest Physicians updated their guidelines in 2007 to include a recommendation to document the pre-test probability of malignancy, the proportion of thoracic surgery notes including a qualitative risk statement increased from 36% (31/86) to 58% (45/77), whereas the portion of pulmonary and tumor board notes documenting this did not change. Qualitative risk statements fell into 32 broad categories. The most frequently used statements (Table 1) aligned well with Mayo model predictions.

Conclusion:
Clinicians do not provide quantitative documentation of the probability of cancer for IPNs in high-risk lesions. Qualitative statements of risk in current practice are highly variable but correlate well to Mayo model predictions. A standard quantitative scale that correlates with predicted risk for IPNs should be used to communicate with patients and other providers. 
 

43.20 Functional Recovery in Transfemoral Versus Transapical Transcatheter Aortic Valve Replacement

N. K. Asthana1, A. Mantha4, G. Vorobiof3, P. Benharash2  1University Of California – Los Angeles,Los Angeles, CA, USA 2University Of California – Los Angeles,Cardiothoracic Surgery,Los Angeles, CA, USA 3University Of California – Los Angeles,Cardiology,Los Angeles, CA, USA 4University Of California – Irvine,Orange, CA, USA

Introduction: Transcatheter aortic valve replacement (TAVR) has greatly improved treatment options for severe aortic valve stenosis patients (AS) at high surgical risk. Typically, a transfemoral (TF) approach is preferred due to being less invasive than a transapical (TA) approach. However, in patients where peripheral access is limited due to tortuosity, size, or calcification, a TA approach is preferred. This study assessed whether myocardial functional recovery differed significantly post-TAVR between patients who received a TF approach vs. a TA approach.

Methods: Echocardiograms of all severe AS patients that underwent TAVR at Ronald Reagan UCLA Medical Center from 2012-2016 were evaluated. Parameters that were assessed include left ventricular ejection fraction (LVEF), left ventricular internal diameter (LVID), interventricular septal thickness at end-diastole (IVSd), and posterior wall thickness at end-diastole (PWd). Moreover, left ventricular segmental longitudinal strains and global longitudinal strain (GLS) were measured using two-dimensional speckle tracking echocardiography (2D-STE). Echocardiograms were evaluated pre-TAVR (mean: 20.1 d), post-operatively (mean: 2.5 d), and at a 1-month follow-up (mean: 32.7 d). Statistical analysis was conducted using a repeated measures analysis of variance (rANOVA), where p < .05 was considered significant.

Results: Of the 216 patients assessed, 42 patients had complete data available. Patients that underwent TAVR with a TF approach (N = 31, 67% male, 81.6 y in age) were compared to those that underwent a TA approach (N = 11, 55% male, 87.3 y in age). For the entire cohort, between the pre-TAVR baseline and the 1-month follow-up: (i) There were no significant changes in LVEF, LVID, IVSd, or PWd (p > .05). (ii) Segmental longitudinal strains significantly increased in the apex (from -18.9 to -21.5%, p < .0001), anterior segments (from -15.5 to 18.3%, p < .0001), lateral segments (from -14.0 to -17.1%, p < .0001), inferior segments (from -14.9 to -18.1%, p < .0003), and septal segments (from -14.2 to -16.9%, p < .0002). (iii) GLS significantly improved (from -15.6 to ?18.2%, p < .001). When comparing the TF and TA groups, there were no significant differences in LVEF, LVID, IVSd, PWd, GLS, and anterior, lateral, inferior, and septal segmental longitudinal strains (p > .05). However, there was a significant difference in longitudinal strain at the apex between the TF and TA groups (TF vs. TA at 1-month follow-up: -22.3 +/- 7.63% vs. -15.9 +/- 7.47% respectively, p < .05).

Conclusion: Patients that underwent a TF approach showed significantly greater post-TAVR improvement in apical longitudinal strain, although myocardial functional recovery did not significantly differ between TF vs. TA groups otherwise. Additionally, it appears that myocardial strains measured by 2D-STE are more sensitive in detecting subclinical functional changes compared to more customary measures of cardiac remodeling.

 

43.19 Patterns of Mediastinal Metastasis after Robotic-Assisted Lobectomy for Non-Small Cell Lung Cancer

R. Gerard4, F. O. Velez-Cubian2, E. P. Ng4, C. C. Moodie1, J. R. Garrett1, J. P. Fontaine1,2,3, E. M. Toloza1,2,3  1Moffitt Cancer Center,Thoracic Oncology,Tampa, FL, USA 2University Of South Florida Morsani College Of Medicine,Surgery,Tampa, FL, USA 3University Of South Florida Morsani College Of Medicine,Oncologic Sciences,Tampa, FL, USA 4University Of South Florida,Morsani College Of Medicine,Tampa, FL, USA

Introduction:   Many thoracic surgeons perform mediastinal lymph node (LN) sampling (MLNS) in order to minimize morbidity believed to be associated with complete mediastinal LN dissection (MLND).  In order to focus attention of MLNS to the most likely LN levels involved for a given lung cancer, we sought to determine the patterns of mediastinal LN metastasis found after robotic-assisted video-thoracoscopic pulmonary lobectomy for non-small cell lung cancer (NSCLC).

Methods:   We retrospectively analyzed prospectively collected data for all patients who underwent robotic-assisted pulmonary lobectomy for NSCLC by one surgeon over 69 months.  Clinical stage was determined by history & physical examination, computerized tomography, positron-emission tomography, brain imaging studies, and/or endobronchial ultrasonography. Pathologic stage was based on intraoperative findings and final pathology. The pulmonary lobe resected and any mediastinal LNs involved by metastasis were noted.

Results:  Of 303 NSCLC patients (pts), mean age was 69±0.5 yr (range 39-98 yr), with most common histologies being adenocarcinoma (66%), squamous cell carcinoma (21%), and neuroendocrine carcinoma (10%).  Tumors were located in the right lung in 198 (65.3%) pts and in the left lung in 105 (34.7%) pts.  The three most common anatomic locations were right upper lobe (RUL; 39.6%), left upper lobe (LUL; 21.8%), and right lower lobe (RLL; 18.5%).  Frequencies of stage-3 disease were similar for left NSCLC compared to right NSCLC (p=0.59), but the frequency of stage-2 disease was higher for left NSCLC (28.6%) compared to that for right NSCLC (17.2%; p=0.02).  Of stage-3A right NSCLC, 56.8% were in the RUL, while 69.6% of stage-3A left NSCLC were in the LUL.  Among N1 LNs, level 11 involvement was more common than level 10 involvement for all right and left NSCLC combined (72/103, 69.9% vs. 20/103, 19.6%; p<0.0001).  Mediastinal LN involvement was highest in level 4R (23/198; 11.6%), level 5 (11/105; 10.5%), level 7 (25/303; 8.3%), and level 2R (10/198; 5.1%).  Stage-3A RLL NSCLC most commonly metastasized to level 7 (12/26; 46.2%), while stage-3A left lower lobe NSCLC metastasized most commonly to level 9L (3/6; 50.0%).

Conclusion:  After robotic-assisted pulmonary lobectomy, mediastinal LN metastatic disease was similarly frequent for right versus left NSCLC, while stage-2 disease was more frequent with left NSCLC.  Among N1 LNs, interlobar LNs were more commonly involved than hilar LNs.  For stage-3A NSCLC, there was upper lobe predominance on both sides.  Level 4R LNs were the most frequently found to be positive with right NSCLC, mostly due to RUL NSCLC, while level 5 LNs were most frequently found to be positive with left NSCLC, mostly due to LUL NSCLC.  These patterns of N1 and mediastinal LN involvement should assist in guiding thoracic surgeons to perform a more focused MLNS or a more complete MLND for more accurate NSCLC staging.

43.18 Predictors of Cardiogenic Shock in Cardiac Surgery Patients Receiving Intra-Aortic Balloon Pumps

A. Iyengar1, O. Kwon2, R. Shemin2, P. Benharash2  1University Of California – Los Angeles,David Geffen School Of Medicine,Los Angeles, CA, USA 2University Of California – Los Angeles,Division Of Cardiac Surgery,Los Angeles, CA, USA

Introduction:  Cardiogenic shock following cardiac surgery is a rare complication that leads to increased morbidity and mortality. Intra-aortic balloon pumps (IABPs) may be used during the perioperative period to increase coronary perfusion and support cardiac output. The purpose of this study was to characterize predictors of postoperative cardiogenic shock in cardiac surgery patients, and examine differences between those with and without IABP support.

Methods:  Retrospective analysis of UCLA’s Society of Thoracic Surgeon’s (STS) database was performed between January 2008 and July 2015. Preoperative demographic data for all patients were queried, and patient’s receiving IABP support during the perioperative period were identified. The Kruskal-Wallis and chi-squared tests were used for comparisons between IABP and control cohorts. Multivariable logistic regression with step-wise elimination was used to model postoperative cardiogenic shock in both the IABP and control cohorts. 

Results: During the study period, 4,741 cardiac surgery patients were identified during the study period, of whom 268 (6%) received an IABP. IABP patients had higher rates of previous cardiac surgery (54% vs. 38%, p<0.001), congestive heart failure (69% vs. 43%, p<0.001), and preoperative cardiogenic shock (22% vs. 2%, p<0.001). Furthermore, IABP patients were more likely to have emergent operations (84% vs. 42%, p<0.001) and receive coronary artery bypass grafts (CABG, 63% vs. 32%). IABP patients had significantly greater ventilation times, ICU/total hospital stays, and 30-day mortality (all p<0.001), and more postoperative cardiogenic shock (10% vs. 3%, p<0.001).
Among the IABP cohort, preoperative dialysis, arrhythmias, and previous cardiac surgery were all associated with higher odds of postoperative cardiogenic shock (all p<0.10), while CABG operations were found to be protective compared to other cardiac operations (OR 0.33 vs 2.28, p=0.008 and 0.053, respectively). On multivariate analysis, previous cardiac surgery and preoperative arrhythmia remained significant (AOR 5.95, p=0.005, and 2.94, p=0.015, respectively)predictors of postoperative cardiogenic shock. In the control cohort, several factors including hypertension, chronic lung disease, preoperative congestive heart failure, cardiogenic shock, inotropic medications, urgent/emergent status, non-CABG/Valve cardiac surgery, and prolonged bypass times, were associated with postoperative cadiogenic shock.

Conclusion: Factors associated with cardiogenic shock among post-cardiac surgery patients differ between those patients receiving IABP and those who do not. Among IABP patients, previous cardiac surgery and arrhythmias were associated with increased rates of cardiogenic shock, while shock was multifactorial among control patients. The etiology of cardiogenic shock may differ between these two cohorts, and early identification of those patients at risk may lead to improved outcomes.

43.17 Tumor Size and Perioperative Outcomes after Robotic-Assisted Pulmonary Lobectomy

R. Gerard4, F. O. Velez-Cubian2, E. P. Ng4, C. C. Moodie1, J. R. Garrett1, J. P. Fontaine1,2,3, E. M. Toloza1,2,3  1Moffitt Cancer Center,Thoracic Oncology,Tampa, FL, USA 2University Of South Florida Morsani College Of Medicine,Surgery,Tampa, FL, USA 3University Of South Florida Morsani College Of Medicine,Oncologic Sciences,Tampa, FL, USA 4University Of South Florida,Morsani College Of Medicine,Tampa, FL, USA

Introduction:   Tumor size is one factor that determines whether lobectomy is performed via open or minimally invasive approach.  We investigated whether tumor size affects perioperative outcomes after robotic-assisted video-thoracoscopic (RAVT) pulmonary lobectomy.

Methods:   We retrospectively studied all patients (pts) who underwent RAVT pulmonary lobectomy between September 2010 and May 2016 by one surgeon at our institution.  Patients were grouped by greatest tumor diameter on pathologic measurement of lobectomy specimens.  Perioperative outcomes, including estimated blood loss (EBL), skin-to-skin operative time, conversion to open lobectomy, intraoperative and postoperative complications, chest tube duration, hospital length of stay (LOS), and in-hospital mortality were compared.  Chi-square test, Student’s t-test, and Kruskal-Wallis test were used, with p≤0.05 as significant.

Results:  We identified and grouped 359 pts by greatest tumor diameter being ≤10mm, 11-20mm, 21-30mm, 31-50mm, or ≥51mm.  Tumor histology was comprised of NSCLC (89.4%), SCLC (1.9%), and pulmonary metastases (8.6%), with the most common NSCLC histology being adenocarcinoma (63.8%), squamous cell (21.5%), and neuroendocrine (9.7%).  No differences were noted in mean age, female:male ratio, or mean body surface area among the groups, but mean body mass index was lowest in pts with tumors ≥51mm.  Lobar distribution of lung tumors did not differ among the groups (p>0.14), but extent of resection differed by pts with tumors ≥51mm having a lower rate of simple lobectomies (p<0.001) and a higher rate of en bloc chest wall resection (p<0.001).  Neither overall intraoperative complications nor overall or emergent conversion to open lobectomy differed among the groups (p>0.21), but pulmonary artery (PA) injury occurred in as high as 7.2% of pts in groups with tumors ≥21mm (p=0.014).  While median EBL was higher in pts with tumors ≥51mm (p≤0.003) and median operative time was higher in pts within groups with tumors ≥31mm (p≤0.019), median chest tube duration and median hospital LOS did not differ among the groups (p>0.37).  Neither overall total postoperative complications nor overall pulmonary or cardiovascular complications differed among the groups (p>0.23), but pneumothorax after chest tube removal and requiring intervention was more frequent in pts who had tumors ≤10mm (p=0.03).  In-hospital mortality did not differ among the groups (p=0.60).

Conclusions: Patients who undergo RAVT lobectomy for tumors ≥51mm are associated with lower BMI and are less likely to have simple lobectomies and more likely to require en bloc chest wall resection.  Patients with larger tumors also are at increased risk of PA injury, higher EBL, and longer operative times, but are at lower risk for pneumothorax after chest tube removal and that require intervention.  However, tumor size does not affect chest tube duration, hospital LOS, or in-hospital mortality.

43.16 Weekend Discharge and Readmission Rates After Cardiac Surgery

G. Ramos1, R. Kashani1, Y. Juo1, A. Lin2, N. Satou1, R. J. Shemin1, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Division Of Cardiac Surgery,Los Angeles, CA, USA 2David Geffen School Of Medicine, University Of California At Los Angeles,Division of General Surgery,Los Angeles, CA, USA

Introduction:  Unintended rehospitalization within 30 days serves as a quality metric for institutions and may lead to financial penalties. Few studies have examined the implications of weekend discharges on readmission rates. Limited care coordination and cross coverage of surgeons are known challenges of weekend hospital function and may lead to less comprehensive post-discharge care plan. Based on this knowledge, we hypothesized that patients discharged on weekends would be more likely to be readmitted. 

Methods:  Using the institutional Society of Thoracic Surgeons (STS) database, all adult patients (>18) undergoing cardiac surgery between 2008 and 2015 were identified. 44 demographic and perioperative characteristics were collected and accounted for in a multivariate model. Emergency, transplant and mechanical assist patients were excluded. Weekday discharge was defined as being discharged on Monday through Friday while weekend discharge was limited to Saturdays and Sundays. The primary outcome variable was any readmission within 30 days of discharge. 

Results: Of the 4416 patients included in the study, 3632 (82%) were discharged on a weekday, and 783 (18%) were discharged on a weekend; 495 (11%) patients were readmitted within 30 days. The readmission rates for the weekday and weekend cohorts were similar (10% vs.11%, p=0.4). After adjustment for other risk factors of readmission, there was no association between discharge day of the week and readmission (adjusted odds ratio [AOR]= 1.08, 95% CI=0.83-1.42, p=0.6).  Significant risk factors for readmission included receiving dialysis (AOR=1.6, 95% CI=1.09-2.30, p=0.016), undergoing an urgent operation (AOR=1.24, 95% CI=1.01-1.53, p=0.04), and taking preoperative Coumadin (AOR=1.43, 95% CI=1.04-1.97, p=0.03) or beta-blockers (AOR=1.32, 95% CI=1.07-1.63, p=0.01).

Conclusion: In this study, weekend discharge status was not associated with an increased risk of readmission after cardiac surgery. However, patients having an urgent surgery, receiving dialysis, or taking preoperative Coumadin or beta-blockers were more likely to be readmitted. Our findings suggest that readmission reduction programs should focus on patient factors rather than providing additional weekend coverage beyond existing care coordination resources. 

 

43.15 Impact of Prior Myocardial Infarction on Myocardial Recovery after Transcatheter Aortic Valve Replacement

A. Mantha2, N. Asthana3, G. Vorobiof3, P. Benharash1  1University Of California – Los Angeles,Cardiac Surgery,Los Angeles, CA, USA 2University Of California – Irvine,Orange, CA, USA 3University Of California – Los Angeles,Cardiology,Los Angeles, CA, USA

Introduction:  Transcatheter valve replacement (TAVR) is a definitive, minimally invasive treatment for patients with severe stenosis (AS) and has been shown to improve myocardial remodelling. However, it is unclear whether these changes occur in patients who have a history of myocardial infarction. This study sought to evaluate the impact of TAVR in patients who had previously suffered a myocardial infarction and have undergone PCI or CABG.

Methods:  A review of our prospectively-maintained institutional Society of Thoracic Surgeons database and Transcatheter valve Registry was performed to identify all patients undergoing TAVR from Jan. 2013- Mar. 2016. Chi-square test and regression were used to evaluate differences in patient demographics, readmission rate and length of stay. Repeated measures analysis of variance was used to compare myocardial strain and function among forty-two patients with speckle data through one-month follow-up. 

Results: Of the 172 patients included in the analysis, 62 (36%) had previously experienced a myocardial infarction. 22 (33%) underwent PCI alone, 18 (27%) underwent CABG alone and 9 (13%) underwent both PCI and CABG. Patients who underwent PCI alone had a significantly longer length of stay (7.2 vs 4.4 days, p<0.001) after TAVR and higher proportion of patients readmitted within 30 days (37%, p<0.01). Patients with history of MI had consistently lower magnitudes of strain in the septal (p<0.01), anterior (p<0.02), lateral (p<0.01), and inferior (p<0.01) ventricular walls despite having similar ventricular diameter (p=0.77) and septal thickness (p=0.82). Both cohorts demonstrated significant improvement in global longitudinal strain (-19.0 vs -16.0 in control, -17.1 vs -14.1 in MI, p<0.01) and interaction between history of MI and GLS was not significant.

Conclusion: Patients with history of myocardial infarction benefit similarly from TAVR as control patients with no history of MI despite having poorer pre-operative ventricular contractility. However, management strategy of infarction episode may have differential impact on tissue ischemia leading to increased length of stay and risk of readmission after TAVR.
 

43.14 Thoracoscopic Lobectomy Reliability for NSCLC is an Important Indicator of Program Development

M. Hennon1,2, J. Xiao2, M. Huang1, E. Dexter1,2, A. Picone1,2, S. Yendamuri1,2, C. Nwogu1,2, W. Tan3, T. Demmy5  1Roswell Park Cancer Institute,Thoracic Surgery,Buffalo, NY, USA 2State University Of New York At Buffalo,Surgery,Buffalo, NY, USA 3Roswell Park Cancer Institute,Biostatistics,Buffalo, NY, USA 5Rutgers Cancer Institute Of New Jersey,Cardiothoracic Surgery,New Brunswick, NJ, USA

Introduction:  Outcomes for thoracoscopic (VATS) lobectomy at the institutional level can be affected by numerous variables, including selection bias. The total percentage of cases completed by VATS for locally advanced nonsmall cell lung carcinoma may be an important component of individual program quality.

Methods:  Over 11 years (from January 2002 to March 2013), 1289 consecutive lobectomies were performed, of which 300 were for patients with locally advanced NSCLC (tumors greater than 4cm, T3, T4, or patients who underwent induction chemotherapy).  Patients requiring chest wall resection, sleeve lobectomy, or pneumonectomy were excluded.  Cases were divided into three sequential groups of 100 patients for comparison.  Reliability is defined as the total number of cases completed thoracoscopically (VATS) divided by all cases (VATS + Conversion+ Open). Conversion rates, percentage of cases completed by VATS, along with preoperative, perioperative and outcome variables were compared and analyzed by Mann-Whitney-Wilcoxon and Fisher’s exact tests. Estimated overall survival and disease free survival distributions were obtained using the Kaplan-Meier method.

Results: Of 300 cases during the study period, 219 were completed by VATS.  VATS reliability increased from 62% (early), to 77% (middle), and 80% (late).  Reliability increased due to a steady decrease in planned thoracotomy from 17%, to 9% and 2.1% respectively.  A higher percentage of patients in the late group had more preoperative comorbidities (CAD/MI 27% vs. 19% vs. 42.6%, p = 0.0016). Median operative time increased over the study period from 225 min. [96-574] vs. 328 min. [115-687] vs. 340 min. [140-810], presumably due to approaching more complex tumor pathology.  Median operative blood loss was the same for all groups at 200 mL (10-2200).  Median postoperative ICU stay was 1 day (0-92) for all groups.  Higher neoadjuvant therapy rates (16% vs. 54% vs. 50%, p <0.0001) were achieved in the middle/late groups.  Fewer postoperative complications occurred in the middle and late time groups (any major complication was 38% vs. 13% vs. 16%, p < 0.0001; bleeding 23% vs. 4% vs. 6%, p < 0.0001; air leak 16% vs. 13% vs. 3%, p = 0.0037).   Number of lymph nodes harvested during surgery (10.2 vs. 12.5 vs. 22.8, p <0.0001) improved significantly. 

Conclusion: In our experience, VATS reliability increased over time with favorable perioperative and postoperative outcomes due to fewer cases being approached by planned thoracotomy. Since there were associations with factors like lymph node harvest, VATS reliability deserves additional study as an indicator of individual program achievement and as a tool to explain differences between VATS and open surgeries reported in large, cooperative databases.

 

43.13 The Use of Peri-operative Ketorolac in Surgical Treatment of Pediatric Spontaneous Pneumothorax

R. M. Dorman1,2, G. Ventro1,2, S. Cairo1, K. Vali1,2, D. H. Rothstein1,2  1Women And Children’s Hospital Of Buffalo,Pediatric Surgery,Buffalo, NY, USA 2State University Of New York At Buffalo,Pediatric Surgery,Buffalo, NY, USA

Introduction:
The effect of post-opertative anti-inflammatory medications on pleurodesis success after treatment of spontaneous pneumothorax is uncertain. We sought to determine if the use of post-operative ketorolac is associated with an increased risk of recurrence in the surgical treatment of primary spontaneous pneumothorax in children.

Methods:
The Pediatric Health Information System database was queried for all patients age 10-16 years discharged ibetween 2004-2014 with a primary diagnosis of pneumothorax or pleural bleb and a thoracotomy, thoracoscopy, or lung resection procedure. Deaths, encounters representing readmission after previous operative treatment of pneumothorax in the prior year, patients requiring extra-corporeal life support, and patients with diagnoses or concurrent procedures that may lead to secondary or iatrogenic pneumothorax were excluded. Variables included basic demographics, discharge in the first or second half of the study period, chronic renal or hematologic disease, intensive care unit admission or post-operative mechanical ventilation, and whether a lung resection or plication was coded. The primary predictor of interest was ketorolac administration any time in the period from post-operative day 0 to 5. The primary outcomes of interest were thoracentesis, thoracostomy, thoracotomy, thoracoscopy, lung resection or plication, or pleurodesis within 1 year of the index admission. Bivariate analyses were carried out for all outcomes and multivariate logistic regression analyses were then performed for reintervention and readmission.  

Results:
1,678 records met inclusion criteria. Three hundred ninety-five (23%) were subsequently excluded (227 readmissions and the remainder for one of the above-listed criteria), leaving 1,283 patients for analysis. The cohort was predominately male (79%), white (74%), and older (mean age 15.5 ± 1.2 years). Most patients had some lung resection recorded (78%), a majority were administered ketorolac (57%), and few required reintervention (20%) or readmission (18%). Mean postoperative length of stay was 5.2 ± 3.8 days and mean cost was $17,649 ± $10,599. Older patients and those in the earlier years of the study were more likely to receive ketorolac. There was significant variation in frequency of ketorolac administration by geographic region, ranging from 32% to 68%. On multivariate analysis, no variable was predictive of reintervention, and only lung resection correlated with readmission (adjusted odds ratio 0.63 [95% C.I. 0.45-0.90]). 

Conclusion:
Post-operative ketorolac administration was not associated with an increased likelihood of reintervention or readmission within 1 year of operative treatment of primary spontaneous pneumothorax, suggesting that it may be used safely as part of a post-operative pain control regimen. Effects on postoperative length of stay and cost, however, were not demonstrated.
 

43.12 Acute Retrograde Type A Aortic Dissection: Morphological Analysis and Clinical Implications

B. L. Rademacher1, P. D. DiMusto2, J. L. Philip1, C. B. Goodavish3, N. C. De Oliveira3, P. C. Tang3  1University Of Wiscosin,Department Of Surgery, Division Of General Surgery,Madison, WISCONSIN, USA 2University Of Wisconsin,Department Of Surgery, Division Of Vascular Surgery,Madison, WISCONSIN, USA 3University Of Wiscosin,Department Of Surgery, Division Of Cardiothoracic Surgery,Madison, WISCONSIN, USA

Introduction: Numerous studies have described thoracic stent graft induced retrograde type A dissections (rTAD), however, much less is known about acute spontaneous rTAD with tears originating past the left subclavian without prior aortic instrumentation. This study compares the morphology of acute rTAD with both acute antegrade type A dissection (aTAD) with primary tears in the ascending aorta and acute type B dissection.

Methods: From 2000 to 2016, there were 12 acute rTAD, 96 aTAD, and 92 acute type B dissections with available imaging that underwent operative intervention at our institution. Dissection morphology along the length of the aorta was characterized using 3-dimensional reconstruction based on computerized tomography angiography (CTA) images. We examined primary and secondary tear characteristics, true lumen area as a fraction of the total lumen area, and false lumen contrast intensity as a fraction of the true lumen contrast intensity.  Features of presentation and operative parameters were compared between rTAD and aTAD.

Results: Compared with acute type B dissections, primary rTAD tears were more common in the distal arch (75% vs 43%, p=0.04), and the false lumen to true lumen contrast intensity ratio at the mid-descending thoracic aortic level was lower (0.46 vs 0.71, P=0.02) indicating more sluggish blood flow or thrombosis in the false lumen. rTAD cases had less decompression of the false lumen compared with acute type B dissections such that there were fewer aortic branch vessels distal to the subclavian that were either exclusively perfused through the false lumen or through both the false and true lumen (0.40 vs 2.19, P<0.001). Compared with aTAD, rTAD had a tendency for less root involvement where true lumen as a fraction of total lumen area at the root level was higher (0.88 vs 0.76, P=0.081). rTAD had a lower false lumen to true lumen contrast intensity ratio compared to aTAD at the root (0.25 vs 0.57, P<0.05), ascending aorta (0.25 vs 0.72, P<0.001), and proximal arch (0.39 vs 0.67, P<0.05) indicating more sluggish flow or greater tendency to thrombose. rTAD patients were more likely to undergo aortic valve resuspension (100% vs 74%, P=0.044) than aortic valve replacement, and tend to have lower aortic cross-clamp times (83 vs 108 min, P=0.066) (Table 1).

Conclusion: This study suggests that retrograde propagation of the false lumen to the arch and ascending aorta tends to occur when the primary tears that occur distal to the left subclavian are in close proximity to the aortic arch and when false lumen decompression through the distal aortic branches are less effective. Compared to aTAD, rTAD tends to have less root involvement and successful aortic valve resuspension is more likely.