68.07 Ratio Based Resusctitation in Isolated Traumatic Brain Injury: Is Effect the Same?

T. Orouji Jokar1, B. Zangbar1, P. Rhee1, N. Kulvatunyou1, M. Khalil1, T. O’Keeffe1, A. Tang1, R. Latifi1, D. J. Green1, R. S. Friese1, B. Joseph1  1University Of Arizona,Trauma/Surgery/Medicine,Tucson, AZ, USA

Introduction:  

The use of 1:1:1 (packed red blood cells (PRBC): fresh frozen plasma (FFP): platelets) transfusion ratios has been shown to improve survival in severely injured trauma patients. The aim of this study was to assess outcomes in patients with traumatic brain injury (TBI) receiving 1:1:1 ratio based blood product transfusion (RBT). We hypothesized that RBT improves survival in patients with TBI.

Methods:  

We performed a 3 year retrospective analysis of all patients with an isolated TBI and an intracranial hemorrhage (ICH) presenting to our level 1 trauma center. Patients receiving blood transfusion were included. Patients were stratified into two groups: patients who received RBT and patients those who did not receive RBT (No-RBT). The outcome measure was survival. Multivariate logistic regression analysis was performed. 

Results

A total of 189 patients were included of which; 29% (n=55) received RBT. The mean age was 46.7±24.1 years, median Glasgow Coma Scale (GCS) score was 10 [3-15], and median head abbreviated injury severity scale (h-AIS) score was 3[3-5]. The overall survival rate was 71% (n=135). Patients in the RBT group had a higher survival rate compared to patients in the No-RBT group (78.2% vs. 68.6%, p=0.03).

Conclusion:

The survival benefit of ratio based transfusion exists even in patients with isolated TBI. Guidelines for the initial management of TBI patients should focus on the use of RBT. The beneficial effects of platelets in RBT among TBI patients require further evaluation. 

 

68.08 Adding “Insult” upon “Insult” to Injury: Double Contrast Exposure and Acute Kidney Injury in Trauma

M. Polcz2, C. Orbay2, V. Polcz1, L. Podolsky2, M. Bukur1, I. Puente1, C. Prays2, F. Habib1  1Broward Health Medical Center,Trauma,Ft Lauderdale, FL, USA 2Florida International University,Surgery,Miami, FL, USA

Introduction:

Existing studies do not demonstrate an increased incidence of AKI in trauma patients after computed tomography (CT) with IV contrast. In a subset of patients, however, the contrast load is compounded by the need for angiography. The incidence and consequences of AKI in this cohort are unknown, and form the objective of this study.

Methods:
Records of patients treated at our urban Level I trauma center over a 7-year period (2007-2013) who received both CT with IV contrast and angiography as part of their trauma workup were reviewed. Deaths within the first 24 hours of arrival were excluded. AKI was defined as an increase in serum creatinine of ≥0.3 mg/dL or ≥50% over baseline. Creatinine values were recorded in-between studies as well as 24 and 48 hours after the second study. Univariate analysis was performed using the Student's t test, chi square test and Fisher's Exact test as appropriate. Logistic regression was performed. A p value of < 0.05 was deemed significant.

Results:
Of 119 patients included, criteria for AKI were met in 29 (25.4%). These patients were more likely male (p=0.01), older (p=0.02), more severely injured (p=0.049) and have higher baseline creatinine values (p=0.01). Race, mechanism of injury, hypotension on arrival, Charlson Comorbidity Index, and order of study performance were not significant (all p>0.05). Patients who developed AKI were more likely to die (p=0.02), have need for dialysis (p=0.001), and have increased lengths of ICU (p=0.002) and hospital stay (p=0.001). On logistic regression, age and receiving an angiogram first were predictive.

Conclusion:Exposure to multiple contrast loads results in a high incidence of AKI among trauma patients and is associated with higher mortality, the need for dialysis, and longer ICU and hospital stays. Further prospective studies are clearly warranted

68.09 Epidemiology of Post-Operative Respiratory Failure (PORF) in the United States During 2000-2011

G. Mitchon1, A. Seifi1  1University Of Texas Health Science Center At San Antonio,Neuro Critical Care- Neurosurgery,San Antonio, TX, USA

Introduction:

Post-Operative Respiratory Failure (PORF) is a common complication leading to increased morbidity, mortality, hospital cost and length of stay. The purpose of this study is to determine the epidemiology and socioeconomic impacts on PORF as a Patient Safety Indicator (PSI) in the United States based on a robust database.

Methods:

This is a retrospective cohort study of hospital inpatient discharges that developed PORF during 2000-2011, identified through the Nationwide Inpatient Sample (NIS). Z-tests were performed using a modified version of version 4.4 of the PSI software and the NIS.

Results:
The incidence of PORF remained steady between 7-9/1000 elective surgeries each year. Rates of PORF were significantly higher in patients >65 years old (age 45-64: 6.26/1000 vs age 65+: 10.332/1000, P<0.001) and even higher for patients >85 years old (P<0.001). Males had a nearly 50% increase in PORF as compared to females (P<0.001). In contrast to age, higher income and having insurance demonstrated a negative correlation with PORF as evidenced by the facts that people in the lowest income quartile had significantly higher rates of PORF than all other income quartiles (P<0.001) and that people with private insurance had significantly lower rates than the uninsured and Medicare/Medicaid recipients (P<0.001). Hospitals in large metropolitan areas had significantly higher rates of PORF than did those in smaller municipalities (P<0.001).

Conclusion:
PORF is a common problem, affecting 7-9/1000 patients undergoing elective surgeries. The risk is even greater for elderly, male and low income patients. Further studies are needed to develop and validate objective grading criteria to prospectively identify patients at high risk.
 

68.10 Trauma-Induced Coagulopathy: Stepwise Association Between Platelet Count and Mortality

C. E. Nembhard1, J. Hwabejire1, E. Cornwell1, W. Greene1  1Howard University College Of Medicine,General Surgery,Washington, DC, USA

Introduction:
Trauma-induced coagulopathy causes uncontrollable hemorrhage and is associated with high mortality. Blood component replacement is the main treatment. However the relationship between platelet level and outcomes is not clearly defined. This study investigates the risk factors and predictors of mortality in trauma-induced coagulopathy.

Methods:
The Glue Grant database was examined including adults ≥ 18 yrs who sustained blunt traumatic hemorrhagic shock. They were divided into two groups: trauma-induced coagulopathy [Coag; defined as an international normalized ratio (INR)> 1.5 in the emergency room (ER) in the absence of  pre-existing coagulopathy or current anticoagulation medication] and no coagulopathy (NoCoag) on presentation. The groups were compared using univariate analysis and multivariate analysis identified the predictors of trauma coagulopathy and mortality.

Results:
1804 patients met the inclusion criteria, 66.2% male. 21.9% (n= 395) had trauma-induced coagulopathy. Significant differences between Coag vs. NoCoag group include: Age, yrs (40±20 vs. 44±18, p<0.001),Injury Severity Score, ISS (37±14 vs.31±13 p<0.001), pre-hospital blood transfusion, mL (344±1076 vs. 185±551, p<0.001), pre-hospital crystalloid, mL (3217±2616 vs. 1844±1959), ER systolic blood pressure(109±34 vs.112±30 p=0.038), ER lactate (5.7±3.5vs. 4.2±2.5p<0.001), ER initial hemoglobin (9.6±2.8 vs.11.9±2.3 p<0.001), resuscitation fresh frozen plasma (1454±1373 vs. 525±924 p<0.001) , platelet (230±341 vs. 104±320), total blood transfused in 12 hours (4097±4033 vs.2281 ±2759<0.001), ICU days (14±12 vs. 13±12), acute respiratory distress syndrome,ARDS (31.9% vs. 22.6% p<0.001) and pulmonary embolism(5.8 vs. 3.2% p=0.015). Predictors of Coag include ISS (OR1.021 CI1.009-1.032 p<0.001), ER lactate (OR1.12 CI1.060-1.184 p<0.001). Overall mortality was 15.5%, 27.3% in the Coag group and 12% in the NoCoag group. Predictors of mortality in Coag patients were: maximum lactate in 12-24hours (OR1.32 CI1.05-1.66 p=0.017), ARDS (OR 5.63 CI2.09-15.20 p<0.001) and cardiac arrest (OR84.7 CI13.46-532.95 p<0.001). Protective against mortality were: platelet count (OR0.97 CI0.95-0.99 p=0.002) and ICU days (OR0.88 CI0.83-0.94 p<0.001). Table 1. shows the relationship between platelet count and mortality.

Conclusion:

Conclusion: In patients with coagulopathy due to blunt trauma mortality is high. Platelet count is inversely proportional to mortality. Mortality can be decreased to less than that of patients without coagulopathy if the platelet count is kept above 150.
 

68.11 Concomitant Injuries In Patients With Devastating Head Injuries Do Not Preclude Organ Donation

J. A. Marks1, J. Hatchimonji1, P. Kim1, D. N. Holena1, J. Pascual1, P. M. Reilly1, N. D. Martin1  1University Of Pennsylvania,Division Of Traumatology, Surgical Critical Care And Emergency Surgery,Philadelphia, PA, USA

Introduction:
Organ availability remains a major limitation to organ transplantation. Patients with devastating head injuries are a frequent source of transplantable organs, but when concomitant injuries exist, potential reluctance for aggressive care may develop for fear of not realizing organ donation. We aim to evaluate this cohort, and determine the efficacy of major, non-neurologic operations prior to organ donation.

Methods:
All severe head injured adult trauma patients (AIS Head ≥5) from January 2004 to December 2013 were reviewed retrospectively. Demographics, mortality, and organ donation was recorded.  Major non-neurologic operations performed prior to organ donation were also abstracted.  Quantitative and qualitative assessments of this cohort were determined.

Results:
During the study period 1194 patients with AIS Head ≥5 were identified. Of those, 465 patients died; 79 of whom successfully donated organs. 40 of the 79 donors had significant injuries as determined by an AIS>2 in an alternate body region. Of these, eight patients underwent a total of 11 major operative interventions.  These included non-resuscitative thoracotomy, laparotomy with gastorrhaphy, entrectomy, colectomy, splenectomy, liver packing, hepatorrhaphy and cystorrhaphy.  Of note, although nine resuscitative thoracotomies were performed among the 465 deaths, none of these realized organ donation despite seven patients surviving to ICU admission.

Conclusion:
In patients with devastating head injuries, major concomitant injury requiring operation does not preclude organ donation.  Aggressive non-neurologic operative care can preserve the option for donation in select cases.  
 

68.12 The Sixties: The Intersection Of Modifiable Risks And Physical Vulnerability In Unintentional Falls

S. R. Allen1, R. Cheney1, J. Ellis1, D. Holena1, J. Pascual1, N. Martin1, P. Kim1, P. Reilly1  1Hospital Of The University Of Pennsylvania,Trauma, Surgical Critical Care And Emergency Surgery,Philadelphia, PA, USA

Introduction: Falls are an important source of injury and mortality in the U.S. The impact of falls, especially in elderly marks the need to target fall prevention.  Injury research and prevention strategies with alcohol use and motor vehicle crashes are well established, however.a paucity of data exists for alcohol as a risk factor in falls in elders. It is hypothesized that there is an intersection of both physical vulnerability (pre-existing co-morbidities (PEC)) and external risks (alcohol). We set out to identify this intersection of these factors across adult age groups and evaluate the clinical implications (morbidity and mortality) to identify opportunities for focused interventions.

 

Methods: The Trauma Registry at an urban Level 1 trauma center was queried for fall related injuries in patients 18 and older from July 2008 to June 2013. Patients were grouped into young (18-49 years) then by decade thereafter. Data included demographics, use of alcohol and blood alcohol concentration (BAC), number of PEC's, ISS, lengths of stay, complications, and mortality. P<0.05 was significant.

Results: 4130 patients were seen for falls during this period. 71.5% were tested for ETOH. Older patients had higher ISS than younger patients. More patients 60-69 (n= 385) had at >1 PEC than the young group (n=1031). A similar proportion of those 60-69 used alcohol at the time of the fall as the young, with similar BAC. Twice as many patients in 60-69 group suffered complications than the young.  Mortality was higher in the 60-69 group. The increase in mortality was amplified among those in their 60’s who suffered a complication compared to the young (Table 1).

Conclusion: Sexagenarians who fall appear to use alcohol to a similar extent as young adults but have significantly more PEC’s. The intersection of significant alcohol consumption and physiologic vulnerability is marked by more serious injuries, more complications and higher mortality. Alcohol may be a modifiable risk factor in this overlooked age group who fall. Interventions focused on alcohol use may help to reduce falls among this group of elderly adults.

 

68.13 Risk Factors for Pneumonia after Major Abdominal Surgery

C. K. Yang1, A. Teng1, D. Y. Lee1, K. Rose1  1Mount Sinai St. Luke’s Roosevelt Hospital Center,New York, NY, USA

Introduction:  Pneumonia after major abdominal surgery (MAS) is common and carries a high potential for increased morbidity and mortality. Additionally, it represents a major burden to the health care system. This study was conducted to define the risk factors associated with pneumonia after MAS.

Methods:  The American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) database from 2005 to 2012 was queried for patients who underwent MAS using Current Procedural Terminology (CPT) codes. These operations include elective operations performed on the esophagus, stomach, small intestine, large intestine, pancreas and liver. Patients who developed pneumonia after MAS were compared to those who did not. Appropriate statistical tests were used to compare the preoperative characteristics of the two groups. A logistic regression analysis was performed to determine predictors of post-MAS pneumonia.

Results: 165,196 patients who underwent MAS were identified. The overall rate of pneumonia in this cohort was 3.2%. The rate of pneumonia in the esophageal group was the highest at 16.2% which was more than double that of the second highest group, stomach, at 6.4%. Rates of pneumonia for pancreatic, small intestine, liver and large intestine procedure groups were 4.8%, 4.2%, 3.3%, and 2.4% respectively. The median time to the diagnosis of pneumonia was 5 days for all operation types except pancreas which was 6 days. (Table 1) Patients who developed PNA were significantly older, had higher ASA class, and had more co-morbidities compared to those who did not. On multivariate analysis, esophageal surgery was associated with the highest risk of developing post-MAS pneumonia (OR 5.2, 95% CI (4.6-5.9), p < 0.0001), followed by ASA class VI (OR 4.7, 95% CI (3.2-6.8), p <0.0001). Other factors independently associated with the occurrence of PNA include advanced age, male sex, transfer from inpatient or chronic care facility, history of dyspnea, COPD, smoking status, wound classification, and prolonged operative time. Of note, BMI >21 appeared to be associated with less post-MAS PNA (OR 0.7, 95% CI 0.7-0.8, p <0.0001).

Conclusion: Pneumonia following abdominal procedures is associated with a number of variables. Esophageal operations and ASA class were the strongest predictors. These results provide a framework for identifying patients at risk for developing pneumonia post-MAS. 

68.14 Nutritional Support and the Obesity Paradox in Necrotizing Soft Tissue Infection Patients

N. Goel1, E. Lin1, V. Patel1, R. Askari1  1Brigham And Women’s Hospital,Trauma, Burn, And Surgical Critical Care,Boston, MA, USA

Introduction: There are few studies that describe patient outcomes with necrotizing soft tissue infection (NSTI) based on the type of nutritional support they received. It is also unknown whether the obesity paradox, a phenomenon where obesity exhibits a protective effect on the mortality of patients in severe sepsis, exists in the NSTI population. We sought to examine the association of body mass index (BMI) and outcomes in patients with NSTI, while also trying to identify the best route of nutrition for these patients.

Methods: A retrospective cohort study of all NSTI patients from 1995 to 2011. Using the hospital nutrition database, patients were categorized by type of nutritional intake—artificial (tube feeds and total parenteral nutrition (TPN)) or non-artificial. Each population was further subdivided by baseline BMI (weight in kilograms divided by the square of height in meters) as underweight (BMI <18.5), normal (BMI, 18.5-24.9), overweight (BMI, 25.0-29.9), and obese (BMI ≥30.0) according to the WHO guidelines. This database also provided the percentage of goal caloric and protein needs achieved for each NSTI patient on artificial nutrition. Pre and post nutrition CRP and prealbumin were also measured.

Results: In total, 148 patients with NSTI from 1995-2011 were identified. Forty-six patients were tube fed, 13 patients received TPN, and 89 patients received non-artificial nutrition. Mortality in patients receiving TPN was 15.4%, mortality in tube fed patients was 15.2%, and mortality in patients receiving non-artificial nutrition was 3.4%. Using the WHO guidelines for BMI, 29 patients were categorized as obese, 17 overweight, 9 normal, and 3 underweight in the artificially fed population. Within the non-artificially fed population, there were 37 obese patients, 26 overweight, 25 normal, and 1 underweight patient. Mortality in the obese population was the lowest at 0% in the non-artificially fed population. Mortality in the obese population was also the lowest at 10.3% in the artificially fed population. The artificially fed obese population, on average, achieved suboptimal percentages of their goal caloric and protein intake at 63% and 10%, respectively. This compares to 58% and 124% in the overweight group, 76% and 84% in the normal group, and 94% and 85% in the underweight group, of their goal caloric and protein intake, respectively. In all groups, CRP decreased with nutrition. Prealbumin also improved in all groups on discharge, except in the obese population.

Conclusion: Though patients receiving artificial nutrition through tube feeds and TPN experience higher rates of mortality compared to patients receiving non-artificial nutrition, obese patients may fare better than normal weight patients regardless of their nutritional support. In the artificially fed population, obese patients experienced lower mortality rates despite achieving lower percentages of their goal caloric and protein needs.

68.15 Angioembolization is necessary with any volume of contrast extravasation in blunt trauma

A. Bhakta1, D. Magee1, M. Peterson1, M. S. O’Mara1  1Grant Medical Center / Ohio University,Trauma And Acute Care Surgery,Columbus, OHIO, USA

Introduction:   Reduction of non-essential angiogram and embolization for patients sustaining blunt abdominal and pelvic trauma would allow improved utilization and decreased morbidity related to non-therapeutic embolization.  We hypothesized that the nature of contrast extravasation (CE) on CT would be directly related to the finding of extravasation on angiogram and need for embolization.

Methods:   A retrospective evaluation of trauma patients with CE on CT.  Demographics, hemodynamics, and CE location and maximal dimension/volume were examined for relationship to CE on angiography and need for embolization. Primary complications were defined as nephropathy and pseudoaneurysm. 

Results:  128 patients were identified with CE on CT.  64 (50.4%) also had CE identified on angiography requiring some form of embolization. Size of CE on CT was not related to CE on angiogram (p=0.69).  Location of CE was related to need for embolization, with spleen embolization (85.4%) being much more frequent than liver (51.5%, p=0.006).  Complication rate was 8.7% in all patients, and was not different between patients undergoing embolization and those who did not (p=0.40).

Conclusion:  CE volume for patients was not predictive of continued bleeding and need for embolization. However, splenic injuries with CE required more embolization. In contrast, liver injuries were found to have infrequent ongoing CE on angiography. Complications associated with angiogram with or without embolization are infrequent, and CT findings may not be predictive of ongoing bleeding.  We do not recommend selective exclusion of patients from angiographic evaluation when a blush is present.
 

68.16 Total Hospital Blood Use has Decreased in the Era of Hemostatic Resuscitation

E. M. Campion1, L. Z. Kornblith1, E. W. Fiebig2, B. J. Redick1, R. A. Callcut1, M. J. Cohen1  2University Of California – San Francisco,Laboratory Medicine,San Francisco, CA, USA 1University Of California – San Francisco,Surgery,San Francisco, CA, USA

Introduction:

Blood product resuscitation has undergone a recent evolution driven by the concept of damage control hemostasis. This approach encourages a balanced ratio of packed red blood cells to fresh frozen plasma approaching 1:1. While the mortality benefits have been elucidated, this strategy raised concerns about overuse of blood products and the potential for straining blood bank resources. We hypothesized that adoption of damage control hemostasis would decrease overall hospital blood product usage and wastage.

Methods:

Blood use records from a major Level I Trauma Center were queried for all units of packed red blood cells(pRBCs), fresh frozen plasma(FFP), and platelets (PLT) transfused to the inpatient (medical and surgical) population from 2003-2011. Diagnosis, injury severity, demographics and hospital case mix data were analyzed. Linear regression was performed to determine if a trend in blood product usage existed.

Results:

The study period(2003-2011) encompassed 132,443 patient discharges and 27,498 trauma activations. 124,093 blood units were transfused during this time. The patient population was consistent by diagnosis, ISS and demographics(p=NS). However, pRBC use had a significant decrease of 405.4 units per year (p<0.001). FFP had a non-significant increase of 6.48 units per year (p=0.845) and PLT had a non-significant decrease of 2.13 units per year (p=0.868). Over the study period, the hospital wide ratio of pRBC to FFP decreased from 2.9 to 2.0 (p=0.001) and the total hospital blood use decreased by 401.0 units per year (p=0.002).

Conclusion:

During the period of transition to hemostatic resuscitation, the total hospital blood use and PRBC use decreased significantly amongst medical and surgical patients. Despite worry to the contrary, adoption of hemostatic resuscitation is likely associated with decreased total hospital blood product usage.

 

68.17 Orthopaedic Injury Location, Thrombelastography, and Their Relationship to Pulmonary Embolism

J. B. Wilkerson2,5, B. Andrew1,2, W. Charles2,3, H. B. John2,3, G. Matthew1, B. Sarah2,4, F. Erin2,3, J. Tomasek2, P. Matthew2,3, J. L. Gary1  1University Of Texas Health Science Center At Houston,Department Of Orthopedics,Houston, TX, USA 2Center For Translational Injury Research,Houston, TX, USA 3University Of Texas Health Science Center At Houston,Department Of Surgery,Houston, TX, USA 4University Of Texas Health Science Center At Houston,Department Of Biostatistics,Houston, TX, USA 5University Of Texas Health Science Center At Houston,Medical School,Houston, TX, USA

Introduction:  Pulmonary embolism (PE) can be a deadly outcome for trauma patients despite aggressive surveillance and prophylaxis. Thromboelastography (TEG) has recently come into focus as a method for screening injured patients for hypercoagulability and risk for pulmonary embolism. The maximal amplitude (mA) value, representative of clot strength, is currently used to assess patient risk of PE in our trauma center. The purpose of this study is to determine whether the location of injury (upper extremity, lower extremity/pelvic, or spinal column) affects the risk of PE in orthopaedic trauma patients. It is hypothesized that patients with an isolated lower extremity/pelvic fracture will have higher mA values, predicting a higher risk for PE.

Methods:  The study included all patients admitted with admission TEG between January 1, 2011 and July 15, 2014 at a large level I trauma center. Subjects included patients ages 16 and older suffering a fracture from proximal humerus to distal radius (upper extremity), from pelvis to talus (lower extremity), or along the spinal column. Patients were excluded if their PE was diagnosed before arrival.

Results: 1877 musculoskeletal trauma patients were included in the study. 176 (9.4%) were diagnosed with PE. 1.4% of patients with an isolated upper extremity injury, 4.7% of patients with an isolated lower extremity injury, and 42.5% of patients with an isolated spinal column injury developed a PE. Also, 11.5% of the patients with injuries to more than one of these locations (“poly-trauma”) developed a PE. Two-tailed T tests were used to compare the average mA values between these injury groups, accounting for race, gender, and age. The mean mA and p values are shown in the table below for the different groups compared. 

Conclusion: There was no significant difference in mean mA values across the stratified injury groups to indicate a difference in risk for PE. The elevated mA values seen in both PE and non-PE groups could be explained by the fact that all patients in the cohort underwent musculoskeletal injury, which could contribute to patient hypercoagulability. Further study should be performed to determine if specific fractures within these injury subsets have significantly different mA values and risk for PE. The relationship of mA values to timing of PE should be investigated.

 

68.18 ASA-PS is Associated With Mortality Rate Among Adult Trauma Patients

D. Stewart1, C. Janowak1, A. Liepert1, A. O’Rourke1, H. Jung1, S. Agarwal1  1University Of Wisconsin,Surgery,Madison, WI, USA

Introduction:  American Society of Anesthesiologists-Physical Status (ASA-PS) classification assesses pre-anesthesia surgical risk. Numerous studies correlate higher ASA-PS classification with increased perioperative mortality.  As the number of comorbidities in a traumatically injured patient is correlated to mortality rate, we evaluated if ASA-PS was an indicator of mortality risk for adult trauma patients.

Methods:  Our prospectively collected and internally validated database at an academic Level I trauma center was retrospectively reviewed for adult patients for 2009-2013.  ASA-PS scores were assigned based on patient comorbidities.  Three different methods were used to reflect a lack of concordance on the consideration of patient age in establishing ASA-PS.  In all three methods, NTDB-defined comorbidities were assigned an ASA-PS value and summed for each risk level.  Patients with no comorbidities were considered PS1, while PS2 consisted of those with a single PS2 condition.  Multiple PS2 conditions were considered multi-system disease, elevating a patient’s risk to PS3.  Presence of 3+ PS3 conditions led to a PS4 classification.  We then evaluated mortality rates as a primary outcome for each ASA-PS class using receiver operating characteristic (ROC) and Pearson Chi-Square analysis.  Discharge disposition and major complications were assessed as secondary outcomes.

Results: Model 1 (ASA), considered patient age >70 as a PS2 comorbidity, yielded an ROC of 0.619 for predicting mortality.  Model 2, not including age as a factor in ASA-PS (ASA–w/o Age), produced an ROC of 0.615.  Model 3, Age-Modified ASA (AM-ASA), produced an ROC of 0.648 (p<0.001).  Cross-tabulation revealed mortality rates of 2.4%, 2.4%, 4%, and 13.2%, for PS1, PS2, PS3, and PS4, respectively.  ASA–w/o Age (2.4%, 2.7%, 3.9%, and 13.2%) showed a similar trend, as did AM-ASA (2.4%, 1.9%, 2.9%, 10.2%), albeit with a dip in mortality rate for PS2.  All three ASA models had two-sided p<0.001 under Pearson Chi-Square analysis of mortality rates.  For discharge disposition (ASA ROC=0.668; ASA–w/o Age ROC=0.650; AM-ASA ROC=0.693) and major complications (ASA ROC=0.648; ASA–w/o Age ROC=0.653; AM-ASA ROC=0.641) all three models showed moderate predictive power.

Conclusion: ASA-PS classification models show an association between higher risk status and increasing mortality rate.  ASA-PS is moderately predictive of mortality, discharge disposition, and major complications per ROC analysis.  AM-ASA performed significantly better for mortality and discharge disposition, indicating that age can serve as an adjustment to the codified system to improve accuracy in the trauma population.

 

68.19 Hepatitis C Status does not Correlate with Worse Outcome in the Surgical ICU

M. L. Kueht1, R. A. Helmick3, S. Bebko2, S. Awad1,2  1Baylor College Of Medicine,Department Of Surgery,Houston, TX, USA 2Michael E. Debakey VA Medical Center,Department Of Surgery/Critical Care,Houston, TX, USA 3Mayo Clinic, Rochester,Department Of Surgery, Division Of Transplantation Surgery,Rochester, MN, USA

Introduction: Hepatitis C infection is thought to cause immune dysfunction through chronic inflammation and immune dysregulation. The clinical impact in the ICU of Hepatitis C Virus (HCV) status outside of active infection or cirrhosis is not well described. We aim to characterize the clinical ramifications of HCV status on patients admitted to the surgical ICU.

Methods: All patients admitted to our ICU between 2008 and 2012 were included. Demographic variables collected included age, BMI, MELD and APACHE II scores, and co-morbidities such as coronary artery disease (CAD), hypertension (HTN), hyperlipidemia (HLD), chronic kidney disease (CKD), cancer history, alcohol use, and HIV status. Outcomes evaluated were mortality in the ICU and while in the hospital, ICU and hospital length of stay, and infectious complications in the ICU such as ventilator associated pneumonia, and urinary tract, surgical site, and catheter related blood stream infections. Comparisons were made based on HCV status as well as sub-group analyses based on the presence or absence of cirrhosis. Statistical comparisons were performed with Fisher’s Exact tests and student t-tests where appropriate. Multivariate logistic regression was performed to identify potential predictors of infectious ICU complications.

Results: A total of 1672 patients were identified during the study period, of which 152 (10%) were HCV positive. The mean age of the cohort was 64.1 and the mean APACHE score was 13. The HCV-negative patients were significantly older (64.5 vs 60.1yrs, p<0.01), had increased BMI (28.8 vs 26.3 kg/m2, p<0.01), and had more CAD (40.9 vs 2.1%, p<0.01), HLD (58.3 vs 23.7%, p<0.01), HTN (71.7 vs 57.9%, p<0.01), and CKD (5.9 vs 2.0%, p=0.04). In the HCV-positive patients, there was a higher incidence of cirrhosis (25.7 vs 1.4%, p<0.01), hematologic (2.6 vs 0.5%, p=0.02) and solid organ cancers (27.0 vs 17.9%, p<0.01), HIV (3.3 vs 0.6%, p<0.01), and alcohol abuse (12.5 vs 3.1%, p<0.01). APACHE II (p=0.04) and MELD  (p<0.01) scores on admission were higher in HCV-positive patients. ICU length of stay was longer for HCV-positive patients (6.8 vs 5.5 days, p=0.03). Between the HCV groups overall, there were no significant differences in mortality or any of the infectious complications. However, in the non-cirrhotic patients, those with HCV had significantly increased in-hospital mortality, p<0.01. Multivariate logistic regression identified admission APACHE II score as an independent predictor of SICU infectious complications.

Conclusion: Our HCV-positive cohort was younger, had lower BMI, and less cardiac co-morbidity, yet still spent more time in the ICU. Also, in the absence of cirrhosis, HCV-positive patients had increased in-house mortality. Contrary to our hypothesis, this study demonstrates that despite theorized immunosuppression from HCV infection, HCV status was not associated with ICU infectious complications. This warrants further study.

68.20 Initial Advanced Cardiac Life Support in Trauma Patients with Cardiac Arrest

K. Konesky1, W. Guo1  1State University Of New York At Buffalo,Surgery,Buffalo, NY, USA

Introduction: Adult trauma patients who experience immediate out-of-hospital or in-hospital cardiac arrest (CA) and undergo Advanced Cardiac Life Support (ACLS) represent a unique patient population and pose difficult challenges to trauma surgeons.  Thus far little is known about the predictors and outcomes of CA in trauma patients. The objective of this study was to determine the incidence, predictors and outcomes following CA and ACLS within this population.

Methods: We retrospectively reviewed all 124 adult blunt and penetrating trauma patients who underwent ACLS after trauma over a period of 5 years (Jul 2008-Jun 2012).  The ACLS occurred either in the field, en route or in the ED of our Level I Trauma Center.  Patient’s demographics, clinical data, ACLS–related variables and outcomes were extracted from the electronic and paper medical records. 

Results:The median age of the group was 37 (IQR 38). The median ISS was 37 (IQR 50). While 32% of patients achieved recovery of spontaneous circulation (ROSC), only 8% survived with a complete neurologic recovery (CNR).  The failure rate of ACLS was 92%.  In blunt injury patients, the ROSC and CNR rate after ACLS was higher in motor vehicle collisions, motorcycle and pedestrian injuries combined than falls from heights (27.3% vs 6.9%, OR 5.06, 95% CI 0.95-27.0, p<0.05).  In penetrating injury, the ROSC/CNR rate after ACLS was higher in patients with injuries to the head, neck, face and extremities than those with injuries to the abdomen and chest (0% vs 25%, OR 0.051, 95% CI 0.0024-1.087, p<0.001).  Two variables predicted failure of ACLS were prolonged time interval between injury and ED arrival (OR 0.42, 95% CI 0.22-0.80, p<0.01), and high ISS (OR 0.97, 95% CI 0.94-1.00, p<0.05).   However, initial cardiac rhythms upon CPR (see table), ACLS duration/location (out-of-hospital or in-hospital), head injury, and day/night shifts in ED were not associated with the outcome of ACLS.  The percentage of penetrating trauma was lower in patients with age≥65 y/o than those <65 y/o (11.1% vs 53.6%, p<0.001), and the ROSC/CNR rate was significantly higher in the patients ≥65 y/o than those <65 y/o (18.5% vs 5.2%, p<0.05). 

Conclusion: Although survival after ACLS among trauma patients continues to be poor, advanced cardiac life support should be initiated regardless of the initial EKG rhythm.  A rapid response time and taking the patients to the ED immediately is the key to the survival. A better survival rate in patients ≥65 y/o is probably due to the lower rate of penetrating injury.

68.01 Impact of Employing Damage Control Laparotomy on Pulmonary Complications and Timing of Femur Repair

J. N. Steward1, B. A. Cotton1, J. B. Holcomb1, J. A. Harvin1  1University Of Texas Health Science Center At Houston,Houston, TX, USA

Introduction:  Damage control laparotomy (DCL) is a technique initially described in the 1980’s to address multi-trauma patients with penetrating torso injury and worsening coagulopathy. However, it application increased dramatically over the next three decades and only recently has its use been called into question with data demonstrating potential morbidity and overutilization. DCL has also been shown to delay definitive repair of orthopedic injuries (especially femur fixation), with an increase in pulmonary complications. The purpose of this study was to compare pulmonary complications and delays in definitive femur fixation among patients undergoing emergent laparotomy managed by either definitive laparotomy (DEF) of DCL. 

Methods:  Following IRB approval, our trauma registry was queried for (1) all adult patients >17 years of age, (2) undergoing emergent laparotomy (directly to the operating room within 1-hour of arrival), and sustaining femur fracture. Patients were then divided into DEF group: patients with primary fascial closure at the end of emergent laparotomy and DCL: those whose fascia was left open at the end of the initial case. The co-primary outcomes of interest were pulmonary complications and time to femur fixation (both initial and/or definitive, internal). Univariate analysis was followed by purposeful linear and logistic regression. 

Results: 106 patients met study criteria; 65 in the DCL cohort and 41 in teh DEF group. There were no differences in demographics or pre-hospital vitals between the groups, with teh exception of lower systolic blood pressurein the DCL patients (median 96 vs. 119; p=0.001). The DCL group had higher Injury severity scores (median 34 vs. 26; p<0.001), more hypotension (median 87 vs. 120; p<0.001) and shock (median -8 vs. -4; p=0.002) on arrival. While initial labs and vitals demonstarted worse physiology and coagulaopthy, the temperature (96.8 vs. 96.1), systolic (124 vs, 133), base value (-4 vs. -6) and prothrombin time (16.3 vs. 16.2) were similar. Time to definitive internal fixation was longer in the DCL group (71 vs. 32 hours; p=0.010). Multivariate modeling confirmed increased risk of pulmonary (odds ratio 1.90; p=0.039) and septic complications (odds ratio 1.86, p=0.048). 

Conclusion: Employment of DCL in teh setting of emergent laparotomy delays definitive femur fixation and is associated with increased likelihood of pulmonary and septic complications. 

 

68.02 Transfer Time and Distance Do Not Impact TBI Outcomes in a Mature Rural Regional Trauma System

S. C. Gale1,2, J. Peters1, P. Detwiler1, V. Y. Dombrovskiy2  1East Texas Medical Center,Trauma Surgery,Tyler, TX, USA 2Robert Wood Johnson Medical School,Rutgers University,New Brunswick, NJ, USA

Introduction:
After injury in rural settings, evaluation at local hospitals followed by transportation to regional trauma centers may cause delays to definitive care. We sought to determine if such delays impact outcomes in patients with traumatic brain injury (TBI) within a mature rural regional trauma system.

Methods:
The East Texas Medical Center Regional Level 1 Trauma registry was queried for all patients from 2008 to 2013. Blunt TBI patients, aged ≥18 and admitted ≤24 hours from injury, were stratified as “transfer” versus “direct” admission. Demographics, transfer distance, time from injury to ETMC and outcomes (mortality, complications, length of stay (LOS)) were compared for all study patients, for patients with Injury Severity Score (ISS) ≥15, and for patients requiring neurosurgical intervention using Chi-square; logistic regression was used to identify contributors to mortality.

Results:

For the 6-year study period, 7823 patients were admitted; 1845 met inclusion criteria: 947 direct admissions and 898 transfer patients from 50 different hospitals. For transfer patients, mean travel distance was 59.3±31.7 miles; mean time to Level 1 care after injury was 4.6±2.4 hours. Transfer patients were significantly older (55 vs 49 yrs p<0.01) and had more comorbidities, but also had lower mean ISS (15.9 vs 18.5 p<0.01) and lower mortality (7.0 vs 10.0% p<0.03), complications, and LOS. Neurosurgical intervention was equivalent between groups (p=0.88). For the most injured patients, those with ISS ≥15, mortality was similar (12.4 vs 14.8% p=0.28) between groups. After logistic regression analysis of all study patients, and of those with ISS ≥15, only age and ISS, not time or distance to definitive care, significantly predicted mortality. 

Conclusion:
Neither transfer distance, nor transfer time, independently contributed to mortality after TBI in a rural setting utilizing staged care. An established and mature regional trauma system, with initial stabilization using ATLS principles at small, rural hospitals, is effective in reducing negative outcomes for injured patients in rural settings.

 

68.03 Lower Extremity DVT Screening is Not Associated with Improved Outcomes in Trauma Patients

Z. C. Dietch1, B. Edwards1, M. Thames2, P. Shah1, M. Williams1, R. Sawyer1  1University Of Virginia,Department Of Surgery,Charlottesville, VA, USA 2University Of Virginia,School Of Medicine,Charlottesville, VA, USA

Introduction: Institutions may perform lower extremity ultrasound (LUS) screening for deep venous thrombosis (DVT) in trauma patients because therapeutic intervention is thought to reduce the incidence of pulmonary embolism (PE) in patients with DVT. However, disparate screening practices reflect a lack of consensus regarding clinical indications for screening and whether screening improves clinical outcomes. We hypothesized that LUS screening for DVT is not associated with reduced incidence of PE.

Methods: The 2012 American College of Surgeons National Trauma Data Bank Research Data Set was queried to identify 442,108 patients who were treated at institutions that reported performing at least one LUS and at least one DVT. Institutions that performed LUS on more than 2% of the admitted population were designated as screening (SC) facilities and remaining institutions were designated as non-screening (NSC) facilities. Patient characteristics and risk factors were used to develop a logistic regression model to assess the independent associations between LUS and DVT, and between LUS and PE.

Results: Overall, DVT and PE were reported in 0.94% and 0.37% of the study population, respectively. DVT and PE were more commonly reported in SC than NSC (DVT: 1.12% vs. 0.72%, p<0.0001; PE: 0.40% vs. 0.33%, p=0.0004). Patients treated at SC facilities were more severely injured (ISS>9) (39.1% vs. 34.9%, p<0.001) and significantly more likely to have at least one of 11 injuries or treatment variables commonly associated with DVT. Multivariable logistic regression demonstrated that LUS was independently associated with DVT (OR=1.44, CI 1.34-1.53) but not PE (OR=1.01, CI 0.92-1.12) (c-statistic 0.86 and 0.85, respectively). Sensitivity analyses performed at various rates for designating SC facilities, or limiting analyses to patients with length of hospital stay ≥3 days, did not alter the significance of these relationships (Table).

Conclusion: The performance of LUS in trauma patients is more likely to identify DVT but is not associated with a change in the incidence of pulmonary embolism. These findings suggest that LUS DVT screening protocols will detect many clinically insignificant DVTs for which subsequent therapeutic intervention may be unnecessary, and the use of these protocols should be questioned. Furthermore, in the absence of evidence that LUS decreases the rate of PE in trauma patients, consideration of DVT as a quality and performance measure should be abandoned until prospective, randomized trials further define the role for high-risk screening protocols.

 

68.04 Validation of comorbidity-polypharmacy score as predictor of outcomes in older trauma patients

R. N. Mubang1,3, J. C. Stoltzfus6, B. A. Hoey3, C. D. Stehly2,3, D. C. Evans4, C. Jones4, T. J. Papadimos5, M. S. Cohen1, J. Grell1, W. S. Hoff1,3, P. Thomas1,3, J. Cipolla1,3, S. P. Stawicki2  1St. Luke’s University Health Network,Department Of Surgery,Bethlehem, PA, USA 2St. Luke’s University Health Network,Department Of Research & Innovation,Bethlehem, PA, USA 3St. Luke’s University Health Network,Regional Level I Trauma Center,Bethlehem, PA, USA 4The Ohio State University College Of Medicine,Department Of Surgery,Columbus, OH, USA 5The Ohio State University College Of Medicine,Department Of Anesthesiology,Columbus, OH, USA 6St. Luke’s University Health Network,The Research Institute,Bethlehem, PA, USA

Introduction: Traditional injury severity assessment is insufficient in estimating the morbidity and mortality risk for older (≥45 years) trauma patients. Commonly used tools involve complex calculations or tables, do not consider comorbidities, and often rely upon data that is not available early in the trauma patient’s hospitalization. The Comorbidity-Polypharmacy Score (CPS), a simple sum of all pre-injury medications and comorbidities, has been found to independently predict morbidity and mortality in previously published reports. These studies, however, have been limited by relatively small sample sizes. We hypothesized that CPS is indeed associated with outcomes in older trauma patients and sought to validate this association using a large administrative dataset.

Methods: After IRB approval, a retrospective study of patients aged 45 years and older was performed using the administrative database from a Level I Trauma Center. The study period was Jan 1, 2008 to Dec 31, 2013. Abstracted data included patient demographics, injury characteristics and severity, Glasgow Coma Scale (GCS), hospital (HLOS) and intensive care unit lengths of stay (ILOS), morbidity, post-discharge destination, and in-hospital mortality. Univariate analyses were conducted with mortality, all-cause morbidity, and discharge destination as primary end-points. Variables trending toward statistical significance (p<0.20) were subsequently included in multivariate logistic regression. Data are presented as adjusted odds ratios (AOR), with p<0.05 denoting statistical significance.

Results: A total of 5,839 complete patient records were analyzed. Average patient age was 68.5±15.3 years (52% male, 89% blunt mechanism), with mean GCS 14.3. Mean HLOS increased from 4.27 days for patients with CPS≤15 to 5.55 days for those with CPS≥16 (p<0.01). Likewise, ILOS increased from a mean of 0.67 days for patients with CPS≤7 group to 1.33 days for those with CPS≥22 (p<0.01). Independent predictors of mortality included age (AOR 1.05, p<0.01), CPS (per-unit AOR 1.08, p<0.02), GCS ≤8 (AOR 48.16, p<0.01), and ISS (per-unit AOR 1.08, p<0.01). Independent predictors of all-cause morbidity included age (AOR 1.02, p<0.01), GCS ≤8 (AOR 5.4, p<0.01), ISS (per-unit AOR 1.09, p<0.01), and CPS (per-unit AOR 1.04, p<0.01). CPS did not independently predict need for discharge to a facility.

Conclusion: This study confirms that CPS is an independent predictor of morbidity and mortality in older trauma patients. However, CPS was not found to be independently associated with need for discharge to a facility in the current dataset. Prospective multicenter studies are warranted to evaluate the use of CPS as a predictive and interventional tool, with special focus on correlations between specific pre-existing conditions, potential pharmacologic interactions, and morbidity/mortality patterns.
 

68.05 CTA Grading Predicts Safe Nonoperative Management in Above-Knee Blunt Lower Extremity Vascular Injury

M. R. Noorbakhsh1, M. J. Bradley1, B. Zahoor1, S. Kyere1, K. Shanmuganathan1, D. Stein1, T. M. Scalea1  1University Of Maryland,R A Cowley Shock Trauma Center,Baltimore, MD, USA

Introduction:  The use of CT angiography for detection of extremity vascular injury has become widespread.  We sought to further characterize the vascular injuries noted on CT angiogram of the lower extremity for blunt injury with the use of a novel grading system, and to use CTA grade of vascular injury to predict safe nonoperative management.

Methods: A prospectively collected institutional database was queried for all CT angiograms for blunt lower extremity injury between March 2006 and December 2013.  Admission notes, CT angiograms, traditional angiograms, operative reports, discharge summaries, and post-discharge trauma and orthopedic clinic notes were reviewed.  Data extracted included pulse exam, presence of threatened limb, extremity vascular injuries noted on CTA and in the operating room, and any delayed claudication.  Those patients whose CT examinations revealed an injury to the common femoral, superficial femoral, profunda femoral, and/or popliteal arteries were submitted to an attending radiologist for further grading.

Results: 440 CTA examinations of the lower extremity were performed on 398 patients during the study period.  140 of these examinations revealed a vascular injury.  The sensitivity of abnormal pulse exam in predicting a vascular injury noted on CTA was 69.6%, with specificity of 53.2%.  79 above-knee vascular injuries were identified and graded on 57 CTA examinations.  52 of these injuries were immediately addressed operatively.   Results of operative and nonoperative management, stratified by injury grade, are summarized in table 1.  40/41 (98%) high grade (grades 3 and 4) injuries prompted immediate operative intervention.  32 isolated low-grade (grades 1 and 2) above-knee vascular injuries were identified, of which 25 (78%) were managed nonoperatively, with no adverse sequelae (delayed amputations or claudication).

 

Conclusion: CT angiography is an excellent test to assess the vasculature in blunt lower extremity injury.  Grading of injuries, when combined with physical exam, can predict which above-knee vascular injuries are managed nonoperatively.  Nonoperative management of low-grade injuries appears to be safe, even in the setting of a documented pulse deficit but not an acutely threathened limb, with no evidence of adverse sequelae in the 25 injuries managed nonoperatively.  Grade 3b and grade 4 injuries were generally managed operatively, with a higher percentage of acutely threatened limbs requiring immediate vascular intervention.
 

62.07 Innate lymphoid cells in critical illness: is interleukin-33 (IL-33) a potential marker of sepsis?

T. T. Chun1, D. S. Heffernan1, N. Hutchins1, W. G. Cioffi1, C. Chung1, A. Ayala1  1Brown University School Of Medicine,Surgery,Providence, RI, USA

Introduction:  Sepsis remains a major clinical challenge with few effective therapeutic options. The recently described innate lymphoid cells (ILCs) play an important role in sepsis. Specifically, IL-33 mediated group 2 ILCs (ILC2s) produce Th-2 associated cytokines and have a protective effect against sepsis. IL-33 is also known to interact with invariant natural kill T (iNKT) cells to release protective cytokines. Previous studies have demonstrated that ILC2s are implicated in lung inflammation, and sepsis leads to an increased IL-33 level in lung tissue homogenates. We hypothesized that IL-33 is similarly released during sepsis into the blood, acting on the gastrointestinal tract and the liver to stimulate ILC2s as well as iNKT cells. The purpose of this study was to evaluate IL-33 as a marker of sepsis in an experimental model and human subjects.

Methods:  Cecal ligation and puncture (CLP) was performed in both wildtype (WT) and invariant natural killer T cell knockout (iNKT-/-) mice. IL-33 levels were measured in liver tissue homogenates, the serum and the peritoneal fluid using enzyme-linked immunosorbent assays (ELISA). Liver non-parenchymal cells were isolated and stained with antibodies to determine expression of IL-33 receptor (IL-33R). We also obtained blood from septic patients and compared their serum IL-33 levels with those collected from otherwise healthy volunteers.

Results: CLP induced elevation of liver IL-33 in both WT and iNKT-/- mice (p<0.05). Furthermore, following CLP, there was a significantly increased percentage of IL-33R positive cells in WT (2.8 vs 20.9%;p=0.038). Sepsis did not induce an increase in IL-33R expression in NKT-/- mice. There was a trend toward increase in serum IL-33 levels in WT CLP versus sham (43.3 vs 104.6pg/ml;p=0.15), a finding not observed in iNKT-/- mice. Specifically, iNKT-/- mice compared to WT demonstrated significantly lower serum IL-33 levels following CLP (3.6 vs 104pg/ml;p=0.006). Within the peritoneal cavity, CLP induced significantly higher IL-33 levels in WT mice (0.90 vs 10.8pg/ml;p=0.039). In iNKT-/- mice, this alteration of peritoneal IL-33 levels was magnified (3.6 vs 23.4pg/ml;p=0.49). In humans, patients with sepsis had significantly higher serum IL-33 levels than the normal control group (0.17 vs 1.12pg/ml;p=0.0021).

Conclusion: Together, the observations that mouse liver, serum and peritoneal IL-33 levels increase with CLP, as does non-parenchymal cell IL-33R expression, imply that the ILC2s are involved and activated by sepsis. Moreover, this may be effected by the interaction with iNKT cells. Finally, that such changes are not restricted to mice but are also evident in humans, points at novel therapeutic/pathological targets.