66.01 National incidence, mortality, and cost of CABG in renal transplant recipients in the US

J. E. Tooley1, D. D. Bohl1, S. Kulkarni1, M. I. Rodriguez-Davalos1, S. Emre1, D. C. Mulligan1, P. S. Yoo1  1Yale University School Of Medicine,Transplant Surgery,New Haven, CT, USA

Introduction: We investigated the national trend in prevalence of kidney transplants in patients receiving coronary artery bypass grafts (CABG) and determined if these patients have worse outcomes compared to the national average.

Methods: This was a retrospective cohort study using the 2004­ – 2011 Nationwide Inpatient Sample (NIS) – the largest inpatient database in the United States, which documents 8 million hospitalizations annually. All patients receiving CABG during their hospitalization were identified by ICD-9 procedure code. We then identified patients with history of kidney transplant by ICD-9 diagnosis codes. Baseline differences in age, sex, and comorbidities between CABG patients with a history of kidney transplant compared to all other patients receiving CABG were identified. The primary outcome measure was in-hospital all-cause mortality. Secondary outcomes included length of stay and total hospitalization charge. Comparisons were made using bivariate analysis and multivariate analysis correcting for identified baseline differences. 

Results:The percentage of CABG patients with kidney transplants has increased from 0.10% in 2004 to 0.20% in 2011 (p=0.002). Kidney transplant patients receiving CABG were younger (59.1±9.9 vs. 66.0±10.9, p=0.002) and more commonly female (33.8% vs. 28.4%, p=0.002) compared to other CABG patients. Charlson comorbidity index scores did not differ between groups. In hospital all-cause mortality was higher among kidney transplant patients receiving CABG than the remainder of the CABG cohort (4.83% vs. 2.79%, p = 0.001). On average, kidney transplant patients stayed in the hospital an extra 1.74 days (95% Confidence Interval (CI) 1.12 – 2.36, p <0.001) and had an increase in total charge of hospitalization by $22,829 (95% CI $14,946 – $30,711). 

Conclusion:The percentage of CABG patients with kidney transplants doubled between 2004 and 2011. These patients have increased mortality, increased length of stay, and increased hospitalization charges compared to other patients receiving CABG. These findings warrant further investigation into the causes for disparities in CABG outcomes in kidney transplant patients. 

 

66.04 Successful Post-operative Nutritional Management in Lung Transplant Patients with Systemic Sclerosis

A. D’Angelo1, D. Odell1, N. Shigemura1, C. Bermudez1, T. Richards1, M. Crespo1, J. Pilewski1, J. Luketich1, J. D’Cunha1  1University Of Pittsburg,Pittsburgh, PA, USA

Introduction: End-stage lung disease is the leading cause of death in people with systemic sclerosis.  Lung transplantation (LTx) is often the only viable therapy available when drug treatment fails to suppress progression of pulmonary disease.  Patients are often declined for LTx because of the associated morbidity related to esophageal dysmotility.  Because of the lack of viable options for their end-stage lung disease, we have considered these patients for LTx and followed a unique post-operative nutritional management algorithm because of their esophageal disease.  The aim of this study was to review our approach and the associated outcomes.

Methods: Between 2008 and June 2013, our center transplanted 48 patients with systemic sclerosis. An IRB-approved retrospective analysis was performed to examine the nutritional management and subsequent complications in these patients.  Patients were categorized based on route of enteral access (percutaneous endoscopic gastrojejunostomy (PEG-J) or nasojejunal (NJ) tube).  The incidence, timing, and types of complications were analyzed. Student's t-test was used to compare groups (p<0.05 was significant).

Results: In our 48 patients with systemic sclerosis receiving LTx, 47 received bilateral grafts.  33 received PEG-J tube nutritional support, while the remaining 15 received NJ tube feeding alone.  In patients receiving a PEG-J feeding, an average of 16.75 days (median = 12 days) elapsed between transplant and PEG-J tube placement, during which time nutrition/medications were provided by NJ tube only. There was one complication (localized infection) in the PEG-J.  There were no major complications in the NJ treatment group. An average of 68 days elapsed from transplantation to first oral intake in patients receiving a PEG-J tube, compared to 35.5 days for patients with a NJ tube (p = 0.001).  Average hospital length of stay was 40 days for patients with PEG-J feeding tube and 42.6 days for patients with NJ tube feeding (p = 0.75).  Overall survival was 85% at one year and there was no difference between the groups (NJ 86%, PEG-J 84%).

Conclusion: End-stage lung disease secondary to systemic sclerosis is a highly morbid condition for which LTx is a life-saving intervention.  Although many centers will not consider these patients for LTx, we have developed a low morbidity nutritional management algorithm with acceptable short-term outcomes.

 

66.06 Increased Risk of Vascular Thrombosis in Pediatric Liver Transplant Recipients with Thrombophilia

D. J. Cha1, E. J. Alfrey3,4, D. M. Desai1,2, C. S. Hwang1,2  1University Of Texas Southwestern Medical Center,Division Of Surgical Transplantation/Department Of Surgery,Dallas, TX, USA 2Children’s Medical Center,Division Of Pediatric Transplantation,Dallas, Tx, USA 3Marin General Hospital,Department Of General Surgery,Larkspur, CA, USA 4Prima Medical Group,Larkspur, CA, USA

Introduction:

Pediatric patients who undergo liver transplantation are at higher risk of developing vascular complications when compared to adult liver transplant recipients. The consequences of hepatic artery thrombosis (HAT) or portal vein thrombosis (PVT) can cause significant morbidity and mortality. We examined pediatric liver transplant recipients who developed vascular thrombosis, either HAT or PVT, and the presence of thrombophilia.

Methods:

We examined outcome in all pediatric patients who underwent liver transplantation between January 2010 and July 2014. Recipient and donor demographic data and outcome data were examined. Demographic and outcome data included age, race, sex, weight, blood type, cold storage time (CST, time from donor aortic cross clamp to out of ice) in minutes, anastomotic time in minutes, estimated blood loss (EBL) at transplant, intraoperative transfusion requirement, presence and number of rejections, graft survival, mode of arterial anastomosis (artery to artery vs. conduit), HAT, PVT, and presence of thrombophilia. Categorical differences were compared using the unpaired Student's t-test and nominal variables using either the Chi Square or the Fischer's exact test. A p-value of <0.05 was considered significant.

Results:

Forty-six pediatric patients underwent liver transplantation. The mean recipient age in months at transplantation was 62.7+/-59.5 months, mean weight was 20.2+/-15.5 kg, mean CST was 432+/-98 minutes, mean anastomotic time was 57+/-17 minutes, mean EBL was 399+/-501 mL, mean intraoperative transfusion requirement was 360+/-371 mL, mean donor age was 89.5+/-115 months, and mean donor weight was 26.3+/-20.7 kg. Twenty-one recipients were found to have thrombophilia, including 5 with HAT and 2 with PVT.

When comparing recipients with or without any vascular thrombosis, those with thrombophilia had a significantly higher incidence of developing a vascular thrombosis (7/21 vs. 0/25, P = 0.0017). Five of 42 recipients with artery to artery reconstruction developed HAT versus 0 of 4 with a conduit. Recipients who developed any thrombosis were significantly lower in weight than those who did not develop any thrombosis (9.0+/-1.6 kg vs. 22.2+/-16.0 kg, P = 0.0366), Table.

Conclusion:

All pediatric liver transplant recipients who developed any vascular thrombosis were also found to have thrombophilia. Recipients who were smaller in size were at significantly higher risk of developing vascular thrombosis. Lower weight recipients with thrombophilia may benefit from arterial reconstruction with a conduit to decrease the risk of vascular thrombosis.

66.07 A Multidisciplinary Team Approach to End Stage Dialysis Access Patients

C. Kensinger1, M. Nichols2, P. Bream2, D. Moore1  1Vanderbilt University Medical Center,Department Of General Surgery,Nashville, TN, USA 2Vanderbilt University Medical Center,Department Of Radiology And Radiological Sciences, Division Of Interventional And Vascular Radiology,Nashville, TN, USA

Introduction:  The Hemodialysis Reliable Outflow (HeRO) dialysis access device is a permanent dialysis graft coupled to a central venous outflow component used in patients with end-stage dialysis access (ESDA) issues. Placement involves creation of an arterial anastomosis with the continuous venous outflow being delivered directly to the right atrium bypassing a stenosed or occluded central venous system.  Compared to tunneled venous dialysis catheters, the HeRO graft has been shown to have superior patency rates, lower infection complications, lower associated costs, and superior dialysis flow rates.  Given the potential for morbidity in ESDA patients secondary to medical comorbidities and multiple previous dialysis accesses, a multidisciplinary approach has been employed to maximize operative success in this complex patient population. 

Methods:  The multidisciplinary team consists of a nephrologist, an interventional radiologist, and a surgeon. Patients with suspected venous outflow stenosis or obstruction are referred to the ESDA clinic by the nephrologist. A computerized tomography angiogram/venogram is obtained and reviewed for anatomical consideration by the interventional radiologist. Based on the imaging and physical exam, laterality and a plan for access to the right atrium is jointly determined pre-operatively. In addition, an echocardiogram is obtained to ensure adequate right heart function given the increased venous return following HeRO placement.  On the day of the procedure, access to the right atrium is first achieved in the interventional radiology (IR) suite.  Using the resources available in the IR suite maximizes success for graft placement. In the OR, a 19F venous outflow component is exchanged for a placeholder catheter that was initially placed in the IR suite. Once the venous outflow component is successfully placed, the brachial arterial is exposed and the arterial inflow anastomosis to the HeRO graft is performed. The patient is discharged the day of surgery. The patient returns to clinic for evaluation prior to HeRO cannulation.

Results: Over the past 4 years, a multidisciplinary approach for HeRO placement has been used in 31 ESDA patients.  58% (18/31) of these patients have required advanced maneuvers in the IR suite in order to obtain central venous access.   In these cases, angioplasty (5), recannulatization (5), or chest wall collateral access (8) was required to reach the right atrium. Access to the right atrium was achieved in 100% (31/31) cases. Two cases were aborted in the OR due to the inability to exchange the placeholder catheter for the 19F venous component. No intra-operative complications were encountered. 

Conclusion: In this difficult patient population, a multi-disciplinary team can maximize operative placement of HeRO grafts in patients with complex central venous outflow obstruction. This in turn leads to higher success rates and decreased cost.

 

66.08 Surgical Outcomes on Pediatric Liver Transplantation

M. I. Rodriguez-Davalos1, A. Munoz-Abraham1, S. Torres-Landa1, J. E. Tooley1, S. Alburquerque1, P. S. Yoo1, U. D. Ekong1, S. H. Emre1  1Yale University School Of Medicine,Surgery/Transplantation,New Haven, CT, USA

Introduction:  Liver transplantation (LT) is currently the standard of care treatment for end stage liver disease, some metabolic/genetic liver diseases, and acute liver failure in children. The current outcomes in Pediatric Liver Transplantation (PLTx) are a prime example of multidisciplinary teams and a patient centered culture that works in a safe and efficient manner to create a system that can provide the best care for these children. Outcomes of PLTx have progressively improved morbidity and graft survival mainly because of advances in surgical techniques, immunosuppressive therapies, infectious monitoring and treatment.

Methods:  After obtaining IRB, we reviewed all PTLx between Sep 2007-Aug 2014; all patients 0-18 years who received liver transplants were included in the study. Data on demographics, etiology, type of graft, vascular, biliary complications, return to the operating room (OR) as well as patient and graft survival were analyzed. Descriptive statistics were used for most of the data. Patient and graft survival were calculated.

Results: 67 pediatric patients received 69 LT during the study period (39 male). 19 (28.35%) patients were transplanted with whole liver grafts and 48 (71.64%) with technical variant segmental grafts. 28 patients (41.79%) underwent Living Donor Liver Transplantation (LDLT). The mean age was 6.10 years, 29 (43.3%) patients were under 2 years of age. The most common indication was biliary atresia (34.32%), followed by metabolic/genetic liver diseases (23.88%).

The following vascular complications were observed: 2 cases of hepatic artery stenosis (2.8%) treated with balloon dilatation, both proximal to the common hepatic artery, 2 hepatic artery thrombosis (HAT) that resulted in graft loss and need for re-transplantation. One child developed portal vein stenosis that resulted in portal hypertension treated with a spleno-renal shunt, there were no cases of portal vein thrombosis.

13 (18.8%) children developed biliary complications. Other surgical complications that required return to OR within 30 days included a case of thermal injury to the bowel resulting in late perforation and peritonitis, postoperative hemorrhages in 6 patients (8.6%) and one abdominal compartment syndrome in a LDLT.  The only complication with significant impact in graft survival was in the patient who suffered HAT. The overall one-year patient and graft survival was 100% and 97.11%. 

Conclusions: Over the years, statistical outcomes in pediatric liver transplantation have significantly improved. In the current era of PLTx, we believe that outcomes are not only the result of a high grade of experience and surgical expertise but also a conduction of a multidisciplinary team collaborating for the patient. As expected the most common complication was biliary stricture, however this did not affect the overall outcomes. Vascular complications especially hepatic artery thrombosis continues to be the most important factor in early graft survival.

 

 

 

66.09 Superior Outcomes of Chinese Americans After Kidney Transplantation

F. Karipineni1, A. Parsikia1, M. Shaikh1, J. Ortiz1, A. Joshi1  1Albert Einstein Medical Center,Department Of Surgery,Philadelphia, PA, USA

Introduction:  Asians represent the fastest growing ethnic group in the United States. Despite significant diversity within the group, many transplant studies treat Asians as a homogeneous entity. We compared patient and graft survival among major Asian ethnicities to determine whether any subgroup has superior outcomes.

Methods: We conducted a retrospective analysis of kidney transplants on Asian patients between 2001 and 2012. Covariates included gender, age, comorbidities, and donor category. Primary outcomes included one-year patient and graft survival. Secondary outcomes included delayed graft function and cause of rejection and death.

Results: 91 Asian patients were identified. Due to the large proportion of Chinese patients (n=37), we grouped other Asians into one entity (n=54) for statistical comparison among Chinese, other Asians, and Whites (n=346). Chinese subjects had significantly lower body mass index (BMI) (p=0.001) and had the lowest proportion of living donors (p<0.001). Patient survival was highest in our Chinese cohort (p<0.001), while graft survival did not differ.

Conclusion: Our study confirms outcomes differences among Asian subgroups in kidney transplantation. Chinese demonstrate better patient survival at one year than Whites and non-Chinese Asians despite fewer live donors. Lower BMI scores may partly explain this. Larger, long-term studies are needed to elucidate outcome disparities among Asian subgroups.

 

66.10 Tobacco Abuse Does not Increase Risk of Wound Infection or Hernia after Liver Transplantation.

V. A. Fleetwood2, J. Zimmermann2, J. Poirier2, M. Hertl1, E. Y. Chan1  1Rush University Medical Center,Department Of General Surgery, Division Of Transplantation Surgery,Chicago, IL, USA 2Rush University Medical Center,Department Of General Surgery,Chicago, IL, USA

Introduction:
Tobacco abuse is prevalent in the United States, with an estimated 18% of adults reporting smoking in 2012. The smoking epidemic has not spared those with end stage liver disease. In patients presenting for evaluation for liver transplantation (OLT), smoking rates were up to 75%, elevated especially in those with alcoholic liver disease. Only an estimated 20% of transplant centers have a tobacco use policy. Multiple studies have demonstrated higher morbidity and mortality in transplant patients who smoke, with an established higher risk of vascular and biliary graft complications, malignancy, and recidivism. Despite this literature, there remains a paucity of evidence on the incidence of wound complications after OLT in smokers. The aim of our study was to elucidate the association between tobacco abuse and both wound infections and hernias after OLT.

Methods:
We performed a retrospective single-center review of 141 patients having undergone OLT between January 1, 2004 and December 31st, 2013. We restricted our study to patients who received MELD exception points in order to capture a less systemically ill cohort pre-transplantation. Endpoints were wound infection and incisional hernia. Patients were defined as smokers (56/141, or 40%) or nonsmokers (85/141, or 60%) based on initial assessment. Analysis was performed using Fisher’s exact test, and odds ratios (OR) and confidence intervals (CI) were calculated with R statistical software.

Results:
Neither the incidence of incisional hernia nor of wound infection was significantly higher in smokers. The odds ratio for herniation in smokers was 0.71 (CI 0.23-2.20). Wound infections appeared to be similarly unrelated to smoking status (OR 1.49, CI 0.20-10.90). Our review involved a small sample size but remained adequate for detection of differences between smokers and nonsmokers at the rate of 2 hernias per 100 patients (1.95%) or more (with power =.08 and an alpha=.05).

Conclusion:
Our study revealed no significant difference in rates of wound infection or hernia development in patients smoking at the time of transplantation. According to the data presented, it is questionable whether there is sufficient evidence to deny liver transplantation on the basis of tobacco use. However, although we have established that smoking and the short-term complications of wound infection and hernia do not correlate, the long-term complications – including malignancy, lung disease, heart disease, and graft complications – are well described in the literature. We recommend each transplant center evaluate their policy on tobacco use.
 

65.06 CHA2DS2-VASc Score is a Highly-Sensitive Predictor of Postoperative Atrial Fibrillation

R. Kashani1, S. Sareh1, K. Yefsky1, C. Hershey1, C. Rezentes1, N. Satou1, B. Genovese1, R. Shemin1, P. Benharash1  1David Geffen School Of Medicine, University Of California At Los Angeles,Division Of Cardiothoracic Surgery,Los Angeles, CA, USA

Introduction: Atrial fibrillation after cardiac surgery (POAF) is a common complication and is associated with increased morbidity and costs of care. Identification of patients at high risk for developing POAF is critical to targeted prophylaxis strategies. While various clinical, operative, and anatomic risk factors have been used to estimate the risk for POAF, a practical scoring system to identify such patients is lacking. Since the CHA2DS2-VASc scoring system is widely utilized to predict stroke risk in patients with atrial fibrillation (AF), we hypothesized that it may also serve as a strong predictor of POAF.  The present study evaluated the utility of the CHA2DS2-VASc scoring system in predicting de-novo POAF after cardiac operations.

 

Methods: A total of 3836 patients who underwent cardiac surgery from 2008 to 2014 at our institution were identified for analysis. Exclusion criteria included previous history of atrial fibrillation/flutter, operations or medications for arrhythmia, transplants, or the use of extracorporeal membrane oxygenation and ventricular assist devices. For patients that met the inclusion criteria (2385), a CHA2DS2-VASc score (0-9) was calculated using the institutional Adult Cardiac Surgery Database. POAF within 30 days of the original operation lasting for greater than 30-seconds was the primary outcome measure. Based on CHA2DS2-VASc scores, patients were then grouped into low (0), moderate (1) and high (≥ 2) risk groups. A multivariate regression model was developed using Stata 12.1 (StataCorp, College Station TX) to adjust for other risk factors of POAF including gender, valvular operations, smoking, dyslipidemia, renal failure, use of beta blockers and statins, anemia, and both mitral and aortic insufficiency.

 

Results: Of the 2385 (66% male) patients included for analysis, 380 (16%) patients developed POAF.  CHA2DS2-VASc scores for patients with and without AF were 3.4 ± 1.8 and 2.6 ± 1.8 (p<0.001), respectively.  The majority of patients (74%) were categorized as high risk while another 12% comprised the moderate risk group. When compared to the low risk group, the odds of developing POAF increased two-fold and five-fold in the moderate and high-risk groups, respectively (p<0.001). As the CHA2DS2-VASc score increased from 0-9, the predicted risk of POAF increased from 9% to 42% (p<0.001).  Using a cutoff score of 1, sensitivity and specificity for detecting POAF were 88% and 30%, respectively.

 

Conclusion: The commonly used CHA2DS2-VASc score is a highly-sensitive predictor of POAF.  This scoring system should be utilized to identify at-risk patients who may benefit from prophylaxis strategies against POAF.

65.08 Atrial Fibrillation in Patients Undergoing Esophagectomy: Surrogate for Anastomotic Dehiscence?

M. thau2, S. Hoffe2, R. Shridhar2, K. Almhanna3, A. Salem1, A. Abbott3, m. doepker3, K. Meredith1  1University Of Wisconsin,Surgical Oncology,Madison, WI, USA 2Moffitt Cancer Center And Research Institute,Radiation Oncology,Tampa, FL, USA 3Moffitt Cancer Center And Research Institute,Gastrointestinal Oncology,Tampa, FL, USA

Introduction:
Neoadjuvant chemoradiation (NT) has become the standard for patients with locally advanced esophageal cancer.  Thirty percent of patients who undergo esophagectomy will develop atrial fibrillation (AF).  NT may contribute to patients developing AF and this may be a surrogate for anastomotic dehiscence (AD). 

Methods:
We queried a prospective esophageal database to identify patients who underwent esophagectomy with or without NT.  Demographics and post-operative complications were all compared with fisher exact test and considered significant at p<0.05. 

Results:
We identified 811 patients who underwent esophagectomy with a mean age of 68 +/- 12 years.  Five-hundred and fifteen (63.5%) were treated with NT and 296 (36.5%) were not.  Eighty-nine (11%) of patients developed AF, 59 (11.5%) in the NT group, and 30 (10.1%) in the non NT group.  There was no significant differences noted in the incidence of AF in those that were treated with NT and those that were not (p=0.64). A total of 54 (6.7%) patients were identified as having AD, 27 (5.2%) in the NT cohort and 27 (9.1%) in the non NT cohort.  The NT group had lower incidence of AD compared to their non NT counterparts (p=0.04). Of the 54 patients who experienced AD, 6 (11%) had concomitant AF, and 48 (89%) did not and the remaining 83 patients who developed AF did not develop AD (p=1). 

Conclusion:
NT prior to esophagectomy does not increase patient’s risk for developing postoperative AF.  Moreover, the presence of AF in the post esophagectomy patient, did not serve as a surrogate for identifying AD. 
 

65.09 Periodontal Disease Does Not Correlate with Worse Outcomes after Esophagectomy

W. B. Weir1, K. M. Thompson1, C. Garaicoa-Pazmino2, C. Tsai2, J. Lin1, P. Carrott1, W. Lynch1, M. Orringer1, A. Chang1, J. Fenno2, Y. Kapilla2, R. M. Reddy1  1University Of Michigan Health System,Department Of Surgery, Section Of Thoracic Surgery,Ann Arbor, MI, USA 2University Of Michigan School Of Dentistry,Division Of Periodontics, Department Of Periodontics And Oral Medicine,Ann Arbor, MI, USA

Introduction:  Postoperative anastomotic leaks after esophagectomy for esophageal cancer remain a significant source of morbidity and mortality. Periodontal disease has been associated  with post-operative infections in esophagectomy and brain-surgery patients. We hypothesized that preoperative periodontal disease may result in a higher incidence of postoperative anastomotic leak and morbidity after esophagectomy.

Methods:  A prospective study of esophagectomy patients was performed from May 2013 to August 2014, beginning with a periodontal health survey administered to the patients prior to surgery. Risk factors for periodontal disease included prior tooth extraction, gum treatment, tooth replacement, bleeding gums, teeth cleaning greater than one year prior to survey enrollment, teeth brushing less than twice per day, flossing less than daily, and diabetes. Patients were risk stratified into low risk (1-2 risk factors) and high risk (≥ 3 risk factors) for periodontal disease. Primary outcomes analyzed were elevated pre- and postoperative white blood cell count (WBC), postoperative anastomotic leak, length of stay, and readmission after discharge. The data was analyzed using a Chi-square and student's t-test.

Results: 66 patients were enrolled, 55 patients completed the survey and underwent successful esophagectomy. Among those, 80% (44) had a diagnosis of adenocarcinoma, 13% (7) had squamous cell carcinoma, and 7% (4) had benign disease. The 30-day mortality was 2% (1).

All patients had at least 1 risk factor for periodontal disease. Approximately 27% (15) of patients were considered low risk for periodontal disease and 73% (40) were high risk. Periodontal disease did not correlate with the mean difference between pre- and postoperative WBC (low risk group 7.77, high risk group 6.84, p = 0.56).  Nor was periodontal disease a risk factor for the development of anastomotic leak (p = 0.63). The mean length of stay was longer for the low risk group than for the high risk group (13.5 days vs. 9.75 days, p < 0.001). There was no difference in re-admission rates between the groups (p = 0.2). 

Conclusion: Risk factors for periodontal disease were present in all of our esophagectomy patients. Higher numbers of risk factors did not correlate with worse outcomes. As all patients had some risk of periodontal disease, there may be an association between periodontal disease and esophageal cancer development. 

65.10 The Effect of Postoperative Atrial Fibrillation on Hospital Course in RATS Lobetomy

E. P. Ng1, F. O. Velez-Cubian1,2, M. Echavarria1, C. Moodie2, J. Garrett2, J. Fontaine2, L. Robinson2, E. Toloza1,2  1University Of South Florida College Of Medicine,Tampa, FL, USA 2Moffitt Cancer Center And Research Institute,Tampa, FL, USA

Introduction:

In this study, we sought to investigate the effect of post-operative atrial fibrillation incidence after robotic-assisted video-thoracoscopic pulmonary lobectomy on comorbid postoperative complications, chest tube duration, and hospital length of stay.

Methods:

We retrospectively analyzed prospectively collected data from 208 consecutive patients who underwent robotic-assisted pulmonary lobectomy by one surgeon for known or suspected lung cancer. Postoperatively, 39 (18.75%) of these patients experienced atrial fibrillation (AF) during their hospital stay. The presentations of postoperative complications other than AF, chest tube duration, and hospital length of stay were analyzed in patients with AF and without AF postoperatively. Statistical significance (p<0.05) was determined by unpaired student’s t-test.

Results:

Of patients with postoperative AF, 46% also had other concurrent postop complications, while only 31% of patients without AF experienced complications. The average number of postoperative complications experienced by patients with AF was significantly higher than that experienced by those without AF (0.92 vs. 0.44, p = 0.001). The mean (9.8 days) and median (8 days) hospital lengths of stay in the AF patients were significantly longer than those without AF, whose mean and median hospital stays were 6.1 days and 4 days, respectively. A similar result was also seen with chest tube duration, with the mean and median chest tube duration in AF patients (9.6 days and 6 days, respectively) were significantly higher than those in patients without AF (5.6 days and 4 days, respectively). Results were summerized in Table 1 with standard deviation (SD) and standard error of the mean (SEM).

Conclusion:

This study demonstrated the association between the incidence of atrial fibrillation and a more complicated hospital course. Further studies were needed to evaluation whether confounders were involved in these associations.

65.11 Intra-operative Factors affects Incidence of Postoperative Atrial Fibrillation in RATS Lobectomy

E. P. Ng1, F. O. Velez-Cubian1, M. Echavarria1, C. Moodie2, J. Garrett2, J. Fontaine2, L. Robinson2, E. Toloza1,2  1University Of South Florida College Of Medicine,Tampa, FL, USA 2Moffitt Cancer Center And Research Institute,Tampa, FL, USA

Introduction:

In this study, we sought to investigate the association between post-operative atrial fibrillation incidence and intra-operative factors, such as estimated blood loss, operative time, and intra-operative complications, in patients who underwent robotic-assisted video-thoracoscopic pulmonary lobectomy.

Methods:
 

Prospectively collected data from 208 consecutive patients who underwent robotic-assisted pulmonary lobectomy by one surgeon for known or suspected lung cancer were analyzed. Postoperatively, 39 (18.75%) of them experienced atrial fibrillation (AF) during the hospital stay. Estimated blood loss (EBL) during operation, duration of operation, and intra-operative complications from all patients were collected and analyzed. Statistical significance (p<0.05) was determined by unpaired student’s t-test.

Results:

Eighteen out of 208 patients (8.7%) experienced intra-operative complications during the surgery. Of which, 5 of them (27.8%) experienced postoperative AF, which is a slightly higher incidence of AF than for patients without complication intra-operative complications (34 out of 190, 17.9%). In the group of patients who experienced postoperative AF, the mean and median EBL were 368ml and 200ml ,respectively, which were both higher (but not significantly, p = 0.08) than the mean and median EBL in patients without AF (266ml and 150ml, respectively). There is no significant difference in operative times between the patients with and without postoperative AF (median 225mins vs. 206mins, p=0.33).

Conclusion:

This study supported that the potential link between post-operative atrial fibrillation and intra-operative complications. Thought not significantly different, the patients who experienced postoperative AF had higher EBL and longer operative times during surgery.

65.12 Post-Operative Outcomes with Cholecystectomy in Lung Transplant Recipients.

S. Taghavi1, S. Jayarajan1, V. Ambur1, J. Gaughan1, Y. Toyoda1, E. Dauer1, L. Sjoholm1, A. Pathak1, T. Santora1, A. Goldberg1, J. Rappold1  1Temple University School Of Medicine,Department Of Surgery,Philadelphia, PA, USA

Introduction:   Lung transplantation remains the treatment of choice in select patients with end-stage pulmonary disease.  There is a paucity of data on outcomes for lung transplant recipients requiring general surgery procedures.  The goal of this study was to examine outcomes after cholecystectomy in lung transplant patients using a large, national database.

Methods:   The National Inpatient Sample (NIS) Database (2005-2010) was queried for all lung transplant patients requiring laparoscopic cholecystectomy, open cholecystectomy, and tube cholecystostomy.   Weighted frequencies were used to examine peri-operative outcomes.

Results:  There were a total of 387 cholecystectomies or cholecystostomies performed in lung transplant patients during the study period.  The majority were done for acute cholecystitis (n=218, 56.9%) and were done urgently/emergently (n=258, 68.2%).  There were a total of 304 (78.6%) laparoscopic cholecystectomies, 73 (19.1%) open cholecystectomies, and 10 (2.6%) tube cholecystostomies.  Elective admission occurred more often in the laparoscopic cholecystectomy group (n=114, 37.5%) as compared to the open cholecystectomy (n=15, 20.5%) and cholecystostomy (n=0, 0.0%) groups; p=0.002.  There was no significant difference in age when comparing the laparoscopic cholecystectomy (53.6 years), open cholecystectomy (55.5 years), and cholecystostomy (62.5 years) groups; p=0.39.  In addition, Charlson Comorbidity Index was similar in the laparoscopic cholecystectomy (2.69), open cholecystectomy (3.49), and cholecystostomy (3.47) groups; p=0.52.  Patients undergoing open cholecystectomy were more likely to have perioperative myocardial infarction, pulmonary embolus, or any complication compared to laparoscopic cholecystectomy or tube cholecystostomy (table).  Total hospital charges ($59,137.00 vs. $106,329.80; p=0.03) and median length of stay (4.0 vs. 8.0 days; p=0.02) were significantly higher and longer with open cholecystectomy compared to the laparoscopic procedure.  Patients having urgent/emergent surgery were more likely to suffer pulmonary embolus (3.5% vs. 0.0%, p=0.03) or any complication (9.3% vs. 3.9%, p=0.05) than patients having elective surgery.

Conclusion:  Cholecystectomy can be performed in the lung transplant population with minimal morbidity and mortality.  Patients requiring open surgery and emergency procedures appear to have worse outcomes.  Strong consideration should be given to elective, laparoscopic cholecystectomy in lung transplant patients with symptomatic gallstone disease.

 

65.13 Impact of Cardiac Interventions on Graft and Overall Survival In Abdominal Transplant Recipients

E. W. Beal1, S. Bennett1, N. Jaik1, G. Phillips2, S. Black1,4, T. Pesavento3,4, R. Higgins1,4, B. Whitson1,4  1Ohio State University,The Department Of General Surgery,Columbus, OH, USA 2Ohio State University,Center For Biostatistics,Columbus, OH, USA 3Ohio State University,Department Of Internal Medicine,Columbus, OH, USA 4Ohio State University,Comprehensive Transplant Center,Columbus, OH, USA

Introduction: Solid organ transplant recipients have a propensity for both having pre-existing and developing cardiovascular disease.  End-stage organ dysfunction and immunosuppression may hasten the development.  Due to the nature of transplant recipients, interventions are high risk in this population and can affect graft function.  We sought to evaluate the impact of cardiovascular interventions (CI) long-term outcomes in abdominal transplant recipients.

Methods: We retrospectively queried a prospectively maintained solid organ transplant database to identify adult recipients undergoing initial transplant over an 11 year period (kidney, kidney-pancreas, or liver) whose continuing-care was performed at our quaternary medical center. We stratified cohorts into CI (percutaneous, coronary artery bypass, valve surgery and complex procedures) and No-CI.  We evaluated graft and overall survival.  Standard Kaplan-Meier survival analysis, Cox proportional hazard modeling were performed.

Results: During the study period, 714 abdominal organ transplants met study criteria: 140 patients underwent CI and 574 did not.  There were no demographic differences.  Mean time from transplant to CI was 1360 days. Those patients undergoing renal transplant and CI had a longer graft survival than those undergoing renal transplant no-CI (p=0.013). Late long-term survival (7-14 years) showed a 167% increased risk of death in the CI cohort as compared to no-CI patients in the adjusted Cox proportional hazard model (p=0.003).

Conclusion: While those patients undergoing CI have a longer graft survival and better short-term survival, their long-term survival is significantly decreased compared to those not undergoing CI.

 

65.14 Comparing Open Gastrostomy to Percutaneous Endoscopic Gastrostomy Tube in Heart Transplant Patients

V. Ambur1, S. Taghavi1, S. Jayarajan1, J. Gaughan1, Y. Toyoda1, E. Dauer1, L. Sjoholm1, A. Pathak1, T. Santora1, J. Rappold1, A. Goldberg1  1Temple University,Department Of Surgery,Philadelpha, PA, USA

Introduction:
Impaired wound healing due to immunosuppression has led some surgeons to preferentially use open gastrostomy tube (OGT) over percutaneous gastrostomy tube (PEG) in heart transplant patients.  OGT allows for pexying of the stomach to the anterior abdominal wall, which may decrease the risk of intra-abdominal leak compared to PEG, potentially resulting in better outcomes.  We hypothesized that heart transplant patients requiring gastric access would have better outcomes with OGT compared to PEG.

Methods:
The National Inpatient Sample (NIS) database (2005-2010) was queried for all heart transplant patients requiring open gastrostomy or PEG tube.   Weighted frequencies and weighted multivariate logistic regression analysis using clinically relevant variables were used to examine clinical characteristics and mortality.

Results:
There were 498 patients requiring gastrostomy tube, with 424 (85.2%) requiring PEG and 74 (14.8%) requiring OGT.  The two groups were not different with respect to male gender (76% vs. 78%, p=0.68). The PEG cohort was older (53.5 vs. 28.5, p=<0.001), more likely Caucasian (73.5% vs. 53.3%, p<0.001), less likely to be Hispanic (3.1 vs. 18.3%, p<0.001), and had higher Charlson comorbidity Index (4.1 vs. 2.0, p=0.002). The PEG cohort had a higher incidence of post-operative acute renal failure (31.5 vs. 12.7%, p=0.001) and a higher incidence of having any postoperative complication (42.3% vs 19.1%, p<0.001).  Rate of post-operative pneumonia (3.4% vs 0%, p=0.1), surgical site infection (4.8% vs 6.4%, p=0.56), deep vein thrombosis (3.5% vs. 0%, p=0.1), and pulmonary embolism (2.6% vs 0%, p=0.16) were not significantly different when comparing the two groups.   Median length of stay (20.0 vs 14.0 days, p <0.005) was longer in the PEG cohort, however total hospital charges were not different ($224,795 vs. $183,474, p=0.41).  Post-operative mortality was not different when comparing the two groups (13.8 vs. 6.1%, p=0.06).  On multivariate analysis, while both PEG (HR: 7.87, 95%C.I: 5.88-10.52, p<0.001) and OGT (HR 5.87, 95%CI: 2.19-15.75, p<0.001) were independently associated with mortality, PEG had higher mortality risk.  Other variables associated with increased mortality included increasing age and increasing Charlson comorbidity Index.  Caucasian race and admission to a teaching hospital were associated with lower mortality (Table 1).

Conclusion:
In heart transplant patients, PEG appears to result in worse morbidity and mortality compared to OGT.  OGT should be the preferred method of gastric access in heart transplant patients.
 

64.05 Use of Gastric Ultrasound to Evaluate for Distension in the Post Operative Patient; A Pilot Study

W. Boyan1, M. Jaronczyk1, M. Goldfarb1  1Monmouth Medical,Long Branch, NJ, USA

Introduction: Nausea and vomiting are a common complaint of the post operative patient. Many factors can lead to nausea ranging from distension of the stomach to medications. At our institution, over a five year period 12/13 patients who aspirated subsequently died. Nasogastric tubes are frequently used after a patient has an episode of emesis and sometimes as a prophylactic measure to decompress the stomach where ileus is suspected. Unfortunately no diagnostic means exists to determine whether nausea is due to gastric distension and ileus or medication related.  In this pilot study the aim was to create a novel diagnostic measure, using bedside gastric ultrasound to differentiate the two distinct causes of post operative nausea. The first being due to gastric distension so that an NGT could be placed. The other being medication/pain related nausea not related to gastric distension and therefore no role for NGT decompression.

Methods: Our work was a prospective study of patients who underwent elective colorectal resections by two board-certified colorectal surgeons over a three month period. The patient’s stomach were examined with ultrasound on the day of operation after completing their bowel prep and being NPO after midnight.  The measurement was taken in cm at its largest anterior-posterior diameter. The ultrasound was then repeated every morning as patients were asked about nausea and vomiting. Any reports of emesis were documented. NGT’s were placed on clinical reasoning and the ultrasound measurements were only recorded for purpose of the study.

Results:Twenty patients underwent elective colorectal surgery after bowel prep. The average size of the stomach at its largest anterior-posterior diameter measured 4.31 cm. Nausea was reported in five patients, three of these patients also reported vomiting and one of these had an NGT placed. In only one patient did the nausea or vomiting correlate with an increased gastric measurement of 5.1 and 8.0 cm; this was subsequently the patient who needed an NGT. This patient’s average gastric measurement was 4.48 cm over their six day hospital stay. The day this patient NGT was placed their measurement was 8cm which is greater than two standard deviations higher than the average, 1500 cc of bilious material was drained initially. The other four patients with complaints had measurements below the average for asymptomatic patients.

Conclusion: The goal of this pilot study was to develop a unique means of accurately measuring gastric distension to identify patients at risk of vomiting and aspiration. Although user dependent, ultrasound for gastric measurement can provide a novel means to differentiate nausea related to pain/medication to that caused by gastric distention. This differentiation will prove valuable when it leads to use of NGT to decompress the stomach, prevent vomiting and the disastrous potential complication of aspiration.

64.06 Laparoscopic Gastropexy for Large Paraesophageal Hiatal Hernia

A. D. Newton1, G. Savulionyte1, K. R. Dumon1, D. T. Dempsey1  1Hospital Of The University Of Pennsylvania,Surgery,Philadelphia, PA, USA

Introduction: We hypothesize that laparoscopic gastropexy is a good alternative to formal paraesophageal hernia repair in frail elderly patients with giant hernias and mechanical symptoms. There is a paucity of published data evaluating this.

Methods: We compared all 18 laparoscopic gastropexies done between August 1, 2011 and December 31, 2013 with 18 age and sex matched formal laparoscopic repairs (sac removal, closure of diaphragmatic defect, fundoplication +/- gastropexy) done over the same period by a single surgeon. Postoperative clinical outcomes, radiographic persistence or recurrence of hiatal hernia (UGI), quality of life (GIQOL index), and patient satisfaction were evaluated and compared.

Results: There was no significant difference in age between groups (gastropexy mean age: 79+/-10 years; formal repair mean age: 73+/-7 years). There were 14 females and 4 males in each group. All operations were completed laparoscopically. There were no hospital mortalities and one serious hospital complication (pneumonia requiring mechanical ventilation). (See table for other clinical outcomes and quality of life results.)

Results of radiographic evaluation of postoperative percentage of herniated stomach were as follows: (gastropexy: no hernia – 21%, 1-9% herniated – 0%, 10-19% herniated – 7%, 20-29% herniated – 50%, >29% herniated – 21%; formal repair: no hernia – 56%, 1-9% herniated – 31%, 10-19% herniated – 13%, 20-29% herniated – 0%, > 29% herniated – 0%, p < 0.01).

Conclusion: Laparoscopic gastropexy is a safe and useful operation in elderly patients with large paraesophageal hiatal hernias. It is well tolerated and patient satisfaction is high. Residual or recurrent hiatal hernia is very common after gastropexy, and also common after formal repair, but does not correlate with postoperative patient satisfaction and symptoms in either group.

 

64.07 What Does the Excised Stomach from Sleeve Gastrectomy Tell Us?

M. Lauti1, J. M. Thomas1, J. J. Morrow1, H. Rahman1, A. D. MacCormick1  1Middlemore Hospital, University Of Auckland,Auckland, Auckland, New Zealand

Introduction:

Upper endoscopy prior to bariatric surgery is the recommended standard. This may not be applicable to asymptomatic patients undergoing laparoscopic sleeve gastrectomy as the stomach is excised and the duodenum remains accessible. We hypothesise that routine pre-operative upper endoscopy is unnecessary in the asymptomatic bariatric patient undergoing sleeve gastrectomy. We also describe the histologic specimens in our series of sleeve gastrectomy patients and explore whether histologic diagnosis is associated with post-operative leak and/or bleed.

Methods:

Consecutive patients undergoing laparoscopic sleeve gastrectomy from March 2007 to May 2014 were included in the study. All final histologic reports were coded and investigated against whether or not the patient had a post-operative leak and/or bleed. Associations were tested using Chi-squared test.

Results:

Over this period, almost 1,000 laparoscopic sleeve gastrectomies were performed. Half of all specimens showed an abnormality. The distribution of histologic diagnoses can be seen in the chart below. There were no incidental findings of malignancy but 12% of specimens exhibited features of premalignant change. There were no associations between histologic diagnosis and post-operative leak and/or bleed.

Conclusion:

Although a histologic diagnosis is common in the resected stomach from a sleeve gastrectomy; it is not related to post-operative bleed and/or leak. Histologic examination of the resected stomach may aid in identifying patients at increased risk of adenocarcinoma in the remnant. It remains arguable whether routine gastroscopy is required in all pre-operative sleeve gastrectomy patients.

64.08 Intraoperative Assessment to Select Segmental Resection vs. Local Excision for Colonic Endometriosis

H. R. Zahiri1, S. G. Devlin1, B. E. Ebert1, M. A. Benenati1, R. Marvel1, A. Park1, I. Belyansky1  1Anne Arundel Medical Center,Surgery,Annapolis, MD, USA

Introduction:

Deep endometriosis affects tissues at least 5 mm below the peritoneum, such as bowel, bladder, or ureters.  Dyspareunia, dysmenorrhea, dyschezia are common manifestations.  Surgical management of colonic endometriosis is either segmental resection or less invasive, local excision of colonic wall lesions with primary repair.  Bowel resection is more invasive and morbid for patients.  Nevertheless, at present, no general consensus on indication for bowel resection in this setting exists.  Surgeons select resection either preoperatively, based on exam and imaging, or intraoperatively after direct examination of the colon and rectum.  Our common practice is intraoperative assessment for selection of appropriate intervention.  The study aim was to determine outcomes post intraoperative assessment and selection of surgical intervention for colonic endometriosis.   

Methods:
We conducted a retrospective review of all patients operated on for endometriosis at Anne Arundel Medical Center. From January 2012 to August of 2014, 7 patients had deep endometriosis involving the colon or rectum who had either local excision or segmental colorectal resection.  The surgical approach for therapy was selected after a detailed intraoperative examination of the colon and rectum.

Results:
Seven patients in our study underwent surgical therapy for their colonic endometriosis.  Mean age, BMI, and ASA were 37.7, 29, and 2.1 respectively.  All were affected by stage 4 endometriosis involving the rectum or rectosigmoid junction.  Intraoperative assessment found 3 patients with severe disease affecting the rectosigmoid region including obstruction.  All 3 underwent laparoscopic low anterior resection (LAR) with primary anastomosis.  The remainder of patients did not exhibit evidence of obstruction and were managed with local excision of colonic wall lesions with muscularis or serosal repair.  Mean surgery time and blood loss were 257.7 minutes and 200 ml.  Complication rates were 29%, 2 UTIs, 1 Seroma, and 1 ureteral injury unrelated to colon resection.  All complications occurred in 2 LAR patients.  Mean hospital length of stay was 2.7 days.  Mean followup was 92 days.  All patients had significant resolution of their endometriosis symptoms post surgery.    

Conclusion:
Intraoperative assessment of deep endometriosis affecting the colon may serve to accurately determine disease burden and spare patients of invasive colonic resection and its complications while providing them with effective symptomatic relief.

64.09 Is surgical resection justified for advanced intrahepatic cholangiocarcinoma ?

T. Yoh1, E. Hatano1, K. Yamanaka1, S. Satoru1, T. Nitta1, S. Uemoto1  1Kyoto University,Division Of Hepatobiliary Pancreatic Surgery And Transplantation,Department Of Surgery, Graduate School Of Medicine, Kyoto University,Kyoto, KYOTO, Japan

Introduction:

European Association for the Study of Liver (EASL) guidelines for the diagnosis and management of intrahepatic cholangiocarcinoma (iCCA) recommended that patients demonstrating intrahepatic metastases (IM), vascular invasion (VI) or lymph node metastases (LM) should not undergo resection. The aim of this study was to evaluate the validity of surgical justification for iCCA with IM, VI, or LM

Methods:
One hundred fifty-five patients who underwent hepatectomy for ICC from 1993 to 2013 in Kyoto University hospital were enrolled. Overall survival stratified with IM, VI and LM, and other prognosis factors for survival were analyzed

Results:
The median survival time (MST) of all patients was 27.8 months (M). MST was 18.7 versus 41.7 month in IM +/- (p<0.001), 16.9 versus 35.9 month in VI +/- (p=0.017), and 12.5 versus 46.1 month in LM +/- (p<0.001), respectively. Multivariate analysis demonstrated that CA19-9, LM, VI and perioperative gemcitabine-based chemotherapy were independent prognostic factors for survival. Subgroup analysis showed perioperative chemotherapy improved survival in patients with VI (37.4M) or LM (35.9M), but not IM (18.7M)

Conclusion:
Although prognosis of iCCA patients with IM, VI or LM was poor, perioperative chemotherapy might improve prognosis. Surgical resection can be justified for iCCA with IM, VI, or LM, if curative resection and following chemotherapy are possible