95.14 Global Collaborative Healthcare: Assessing resource requirements at a leading academic medical center

N. Rosson1, H. Hassoun1  1Johns Hopkins Medicine International,Department Of Surgery,Baltimore, MD, USA

Introduction:
Historically, global collaborations involving US academic medical centers (AMCs) focused on specific diseases and public health issues in less developed countries. Recently, rapid privatization of healthcare systems, economic development, and a shift in the disease burden have led providers in emerging countries to seek partnerships with AMCs with hopes of improving care to its citizens. This new paradigm is termed Global Collaborative Healthcare, and since 1999 Johns Hopkins Medicine International (JHI) has been at its forefront, facilitating global expansion of the Johns Hopkins Medicine (JHM) mission. We investigated the institutional faculty and staff resource requirements to support the JHI operating model. 

Methods:
The size and scope of JHI’s engagements have increased from consulting to projects of greater complexity, such as affiliations, hospital management and joint ventures, with past engagements in over 50 countries and currently 18 active projects in 16 countries. JHI engages subject matter experts (SMEs) from the entities that comprise JHM.  To facilitate and monitor the use resources, JHI develops workorders that define the terms and services provided which are retained in a JHI database. Data was extracted from this database on a query for all work completed in a 3 year period (Jan, 2013-Dec, 2015), sorted and analyzed to determine total utilization (hours and full time equivalent (FTE), professional category of staff, and clinical and non clinical departments, schools, and institutes.   For purposes of this analysis, 1 FTE = 2,080 hours. 

Results:
JHI utilized on average 21,940 hours annually, or 10.55 FTEs of SMEs.  The majority work was performed by faculty from the School of Medicine, representing 77% percent or on average 16,894 hours annually. The remaining 23% (5,046 hours) is a combination of nursing, allied health and non-clinical staff.  Clinical and allied health departments had an average annual utilization of 17,642 hours or 7.8 FTEs, while non clinical departments, schools and institutes averaged 4,298 hours or 1.9 FTEs, representing 80.4% and 19.6% respectively. Within the clinical and allied health departments, Medicine and Nursing had the highest utilization, with an annual average 5,239 hours and 2,537 respectively followed by Surgery and Research with 1,603 hours and 1,309 hours respectively. Within non clinical departments, schools, and institutes, the Armstrong Institute for Quality and Patient Safety had the highest utilization with 1,914 hours annually. 

Conclusion:

The global healthcare market is massive and expanding, providing a platform for leading AMCs to enter into collaborative partnerships with healthcare organizations around the world.  In evaluation of the JHI model, we found that significant human resources are required within a broad range of SMEs, and that with adequate forecasting AMCs can successfully engage in these collaborations while continuing to fulfill their core mission at home.

 

95.11 Development And Implementation of a Minimally-Invasive Surgery Curricula in Ghana: Lessons Learned

G. E. Hsiung1,2, G. Ortega4, F. Abdullah1,2, D. Rhee5, K. A. Barsness1,2  1Northwestern University,Department Of Surgery,Chicago, IL, USA 2Ann And Robert H. Lurie Children’s Hospital Of Chicago,Division Of Pediatric Surgery,Chicago, IL, USA 3Women And Children’s Hospital Of Buffalo,Department Of Pediatric Surgery,Buffalo, NY, USA 4Howard University College Of Medicine,Outcomes Research Center, Department Of Surgery,Washington, DC, USA 5Memorial Sloan-Kettering Cancer Center,Division Of Pediatric Surgery,New York, NY, USA

Introduction:  

There is a paucity of evidence regarding the existing utilization of minimally-invasive surgery in resource-limited settings.   With an estimated one-third of the global burden of disease still attributed to treatable surgical conditions and a growing prioritization of surgical care among the global community, we sought to determine the local usage of minimally-invasive surgical techniques and to develop and implement a minimally-invasive surgical curricula in a resource-limited setting.    

Methods:
After IRB exemption was determined, a 25-item needs assessment questionnaire was designed with expert consultation to determine the utilization and availability of minimally-invasive surgical equipment and techniques for five operations (appendectomy, hernia repair, small bowel resection, Nissen fundoplication and cholecystectomy).    During a minimally-invasive surgery course held in May 2016 in Accra, Ghana, twenty participants received technical and non-technical instruction using a pre-designed curriculum that included didactics and operative mentorship. Participants took a 10-item pre-test that assessed their comfort level with minimally-invasive surgery, as well as a post-test upon successfully completing the curricula.

Results:

Although the curricula was developed for physicians, only 20% of participants were physicians while 80% of participants were nurses representing 5 hospitals in two countries – Ghana and Nigeria.  More than three-fourths of participants were not comfortable with performing the five operations we inquired about and one-third of participants did not feel comfortable with patient selection for minimally invasive surgical procedures.  Only 10% of participants had performed more than 20 laparoscopic procedures.    The leading three barriers to performing minimally-invasive surgery were expense, unfamiliarity with surgical technique and lack of equipment. All participants were interested in participating in ongoing telementoring.   

Conclusion:

There is the desire for more training and education in minimally-invasvie surgery even in resource-limited settings. Lessons learned included the importance of performing a priori needs assessment in designing a curricula that would meet the local needs based on resource availability and training. 

 

95.09 Application of Student Research Objectives in an International Elective: Circumcision in Swaziland

A. R. Oddo1, A. Bales1, R. Siska1, D. J. Dennis1, E. VanderWal2, H. VanderWal2, R. Markert1, M. McCarthy1  1Wright State University Boonshoft School Of Medicine,Department Of Surgery,Dayton, OHIO, USA 2The Luke Commission,Sidvokodvo, MANZINI, Swaziland

Introduction:  Educational objectives for medical student international electives are an important part of any travel program. Objectives such as learning research methodology or engaging in research projects focus students during their travels and are a valuable way to reinforce curriculum goals. Our project focuses on the use of an international database by medical students to produce clinically significant findings impacting international health policy. Our study examines the adverse event rate in voluntary medical male circumcision, a procedure demonstrated to reduce HIV transmission by over 60%. Not only is voluntary medical male circumcision a method of HIV prevention; it is also nearly 40 times more cost effective in comparison with the treatment of HIV using antiretroviral medications. By engaging in an academic research study during the international elective students increased the educational value of the trip.

Methods:  The Luke Commission is a NGO that provides mobile health outreach to rural Swaziland, including HIV testing and prevention.  They perform more than 100 voluntary medical male circumcisions each week. The Luke Commission maintains a database demonstrating program productivity and effectiveness. Information collected from 1500 Swazi males during the first six months of 2014 was de-identified and analyzed after approval by the Wright State University School of Medicine IRB. 

Results: During this time period 34 adverse events occurred in 31/1500 patients, these included bleeding, infection, and wound dehiscence. The overall adverse event rate for the procedure was 2.3%.  Boys ≤12 years old had adverse events in 22/1022 circumcisions (2.2%) and patients ≥13 incurred 11/478 (2.3%; p=0.66).  Patients ≤29 kg body weight had 19/662 (2.9%) and patients ≥30 kg had 13/838 (1.6%; p=0.40).  There were no adverse events reported in 75 HIV-positive patients included. There were more wound dehiscences during the summer months, 10/333 (3.0%) versus 10/630 (1.6%) in fall and 0/517 (0%; p=0.001) in winter.  

Conclusion: Aid organization databases provide a source of information that can be used by medical students for research during international medical electives. The relationship between aid organizations, medical students, and patient populations can be a collectively beneficial one. Global health research has many complexities, but through careful planning and cultural awareness, medical students can contribute by publishing research that brings attention to global health issues and improves policies while having a significant positive effect on their own educational experience.

 

95.08 The Importance of Design Validation in Global Health Surgical Innovation

T. Schwab1,3, B. Fassl2, R. Patel2, J. Langell1,3  1University Of Utah,Department Of Surgery,Salt Lake City, UT, USA 2University Of Utah,Pediatrics,Salt Lake City, UT, USA 3University Of Utah,Center For Medical Innovation,Salt Lake City, UT, USA

Introduction:  Creating high-quality new clinical technology solutions requires an in-depth understanding of user needs and environmental constraints. Accurately capturing market requirements, user needs, and design specifications for medical device innovation is multifactorial and challenging.   Through on-site observation and design validation, innovators may develop better solutions to unmet medical needs and provide products with tremendous potential to impact healthcare delivery.  Our study objective was to determine the effectiveness and value of design validation in medical devices designed to address the global healthcare needs. 

Methods:  An observational comparative effectiveness study and survey were used to collect data involving multiple stakeholder viewpoints (provider, patient, regulatory, industry, academic and societal).  The study setting consisted of a community hospital in rural India.  A random population-based cohort sample was used to conduct a semi-longitudinal assessment of exposure-outcome relations from device prototype use and design validation.  Medical device prototypes were exposed to potential users ranging from ages 1-92 years old.  Sixty-two subjects were observed for a 4-week duration.  User needs, market requirements, and design inputs were created using standard operating procedures for a novel non-invasive hemoglobin detection device according to the code of federal regulations governing the US FDA—21 CFR 820. Design requirements included user needs, product description, regulatory standards, functional requirements, performance and physical requirements, use environment, human and system interfacing, conceptual designs, and market analysis. After validation, each component of the traceability matrix was either marked “no change,” “significant change,” or “new addition” as defined in the methods section.

Results: The study evaluated 156 original design requirements and specifications.  Ten percent of the final design requirements were considered “new additions” and 12% were considered “significant changes.”  Sixteen percent of the final design specifications were considered “new additions” and 22% were considered “significant changes.”  Overall, 22% of the original marketing requirements and 38% of the design specifications changed significantly (Figure 1)

Conclusion: All of the changes and additions to the design requirements and specifications increased medical device design quality and safety, while reducing potential risk of lost time and investment. We strongly recommended environmental immersion for early validation of user needs and design specifications during design conception, and continue through prototyping process. 

 

94.20 Preoperative Bowel Preparation Does Not Influence the Management of Colorectal Anastomotic Leaks

K. Zorbas1, A. Choudhry2, H. Ross1, D. Yu2, M. Philp1  1Temple University,Department Of Surgery/Lewis Katz School Of Medicine,Philadelpha, PA, USA 2Temple University,Lewis Katz School Of Medicine,Philadelpha, PA, USA

Introduction: Controversy exists regarding the impact of preoperative bowel preparation on patients undergoing colorectal surgery. This is due to previous research studies, which fail to demonstrate protective effects of mechanical bowel preparation (MBP) against postoperative complications.  However, in recent studies, combination therapy with oral antibiotics (AB) and MBP seems to be beneficial for patients undergoing an elective colorectal operation. We aimed to determine the association between preoperative bowel preparation and postoperative anastomotic leak management.  We hypothesized that patients experiencing anastomotic leaks following preoperative AB+MBP would require reoperation for leak management less frequently.

Methods: Patients with anastomotic leak after colorectal surgery were identified from the 2013 and 2014 Colectomy Targeted American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database and were employed for analysis.  Every patient was assigned to one of four groups based on the type of preoperative preparation that had received [Mechanical Bowel Preparation and antibiotic (MBP/AB), Mechanical Bowel Preparation alone (MBP), Antibiotic use alone (OAB) and no-preparation (Nothing)]. First, descriptive statistics were used to cite preoperative patient’s characteristics (Table 1). The association between preoperative bowel preparation and postoperative anastomotic leak management was assessed using chi-square test.

Results:Of 1678 patients who had anastomotic leak after a colorectal resection, 695 had adequate information. Baseline characteristic were assessed and found that there were no statistically significant differences between the four groups in terms of age, gender and ASA score. However, we found a higher percentage of patients with Caucasian ancestry. A Chi-Square test of homogeneity was conducted and there was no statistically significant difference between proportion of re-operated patients in the four categories of bowel preparation and operative leak management; p= 0.303.

Conclusion:The implementation of mechanical bowel preparation and antibiotic use in patients who are going to undergo a colon resection does not influence the treatment of any possible anastomotic leakage.

 

94.12 Maturation of Effect Size During Enrollment of Prospective Randomized Trials

A. S. Poola1, T. Oyetunji2, G. W. Holcomb1, S. D. St. Peter1  1Children’s Mercy Hospital- University Of Missouri Kansas City,Department Of Surgery,Kansas City, MO, USA 2Lurie Children’s Hospital-Northwestern University,Department Of Surgery,Chicago, IL, USA

Introduction:
 Randomized trials with a definitive study design are powered by calculating the minimum sample required to achieve statistical significance given an estimated effect size (ES). The ES is the raw difference between two treatment arms. ES quantifies the magnitude of measurable differences between cohorts and is usually reflective of the true meaning of the trial regardless of statistical significance. Under a fixed protocol, we hypothesize that the effect size may mature near its final magnitude well before the completion of enrollment. To investigate patterns of ES during enrollment, we analyzed completed randomized trials with definitive study designs. 

Methods:
Primary outcomes of 11 prospective trials were reviewed at a single institution. ES was calculated at intervals throughout each trial to determine at which point a steady clinical difference was achieved between treatment cohorts. 

Results:
Table 1 summarizes our overall findings. Both the completed trial sample size and the number of enrolled patients at which the ES stabilized, are provided. ES stabilized at a median of 64% enrollment. All patients were needed to meet the precise ES in our smallest study, indicating the need for full enrollment in smaller studies. Otherwise, 50% of our trials required between 48% and 76% of patient enrollment to meet ES. In comparing clinical outcomes, 9 of 12 found a final difference that was nearly identical to the difference that could have been determined much earlier. Categorical outcomes met stabilized ES at 51% enrollment and continuous outcomes at 68%.

Conclusion:
ES and final clinical outcomes were achieved prior to the completion of enrollment for most of our studies. This suggests that clinical differences detected by randomization may not necessarily require the robust sample size often needed to establish statistical significance. This is particularly relevant in fixed-protocol interventional trials of homogenous populations where protocol compliance is high. 
 

93.20 Growth in Robotic Hernia Repair due to Reduction of Laparoscopic Approach, not from Open Surgeries

P. R. Armijo1, D. Lomelin2, D. Oleynikov1  1University Of Nebraska College Of Medicine,Surgery,Omaha, NE, USA 2University Of Nebraska College Of Medicine,College Of Medicine,Omaha, NE, USA

Introduction:  Advancements in technology have led to an increasing number of robotic surgeries over time, in a wide variety of procedures. The aim of this study is to evaluate the current national trends of open (OVHR), laparoscopic (LVHR) and robotic (RVHR) ventral hernia repair (VHR) and to account for the growth of robotic technique.

Methods:  This is a multi-center, retrospective study of patients who underwent VHR from January 2013 to September 2015. The UHC clinical database resource manager (CDB/RM) was queried using ICD-9 procedure codes for OVHR, LVHR and RVHR. Trends were evaluated between and within quarters (Q1 2013 to Q3 2015) and comparisons were made between OVHR and MIS approaches (which included both LVHR and RVHR), and within the MIS group. The last quarter of 2015 was excluded due to changing in the coding system. The data was analyzed using IBM SPSS v.23.0 using linear-by-linear association test.

Results:A total of 63,308 patients underwent VHR from 2013 to 2015 (OVHR: N=50,234; LVHR: N=12,293; RVHR: N=781). During this period, a significant increase of 2.54% was seen in OVHR compared to MIS approaches (graph 1). In the first quarter of 2013, OVHR accounted for 77.87% (N=4,584) of the procedures versus 22.13% of MIS (N=5,887). Whereas, an increase to 80.41% (N=4,183) of OVHR occurred in 2015, compared to a significant decrease of both LVHR and OVHR to 19.59% (N=5,202) in the same time frame (p=0.007). Likewise, an interesting trend was seen within MIS group. RVHR nearly tripled from 4.30% (N=56) in 2013 to 11.97% (N=122) in 2015; whereas LVHR decreased from 95.70% (N=1,247) to 88.03% (N=897) in the same period of time (p<0.001).

Conclusion:In the field of Urology and OB/GYN, growth in robotic surgery has converted open operations to MIS. From the data in this study, it appears that growth in RVHR has come from laparoscopic techniques, and not from open surgery as previously thought. Effects on cost and long term outcomes will need to be studied in order to better understand the impact of this trend.

 

93.19 Emergency General Surgery Transfers have Increased Mortality Risk

C. E. Reinke1, M. Thomason1, N. Rozario1, B. D. Matthews1  1Carolinas Medical Center,Department Of Surgery,Charlotte, NC, USA 2Carolinas Medical Center,Dickson Advanced Analytics,Charlotte, NC, USA

Introduction: Emergency general surgery (EGS) admissions account for more than 3 million hospitalizations in the US annually. Although EGS transfers who undergo surgery have been shown to have worse outcomes, EGS transfers who are managed non-operatively have not previously been studied.  We aim to better understand the characteristics and risk of mortality for EGS interhospital transfer (IHT) patients compared to EGS admissions from the Emergency Department (ED).

 

Methods: Using the 2002-2011 Nationwide Inpatient Sample we identified patients age ≥18 years with an EGS non-cardiovascular principal diagnosis (AAST EGS DRG ICD-9 codes) and urgent or emergent admission status.  These patients were classified into IHT patients and ED patients based on admission source.   Patient demographics, hospitalization characteristics, rates of operation and mortality were identified and compared between the two groups.  The risk of mortality was calculated for IHT patients compared to ED patients, both before and after adjusting for patient characteristics in a multivariable analysis. 

 

Results: From 2002-2011 there were an estimated 25,629,352 EGS admissions, 2% of which were IHTs.  The mean age was 59 years, 54% were female, and 46% were Medicare patients. Transfer patients were more likely to be white, to be female, have Medicare.  IHTs had higher rates of most comorbidities with the exception of AIDS, blood loss anemia, coagulation deficiency, and drug abuse.  Upper gastrointestinal tract and hepatobiliary diagnosis categories were the most common EGS diagnosis in both groups, but a higher percentage of ED admissions had colorectal, general abdominal, or soft tissues diagnoses compared to IHTs.  IHTs were more likely to undergo a surgery or procedure and had a higher mortality rate.  The odds of mortality were increased for IHTs, and remained elevated even after controlling for patient characteristics and EGS diagnosis (Table 1).

 

Conclusions: EGS patients who are transferred from another acute care hospital are at higher risk of mortality even after controlling for a wide range of patient characteristics.  They also undergo procedures and surgeries at a higher rate than ED patients.  Future studies to identify other contributing factors to this increased risk can identify opportunities for decreasing the mortality rate in EGS transfers.

93.16 Is Umbilical Hernia or Diastasis Recti Associated with Increased Risk of Ventral Incisional Hernia?

M. L. Moses1, C. Hannon1, D. V. Cherla1, K. Mueck1, J. L. Holihan1, S. Millas1, C. J. Wray1, L. S. Kao1, T. C. Ko1, M. K. Liang1  1University Of Texas Health Science Center At Houston,General Surgery,Houston, TX, USA

Introduction:
Despite the prevalence of umbilical hernia and diastasis recti in the general population, it is unknown if the presence of either increases the risk of developing a ventral incisional hernia (VIH).  We hypothesize that among patients undergoing abdominal surgery, individuals with an umbilical hernia or diastasis rectus have an increased risk of developing a postoperative VIH. 

Methods:
This was a retrospective study of all patients undergoing surgery for gastrointestinal cancer at a single institution from January 2011 to December 2015. These patients were chosen because of their high likelihood of having both preoperative and postoperative CT imaging. Inclusion criteria included all patients undergoing surgery with a periumbilical incision and both preoperative and postoperative CT scans. To ensure that the baseline umbilical hernias were not VIHs from previous operations, all patients with previous abdominal surgeries were excluded. The primary outcome was whether a VIH was visualized on postoperative CT scan. Primary outcome was compared by chi-square statistical analysis. 

Results:
A total of 159 patients met inclusion criteria and were followed for a median(range) of 41.7(21.7-79.3) months.  Prior to surgery, 93(58% of the included cohort) had a radiographic umbilical hernia and 67(42%) had a diastasis rectus.  Following surgery, patients with a prior umbilical hernia were more likely to have a VIH on postoperative CT scan (67/93,72% versus 26/66,40%, p<0.001) while patients with a preoperative diastasis rectus were not more likely to acquire a postoperative VIH (39/67,58% versus 56/92,61%, p=0.746).   

Conclusion:
Umbilical hernias but not diastasis recti are associated with an increased risk of developing a postoperative VIH.  In addition, the prevalence of ventral hernias seen on CT scans before and after abdominal surgery is substantial. Further studies are needed to determine if radiographically diagnosed hernias are clinically significant and to define the appropriate role of imaging in diagnosing and assessing abdominal wall defects.
 

93.13 Assessing Anastomotic Perfusion in Colorectal Surgery with Indocyanine Green Fluorescence Angiography

A. Dinallo1, W. Boyan1, B. Protyniak1, A. James1, R. Dressner1, M. Arvanitis1  1Monmouth Medical Center,Surgery,Long Branch, NEW JERSEY, USA

Introduction:  Major factors to prevent anastomotic leaks include adequate perfusion, tension free and minimal spillage.  Conventional techniques to assess viability of bowel perfusion such as palpating pulses and evaluating color and bleeding of cut edges are all critical techniques during colorectal surgery; however they are subjective. Like all medical practice, concrete objective data would be ideal while performing an anastomosis during colorectal surgery. The use of Indocyanine Green Fluorescence Angiography seeks to provide objective data when assessing tissue perfusion. 

Methods:  Between June 2013 and November 2015, 176 colorectal resections were retrospectively reviewed. The perfusion to the colon and ileum was clinically assessed, and then measured using SPY Elite Imaging System. The absolute value provided an objective number on a 0-256 gray scale, which represents differences between ICG fluorescence intensity and therefore perfusion. The lowest absolute value was used in data analysis as it represented the least perfused anastomotic portion.

Results:  There were 93 resections done for malignant disease and 83 resections performed for benign disease. There were a total of eleven operations that required additional proximal resections due to low ICG readings. Complications included two anastomotic leaks (1.1%) and three stenoses (1.7%). One anastomotic leak resulted in a mortality from sepsis. The mean ICG absolute values for all of the colon resections was greater than 51.  

Conclusion:  This study represents a 29-month experience at a single institution using the SPY technology in colorectal surgery. To date this the largest collection of data using SPY to objectively assess bowel perfusion in creating an anastomosis. The statistical significance of these values in relation to perfusion and anastomotic leaks has yet to be established in the literature. To determine these values randomized control trials are required. 

 

92.19 The Optical Trocar Access in Laparoscopic Gastrointestinal Surgery

C. Tanaka1, M. Fujiwara1, M. Kanda1, M. Hayashi1, D. Kobayashi1, S. Yamada1, H. Sugimoto1, T. Fujii1, Y. Kodera1  1Nagoya University Graduate School Of Medicine,Dept. Of Gastroenterological?Surgery,Nagoya, AICHI, Japan

Introduction: ~The optical trocar access is one of techniques for the first trocar placement in laparoscopic surgery. By the optical trocar access, each tissue layer can be visualized prior to penetration, leading to prevention of organ injury, and air leaks at the site of trocars can be minimized even in obese patients. The aim of this study is to report the comparison of the required time for a trocar insertion between the optical trocar access and open group in patients who underwent laparoscopic gastrointestinal surgery.

Methods: ~We reviewed our prospectively collected database and identified 384 patients who underwent the laparoscopic gastrointestinal surgery for whom the initial trocar was inserted nearby the umbilicus either by the optical trocar access or by the open method. Prior to comparison between the two methods, the propensity score matching was used to adjust for essential variables between the optical trocar access and open groups. After matching, we compared the influences of age, sex, BMI, comorbidity, history of abdominal surgery, type of diseases and surgeon’s experience of the optical trocar access on required time for an initial trocar insertion. BMI was categorized into not obesity (<25 kg/m2) or obesity (≥ 25 kg/m2).

Results:~Patients categorized either as optical trocar access or open group were matched one-to-one by the use of propensity score matching and 137 pairs of patients were generated. The required time for a trocar insertion was significantly shorter in the optical trocar access group in comparison with that of the open group (36.6 vs 209.8 seconds, respectively, P<0.01). The prolonged time for an initial trocar insertion of optical trocar access was significantly associated with younger age of the patient and surgeon’s experience of 30 cases or fewer in the univariable analysis. The multivariable analysis identified the small experience of the surgeon as the only independent risk factor for prolonged time for an initial trocar insertion (OR 3.45, 95% CI 1.49 – 8.33, P <0.01; Table 2). Notably, BMI and history of abdominal surgery did not significantly affect the required time for a trocar insertion in the optical trocar access group. On the other hand, the prolonged time for an initial trocar insertion of open group was significantly associated with body mass index (OR 3.22, 95% CI 1.22 – 8.90, P = 0.02) and history of abdominal surgery (OR 2.96, 95% CI 1.27 – 7.12, P = 0.01).

Conclusion:~This study indicated that optical trocar access may be recommended for insertion of initial trocar in laparoscopic gastrointestinal surgery.

 

92.12 Trends in Parastomal Hernia Repair in the United States

T. Gavigan1, B. Matthews1, N. Rozario1, C. E. Reinke1  1Carolinas Medical Center,Department Of Surgery,Charlotte, NC, USA 2Carolinas Medical Center,Dickson Advanced Analytics,Charlotte, NC, USA

Introduction:  Parastomal hernia is the most common complication after stoma creation. An estimated 120,000 new stomas are created each year. Recent studies report an parastomal hernia incidence approaching 80%, with more than half requiring surgical repair. Parastomal hernias create significant morbidity, including patient discomfort, small bowel obstruction, and need for emergent surgery. Little is known about the rates of parastomal hernia repair over the last 10 years in the United States. We examined national trends in parastomal hernia repair (PHR) in this study, including annual frequency of procedure, patient characteristics, and same-admission complications.  

 

Methods: The 1998-2011 Nationwide Inpatient Sample was used to identify patients who underwent a PHR (ICD-9 PR 4642).   PHRs were classified as PHR with concurrent resiting (ICD-9 PR 4643), PHR with concurrent ostomy reversal (ICD-9 4652 or 4651), or primary PHR. Patient age, race, sex, comorbidities and type of insurance were identified. Complications, length of stay (LOS), and mortality were identified. The frequencies of patient characteristics and outcomes were calculated by year and by type of PHR and analyzed to identify trends.   

 

Results: The estimated number of annual parastomal hernia repairs increased from 4,161 to 7,646 (p=<0.01, R2=0.85) for a total of 73,659 repairs. 30% underwent a concurrent stoma reversal and 10% underwent a resiting.  The proportion of females undergoing PSHR remained steady (58%). There was an upwards trend in the proportion of privately insured patients (26%-31%, p<0.01) and the number of patients with 3 or more Elixhauser comorbidities (17%-44%, p<0.01). The frequency of reversal increased while the frequency of resiting decreased. LOS remained steady (median 6.3 days) and in-hospital mortality ranged from 1.8-3.9% annually. Mortality and emergency admission status were highest for patients who underwent primary PHR, while the distribution of number of comorbidities was not significantly different between the three groups.

 

Conclusions: The incidence of parastomal hernia repair nationwide is increasing and more than half of patients undergo primary repair.  Although the surgical focus has moved towards prevention, parastomal hernia is a persistent complication of stoma creation. Further exploration is warranted to determine if the observed increase in parastomal hernia repair is related to perceived improved techniques and outcomes, an increasing incidence of parastomal hernia, patient characteristics or other factors.  

92.11 Genitourinary Paraganglioma: An Analysis of the SEER Database (2000-2012)

S. Purnell1, A. Sidana1, M. Maruf1, C. Grant2, S. Brancato1, P. Agarwal1  1National Cancer Institute,Urologic Oncolocy Branch,Bethesda, MD, USA 2George Washington University Hospital,Urology,Washington, DC, USA

Introduction: Extra-adrenal paragangliomas (PGL) are infrequent, benign, neuroendocrine tumors arising from chromaffin cells of the autonomic nervous system. While most develop above the umbilicus, they have been reported in the genitourinary (GU) tract. Due to the paucity of literature on the rates of GU paraganglioma, our study aims to describe demographic, pathologic, and clinical characteristics of GU PGL, and compare them to non-GU sites of PGL.

Methods: Data was collected from the SEER 18 Database to compare GU and non-GU PGL diagnosed between 2000 and 2012. Chi-square and unpaired t-tests were used. Kaplan-Meier analysis and a log rank test were used to determine overall survival and statistical significance, defined as p<0.05.

Results:299 cases of PGL were retrieved and only 20 (6.7%) arose from the GU tract. 83.3% GU PGLs developed in the bladder, subsequently the kidneys/renal pelvis (16.7%), and spermatic cord (2%). Non-GU PGL developed most frequently within the endocrine system (43%). Overall, PGL was more common in men than women. The mean age at diagnosis in years was higher in non-GU than GU PGL (50.4±17.2 vs 40.8±15.6, p=0.026). GU PGL was less common in whites compared to PGL at other sites (p=0.033). The majority (50%) of GU PGL was organ confined while 5.7% of non-GU PGL was localized at diagnosis. All cases of PGL were treated with surgery. 30% of patients with GU PGL underwent LN dissection and none had radiation. There were 2 (10%) cause-specific deaths in the GU PGL groups between 2000 and 2012. 5-year overall survival was 93.3% for GU PGL versus 65.5% in non-GU PGL (p=0.062).

Conclusion:Genitourinary PGL remains rare, with low incidence (6.7% of all PGL cases) in the US population between 2000 and 2012. Also, it had high 5-year overall survival compared to PGL developing outside of the GU tract. The bladder represents the most common site of involvement and surgery is the mainstay of treatment for GU PGL. Clearer prognostic factors are needed to better elucidate PGL management in the future thus pooled studies from various institutions with detailed clinical information are needed to delineate these prognostic factors.

 

90.19 Physiologic Drivers Of Intraoperative Transfusion During Major Gastrointestinal Surgery?

M. Cerullo2, F. Gani2, S. Y. Chen2, J. K. Canner2, W. W. Yang3, S. M. Frank3, T. M. Pawlik1  1Ohio State University,Wexner Medical Center,Columbus, OH, USA 2Johns Hopkins University School Of Medicine,Department Of Surgery,Baltimore, MD, USA 3Johns Hopkins University School Of Medicine,Department Of Anesthesiology And Critical Care Medicine,Baltimore, MD, USA

Introduction:  Current guidelines for transfusion largely focus on nadir hemoglobin (Hb) levels. Hb triggers may not be helpful, however, in defining appropriate intra-operative use of packed red blood cells (PRBCs).  We sought to define the use intra-operative PRBC relative to quantitative physiologic factors at the time of surgery.

Methods:  Prospective perioperative data on patients undergoing major gastrointestinal surgery between 2010 and 2014 were analyzed. Risk of intraoperative transfusion was assessed with multivariable extended Cox models using clinical covariates (e.g. type of surgery, perioperative Hb, coagulation parameters, American Society of Anesthesiologists (ASA) classification, and Charlson co-morbidity), as well as time-varying intraoperative covariates (e.g. continuously-monitored mean arterial pressure [MAP], heart rate, and estimated blood loss [EBL]).

Results: 2,428 patients were identified; 384 (15.8%) patients received an intraoperative transfusion. Higher risk of intraoperative transfusion was associated with preoperative factors including lower Hb (hazard ratio [HR]=1.22, 95% confidence interval [CI]: 1.14-1.30, p<0.001) and higher ASA class (HR=1.55, 95%CI:1.24-1.93, p<0.001). Intraoperative risk factors for transfusion included higher EBL (HR=1.43, 95%CI:1.27-1.62, p<0.001, per 1000mL), as well as lower instantaneous MAP (HR=1.15, 95%CI:1.08-1.22, p<0.001) and higher heart rate (HR=1.30, 95%CI:1.21-1.39, p<0.001). While the majority of patients had a transfusion for a physiologic indication, among the 384 patients transfused, 27.1% of intra-operative transfusions were delivered to patients who never had a physiologic indication (heart rate>100, MAP<65, or a nadir Hb<8) (Figure). 

Conclusion: Physiologic indicators account for considerable variability in intraoperative transfusion practices among patients undergoing major surgery. Up to 27% of patients who received an intraoperative transfusion had no identifiable physiological reason for a transfusion, thereby suggesting possible overutilization of PRBC in a subset of patients. 

 

88.16 Brachial Vein Arteriovenous Graft: Approach From Small Outflow Vein To A Large Diameter Vein

P. Sanchez1, J. C. Duque1, g. klimovich2, H. Labove4, L. Martinez2, R. Vazquez-padron2, L. Salman3, M. Tabbara2  1University Of Miami,Medicine,Miami, FL, USA 2University Of Miami,Surgery,Miami, FL, USA 3University Of Miami,Interventional Nephrology,Miami, FL, USA 4University Of Miami,Miller School Of Medicine,Miami, FL, USA

Introduction:

Arteriovenous Grafts are created in the arm when there are no adequate veins for a fistula. The outflow vein is usually the axillary vein in order to match the outflow to a 6-8mm graft. Our technique involves using a 3.5-4 mm brachial vein and create a preliminary mid arm brachial artery to brachial vein arteriovenous fistula. This is followed with a graft extension involving ligation of the fistula and using the dilated, mature vein as the outflow in an end-to-end anastomosis.  

Methods:
The study included 92 patients who underwent a Brachial- Brachial Arteriovenous Graft creation at the University of Miami or Jackson Memorial Hospital from 2008 to 2015. The effects of primary graft survival were determined using multivariate logistic regressions and Cox proportional hazard models adjusted for clinical and demographic covariates (age, gender, ethnicity, hypertension, diabetes, antiplatelet agents, statins, prior catheter use, history of previous AVF and graft size). 

Results:

Neither primary nor secondary graft survival was significantly correlated with clinical and demographic covariates.  Primary failure at one year (365 days) was 55.4% (51 patients) with a mean survival of 283 (±128) days. The most common intravascular intervention in primary graft survival was balloon angioplasty in 32 (64.0%), followed by thrombectomy 11 (22.0%) and finally surgical revision 7 (14.0%).     

Conclusion:

Our results suggest that the technique of a brachial vein fistula, followed by graft extension can result in a durable access and preserves the axillary vein for future grafts.

 

88.13 Human Immunodeficiency Virus effect on Hemodialysis Arteriovenous Fistulas Remodeling Outcomes

A. Dejman1, J. C. Duque2, L. Martinez3, L. Salman4, R. Vazquez-Padron3, M. Tabbara3  1University Of Miami,Nephrology,Miami, FL, USA 2University Of Miami,Medicine,Miami, FL, USA 3University Of Miami,Suergery,Miami, FL, USA 4University Of Miami,Interventional Nephrology,Miami, FL, USA

Introduction:
Arteriovenous fistulas (AVF) for hemodialysis in ESRD patients is the preferred vascular access type and it currently remains to be one of the areas under profound research given the high rates of failure, complications and cost burden for the health system. Multiple advances in vascular diseases in HIV patients independent to hemodialysis accesses have been reported. One remarkable connotation is the role of the HIV virus and the direct effect in the vessel wall, in which some authors have shown that these patients have a higher incidence of cardiovascular illnesses with elevated morbidity and mortality and poor vascular outcomes. Unfortunately, the impact of the Human Immunodeficiency Virus (HIV) in the AVF remodeling and outcomes is not well known.

Methods:
This retrospective study assessed the impact of HIV infection on one-stage and two-stage hemodialysis AVF outcomes. The study included 494 patients but only 42 patients were HIV positive. All of them underwent an AVF creation at the University of Miami/Jackson Memorial Hospital from 2008 to 2014. The effects of HIV on primary failure were determined using multivariate logistic regressions and Cox proportional hazard models adjusted for 10 clinical and demographic covariates.

Results:

Primary failure was not correlated with clinical including medications and demographic covariates, but population was relatively younger than controls. Patients with diagnosis of HIV had a positive correlation with AVF primary failure (p=0.004) no mater the anastomosis type. Patients with HIV and history of previous AVF had association with primary failure (p=0.002). Moreover different access such as Tunneled dialysis catheters showed correlation with primary failure (p=0.012)  A T-cell subset including  (CD3, CD4, or CD8)  did not show any association with primary failure. 

Conclusion:

Our results suggest that HIV immunosuppression may play a role in AVF outcomes specially primary failure. HIV infection relates to increased rate of AVF primary failure, but this is not explained by the T-cell subset counts and  there should be a different immunological relationship between AVF failure and vascular remodeling.

 

88.07 Mortality following Endovascular versus Open Repair of Abdominal Aortic Aneurysm in the Elderly.

S. Locham1, R. Lee1, B. Nejim1, H. Aridi1, M. Faateh1, H. Alshaikh1, M. Rizwan1, J. Dhaliwal1, M. Malas1  1Johns Hopkins University School Of Medicine,Surgery,Baltimore, MD, USA

Introduction:
Prior RCTs have reported better perioperative outcomes following endovascular aneurysm repair (EVAR) as compared to open aneurysm repair (OAR). EVAR-1 and DREAM trial reported significantly higher mortality for OAR as compared to EVAR. However most of these studies excluded the elderly.  Age is a well-known risk factor for postoperative death and the efficacy of these approaches remains controversial in the elderly population. The aim of the study is to provide recent real world outcomes using the NSQIP database (2010-2014) exclusively looking at the predictors of mortality in a large cohort of elderly population in the United States. 

Methods:

Using the NSQIP targeted vascular database (2010-2014), we identified all patients over 70 years of age who underwent OAR and EVAR for non-ruptured AAA. Explanatory analyses using Pearson’s Chi-square and Student’s t-tests were performed. Univariate and multivariable logistic regression analyses were implemented to examine postoperative morbidities and mortality adjusting patient demographics and characteristics.

Results:
A total of 5,332 non-ruptured AAA repairs were performed [OAR: 809 (15%) vs. EVAR: 4,523 (85%)]. The majority of patients were male (77%) and white (81%) with mean age of 78 ± 6 years. Diabetes mellitus and obesity were more prevalent in the EVAR group (15% vs. 12%, p=0.01) and (30% vs. 25%, p=0.002), respectively. Whereas, history of chronic obstructive pulmonary disease (COPD) (22% vs. 19%, p=0.02) and smoking status (35% vs 23%, p<0.001) were more likely to be seen in patients undergoing an OAR. On average the operative time in minutes (250 vs. 151) and mean length of stay in days (11 vs. 3) was also longer for patients undergoing OAR versus EVAR (p<0.001). The mortality was higher following OAR versus EVAR (8% vs 3%, p<0.001). Compared to EVAR, OAR was associated with higher rates of cardiac (7% vs. 2%), renal (7% vs. 1%), pulmonary (20% vs. 3%) and any wound complications (4% vs. 2%) (all p<0.05). After adjusting for patients’ characteristics and comorbidities, OAR was associated with 3 times higher mortality than EVAR [OR(95%CI): 3.04(2.01-4.57), p<0.001]. The predictors of mortality in our elderly cohort were age, female gender, smoking status, functional dependency, history of COPD, steroid use, bleeding disorders, progressive renal failure, transfusion, aneurysm diameter and Type IV TAAA. 

Conclusion:

Our study reflects contemporary real world outcomes following repair of non-ruptured AAA in the elderly. Endovascular approach was associated with significant reduction in the risk of postoperative cardiac, pulmonary and renal complications the elderly. Open repair was associated with 3 fold increase in mortality compared to EVAR and should be avoided in the elderly. Further prospective studies involving geriatric population is required to better understand the predictors of mortality following AAA repair. 

88.02 Bundling Of Reimbursement For Inferior Vena Cava Filter Placement Decreased Procedural Utilization

M. J. TerBush1, E. L. Hill1, J. Guido1, A. Doyle1, J. Ellis1, G. R. Morrow1, M. Stoner1, K. Raman1, R. J. Glocker1  1University Of Rochester,Surgery,Rochester, NY, USA

Introduction: On January 1, 2012, reimbursement for inferior vena cava filters (IVCF) became bundled by the Centers for Medicare and Medicaid Services (CMS). This resulted in a 70% decrease in RVUs associated with ICVF placement from 15.6 RVUs to 4.71 RVUs.  Our hypothesis was that procedural utilization would decrease following this change. We previously performed an analysis which revealed no significant changes in utilization. As new data have become available, we have revised our analysis in an effort to identify practice pattern changes.
 

Methods: We analyzed data from 2010-2014 using 5% inpatient, outpatient, and carrier files of Medicare limited data sets, analyzing IVCF utilization, controlling for total diagnosis of deep vein thrombosis (DVT) and pulmonary embolism (PE) (ICD – 9 codes 453.xx and 415.xx, respectively).

 

Results: In 2010 and 2011, the rates per 10,000 DVT/PE diagnoses were 918 and 1052, respectively (average 985). In 2012, 2013, and 2014, rates were 987, 877, and 605, respectively (average 823). The included figure demonstrates these trends graphically across different specialties. Comparing each year individually, there is a significant difference (p<0.0001) with 2012, 2013, and 2014 having lower rates of ICVF utilization. Comparing averages between the 2010-2011 and 2012-2014 groups, there is also a significant decrease in utilization after bundling (p<0.0001).

Conclusion: These data demonstrate that adjusted IVCF deployment rates dropped after the introduction of a bundled code with a reduced RVU and professional fee reimbursement value. This correlation may be evidence of a supply-sensitive medical service, and a successful realignment based on procedural valuation. More data from 2015 to present will be needed to show if this decrease in utilization continues to persist today.

 

87.20 Case- Based Learning in Critical Care: An Appraisal of Current Literature

S. F. McLean1, A. H. Tyroch1, C. Ricci1, A. H. Tyroch1  1Texas Tech University Health Sciences Center At El Paso,Surger,El Paso, TX, USA

Introduction: Case—based learning (CBL)is an inquiry-based learning paradigm requiring some type of inquiry by the student, based on  actual or simulated cases, discrete learning objectives, and  with mentorship to  achieve the objectives.  A literature search was completed in order to assess its use in critical care education.

Methods: A literature search was completed using OVID and Pubmed.gov. Key word searches used “Critical Care” and separately Emergency Medicine, Medicine and Surgical Critical Care and “Case-Based” and “learning”.   Abstract total was 595. Abstracts were discarded due to not critical care topics, no data, not CBL, no English or Spanish translation. Method of teaching was assessed and categorized based on main teaching method.  Chi-square testing was used for categorical variables.

Results:  595 Abstracts were retrieved, out of these 39 articles were kept. Key  disciplines were Surgical Specialties (8, 20%), Medicine (26, 67%), and Anesthesia (2, 6%). 5 continents were represented, with North America having 30 papers (77%).  Methods of delivery were live (27, 69%) and live plus online (6, 15%), live plus written (4, 10%), online only (2, 5%).  Method of teaching was of 9 categories, the most common was simulation plus didactics (10, 26%), then simulation cases only (9, 23%), case-based non-simulation only (5, 12.8%), written materials followed by CBL (4, 10%).  19 (49%) of studies listed CBL as part of the course.  Learners were students to post-graduates.   Student numbers were 3-413, 14 papers (36%) reporting on practitioners, 25 (64%) were in-training or students.
Primary assessments were categorized into 5 types:  written tests, 13 (33%),observed skills clinical exam OSCE), 10 (26%), survey of learners, 6 (15%), review of practice behaviors, 4(10%), review of patient outcomes, 3 (7.7%).   Post course learner’s survey was used in 24 (62%), this was the sole course evaluation in 6 (15%).  
 Learners who were still in training for their specialty had more written tests (13 studies vs. 0 in practitioner level courses)  as primary course evaluations, and fewer review of patient outcomes ( 0 in trainees vs. 3 in practitioners), (p=.008). In training studies used OSCE (6) more often than practitioner studies (4).

 

Conclusion: CBL is used throughout the world to teach critical Care topics to students in training and practitioners.  Simulation is the most common way to deliver CBL. Evaluations differ significantly depending on learner education level. .

 

87.16 Effect of Bariatric Surgery on Cardiovascular Risk Factors: Single Institution Retrospective Study

M. A. Al Suhaibani1, A. Al Harbi1, A. Alshehri2, E. Alghamdi2, R. Aljohani2, S. Elmorsy3, J. Alshehri3, A. Almontashery3  1Qassim University,Medicine,Qassim,Saudi Arabia 2Umm-alqura University,Makkah,Saudi Arabia 3King Abdullah Medical City,Makkah,Saudi Arabia

Introduction:
Bariatric surgeries became a trend for obese individuals with or without comorbidities. It can result in remission and control of cardiovascular risk factors. We aim to describe bariatric surgeries effect on cardiovascular risk factors including glycemic control, blood pressure, lipid profile and body weight at King Abdullah Medical City (KAMC) at Makkah, Saudi Arabia. 

Methods:
Retrospective cohort study including all obese patients who underwent bariatric surgery at KAMC between 2013 to February 2016 . 

Results:
Total of 566 patients, 58.1% (n=329) were female with mean age 34.8±10.2 years and 26.9% (n=152) were smokers. Almost all of them underwent laparoscopic gastric sleeve (95.9%, n=543). Diabetes and hypertension were on the top of comorbidities (24.8%, n=118 and 18.9%, n=107 respectively). After 24 months of follow up there was significant reduction in the baseline mean of Body Mass Index (BMI) (47.6 kg/m2) and Glycosylated Hemoglobin (HbA1c) (6.8%) to 31.8 kg/m2 and 5.7% respectively (p<0.001). Baseline High Density Lipoprotein (HDL) (44.4mg/dl), non-HDL cholesterol (147.1 mg/dl) and Triglyceride (TAG) (123.6 mg/dl) also showed significant improvement to reach 52.3 mg/dl, 137.7 mg/dl and 78.5 mg/dl at 12 months of post-operative follow up (p<0.001).Out of 79 hypertensive patients, 57 patients had remission of hypertension in last follow up (p<0.001). Among 103 pre-operative diabetic patient 71 (68.9%) had complete remission, and 17 (16.6%) had partial remission in last follow up (p<0.001).The predicted 10-years risk of CVD decreased significantly from 7.4% to 5.5% (p<0.001).

Conclusion:
Bariatric surgeries were significantly effective for controlling obesity related cardiovascular risk factors and decrease its 10-years predicted risk within a maximum 2 years period of follow up.