Celebration of Research

Professor demonstrating on a simulator to students at the U-F Department of Anesthesiology Celebration of Research

Thank you for joining us!

The University of Florida Department of Anesthesiology’s Celebration of Research encourages collaboration with other scientists and clinicians, and provides a forum for the exchange of ideas and information.

Agenda

Thursday, March 5, 2026; HVN 8th Floor Conference Area

3:30 p.m. Doors Open
4 p.m. – 5 p.m. Poster Viewing
5 p.m. – 6 p.m. Abstract Winner Presentations
6 p.m. Keynote Presentation: Early Exposure to General Anesthesia – How Sensitive Is Our Young Brain?

Vesna Jevtovic-Todorovic

Keynote Speaker

Vesna Jevtovic-Todorovic, M.D., Ph.D., MBA, FASA

Vesna Jevtović‑Todorović, MD, PhD, MBA, is a professor of anesthesiology at the University of Colorado College of Medicine. She earned her MD in Belgrade (1985), PhD at the University of Illinois at Chicago (1990), and an MBA at the University of Virginia (2010). She completed residency training in anesthesiology at the Washington University School of Medicine in St. Louis. Jevtović‑Todorović is a leading investigator of anesthesia‑induced developmental neurotoxicity and chronic pain mechanisms, publishing more than 170 peer‑reviewed articles—including seminal work on calcium‑channel blockers that reduce nociception while sparing sedation. Her research, conducted in collaboration with the Center for Neuroscience, continues to shape safer anesthetic practices for newborns and adolescents.


Abstracts


Intravenous NanO₂ Therapy Mitigates Secondary Injury and Promotes Recovery Following Cervical Spinal Cord Trauma in Rats

Jiamei Hou, MD, PhD1,2; Caleb Brown1,2; Daniel Plant, MS1; Alexis Pullos1; Kelena Klippel, MS1,2; Fiona Cheung, MS1; Shigeharu Tsuda, PhD1,2; Araana Mondal1; Cynthia S Garvan, PhD2; Bruce Spiess, MD2; Floyd J Thompson, PhD1,3; Prodip Bose, MD, PhD1,2,4

1North Florida/South Georgia Veterans Health System, Gainesville, FL 32608; 2Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 3Department of Neuroscience, University of Florida College of Medicine, Gainesville; 4Department of Neurology, University of Florida College of Medicine, Gainesville.

Cervical spinal cord injury often leads to persistent locomotor impairments that severely diminish independence and quality of life. Although the primary mechanical trauma initiates immediate damage, subsequent secondary injury cascades—driven by ischemia, oxidative stress, and inflammation—exacerbate tissue loss and worsen neurological outcomes. Therefore, therapies that target these secondary processes hold promise for improving recovery. In this study, we investigated the neuroprotective potential of NanO₂, a patented intravenous emulsion of perfluorocarbon nanoparticles (NuVox, Tucson, AZ) designed to enhance oxygen delivery and limit ischemia-induced damage. Adult female Sprague-Dawley rats underwent moderate C6/7 spinal cord contusions (200 kdyn, Infinite Horizon Impactor) under anesthesia. Animals were randomized to receive either NanO₂ (0.6 ml/kg; initial dose at 20 minutes postinjury, followed by 2 additional doses at 90-minute intervals) or normal saline, administered via lateral tail vein using a WPI syringe pump at 50 µl/min. Functional outcomes were assessed using 3D kinematic gait analysis (SIMI Motion Systems) and CatWalk XT (Noldus) automated footprint tracking at preinjury and postinjury timepoints. Hindlimb spasticity was evaluated at postoperative weeks 4 and 8 using velocity-dependent ankle torque measurements, time-locked triceps surae EMG responses, and H-reflex rate-depression testing. In vivo 7T (MR Solution) fast spin-echo (FSE) MRI was conducted at weeks 1, 4, and 8 to quantify lesion volume and edema. Immunohistochemical analyses were performed on spinal cord tissue to assess expression of inflammatory and oxidative stress markers, including IL-1β, and Nitrotyrosine (3-NT). Results to date demonstrate that NanO₂-treated animals exhibited significantly improved locomotor function and reduced hindlimb spasticity compared to controls. MRI revealed smaller lesion volumes, and histological data showed decreased expression of proinflammatory and oxidative markers in NanO₂-treated tissue. These findings suggest that NanO₂ effectively mitigates ischemia-driven secondary injury processes and improves functional outcomes following cervical spinal cord injury. This therapy represents a promising candidate for future translation in the management of acute spinal cord injury.

Support: Department of the Army – USAMRAA HT9425-23-1-0562 (SC220248), North Florida Foundation for Research and Education SUB00003847. We thank Dr. Evan Unger, executive chairman and cofounder of NuVox Pharma, for providing NanO₂.


Targeting Iron-Mediated Neurotoxicity in Spinal Cord Injury: A Novel Iron Chelator Therapy Reduces Motor Deficits and Promotes Recovery in a Rodent Model

Prodip Bose, MD, PhD1,2,3, Jiamei Hou, MD, PhD1,2, Shigeharu Tsuda, PhD1,2, Daniel Plant, MS2, Caleb W Brown, BS1, Fiona Cheung, MS1,2, Anam Kidwai, MS1,2, Alexis Pullos, BS1,2, Kelena S. Klippel, MS1,2, Nicole M, Weston, PhD2, Raymond J Bergeron, PhD4, Floyd J Thompson, PhD1,5
 

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville, FL; 2 North Florida/South Georgia Veterans Health System, Gainesville, FL; 3 Department of Neurology and the McKnight Brain Institute, University of Florida College of Medicine,Gainesville; 4 Department of Medicinal Chemistry, University of Florida, Gainesville; 5 Department of Neuroscience and the McKnight Brain Institute, University of Florida College of Medicine, Gainesville, FL

Introduction

Cervical spinal cord injury (C-SCI) is a major cause of long-term disability, leading to chronic motor, sensory, and pain-related deficits that significantly reduce quality of life. Current treatments offer limited benefits and do not address the progressive secondary injury processes that worsen outcomes over time. A key driver of this progression is hemorrhagic “free iron” released after contusion injury, which disrupts the blood–spinal cord barrier, promotes oxidative stress, fuels neuroinflammation, and contributes to ongoing neurological decline.

This study evaluates SP420, a novel targeted iron chelator, for its ability to neutralize hemorrhagic iron and mitigate secondary damage in a clinically relevant rodent model of C-SCI. We hypothesize that postinjury SP420 treatment will reduce oxidative injury and improve functional recovery. Successful outcomes would provide a strong foundation for advancing SP420 into human clinical trials and support the development of a needed, mechanism-based therapy for individuals with SCI, including civilian and veteran populations.

Methods

Moderate C-SCI was induced at C6/7 using a 200-dyne force contusion protocol under general anesthesia and aseptic conditions, following an approved IACUC protocol. The SP420 treatment was initiated at 2 postinjury time points (n = 24 per time point): 30 minutes and 4 weeks after SCI. Each treatment cohort consisted of 3 groups (n = 8 each): SP420-treated, injured controls, and naïve controls. The SP420 was administered subcutaneously at 66 mg/kg on alternating days for 2 weeks, while controls received equivalent saline volumes. To reduce bias, procedures included standardized operations, formal experimental design with blocking and stratification, randomization, and blinding.

An acute SCI cohort also received SP420 beginning 30 minutes postinjury at a human phase II-equivalent dose (66 mg/kg, SQ). Primary outcome measures included quantitative assessments of spasticity, gait, and axonal conduction in descending locomotor pathways. Secondary outcomes incorporated advanced imaging modalities—T1/T2-weighted MRI, susceptibility weighted imaging, quantitative susceptibility mapping, and diffusion tensor imaging.

Safety evaluations encompassed activity and general health monitoring, hemoglobin levels, and liver and kidney function tests. Histological and immunohistochemical analyses further examined iron deposition, oxidative stress, inflammation, blood–spinal cord barrier integrity, and markers of neural and vascular protection. Collectively, these assessments provide a comprehensive evaluation of SP420’s safety and therapeutic potential in mitigating SCI-induced pathology.

Results

C6/7 contusion injury caused a marked increase in velocity-dependent ankle torque and stretch-evoked EMG amplitudes of ankle extensor muscles, indicating spasticity. The SP420 treatment significantly attenuated spasticity in both acute and chronic conditions, with acute administration showing the greatest effect. Beyond spasticity, SP420 improved multiple gait parameters, including enhanced interlimb coordination during locomotion, compared to saline-treated injured animals.

Mechanistically, free iron exacerbated oxidative stress and neuroinflammation via ROS, driving progressive neurological and motor deficits. The SP420-treated animals exhibited significant reductions in proinflammatory molecules (IL-1β) and ROS markers (NT3) compared to untreated SCI controls. The therapy mitigated iron-mediated neurological damage, prevented inflammation-induced neuronal cell death, and promoted neuroplasticity at the injury site. In vivo and ex vivo diffusion tensor imaging analyses demonstrated improved preservation and regenerative capacity of dorsal corticospinal and rubrospinal motor tracts.

These results indicate that SP420 effectively targets oxidative stress and neuroinflammation, reduces chronic SCI-related motor disabilities, and fosters neuroplastic recovery, supporting its potential as a safe and mechanism-based therapeutic approach.

Conclusion

This preclinical study provides the first comprehensive evidence linking SCI-induced motor deficits to iron-mediated oxidative stress and neuroinflammation, and it demonstrates that SP420 can effectively mitigate these pathological processes. These findings underscore the potential of innovative, noninvasive, and patient-centered therapies to transform SCI treatment and rehabilitation. Supported by robust efficacy and safety data, SP420 represents a strong candidate for an FDA-fileable investigational new drug, laying the groundwork for future clinical trials. Successful translation could revolutionize SCI care by delivering targeted therapeutics alongside prognostic imaging tools, ultimately improving functional recovery, quality of life, and long-term outcomes for individuals living with SCI.

Funding Supported by Spinal Cord Injury Research Program (SCIRP) Investigator-Initiated Research Award # SC210266 from the United States Department of Defense (DoD). Merit Review Award # B3986-R/1 I01 RX003986-01A1, from the United States Department of Veterans Affairs Rehabilitation Research and Development Service (RR&D).


Noradrenergic Modulation of Neuroinflammation and Cognitive Deficits Following Mild Traumatic Brain Injury

Kelena Klippel, MS1, 2; Daniel Plant, MS2; Fiona Cheung, MS2; Jiamei Hou, MD, PhD1, 2; Floyd Thompson, PhD2, 3; Prodip Bose, MD, PhD1,2,4

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Brain Rehabilitation Research Center, Malcom Randall VA Medical Center, Gainesville; 3 Department of Neuroscience, University of Florida College of Medicine, Gainesville; 4 Department of Neurology, University of Florida College of Medicine, Gainesville

Introduction

Damage to the noradrenergic (NA) system contributes to persistent neuroimmune and metabolic dysfunction following mild traumatic brain injury (mTBI), the most common injury severity level experienced by the global population (75–90% of 70 million individuals). Despite its classification as “mild,” mTBI is associated with long-lasting neurocognitive, affective, and somatic symptoms. However, the absence of targeted pharmacological interventions remains a critical barrier to improving long-term outcomes.  Our laboratory has shown that experimental TBI in rats results in persistent reductions in NA cell and fiber density, particularly in brain regions highly dependent on NA tone for homeostatic function, including the hippocampus, prefrontal cortex, and amygdala. This compromised NA signaling lowers norepinephrine levels and disrupts blood–brain barrier integrity, immune cell maturation, and microglial regulation. An underexplored yet promising therapeutic target is the central NA system. The NA signaling plays a pivotal role in modulating immune function through mechanisms that influence microglial activation, cytokine expression, and oxidative phosphorylation. Norepinephrine promotes mitochondrial ATP production, limits reactive oxygen species generation, and suppresses proinflammatory cytokine production while shifting microglia toward a more anti-inflammatory phenotype. When NA tone is diminished—such as through damage to the locus coeruleus or its projections—these regulatory mechanisms fail, allowing unchecked neuroinflammatory cascades to persist. Disruption of NA signaling after TBI likely plays a central role in exacerbating immune dysfunction during the subacute and chronic phases of injury. Given this mechanistic vulnerability, restoring NA tone offers a targeted strategy to mitigate immune dysfunction. Methylphenidate (MP), a norepinephrine–dopamine reuptake inhibitor widely used in attention deficit disorders, represents a promising and potentially repurposable therapeutic agent.

three purple images from a PET scan
Figure 1. Region-based qualitative analysis of 18F-FDG PET imaging revealed that overall glucose metabolism in the treated TBI group was comparable to naïve controls, indicating a restoration of metabolic activity toward baseline levels. TBI-control animals showed increased FDG uptake.

Methods

Adult female Sprague Dawley rats (250–300 g, 10–12 weeks old) were used. Animals were block randomized to the following experimental groups (n = 10 per group): mTBI + MP, mTBI + vehicle/control, naïve. Mild diffuse TBI was induced using a modified Marmarou weight-drop model, which replicated concussive injury mechanisms with high translational relevance Under 2% isoflurane anesthesia, a 450 g brass weight was dropped from 1.25 m through a Delrin tube onto a 10 mm steel disk affixed to the skull. The animal was placed on a foam platform (12×12×43 cm) to yield a standardized 2500 N/m impact over ~0.2 ms. Then, MP was administered orally in a chronic postinjury window at a dosage of 2.5 mg/kg, starting after the emergence of secondary neuroinflammatory cascades and behavioral deficits. Vehicle-treated animals received matched dosages of saline. Furthermore, MP was freshly prepared and administered twice daily at the same circadian time.

Outcomes

A multimodal neuroimaging approach—including resting-state functional MRI (rsfMRI) to examine large-scale functional connectivity across neural networks, [¹⁸F]-FDG PET (FDG-PET) for regional glucose metabolism as a measure of neuronal activity and bioenergetic demand, and Diffusion Tensor Imaging (DTI) to assess white matter microstructure—was used to monitor treatment response and characterize underlying metabolic and structural changes. All rats underwent the neuroimaging scans at baseline, 1-month, and 3-month postinjury. All animals underwent behavioral testing at 3 months postinjury before terminal tissue collection. Behavioral outcomes included assessments of cognition (Morris water maze) and sensorimotor gating/NA tone (startle reflex/prepulse inhibition). Tissue from region-specific areas for cognition—the prefrontal cortex and hippocampus—will be assessed using the Simoa Cytokine 4-Plex to quantify TNF-α, IL-1β, IL-6, IL-10 via the Quanterix HD-X platform. Sholl analysis will be performed to examine microglial activation using Iba1 and CD68 immunostaining.

A line graph of startle amplitude
Figure 2. Startle reflex testing revealed that TBI-induced deficits in inhibitory gating were rescued in the treated group, suggesting restoration of NA tone.

Results

Overall, MP rescues inhibitory control and NA tone, as startle reflex testing revealed that TBI-induced deficits in inhibitory gating were rescued in the treated group. Statistical analysis using a 2-way ANOVA (factors: group x intensity) revealed a significant treatment effect (P < .01). These findings indicate that MP effectively normalizes NA-dependent behavioral responses following injury. Also, MP enhances search strategy and short-term memory following TBI without compromising safety. Across all measures, treated animals performed comparably or better than TBI-controls, indicating no safety concerns or adverse behavioral effects related to MP administration. Representative swim paths from each group show that MP-treated TBI animals adopted more strategic, goal-directed search patterns, whereas TBI-controls exhibited predominantly random searching. Statistical analysis using unpaired t-test revealed a significant injury effect (P < .01). During the retention probe, treatment effects were less pronounced, suggesting benefits primarily for short-term rather than long-term memory. Together, these data support a selective enhancement of early cognitive recovery without evidence of behavioral toxicity. Additionally, MP restores glucose metabolism in TBI towards baseline levels. Global and region-based qualitative analysis of 18F-FDG PET imaging revealed that overall glucose metabolism in the treated TBI group was comparable to naïve controls. In contrast, TBI-control animals showed increased FDG uptake. These findings suggest that the treatment supports neuronal recovery and energy utilization without inducing hypermetabolic or adverse effects. Furthermore, rsfMRI indicated changes in functional connectivity between pretreatment and posttreatment of the TBI-control group, with heightened functionality within anxiety-related regions, while within the TBI-treated group showed decreased functionality between cognitive regions, suggesting neuroplastic changes within the cognitive network with treatment. These connectivity shifts are consistent with a normalization of circuit-level activity following MP treatment.

Conclusions

Our findings suggest that treatment restores inhibitory control and NA tone. For instance, TBI-induced deficits in inhibitory gating were effectively rescued, indicating recovery of sensorimotor and pre-attentive processing. Statistical analysis using a 2-way ANOVA (factors: group x intensity) revealed a significant treatment effect (P < .01). These outcomes support the conclusion that treatment normalizes NA-dependent functional responses after injury. Our findings also suggest that cognitive function is improved with treated animals showing enhanced short-term memory and more strategic, goal-directed behaviors, highlighting benefits for learning and cognitive flexibility. In particular, improvements in search strategy and early memory performance underscore treatment-related gains in executive processes. Treated animals also showed metabolic activity normalization with glucose metabolism being restored toward baseline levels, supporting neuronal energy utilization and recovery without inducing hypermetabolic stress. This metabolic stabilization aligns with reduced pathological energy demand commonly observed after TBI. Lastly, functional connectivity analyses point to neuroplastic remodeling between pretreatment and posttreatment, revealing targeted changes in cognitive networks, with reduced hyperactivity in anxiety-related regions and enhanced organization of cognitive circuits. Collectively, these findings demonstrate that the treatment promotes multi-level recovery—behavioral, metabolic, and network-level—after TBI, supporting its translational relevance. Overall, the data provide proof-of-concept that restoring NA tone may represent a viable therapeutic avenue for improving signature functional outcomes following mTBI.

Funding

Supported by SPiRE Award # K9J8XA7VF6T7 from the United States Department of Veterans Affairs Rehabilitation Research and Development Service (RR&D) and I. Heermann Anesthesia Foundation Award #AWD19027.


Efficacy of Transcranial Magnetic Stimulation Treatment for Impaired Amygdalar Neurotransmitter Systems Following Traumatic Brain Injury in Rats

Shigeharu Tsuda, PhD1,4; Joseph V. Watts, PhD4; Jiamei Hou, MD, PhD1,4; Floyd J. Thompson, PhD2,4; Prodip Bose, MD, PhD1,3,4

1 Department of Anesthesiology, 2 Department of Neuroscience, 3 Department of Neurology, University of Florida College of Medicine, Gainesville; 4 North Florida/South Georgia Veterans Health System, Gainesville

Introduction

It is reported that 50–60 million new traumatic brain injury (TBI) cases occur per year. Most of these injuries are categorized as mild TBI and leading causes of disabilities, such as prolonged anxiety disorders, in all ages in all nations. While the basolateral amygdala (BLA) is shown to play an important role in regulating anxiety, TBI-induced alterations of neuroregulatory systems in the BLA and their associations with anxiety disorders are not well understood. Since therapeutic potential of transcranial magnetic stimulation (TMS) to treat TBI-induced anxiety has been shown in the literature, knowledge related to these pathological issues and efficacy of this treatment are critical to develop therapeutic methods for TBI-induced anxiety disorders (including posttraumatic stress disorder). Accordingly, the purpose of this study was to determine the impacts of chronic TBI on neurotransmitter systems in the BLA andthe restorative effects of TMS on themin a clinically relevant rodent model.

Methods

Following an acclimatization period (at least 1 week), rats were randomly separated into naïve (normal intact), TBI, and TBI + TMS groups (n = 3 per group). For TBI and TBI + TMS groups, mild-to-moderate closed-head TBI was induced using a weight-drop model, and TMS was given every other day for 4 weeks (starting 1 week after injury). Twenty-one weeks after TBI, all animals were deeply anesthetized and transcardially perfused via the left ventricle with phosphate-buffered saline followed by 4% paraformaldehyde in phosphate buffer. Following fixation in the 4% paraformaldehyde solution (overnight) and incubation with 30% sucrose in the phosphate buffer, brains were coronally sectioned to be used for immunofluorescence staining of crucial molecules for the noradrenergic and GABAergic systems as well as neuroprotective, scaffolding, and cross-linking molecules.

Four colored images of TMS treatment attenuated TBI-induced reduction.
Figure 1. TMS treatment attenuated TBI-induced reduction of DBH-ir noradrenergic (A) and GAD-ir GABAergic fiber innervation of the BLA (B). Figure 2. TMS treatment attenuated TBI-induced reduction of PSD95/NeuN-ir (A) and MAP2/NeuN-ir BLA neurons (B).

Results

We revealed that chronic TBI significantly decreased dopamine beta-hydroxylase-immunoreactive (-ir) noradrenergic and glutamate decarboxylase-ir GABAergic fiber innervation of the BLA (P < .00001 and P < .01, respectively). Furthermore, expressions of GABA B receptor 1, brain-derived neurotrophic factor, postsynaptic density protein 95, and microtubule-associated protein 2 in the NeuN-ir BLA neurons were significantly decreased following injury (P < .01, P < .01, P < .0001, and P < .001, respectively). However, all of these TBI-induced unfavorable neurobiological phenomena were significantly attenuated by the TMS treatment (P < .001, P < .01, P < .01, P < .01, P < .01, and P < .01, respectively) although not fully rescued (P < .001, P < .01, P < .0001, P < .05, P < .00001, and P < .01, respectively).  

Conclusion

We concluded that the TMS treatment partially restored TBI-induced reduction of noradrenergic and GABAergic supplies to the BLA as well as expressions of neuroprotective, scaffolding, and cross-linking molecules in the BLA. Although TMS is not a panacea for TBI-induced multiple morbidities, it has great potential as an effective therapeutic option.

Funding: This work was supported by VA Rehabilitation Research and Development Service Merit Review Awards (# B3123-I/I01 RX003123 and B78071/1I01 RX000502-01A) and VA SPiRE grant # K9J8XA7VF6T7.


The Brain-Gut-Microbiome Axis in a Female Rat Model of Postoperative Neurocognitive Disorder

Itamar Gal, Zeeshan Khan, Ling Sha Ju

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Surgery and general anesthesia may accelerate cognitive decline, termed perioperative neurocognitive disorder, potentially contributing to Alzheimer’s disease. However, the mechanisms, vulnerable populations, and prevention strategies remain unclear. Building on findings from the Martynyuk Laboratory (UF Anesthesiology), we investigated in female rats whether general anesthesia with sevoflurane (Sevo) alters gut microbiota, a component of the brain-gut-microbiome (BGM) axis, in a sex-dependent manner (findings in male rats were published in Khan et al). We further hypothesized that inhibition of the NKCC1 Cl⁻ importer with bumetanide would mitigate Sevo-induced microbiota effects.

Methods

Sprague Dawley female rats were exposed to 2.1% sevoflurane for 3 hours on postnatal days (P) 56, 58, and 60. Prior to each exposure, they received either saline as vehicle (Control and Sevo groups) or bumetanide (BS group, 1.84 mg/kg, intraperitoneally). Gut microbiome samples were collected on P60 and analyzed using 16S rRNA gene sequencing. My role in this project has focused on performing bioinformatics analyses of sequencing data obtained from study samples collected by other members of the laboratory.

Figure 1 shows different metrics including alpha and beta diversity metrics in the form of graphs.
Figure 1. (A) Shannon Alpha diversity index and (B,C) Beta Diversity Indices: Weighted Unifrac, Bray Curtis respectively. (D-F) Volcano plot of ASVs, showing enriched ASVs (log2 fold change) and depleted ASVs (-log2 fold change) on the x-axis, and the -log10 adjusted p-value on the y-axis. Horizontal and vertical dashed lines separate nonsignificant ASVs. (G) Stack plot of relative abundance percentage of the groups at taxonomic level: Genus. (H) PCoA plot showing the distribution of taxonomic composition at the family level through all groups. (I) Barplot of Firmicutes to Bacteroidetes (F/B) ratio.

Results

Compared to Control rats, sevoflurane exposure produced non-significant changes in the gut microbiome. Alpha and beta diversity metrics remained consistent across groups. Differential abundance analysis revealed minor shifts in Sevo and BS group in gut microbial Amplicon Sequencing Variants (ASVs), with a tendency toward depletion. The Lactobacillus and Bacteroides genus showed increased abundance following treatment, while overall community structure remained stable across groups. Consequently, the effects of bumetanide pretreatment were unremarkable.

Conclusions/Significance

These findings suggest that, in young adult female rats, the BGM axis is tolerant to repeated sevoflurane exposure at the time of treatment; therefore, may not contribute to perioperative neurocognitive disorder pathophysiology.

Figure two shows a bar chart showing relative abundance and stress tolerance.
Figure 2: (J) BugBase predicted the relative abundance of stress-tolerant bacteria within the gut microbiome of each group (boxplot). (K) Stackplot of the corresponding OTU contribution of groups at taxonomic level: Family. (Red): Alcaligenaceae; (Orange): Enterobacteriaceae; (Grey): Others.

Acknowledgements

This study was supported by the 2025–2026 UF COM University Scholars Award to Itamar Gal (mentor: Dr. Anatoly E. Martynyuk). This project is part of the broader research program in the Martynyuk Laboratory, supported by the National Institutes of Health (R01HD107722 and R56HD102898 to A.E.M.), the J.H. Modell, M.D., Endowed Professorship (N.G.), and the UF Department of Anesthesiology. The author also gratefully acknowledges the help and guidance of Drs. Zeeshan Khan and Ling-Sha Ju in the Martynyuk Laboratory.


Computer Vision Interpretation of Chest Drainage Canister Data

Mario Rampangu, MS1; Simon Mesber, BS1; Subhash Nerella, PhD2; Christopher Samouce, PhD1; Keerthi Humsika Kattamudi, MS1; Gregory Janelle, MD1; Mindaugas Rackauskas, MD3; Nikolaus Gravenstein, MD1; Samsun Lampotang, PhD1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Biomedical Engineering, University of Florida College of Engineering, Gainesville; 3 Department of Surgery, University of Florida College of Medicine, Gainesville

Introduction

We explore whether contactless, remote, automated, and continuous monitoring via computer vision (CV), a field of Al, can convert data from a standalone, mechanical (non-digital) chest drainage canister into useful clinical information.

Methods

The features of interest: (1) drainage volume in 10 mL increments up to 2,000 mL, (2) drainage rate over the last 30 minutes, (3) setting of the Dry Suction Control Dial (-10, -15, -20, -30, -40 cm H2O), (4) Suction Control Indicator state (dialed suction level achieved [YES: orange float appears] or not achieved), (5) number of columns in the Patient Air Leak Meter with air bubbles (air leak rate from 1 [low] to 7 [high]), (6) air leak mode (continuous or intermittent), (7) Negative Pressure Indicator state (is there negative pressure in the collection chamber of the canister?), (8) use mode: Passive (water seal/gravity, no vacuum) or Active (Vacuum applied), (9) whether the chest drainage canister is ≥90% full and needs to be changed, and (10) RGB color of fluid in canister were addressed by 4 specialized models. A parent model (YOLOv8n) handled canister features detection, YOLOv8-cls classified the settings in (3), (4), (7), and (8), MoviNetA2 handled (5) and (6), and YOLOv8s estimated (1) and (2).

We captured ~2000 images of a chest drainage tubing and canister (Pleur-evac S-1100-08LF, Teleflex) at floor level with an HD camera (StreamCam Plus 1080, Logitech) positioned at 2, 3, and 5 ft from the center of the canister front and azimuth angles of -45°, -30°, 0°, 30° and 45°. Each image was manually annotated with its known readings and settings such as the suction dial position as the ground truth. Using imgaug, a Python library, we augmented the number of images to ~150 000 for different models via these parameters (rotation, scaling down, flipping, Gaussian blur, brightness, grayscale, translation, Gaussian noise, shearing, color jitter, colorspace). Roboflow was used for annotation. The images were split 70/20/10 for training/validation/testing. Training of the model was performed on a HiPerGator cluster of 8 GPUs (Nvidia A100, 80GB). The trained model ran in real time on a laptop equipped with an NVIDIA 1650Ti 4GB GPU (IdealPad Gaming 3, Lenovo). For verification, the readings/settings predicted by the trained model were compared against the annotated readings/settings.

Figure one shows a chart of pressure with predictions of the CV model.
Figure 1. Predictions of the CV model from a camera’s zoomed-in view.

Results

The parent model (YOLOv8n) achieved 98.9% mAP@50–95 in detecting canister features. The YOLOv8-cls for (3), (4), (7), and (8) reached 99% precision. In addition, MoviNetA2 for (5) and (6) delivered 95% precision. Also, YOLOv8s for (1) and (2) reached 94.3% mAP@50-95. The models were evaluated for all settings/readings (1) through (10). An example of a model output is provided in Figure 1. The model can handle multiple canisters by zooming in on each individually.

Discussion

The preliminary results indicate that CV may be an option for reading a chest drainage canister as a clinical application that may enhance quality and safety and can include alarm parameter bounds.

Further work is required to refine, validate, and test the model at greater camera distances, sharper angles, and lower resolution. Also, try to format and integrate its output with an electronic medical record. This abstract focused on whether CV can read a chest drainage canister accurately, a task with inherent clinical value. In later work, we will also explore how to deploy CV monitoring clinically without violating privacy.


Computer Vision Interpretation of Fluid-Filled Dependent Loops in Urine and Chest Drainage Systems

Mario Rampangu, MS1; Simon Mesber, BS1; Subhash Nerella, PhD2; Eric Ragan, PhD3; Gregory Janelle, MD1; Christopher Samouce, PhD1; Mindaugas Rackauskas, MD4; Nikolaus Gravenstein, MD1; Samsun Lampotang, PhD1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Biomedical Engineering, University of Florida College of Engineering, Gainesville; 3 Computer & Information Science & Engineering Department, University of Florida, Gainesville; 4 Department of Surgery, University of Florida College of Medicine, Gainesville

Introduction

Fluid-filled dependent loops cause unintended backpressure on the bladder that interferes with complete bladder emptying or attenuates the set vacuum in urine and chest drainage systems respectively. The effect is proportional to the difference in meniscus heights (∆h cm) in the generally U-shaped dependent loop. We explored if computer vision (CV), an Al branch, can estimate ∆h, thereby facilitating automated, continuous, and remote CV monitoring (including alarms) of traditional (non-digital) drainage systems.

Methods

We collected ~400 images of fluid-filled dependent loops in both chest and urine drainage tubes with an HD camera. Each image was annotated using bounding boxes to identify meniscus points (on the patient and canister/bag sides), canister corners, and the urine bag’s scale markings, both having known lengths. These references enable pixels-to-centimeters conversion during measurement. To enhance model robustness, we performed data augmentation (rotation, scaling, flipping, Gaussian blur, brightness adjustment, grayscale conversion, translation, Gaussian noise, and color space variations), expanding the dataset to roughly 40 000 images.

These were split into training (70%), validation (20%), and testing (10%). We employed 3 separate Faster R-CNN models. One detects the chest drainage tube meniscus points and the canister. The output is then processed by a second model specializing in corner detection. The third model handles combined detections for meniscus points on the urine drainage tube and reference markings on the urine bag. Drawing on the camera orientation as a reference plane, we created a vertical line through the patient-side meniscus and a horizontal line through the canister/urine bag-side meniscus; their intersection point provides a geometric reference.

The pixel distance between the patient-side meniscus point and this intersection is then converted to centimeters using the calculated pixels-to-centimeters ratio, yielding the estimated ∆h (Figure 1). Training was conducted on a HiPerGator cluster of 8 Nvidia A100 GPUs (80 GB each). The final model achieved real-time inference on a standard laptop (Lenovo IdeaPad Gaming 3 with Nvidia 1650 Ti 4 GB). We assessed feature detection by comparing the model’s outputs with manual ground truth annotations and verified the CV-estimated ∆h by comparing against physical measurement using a ruler.

Figure one of a dummy in a bed showing the computer vision metrics.
Figure 1. CV-estimated ∆h of 8.77 cm (actual 10 cm).

Results

In testing, the first Faster R-CNN model (chest drainage tube meniscus and canister detection) achieved an mAP50 of 0.80, the second model (canister corner detection) reached 0.75, and the third model (urine drainage tube meniscus and reference detection) achieved 0.78. Automated ∆h measurements exhibited an average error of approximately plus or minus 2 cm compared to manual scale readings.

Discussion

Computer vision is an option for estimating meniscus height difference within plus or minus 2 cm in fluid-filled dependent loops in chest and urine drainage systems as a clinical application that could enhance quality and safety and can include alarm parameter bounds. Further work is required to refine, validate, and test the model at greater camera distances, sharper angles, and lower resolution and integrate its output with an electronic medical record.


Computer Vision Monitoring of Urine Output Via a Digital Scale

Keerthi H. Kattamudi, MS; Mario V. Rampangu, MS; Simon Mesber, BS; Nikolaus Gravenstein, MD; Christopher Samouce, PhD; Samsun Lampotang, PhD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville; Center for Safety, Simulation & Advanced Leaning Technologies, University of Florida Health, Gainesville

Introduction

In patients with indwelling urinary catheters, urine output monitoring is an integral part of patient assessment; for example, low hourly urine output suggests inadequate intravascular volume and/or hemodynamic or renal function decline. Monitoring and recording in preponderant, traditional, nondigital, stand-alone (not connected) urine drainage systems is done manually, noncontinuously, and requires mental math. To free up nurses’ time for critical patient care and provide earlier clinical warning of changes in urine production, we propose an Al-powered prototype that automates urine output measurement by using computer vision (CV) to accurately read the displayed weight of a urine collection bag hanging from a digital scale. This prototype is a first step towards continuous accurate remote monitoring of urine output in perioperative patients with indwelling urinary catheters, optimizing fluid management and supporting earlier alarms and timely clinical decision making.

Methods

Using a tripod-mounted webcam (Tiny 2 Lite, OBSBOT) placed 83 cm to 200 cm above the floor, the prototype utilizes deep learning-based CV techniques to automate detection of the liquid-crystal display (LCD) screen of the scale (AHS-6, Intelligent Weighing Technology) and to read weight data from the LCD screen’s 7-segment digits. The methodology follows a 2-step approach. First, Faster R-CNN (Region-Based Convolutional Neural Network) is used to detect and localize the LCD screen of the scale from a captured frame. The identified display area is then cropped, removing the background and enhancing clarity. In the second step, PARSeq, a state-of-the-art scene text recognition model, is applied to extract numerical weight readings from the LCD screen.

We assumed that the density of urine is 1 g/mL such that the displayed weight in grams is the urine volume in milliliters.

To improve robustness and adaptability across different conditions like poor lighting, variations in scale placement, and differing display conditions, various augmentation techniques were incorporated. These techniques included 60% to 80% scaling, −45° to 45° rotation, 140% to 150% of linear contrast adjustment, 110% to 120% of Gaussian blur sigma, a scale of 20% × 255% for additive Gaussian noise, 80% to 120% for brightness adjustment, and 60% to 100% of grayscale conversion, which created approximately 25 000 images.

Picture or two pictures of a CV test rig. The inset photo shows the model output.
Figure 1. CV test rig. Inset photo shows a cropped typical CV model output.

Results

In ideal conditions, the prototype exhibited a mean average precision (mAP; 0.5 to 0.95) of 0.78 in LCD screen detection and 96% accuracy in extracting the weight in grams from digit recognition tasks on the detected LCD screen.

Discussion

We completed the first step in eventually deploying a clinical tool by accurately reading via CV the weight displayed by a digital scale with a urine collection bag hanging from it. The remaining steps prior to clinical deployment are to (1) determine how accurately urine sequestered in a dependent loop in urine drainage tubing is measured with the current prototype configuration, (2) train the model to read accurately from a ceiling-mounted or wall-mounted camera (a more robust location for clinical use), (3) ensure the prototype continues to read well in diverse lighting and obstruction conditions, and (4) verify accuracy before deployment.


Simulation-Based Validation of Motion Extraction for Quantitative Train of Four Assessment

Adam Wolach, MD, Nikolaus Gravenstein, MD

University of Florida, College of Medicine, Department of Anesthesiology

Introduction

The accessibility of quantitative Train of Four (qTOF) for monitoring recovery from neuromuscular blockade, a strong recommendation of current society guidelines, is limited by expensive deployment and cumbersome hardware. In our previous work, we demonstrated proof-of-concept for qTOF via an AI machine vision hand-tracking system utilizing OpenCV and MediaPipe. In this work, we present an alternative video processing technique, motion extraction (ME), which we theorize can provide motion analysis agnostic to procedural factors that might limit our previous technique, such as lighting, camera position, and rolling shutter effects. The technique utilizes time-shifting and color inversion to transform motion into brightness data, which is then analyzed to calculate qTOF.

This image shows a twitch video corpus that was modified to isolate hand motion.
Figure 1.

Methods

To serve as ground truth,our twitch video corpus was modified to isolate hand motion. Video stabilization was applied using OpenCV optical flow, followed by the motion extraction technique. Brightness data representing motion (Figure 1) was analyzed using peak detection fit to a template of 4 twitches at 2 Hz, and integrated peak values were compared to report the TOF ratio. Detection accuracy was calculated with a coefficient of determination (R2) and was compared to our prior technique. Mean difference and SD were calculated. We used Bland-Altman analysis to identify agreement and systematic error.

This figure shows three graphs. The first graph (A) shows hand tracking. The second (B) shows motion extraction. The third (C) shows the difference vs average.
Figure 2.

Results and Discussion

Twenty-four trials were conducted. The mean and SD of differences of observed and ground truth ratios were -0.01554 and 0.03179, respectively. The R2 was 0.9925. Compared to hand tracking (Figure 2A), motion extraction produced more consistent results with a higher R2 (Figure 2B). Bland-Altman analysis showed very low error in the clinically sensitive domain near a qTOF value of 0.9, yet displayed systematic error with overestimation of low qTOF values and underestimation with high values (Figure 2C). Systematic error could be due to thresholds used to account for noise or our reliance on simulated hand motion in testing.

Given the strong agreement with ground truth, particularly near a qTOF value of 0.9, this work demonstrates that motion extraction presents a promising avenue for tracking patient movement and providing access to qTOF. Future work will involve threshold optimization, combined techniques with hand tracking, and in vivo testing of motion extraction compared to gold standard qTOF methods.


Real-Time Vein Compression Model for Mixed-Reality Ultrasound Probe Simulation

Vinh Nguyen; Chase Prasad; Christopher Samouce, PhD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Accurate vein compression modeling is important for ultrasound-guided vascular access training because probe pressure simultaneously enables vessel identification (veins compress with modest pressure) yet can also collapse superficial veins, shrinking the target and changing cannulation mechanics. These effects are compounded during real scanning, where probe angulation, sweeping, and rolling change both contact location and the direction of compression. Finite-element models can capture these interactions but are too computationally expensive for interactive use. We developed a real-time, direction-agnostic geometry-based compression method with a back-wall constraint that preserves physically plausible collapse limits.

Methods

Vessel geometry was obtained and tested using 2 inputs: (i) segmented medical imaging and (ii) synthetic tubular meshes (e.g., cylinders/tubes) created in Blender or Unity. In both cases, the Vascular Modeling Toolkit was used to extract a vessel centerline with per-node radius values, yielding a common centerline/radius plus surface mesh representation used for downstream compression logic. The centerline/radius representation was Laplacian smoothed and used to generate a tubular surface mesh.

To support arbitrary probe orientation, the method computes a per-frame compression direction from tracked probe markers and updates a dynamic backplane (backwall limit) that defines the maximum allowable collapse boundary. The backplane is parameterized along the (smoothed) centerline such that the collapse limit remains well-defined along curved vessels and under oblique probe incidence. For each vessel vertex, a ray cast along the negative surface normal estimates allowable inward motion before contacting the backplane; a minimum-separation (residual gap) constraint preserves plausible wall thickness and prevents nonphysical inversion or self-intersection. Vertex displacements are applied using a compact-support cubic (smoothstep) falloff within an influence radius and computed in a Burst-compiled parallel job for interactive performance.

Probe force magnitude was modeled using a polynomial fit to digitized force-depth data and mapped to probe displacement via an effective stiffness parameter k, parameterized from a literature-reported cephalic vein full-compression condition (~8 N at ~6 mm indentation), yielding k ~ 1.33 N/mm for the initial calibration case.

This figure shows a probe pressure image.
Figure 1.

Results

The algorithm produced smooth, stable vessel deformation under probe contact while preventing nonphysical overcompression via the back-wall limit and residual-gap enforcement. Because constraints and deformation are updated using the instantaneous pushing direction, the method supports compression from any incident direction, enabling realistic behavior during oblique probe contact, sweep, and roll. The same implementation supported both image-derived vessels and synthetic tube geometries, facilitating rapid scenario authoring and controlled test cases. Corresponding ultrasound views demonstrated progressive lumen narrowing from baseline to partial and full compression under increasing indentation.

Figure is showing an uncompressed vein (an ultrasound image).
Figure 2.

Discussion

This approach provides an interactive alternative to finite-element simulation for mixed-reality ultrasound training scenarios where venous compressibility is both a realism driver and a procedural skill target, specifically probe-pressure modulation to maintain a patent cannulation target while preserving image quality (Figs. 1–2). Future work will include validation against reference compression datasets to refine stiffness mapping beyond the initial cephalic vein calibration point and improve generalization across anatomies.


Case Study, Case Series, or Chart Review*

*Click the right arrow or change “Show 10” to “Show 20” to view posters #21-28.


Intraoperative Tension Pneumothorax and Cardiac Arrest During Bronchoscopy in an Ex-Premature Infant with Chronic Lung Disease

Shehani Perera, BS1; Abigail Schirmer, MD2; Sonia Mehta, MD2

1Florida State University College of Medicine; 2University of Florida Department of Anesthesiology

Introduction

Tension pneumothorax is an uncommon but life-threatening complication of pediatric bronchoscopy. Infants with bronchopulmonary dysplasia, tracheomalacia, and poor lung compliance are vulnerable to rapid air trapping when the bronchoscope obstructs the endotracheal tube (ETT). Early recognition of ventilation failure and immediate decompression are essential to prevent cardiovascular collapse.

Case Presentation

A 3-month-old male, born at 30 weeks 6 days, with bronchopulmonary dysplasia, tracheomalacia, and recent respiratory syncytial virus/adenovirus bronchiolitis with superimposed pneumonia was scheduled for diagnostic flexible bronchoscopy for suspected bronchiolitis obliterans.

Intraoperative course

After inhalational induction, a nasal bronchoscopy attempt was poorly tolerated. The trachea was then intubated with a 3.5-mm microcuff ETT under video laryngoscopy. Soon after the flexible bronchoscope was introduced, tidal volumes abruptly decreased, followed by absent chest rise, loss of capnography, and rising airway pressures. The infant became bradycardic.

To improve visibility, oxygen insufflation through the bronchoscope was increased from 2 L/min to 6 L/min, worsening distal air trapping. Right-sided subcutaneous crepitus was palpated. Atropine (100 µg IV) and several epinephrine boluses (~60 µg total) were administered, and chest compressions were initiated. Needle decompression at the second intercostal space produced no air return. Point-of-care ultrasound demonstrated a right-sided pneumothorax consistent with tension physiology.

Cardiopulmonary resuscitation continued until return of spontaneous circulation occurred a few minutes later. Placement of a right chest tube resulted in air release with immediate improvement in ventilation and hemodynamic stability.

Immediate Postoperative Course

The patient remained intubated and was transferred to the pediatric intensive care unit on a low-dose epinephrine infusion. He required extended ventilation and management of airway edema. Neurologic status remained consistent with his prematurity-related baseline.

Discussion

A bronchoscope can occlude more than two-thirds of the ETT lumen in infants, eliminating effective ventilation and predisposing to hyperinflation, especially when insufflation flow is increased. Sudden ventilation failure, absent capnography, increasing airway pressure, and unilateral crepitus should trigger immediate evaluation for tension pneumothorax. Needle thoracostomy may fail in small infants due to catheter obstruction or inadequate depth; in such cases, rapid chest-tube placement is lifesaving.

Conclusion

Tension pneumothorax during bronchoscopy in an ex-premature infant can lead to rapid cardiopulmonary collapse. Immediate recognition of failed ventilation, cautious control of insufflation flow, and readiness to perform chest-tube decompression are critical to preventing fatal barotrauma.


ECMO in Burn Patients: A Single Center Experience

Iaroslav Tsymbaliuk1, Yuriy Stukov2, Mindaugas Rackauskas3, Amalia Cochran2, Andrea Munden2, Mohammad Aladaileh3, Shawn Larson4, Lara Nicolas4, Torben Becker5, Alexandra Murillo Solera2, Jeffrey P. Jacobs6, Giles Peek6, Marc O. Maybauer7

1Department of Anesthesiology, 2 Division of Acute Care Surgery, Department of Surgery, 3 Division of Thoracic Surgery, Department of Surgery, 4 Division of Pediatric Surgery, Department of Surgery, 5 Division of Critical Care, Department of Emergency Medicine, 6 Congenital Heart Center, Departments of Surgery and Pediatrics, 7 Division of Critical Care, Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Objective

Patients with significant thermal injury often develop life-threatening complications, including respiratory failure and multiorgan dysfunction. Incidence of acute respiratory distress syndrome (ARDS) in burn patients is high and carries 30% mortality. Extracorporeal Membrane Oxygenation (ECMO) has emerged as a potential life-saving intervention. While ECMO offers substantial benefits in the management of ARDS, its use in burn patients is constrained by limited availability and the requirement for expert, multidisciplinary management. In this study, we present a single-center experience of ECMO in burn patients.

Methods

We conducted a retrospective analysis of burn patients who received ECMO for ARDS. The Revised Baux Score (rBaux) was used to estimate predicted mortality, calculated as age + total body surface area (TBSA) burned + 17 (if inhalation injury is present). The rBaux-predicted mortality was compared with actual mortality in patients receiving ECMO. Continuous variables are reported as median (range) and categorical variables as number (%). The primary outcome was mortality.

Results

Seventeen burn patients received ECMO support. Patient characteristics are summarized in Table 1. Venovenous ECMO was used in 15 patients (88%) with a median support duration of 14 days. Overall mortality was 47%. Median TBSA burned was 40% (0–64%), and 8 patients (47%) had inhalation injury. Fourteen patients (82.3%) underwent tracheostomy. Median hospital length of stay was 29 days (4–117). Notably, 4 patients (23%) with high rBaux-predicted mortality survived to hospital discharge.

A table showing characteristics of patients.
Table 1. Patient characteristics

Conclusion

Our experience demonstrates that ECMO is a feasible and effective rescue strategy for ARDS in burn patients, including those with high predicted mortality. Survival exceeding rBaux score expectations in a subset of patients suggests that ECMO may provide significant benefit in patient survival. Larger multicenter studies are warranted to better define optimal patient selection and outcomes.


Unloading the Failing Heart: Coordinating ECMO and Impella Support in a High-Risk Cardiac Trauma Case

Gabriel Flambert, BA¹; Arthur de Souza, BS¹; Yong G. Peng, MD, PhD¹²

¹ University of Florida College of Medicine, Gainesville, FL; ² Department of Anesthesiology, University of Florida College of Medicine, Gainesville, FL

Background

Managing concurrent cardiogenic and distributive shock presents significant hemodynamic challenges, especially when multiple mechanical circulatory support  modalities are required. Venoarterial extracorporeal membrane oxygenation (VA-ECMO) can provide emergent circulatory support in refractory cardiogenic shock. However, its retrograde aortic flow increases left ventricular (LV) afterload and may precipitate complications such as LV distension, pulmonary edema, arrhythmias, and impaired coronary perfusion. There is increasing evidence supporting the use of Impella devices during ECMO to actively unload the LV, reduce LV end-diastolic pressure, and improve myocardial oxygen balance. We present a complex case of refractory cardiogenic shock and mesenteric ischemia requiring sequential VA-ECMO, veno-venoarterial (V-VA) ECMO, and veno-venous (VV) ECMO support with Impella 5.5 for LV unloading.

Case Presentation

A 75-year-old male patient with coronary artery disease (Coronary Artery Bypass Graft ×4, drug-eluting stents (DES) ×7), ischemic cardiomyopathy (Ejection Fraction 40%), diabetes mellitus, COPD, and obstructive sleep apnea presented after a 4-foot fall with GCS 15, severe hypertension, and right clavicle and rib fractures. Early labs showed mild troponin elevation, elevated B-type natriuretic peptide, and elevated C-reactive protein. By hospital day 3, he developed abdominal pain, hypotension, rising troponins, and occult GI bleeding. CT angiography revealed an active gastric bleed and chronic superior mesenteric artery occlusion. After intubation and initiation of vasopressors, esophagogastroduodenoscopy, exploratory laparotomy, and sigmoidoscopy on day 4 revealed no source of bleeding. During superior mesenteric artery revascularization on day 5, he suffered cardiac arrest requiring 60 minutes of ACLS. VA-ECMO was initiated intraoperatively and subsequently escalated to V-VA ECMO for differential hypoxemia. On day 8, an Impella 5.5 was placed for LV unloading, enabling conversion to VV-ECMO. He was decannulated from VV-ECMO on day 10. On day 14, during a left heart catheterization, he experienced ventricular fibrillation arrest; ROSC was not achieved.

Discussion

This case highlights the physiologic interplay between ECMO support and LV loading conditions. Although VA-ECMO restored systemic perfusion, the retrograde aortic flow it creates caused an increased LV afterload. This in turn was limiting to aortic valve opening, promoting LV distension, well-recognized complications that can impair myocardial recovery. Subsequent placement of an Impella 5.5 provided active forward-flow LV unloading. Consequently, LV end-diastolic pressure was reduced, which decreased myocardial oxygen consumption and improved coronary perfusion. This unloading facilitated transition from V-VA to VV-ECMO, demonstrating the complementary nature of Impella + ECMO support. Early initiation of LV unloading during ECMO has been associated with improved hemodynamics and potentially improved survival.

Conclusion

This case demonstrates the clinical significance of LV unloading during VA-ECMO in the management of severe cardiogenic shock. The Impella 5.5 effectively mitigated ECMO-induced LV afterload and supported myocardial decompression. Timely recognition of LV distension and early incorporation of unloading strategies may enhance myocardial recovery and minimize ECMO-related complications. The coordinated use of ECMO and Impella offers a synergistic, physiology-driven approach to managing complex shock states.


Balancing Act: Prothrombin Complex Concentrate in Liver Transplantation and the Thrombotic Tightrope

Luis Alejandro Carvajal, BS1; Carmelina Gorski, BS1; Michael Lafferty, BS1; Terrie Vasilopoulos, PhD2; Saba Ali, BS1; Sehrish Saleem, MD3; Asad H. Bashir, MD2

1 University of Florida College of Medicine1, Gainesville, FL; 2 Department of Anesthesiology, University of Florida College of Medicine, Gainesville, FL; 3 Department of Radiology, University of Florida College of Medicine, Gainesville, FL

Introduction

Coagulopathy is common in liver transplant patients and raises the risk of both bleeding and thrombosis. Prothrombin Complex Concentrate (PCC) was approved for use by the Food and Drug Administration in April 2013 and has since been used intraoperatively to manage bleeding and treat coagulopathy. Our study aimed to define a safe range of PCC dosing for use during orthotopic liver transplantation (OLT). While there is an absence of standardized PCC dosing parameters for OLT, current practice, based on our chart review, typically involves doses of 20–60 IU/kg. These dosing parameters were utilized when recipient international normalized ratio (INR) values were noted to be 2.6 or greater in most of the reviewed cases. Excessive PCC dosing, however, can lead to thrombotic complications. Given these considerations, we aim to investigate the relationship between PCC dose and perioperative thrombotic and bleeding complications.

Methods

This retrospective cohort study included 56 adult liver transplant recipients with complete perioperative data on PCC administration at University of Florida (UF) Health Shands Hospital out of 931 total liver transplants performed from April 2013 to December 2024. Patients who were not given PCC during liver transplant, pediatric patients, and those with incomplete data were excluded. Following IRB approval, deidentified patient data were obtained through chart review of the UF Health Shands electronic medical record. Perioperative variables, including demographics, Model for End Stage Liver Disease (MELD) score, PCC dose, intraoperative blood products and thromboelastography, and postoperative bleeding and thrombotic events, were systematically collected.

This graph or visual shows total PCC dose administered among OLT recipients.
Figure 1: Total PCC Dose Administered Among OLT Recipients.

Results

Analysis included 56 adult liver transplant recipients who received intraoperative PCC. The mean PCC dose was 30.39±9.5 IU/kg (Figure 1) and the mean preoperative INR was 2.6±1.16 (Figure 2). PCC dosing did not significantly differ between patients with and without postoperative thrombosis (29.7±6.3 vs. 30.5±10.1 IU/kg, P = .747) or between those with and without postoperative bleeding complications (30.5±9.3 vs. 30.3±9.9 IU/kg, P = .938). Estimated blood loss (P = .490), intraoperative pRBC transfusion (P = .282), MELD score (P = .109), ICU stay (P = .611), and hospital stay (P = .660) were all also statistically unrelated to PCC dose. Higher PCC doses correlated with increased postoperative hemoglobin (r = 0.28, P = .039), and a marginal rise in hematocrit (r = 0.26, P = .055); patients requiring intraoperative heparin also had higher PCC dosing (38.4±13.4 vs. 28.7±7.7 IU/kg, P = .044). Additionally, PCC dosing showed a weak positive correlation with preoperative prothrombin time/INR (r = 0.25, P = .067), suggesting that patients with more pronounced coagulopathy tended to receive slightly higher doses.

Figure showing total PCC dose compared to preoperative INR.
Figure 2: Total PCC Dose vs Preoperative INR.

Conclusions

These findings suggest that within commonly used dosing ranges, PCC can be administered without increasing thrombotic risk in liver transplantation. Our single-center retrospective design and relatively small sample size are limitations that may reduce the generalizability of these findings. As a result, further research involving larger multicenter collaborations can help validate these findings.


Beware of Sofferman Syndrome: Why Our NORA Patient Needed Emergency Frontal Neck Access in NORA!

Victor Silva, MD, Felipe Urdaneta, MD, FASA, Anthony Barrios, MD, Jeffrey D. White, MD, FASA

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Nasogastric tubes (NGT) have commonly been utilized in medical and surgical settings. Estimates are that around 1.2 million NGT are placed in the US annually. The Nasogastric Tube Syndrome (NGTS), also known as Sofferman Syndrome, is rare yet most likely highly underreported as a complication associated with NGT placement. It was first defined by Robert Sofferman in 1990. Presenting signs and symptoms include hoarseness, stridor, and dyspnea. Here we present a case of unrecognized NGTS in a patient scheduled for colonoscopy, where planned routine rapid-sequence induction for general anesthesia quickly devolved into a “can’t ventilate, can’t intubate” (CVCI) airway emergency.

Methods

A retrospective chart review was conducted through EPIC to gather information relating to the case presented including patient demographics, interventions performed during the patient’s admission, as well as outcomes and comorbidities. Literature review was also conducted for background information relating to Sofferman Syndrome. Additionally, IRB approval was not required as no identifiable information was utilized for the purpose of the case report.

Results (Case Description)

A 47-year-old male patient presented for ambulatory right arthroscopic rotator cuff repair. Uneventful general anesthesia along with an interscalene peripheral nerve catheter was provided. At laryngoscopy, a grade I view was obtained with a MAC #3 blade under video laryngoscopy. A 7.5 mm endotracheal tube was secured on first attempt. At the end of the case, the patient was extubated without complications. On postoperative day 2, the patient developed nausea, vomiting, and abdominal pain, prompting him to go to the ED. A KUB film revealed small bowel obstruction. The patient was admitted to the surgical ward where a 16 Fr NGT was successfully placed. Following the NGT, the patient reported worsening throat discomfort and excessive salivation. By the fifth day of hospitalization, his abdominal symptoms had failed to improve prompting the decision to schedule him for colonoscopy. An anesthetic evaluation was performed which included airway exam and review of the patient’s recent anesthetic record. He reported severe pharyngodynia and hoarseness to the anesthetist. The decision was made to proceed under GA. Based on the airway exam and recent uncomplicated intubation, rapid-sequence induction and intubation with cricoid pressure was planned and expected to be similarly straightforward.  Following induction with fentanyl 50 mcg iv, lidocaine 60 mg iv, Propofol 200 mg iv, and Rocuronium 80 mg iv, the airway management plan from his shoulder surgery was replicated.  However, the airway was now totally distorted.  Video laryngoscopy with MAC #3 blade revealed significant glottic edema and no identifiable glottic opening. A second and third attempt were made by a more experienced provider. Facemask ventilation was attempted with and without oropharyngeal airway, and via one-person and two-person techniques. All failed. Various laryngeal mask airway sizes were attempted to no avail. By then, an airway emergency was declared. With the patient’s oxygen saturation below 60%, trauma surgeons quickly established emergency frontal neck access approach (EFONA). A 6.5 mm endotracheal tube was inserted and the patient could be ventilated and oxygenated. Postoperatively, there were no neurological sequela from this CVCI episode. Flexible endoscopic evaluation of swallowing 3 days later revealed significant laryngopharyngeal edema plus erythema in the post-cricoid region, a cobblestone oropharynx, and asymmetric sluggish vocal cords. At day 26, the patient’s tracheostomy was successfully decannulated, and soon after he was discharged home.

Swallowing images on postoperative day 3.
Figure 1: Flexible endoscopic evaluation of swallowing images on postoperative day 3 of EFONA.

Conclusions

Not all cases of NGTS exhibit the symptoms of severe airway obstruction illustrated in our case.  Sofferman described NGTS as a triad of NGT placement, pharyngodynia, and vocal cord abduction paresis (either unilateral or bilateral).  Our patient met the criteria for NGTS, yet the diagnosis was missed by surgery, anesthesia and endoscopy teams.  A very recent prior general anesthesia along with unfamiliarity of NGTS lulled the anesthesia team into false reassurance that this laryngoscopy would also be straightforward.   NGTS should be suspected in any patient who reports significant upper airway symptoms following NGT placement. The pathophysiology of NGTS is not yet fully understood but removal of the NGT usually results in resolution within days.

We present this case to raise awareness of NGTS.  We conducted a comprehensive database search and found 70 reported cases of NGTS.  We suspect that given the high incidence of routine NGT placement, more cases go unreported.  This is the seventy-first reported case of confirmed NGTS and the very first reporting EFONA rescue.  We recommend that prior to planned rapid-sequence induction in such patients, a preoperative fiberoptic airway exam should be included in the preanesthetic evaluation.


When Drains Go Wrong: A Case of Post-Drain Spinal Hematoma

Meena Kanhai, MD; Kevin Priddy, MD; Alessandra Costello-Serrano, MD; Erika Taco Vasquez, MD  

Department of Anesthesiology, University of Florida, Gainesville

Introduction

Spinal drains are lumbar cerebrospinal fluid drains that are utilized in repair of aortic aneurysms to help prevent and manage spinal cord ischemia. We present a 73-year-old female patient with history of coronary artery disease, status post coronary stent placement, on aspirin, and atrial fibrillation on warfarin, who developed a spinal hematoma after the removal of a spinal drain placed for complex abdominal aortic aneurysm repair using physician-modified endovascular grafts (PMEGs) despite following ASRA coagulation guidelines. We discuss reconsidering when to place spinal drains in high-risk patients.   

Materials and Methods

As the case report is devoid of patient identifiable information, it is exempt from IRB review requirements as per University of Florida policy.  

Results/Case Report

A 73-year-old female patient with a BMI of 20.4 kg/m2 presented for complex abdominal aortic aneurysm repair. Her past medical history included hypertension, hyperlipidemia, atrial fibrillation on warfarin, chronic obstructive pulmonary disorder, smoking, coronary artery disease with prior coronary stent placement treated with aspirin, and type 2 diabetes mellitus.

A spinal drain was requested by the surgical team. Warfarin stopped for 7 days and ASA 81 mg for 5 days. On the day of surgery, the INR was 1.1 and platelets 189. She underwent 4-vessel PMEG with SMA, bilateral renal and iliac stents. A total of 4000 U of heparin given intraoperatively (final ACT: 164 seconds; baseline: 148 seconds) reversed with 50 mg of protamine and total of 38 mL of CSF drained. 

On ICU arrival, she was classified as high risk for spinal cord ischemia requiring MAP goals >80 mmHg on norepinephrine infusion and spinal drain open and draining at 10 cmH20, neurologically intact. On POD1, the drain stopped draining because of suspected kinking and was clamped. On POD2, the drain was removed. Eight hours after removal, the patient developed new-onset intense back pain with no neurological deficits. Magnetic resonance imaging reported lumbar subarachnoid hematoma, managed with ketamine infusion and multimodal analgesia. On POD 3, she developed a new-onset of left lower extremity weakness. Repeat magnetic resonance imaging showed expanding intrathecal hematoma (4.7 to 6.0 cm), which was managed nonoperatively per neurosurgery recommendations. Later in her hospital course, she experienced a STEMI with development of Takotsubo’s cardiomyopathy, with a readmission due to a fall secondary to her lower extremity weakness. After methylprednisolone and mannitol, her neurological deficits then improved.  

Discussion

Spinal drains are crucial for preventing spinal cord ischemia by enhancing spinal cord perfusion pressure through CSF drainage and increased arterial pressure. Prophylactic drains in thoracic endovascular aortic repair reduce the incidence of spinal cord injury to 1.3%. However, the risk of spinal hematoma after drain placement ranges from 3.9% to 27%, dependent on atraumatic placement and coagulation status. Despite following risk-mitigation guidelines, a postoperative hematoma occurred in this patient. This raises questions about the favorable initial risk-to-benefit ratio with drainage duration under 48 hours.


Hemodynamic Effects of Bolus vs. Infusion Anesthesia During Transesophageal Echocardiography in Patients with Severe Left Ventricular Systolic Dysfunction

Anson Wang, MD1; Emuejevoke Chuba, MD, MS1; Terrie Vasilopoulos, PhD1; Renee Cress; Thomas Lewandowski, MD, FAC, FASE2; Yong Peng, MD, PhD, FASE, FASA1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Medicine, University of Florida College of Medicine, Gainesville

Background/Introduction

Transesophageal echocardiography (TEE) is a semi-invasive diagnostic tool frequently performed under moderate sedation to evaluate cardiac structure and function in high-risk patients. Although generally safe, use of anesthetics during TEE poses an increased risk in patients with severely reduced left ventricular ejection fraction (LVEF <30%), including arrhythmias, respiratory compromise, and hemodynamic instability. This study assesses trends in hemodynamics associated with different anesthetic management strategies during TEE in patients with LVEF <30%.

Methods

We conducted a single-institution retrospective observational study of adult patients (≥18 years) with LVEF <30% who underwent anesthetic management for TEE between January 1, 2022, and December 31, 2024. Of 111 eligible encounters, 32 were excluded due to procedures performed without an anesthetist (e.g., bedside studies), concurrent ablation procedures, or missing records, leaving 79 encounters for analysis. The variables for this analysis included anesthetic technique (medications used, mode of administration, total anesthetic dose), intraoperative vitals, and postoperative complications. Patients were categorized into 3 groups: boluses only, infusions only, and boluses + infusions, with latter 2 groups combined for analysis. Hemodynamic instability was defined as systolic blood pressure <90 mmHg or mean arterial pressure <50 mmHg. Descriptive statistics and comparative analyses (Fisher’s exact test, Z-test, and chi-square test) were performed.

This figure shows the prevalence of any hypotensive event in a bolus group compared to any infusion group (bar chart).
Figure 1. Prevalence of any hypotensive event in bolus group compared to any infusion group (i.e., infusion group + bolus and infusion group).

Results/Discussion

Among the 79 encounters, 15 received boluses only, 14 received infusions only, and 50 received both. Hypotension occurred most often in the bolus + infusion group (48%), followed by the infusion-only group (43%) and the bolus-only group (20%; P < .142). In two-group analysis, patients managed with boluses alone had fewer hypotensive episodes (20% vs 47%; P = .058) and fewer multiple hypotensive episodes (13% vs 33%; P = .128), compared with those receiving any type of infusions. While not statistically significant—likely due to limited sample size—these trends may be clinically relevant in practice. Of note, patients receiving infusions had a higher median total propofol dose than those receiving boluses only (247 mg vs 80 mg; P = .0132), which likely contributed to the increased incidence of hypotension observed in the infusion groups.

Bar chart showing prevalence of multiple hypotensive events in bolus compared to infusion.
Figure 2. Prevalence of multiple hypotensive events in bolus group compared to any infusion group (i.e., infusion group + bolus and infusion group).

Conclusion

Among patients with LVEF <30% undergoing TEE, bolus-only anesthetic management was associated with fewer hypotensive events and lower total propofol requirements compared with infusion-based techniques, although these differences were not statistically significant. Expanding the dataset to earlier years may improve the power and further clarify the observation of these associations.


Thrombotic Complications During Liver Transplantation: A Case Report

Carmelina Gorski, BS; Michael Lafferty, BS; Luis Carvajal, BS; Cheng Zheng, MD; Asad Bashir, MD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville, FL

Introduction

Patients with end-stage liver disease exhibit a paradoxical hemostatic state with significant hypercoagulable potential despite traditional views of cirrhosis as hypocoagulable. Liver transplantation further disrupts hemostatic balance, creating conditions favoring thrombosis. Perioperative thrombotic complications occur in 5% to 7% of liver transplant recipients, significantly impacting graft and patient survival. Vascular access devices represent an additional thrombotic risk in this vulnerable population. We present a case of extensive catheter thrombosis in a trialysis line during orthotopic liver transplantation with concurrent continuous veno-venous hemofiltration (CVVH).

Methods

Informed consent was obtained from the patient to have her case written up. Patient identifying information was not included in the case report; therefore, no IRB review was required.

Three different scans showing different views like chamber, bicaval, and left atrial appendage.
Figure 1: Intraoperative TEE: A) Midesophageal 4 chamber view, B) Bicaval view, and C) Pulsed wave doppler of left atrial appendage.

Case Report

A 61-year-old female patient with past medical history significant for hypothyroidism, hypertension, and end-stage renal disease secondary to alcoholic cirrhosis underwent orthotopic liver transplantation. There was a request by the transplant surgical team to place a Power-Trialysis™ Short-Term Dialysis catheter line post-induction for intraoperative CVVH. The initial line was placed by the resident physician with some resistance during initial wire placement. Both ports were able to be flushed, but sluggish flow was observed from the 17-gauge third lumen.

As the case proceeded, the patient was placed on CVVH. The blood purification machine gave high-pressure alarms, so the line was retracted about 2–3 cm and re-sutured due to concern for it being up against a vessel wall. There was intermittent improvement but subsequent continuation of high-pressure alarms from the machine. An attempt was made to flush the line, and large clots were evacuated from the line.

As a second clot was evacuated from the line, tPA and IV heparin were drawn up and made available in the room. Regular transesophageal echocardiogram (TEE) checks were performed to ensure the absence of clot extension into the heart chambers or great vessels (Figure 1). The decision was made to remove the line, and another clot was found in the line upon removal (Figure 2). The trialysis line was replaced at the end of the case by the attending physician with smooth flush and unhindered CVVH use.

Removed clot after trialysis line removal.
Figure 2: Trialysis line following removal; removed clot on the right side.

Discussion

The hypercoagulable potential of patients with end-stage liver disease presents an increased risk when placing central venous catheters. In this case of repeated thrombus formation in the trialysis line, the decision to remove the line was made to lessen the risk of stroke and further embolic spread of clot burden. There is no standardized thromboprophylaxis protocol for liver transplantation due to bleeding concerns.  This case demonstrates the utility of intraoperative TEE to exclude intracardiac thrombus and thrombus in major vasculature during orthotopic liver transplantation.

The synthetic material of the catheter is an important factor for thrombus formation, especially in liver transplant patients with a tendency toward hypercoagulability. The trialysis catheter is made of polyurethane, which is more thrombogenic than silicon used in tunneled permanent catheters. This should be taken into account during preoperative evaluation, and prophylactic anticoagulation will be an patient-specific decision. Meticulous placement technique is critical, as endothelial injury combined with catheter-induced venous stasis substantially increases thrombosis risk. Regular assessment of catheter patency through flush testing and monitoring of CVVH circuit pressures enables early detection of developing thrombosis.

Conclusion

This case highlights the underrecognized hypercoagulable state in cirrhotic patients undergoing liver transplantation and the critical role of intraoperative TEE in detecting and managing thrombotic complications. Prompt recognition of catheter-related thrombosis, meticulous line placement technique, and consideration of catheter material thrombogenicity are essential for preventing potentially catastrophic thromboembolic events during liver transplantation. This case underscores the importance of vigilance and preparedness when managing vascular access in this high-risk population.


Exploring Neuroanatomical and Neuropsychological Associations of DTI-ALPS

Faith Kimmet, B.S.¹, Teng Peng, M.D.², Rachael Seidler, Ph.D.³, Sumire Sato, Ph.D.³, Seda Tasci, M.S.³, Yonah Joffe, M.S.¹, Jared Tanner, Ph.D.¹, Catherine Price, Ph.D.¹

1 Department of Clinical and Health Psychology, University of Florida College of Public Health & Health Professions, Gainesville; 2 Department of Neurology, University of Florida College of Medicine, Gainesville; 3 Department of Applied Physiology and Kinesiology, University of Florida College of Health & Human Performance, Gainesville

Background

Diffusion tensor imaging analysis along the perivascular space (DTI-ALPS) is a metric that quantifies diffusion in the direction of the perivascular spaces along the lateral ventricles. DTI-ALPS is considered a measure of glymphatic functioning due to associations with general cognitive metrics and intrathecal measurements in 39 human subjects. The current investigation aims to further understand DTI-ALPS by assessing patterns of associations to specific neuroanatomical structures and cognitive domains. Aim 1 is to explore associations between DTI-ALPS and neuroanatomical regions theoretically disrupted by altered diffusion and increased perivascular space volume. We hypothesized that DTI-ALPS would negatively associate with white matter abnormalities and lateral ventricular volume, with less association to entorhinal thickness or hippocampal volume. Aim 2 is to examine associations between DTI-ALPS and neuropsychological domains, with the hypothesis that stronger associations will be observed with measures reliant upon frontal-subcortical pathways (e.g., Part B of the Trail Making Test and total letter fluency) rather than those dependent upon educational, semantic knowledge (e.g., Boston Naming Test [BNT] and total animal fluency).

Methods

Participants included older adults (N= 142, mean (SD) age = 69.06 (6.50); education years 15.65 (2.745)) who completed a series of neuropsychological measures within 24 hours of a 3T brain MRI. DTI-ALPS was quantified using the published protocol.

Results

Aim 1: Individual multiple regressions were run for our neuroanatomical regions of interest, controlling for total intracranial volume, age, and sex. To account for multiple comparisons, P values were evaluated against a Bonferroni-adjusted alpha of 0.013 for aim 1 and 0.017 for aim 2. Significant negative associations were found with lateral ventricular volume (β = -0.264, P < .001), as well as with whole brain, paraventricular, and deep white matter hyperintensity volumes (β = -0.325, P < .001; β = -0.349, P < .001; β = -0.272, P = .002). Further, no significant associations were found with entorhinal thickness (β = -0.040, P = .637) or hippocampal volume (β = 0.038, P = .632). Aim 2: Individual multiple regressions were run for our neuropsychological measures of interest, controlling for age, sex, and education. A significant association was found with letter fluency total output (β = 0.212, P = .009) and a trending toward significant association with Trails B total time (β = -0.178, P = .023), but not with BNT total score (β = 0.085, P = .314) or semantic fluency total output (β = 0.093, P = .271).

Conclusions

The correlational disparities found in both neuroanatomical and neuropsychological measures suggest that DTI-ALPS captures frontal-subcortical rather than cortical functioning and integrity. While DTI-ALPS’ proximity to perivascular spaces and associations with general cognitive metrics makes it a tempting noninvasive measure of glymphatic functioning, our findings suggest a level of specificity that diverges from the idea of it being a systemic metric. Despite demonstrating that DTI-ALPS better captures subcortical diffusion, the full extent of its neuroanatomical and neuropsychological associations remains unclear. Therefore, further research needs to be conducted to fully discern the specificity of DTI-ALPS’ application, moving beyond prior assumptions of its role as a glymphatic measure.


Vasopressor Requirements Across Anesthetic Modalities During Transesophageal Echocardiography in Severe Left Ventricular Dysfunction

Renee Cress; Emuejevoke Chuba MD, MS1; Terrie Vasilopoulos, PhD1; Anson Wang, MD1; Thomas Lewandowski, MD, FAC, FASE2; Yong Peng, MD, PhD, FASE, FASA1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Medicine, University of Florida College of Medicine, Gainesville

Introduction

Patients with severely reduced left ventricular ejection fraction (LVEF <30%) are at increased risk of hemodynamic instability during transesophageal echocardiography (TEE). Although TEE is commonly performed under moderate sedation, differences in anesthetic administration may result in variable pharmacokinetic effects. Bolus dosing may precipitate acute hypotension with rapid anesthetic delivery, while infusion provides more gradual delivery. The influence of these modalities on intraoperative vasopressor requirements in patients with LVEF <30% who are undergoing TEE is not known. This study evaluated the association between anesthetic delivery method and vasopressor use during TEE.

Methods

This study is a single-institution retrospective observational study of patients ≥18 years of age with LVEF <30% who underwent TEE between 1/1/2022 and 12/31/2024. Of 111 identified patient cases, 32 were excluded due to missing records, TEE performed without an anesthetist, or cases in the same encounter as ablation procedures. Seventy-nine cases met criteria for analysis and were categorized by 3 different anesthetic modalities: bolus-only, infusion-only, or combined bolus and infusion, with latter groups combined for secondary analysis. Data collected included anesthetic modality, dosages, intraoperative vital signs, vasopressor medication use (primary outcome), the number of vasopressors administered, and the specific medications given. Descriptive statistics and comparative tests (Fisher’s exact, chi square, and two-sample z tests) were used for analysis.

A percentage bar chart showing the comparison of two groups.
Figure 1. Two Group Analysis By Percent.

Results

Among the 79 cases included, 15 received boluses only, 14 received infusions only, and 50 received combined boluses and infusions.Phenylephrine (40.5%) followed by ephedrine (17.7%) were the most commonly used vasoactive agents. Rates of intraoperative vasopressor use among anesthetic modalities showed potentially clinically-relevant differences in both 3-group (bolus 33.3%, infusion 64.3%, combined 56.0%; P = .196) and 2-group (bolus-only 33.3% vs infusion-only/combined; P = .083), although this did not reach statistical significance. Likewise, the use of multiple vasopressors was over double the rate in the infusion group compared to the bolus-only group (17.2% vs 6.7%, P = .311). Interestingly, patients receiving infusions had a higher median total propofol dose than bolus-only patients (247 mg vs 80 mg; P = .0132), which likely contributed to increased vasopressor requirement in the infusion group.

Conclusion

In this cohort of patients with LVEF <30% undergoing TEE, infusion-based anesthetic administration was associated with greater vasopressor requirements compared to bolus-only techniques. Further study with a larger cohort is needed to determine whether anesthetic delivery, dosing, or both contribute to these trends, which may help optimize anesthetic management for this high-risk population.


Progressive Management of Venopulmonary ECMO in Bilateral Lung Transplantation: A Case Report

Omowonuola Ogundele, MD; Mindaugas Rackauskas, MD, PhD; Mohammad Aladaileh, MBBS; Moustafa Younis, MD; Sara Rosen, PA-C, Yuriy Stukov, MD; Marc O. Maybauer, MD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville; Department of Surgery, University of Florida College of Medicine, Gainesville; Department of Medicine, University of Florida College of Medicine, Gainesville

Background

The use of extracorporeal membrane oxygenation (ECMO) was previously a relative contraindication for lung transplantation. However, with advancements in technology, improvements in critical care and updated patient selection criteria, its use as a bridge to lung transplantation has become more common.

Case Description

We present the case of a 37-year-old female patient with history of SLE, interstitial lung disease, ANCA vasculitis, GERD, and iron deficiency anemia who presented with acute respiratory distress syndrome and concomitant severe right ventricular (RV) dysfunction with subsequent decompensation to acute RV failure.

Case Report

The patient was cannulated for venopulmonary (VP) ECMO, supported with a ProtekDuo dual lumen cannula. This was secondary due to the patient’s long-standing history of interstitial lung disease and systemic lupus erythematosus and intended as bridge to bilateral lung transplantation (BOLT) pending evaluation and listing. Despite maximal medical management with inotropic agents (milrinone, epinephrine), the patient demonstrated worsening biventricular function.

She ultimately required reconfiguration to venopulmonary-aortic ((dL)VP-/AO) ECMO via clamshell sternotomy (Figure 1). The patient underwent bilateral orthotopic lung transplantation postoperative day (POD) 2 from central cannulation. Intraoperatively during lung transplant, the right ventricle remained significantly distended, prompting initiation of cardiopulmonary bypass during BOLT and subsequent conversion back to (dLVP-/AO ECMO configuration.

On POD 4, a weaning trial was conducted under the guidance of transesophageal echocardiography postoperative day 3 from initial central cannulation. This was tolerated well without additional vasopressor support. The patient was then converted back to (dl)V-P ECMO with removal of the central aortic cannula. However, on POD 5, the patient demonstrated radiographic signs of volume overload and pulmonary edema and was diuresed.

On POD 6, another weaning trial was conducted showing some degree of RV recovery. The decision was made to place an additional femoral vein multistage drainage cannula (25 Fr) to be used solely for drainage, and the circuit was reconfigured with a Y-piece to return blood flow through both lumens of the ProtekDuo. The patient still required high flows of 4–5 L/min to deliver oxygenated blood; therefore, VP flow was reduced to about 35% in the pulmonary artery and 65% in the RA with improvement of pulmonary edema the following day without affecting RV function.

On POD 8, a 23 Fr single stage cannula was placed in the left internal jugular vein, the tubing toward the ProtekDuo was clamped and cut and wet connected to the LIJV with subsequent initiation of venovenous (VV) ECMO. The RV function remained mildly reduced, the patient could be oxygenated and further weaned from ECMO support. On POD 14, the patient was decannulated from VV ECMO and further rehabilitated.

A chest x-ray of venopulmonary aortic ECMO.
Figure 1. Chest X-ray of venopulmonary-aortic ((dL) VP-/AO) ECMO via clamshell sternotomy. Blue arrow shows ProtekDuo cannula; red arrow demonstrates central aortic cannula.

Discussion

The case evaluates the role of ECMO in the management and stabilization of critically ill patients in the perioperative period of bilateral lung transplantation. The specific focus is on ProtekDuo as RVAD and as drainage cannula for venopulmonary-aortic configuration. The case also shows that the ability to convert back in a staged fashion to (dL) V-P and V-V ECMO for postoperative weaning and RV protection depends on how quickly the patient can tolerate such changes.


Successful Treatment of Chronic Vertebrogenic Pain with a Novel Basivertebral Nerve Ablation Technique

Brandon Yang; Juan Mora, M.D.

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Chronic low back pain (CLBP) is a leading cause of disability and is frequently multifactorial. Vertebrogenic pain, arising from nociceptive signaling within degenerated vertebral endplates via the basivertebral nerve (BVN), represents a distinct and treatable source of axial low back pain associated with Modic type 1 or 2 endplate changes on MRI. Traditional therapies such as epidural steroid injections, medications, and surgeryvoffer variable benefit in this population. Futhermore, BVN ablation has emerged as a condition-specific, minimally invasive treatment demonstrating durable pain and functional improvement.

Case Presentation

An 81-year-old male patient with a history of prostate cancer s/p radiation and localized osteoporosis initially presented in 2021 with acute low back pain with radicular features. MRI demonstrated an L5 superior endplate compression fracture, and DEXA confirmed osteoporosis (L1–L4 T-score −2.5); zoledronic acid was initiated. Bilateral L4/5 and L5/S1 TFESIs provided ~90% temporary pain relief.

By 2025, pain evolved into chronic axial low back pain with no radicular symptoms. Conservative therapies were ineffective. Lumbar MRI revealed a healed L5 fracture with Modic type 2 endplate changes at L4 and L5, consistent with vertebrogenic pain. After multidisciplinary evaluation, basivertebral nerve (BVN) ablation was selected as the most targeted, minimally invasive approach.

On June 24, 2025, L4–L5 BVN ablation was performed under fluoroscopic guidance. The patient’s VAS improved from 9/10 preprocedure to 1/10 immediately postprocedure (≈ 89% reduction) and remained 1/10 at 8-month follow-up, with ODI improving from 46 to 22 (−52%).

A scan showing signs of fatty degeneration.
Type II Modic changes (signs of fatty degeneration) at L4/5.

Discussion

Vertebrogenic pain originates from nociceptive signaling within diseased vertebral endplates transmitted via the BVN. This entity is most commonly associated with Modic type 1 or 2 endplate changes on MRI, reflecting inflammatory and fatty degeneration, respectively. Subsequently, BVN ablation directly targets this pathway and has emerged as a disease-specific, minimally invasive treatment option for chronic axial low back pain refractory to conservative therapy.

Two major randomized controlled trials established the evidence base for BVN ablation. The SMART trial (double-blind, sham-controlled) demonstrated significantly greater functional improvement at 3 months, with durable benefits through 24 months and marked reductions in opioid use and spinal injections. The INTRACEPT trial, comparing BVN ablation to standard care, found superior improvements in both function and pain, sustained through 12 months. Long-term pooled analyses up to 5 years confirm sustained efficacy and safety, with ~65% of baseline opioid users discontinuing and no device-related complications reported.

Our patient exhibited classic vertebrogenic features—chronic midline low back pain, Modic type 2 changes at L4 and L5, and failure of pharmacologic therapy. Following BVN ablation, he experienced an 83% reduction in pain and a 52% functional improvement, closely mirroring outcomes from major RCTs. Although his chronic compression fracture and osteoporosis (T-score −2.5) would have excluded him from strict trial criteria, his radiographic stability and clinical presentation supported inclusion in real-world practice. His excellent response underscores the expanding applicability of BVN ablation beyond initial study populations when appropriate patient selection and multidisciplinary evaluation are employed.

A scan or image of fluoroscopic guidance.
Fluoroscopic guidance of transpedicular access during Intracept basivertebral nerve ablation — targeting vertebrogenic low back pain at its source.

Conclusion

Overall, BVN ablation produced rapid and durable improvement in an octogenarian with vertebrogenic low back pain: pain 9→1/10 (−83%) and ODI 46→22 (−52%) at 8 months. Also, PROMIS-29 changes were concordant with improved physical function, reduced pain interference, and normalization of social participation. These clinically meaningful gains support BVN ablation as a targeted, minimally invasive option when Modic 1/2 changes are present.


VV ECMO Intervention for Intraoperative Respiratory Collapse During Redo Aortic Arch Surgery: A Case Report

Jonathan Holt; Yong G. Peng, MD, PhD, FASE

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Mycotic aortic arch pseudoaneurysms are rare and can become rapidly fatal without prompt surgical intervention. This case report involves a patient who was undergoing redo of aortic arch surgery, and VV ECMO was initiated for refractory pulmonary failure.

Case Report

A 64-year-old woman with a history of aortic repair and bioprosthetic AVR arrived from another hospital already sedated, intubated, and on norepinephrine with severe ventilator-dependent pulmonary insufficiency. A preoperative TEE showed that there was preserved LV function (LVEF at 50%–60%) and a large pseudoaneurysm that surrounded the previously operated graft site.  However, the right radial arterial waveform would become flat anytime the TEE probe advanced further.  These findings raised concern for an aberrant innominate artery and pseudoaneurysm mass effects, which were confirmed by the initial surgical dissection.  The patient underwent total arch replacement that included multibranch grafting under deep hypothermic circulatory arrest. After coming off bypass, the patient developed acute hypoxic respiratory failure with peak airway pressures exceeding 37 cmH2O, and tidal volumes dropped to approximately 300 mL while a large amount of bloody secretions filled the airway and endotracheal tube.

Postbypass TEE revealed newly depressed biventricular function; estimated LVEF was about 30% to 35% accompanied by a notable decline in right ventricular function, with TAPSE falling to 0.9 cm.   The team then confirmed that the primary problem was refractory pulmonary compromise, not cardiogenic shock, after evaluating that there was no new valvular lesion, aortic complication, or tamponade. 

The anesthesia and surgical team made the decision to initiate VV ECMO after refractory hypoxemia persisted despite receiving maximal ventilatory support and confirmation that the origin of the deterioration was respiratory rather than cardiac. After VV ECMO was established, oxygenation improved immediately and RV function partially recovered.

Discussion

This case demonstrated the importance of TEE-guided physiologic evaluation immediately after being separated from bypass. Clear communication among the operating team ensured that unnecessary cardiac interventions were avoided and the clinical picture was recognized as refractory respiratory failure.

Conclusion

Early recognition of the source of hypoxia resulting from respiratory failure rather than cardiogenic shock is critical in complex aortic surgery, especially in redo procedures that involve infection, mass effects, and severe preoperative pulmonary compromise. Prompt activation of the VV ECMO, guided by TEE and coordinated multidisciplinary communication, can prevent severe, rapid deterioration and support the stabilization of the cardiopulmonary system of critically ill patients.


Balancing Analgesia and Infection Risk: A Case of a Femoral Nerve Catheter Infection in an Immunocompromised Patient

Erika Taco-Vasquez, MD; Alisha Shah, MD; Paola Nathaly Silva Enriquez, MD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Continuous femoral nerve catheters provide effective postoperative analgesia but carry a risk of bacterial colonization and infection, particularly when used in high-risk patients. This case is focused on femoral catheter–associated soft tissue infection in a patient with multiple infection-predisposing factors including cancer, MRSA colonization, acute deep femoral venous thrombosis. We discuss contraindications for catheter placement when infection risk outweighs potential benefits.

Materials and Methods

Patient informed consent was obtained on the phone, and it is exempt from IRB review requirements as per the University of Florida guidelines.

Case Report

A 75-year-old male patient (BMI 20 kg/m²) with metastatic urothelial bladder carcinoma with bone metastases, prior MRSA infection and sepsis, VP shunt infection requiring removal, chronic tobacco use, and an acute right femoral DVT presented with progressive leg swelling. Evaluation revealed a subacute pathologic fracture of the right proximal femur, and he subsequently underwent operative fixation. For perioperative analgesia, a femoral nerve catheter (StimuCath) was placed 1 day prior to surgery under sterile technique using ultrasound guidance and nerve stimulation. Placement was achieved in a single attempt without immediate complications, and sterile gown, gloves, and mask were utilized throughout the procedure. Postoperatively, the patient received cefazolin prophylaxis for 24 hours. The catheter remained in place for 5 days, during which daily examinations described a clean, dry, and nontender insertion site without erythema or drainage. Throughout the dwell period, no neurologic changes or infectious concerns were observed, and analgesia was satisfactorily maintained.

Right femoral nerve catheter insertion site.
Figure 1. Progressive localized erythema and soft tissue changes at the prior right femoral nerve catheter insertion site.

Results

On postoperative day 6, the patient developed pruritus and erythema in the right groin. Vital signs remained stable and the white blood cell count was within normal limits. Blood cultures were obtained and negative. C-reactive protein was elevated to 160, prompting initiation of ceftriaxone for suspected cellulitis. By postoperative day 8, a right groin abscess was noted during a bedside dressing change, with purulent material expressed. A bedside incision and drainage were performed. The patient was discharged on oral doxycycline and cefpodoxime, with clinical improvement confirmed at a telemedicine follow-up visit 2 weeks later. Throughout the course, there was no evidence of systemic sepsis, deep soft-tissue extension, or neurologic involvement.

Discussion

Patients at increased risk for nerve catheter infection include those with MRSA colonization, immunocompromised status, femoral catheter placement, catheter dwell time exceeding 48 hours, tobacco use, deep venous thrombosis (DVT), and limited antibiotic prophylaxis. Concurrent DVT has been associated with sepsis, reflecting the bidirectional relationship between thrombosis and infection, which is driven by inflammation, endothelial injury, and hypercoagulability. The risk of infection rises markedly by the fourth day after catheter placement. These findings support reinforcing antibiotic prophylaxis practices and consider tunneled catheter placement or earlier catheter removal in high-risk patients.

Learning Points:

  1. Prior MRSA carrier with a history of sepsis should trigger risk-stratified prevention measures for peripheral nerve catheters and reinforce the preprocedure infection prophylaxis protocol (chlorhexidine wipes and mupirocin swabs).
  2. Femoral location and prolonged dwell time amplify colonization risk in high-risk patients; we suggest standardizing catheter tunneling.    
  3. Institutional protocols incorporating antibiotic prophylaxis for patients who require perineural analgesia prior to surgical procedures.
  4. Consider not proceeding with nerve catheter placement in patients with known deep venous thrombosis in the placement area.

Lumbar Drain: Spinal Cord Protection at a Price

Hadia Maqsood, MBBS; Soleil Schutte, M.D.

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Thoracoabdominal aortic surgery carries the risk of spinal cord ischemia (SCI) and subsequent paraplegia. Lumbar drain (LD) can reduce the risk of SCI by improving spinal cord perfusion pressure; however, it carries a high risk of neuraxial hematoma in this patient population. Balancing the benefits/risks is often challenging. We present a case of a patient in whom a lumbar drain was placed in the setting of SCI and alteplase administration.

Methods

The information was obtained from reviewing the patient’s chart. The patient is deceased 6 months after being discharged from the hospital due to an unrelated cause, and the next of kin could not be reached to obtain informed consent. Patient identifying information was not included in the case report, and IRB was not required. 

Case Report

A 57-year-old male patient with past medical history of hypertension, end-stage renal disease, and tobacco use underwent emergent combined open and endovascular repair of a contained ruptured thoracoabdominal aortic aneurysm with cardiopulmonary bypass. In the immediate postoperative setting, he was noted to have left lower extremity (LE) weakness, consistent with SCI. An emergent LD was placed by the regional anesthesia team despite a deranged coagulation profile. With careful titration of permissive hypertension and cerebrospinal fluid pressure, the patient gradually regained motor function. After 4 days, the LD was clamped for 24 hours and subsequently removed, with the patient remaining neurologically intact. Four days after the LD removal, the patient developed an acute onset of aphasia and right-sided weakness along with left LE weakness. The head computed tomography was negative for hemorrhage, and intravenous alteplase therapy was administered with resolution of acute stroke symptoms. However, bilateral LE weakness remained and was attributed to SCI. Considering alteplase administration 12 hours ago and ongoing hypotension, the LD placement was delayed for 24 hours until the patient was medically optimized. Despite delayed placement, 36 hours after symptom onset, cerebrospinal fluid drainage led to gradual improvement and significant recovery in LE motor function.

Discussion

The rate of permanent SCI following thoracoabdominal aortic surgeries is about 4.5%. Lumbar drain is beneficial for spinal cord protection, but carries the risk of neuraxial hematoma and subarachnoid hemorrhage during placement, removal, and management. The risks can be reduced by avoiding over-drainage of cerebrospinal fluid and adhering to coagulation guidelines for neuraxial procedures. Complex clinical situations, such as this case, may require higher risk-taking, highlighting the need for further research.


Between a Varix and a Hard Place: TEE Safety in Liver Transplant

Michael Lafferty, BS1; Carmelina Gorski, BS1; Luis Alejandro Carvajal, BS1; Terrie Vasilopoulos, PhD2; Saba Ali, BS1; Sehrish Saleem, MD3; Asad Bashir, MD2

1 College of Medicine, University of Florida, Gainesville; 2 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 3 Department of Radiology, University of Florida College of Medicine, Gainesville

Introduction

Esophageal varices (EVs) are a common, high-risk complication in patients undergoing orthotopic liver transplantation (OLT), primarily due to portal hypertension. Intraoperative transesophageal echocardiography (TEE) is increasingly utilized for real-time assessment of cardiac function and volume status monitoring during OLT, and over 50% of liver transplant programs incorporate TEE into their standard practice. The American Society of Anesthesiologists (ASA) and other professional guidelines recommend TEE for major abdominal surgeries where significant hemodynamic instability is anticipated. However, ambiguity persists regarding its safety in patients with EVs, as the ASA guidelines cite equivocal opinions about the absolute or relative contraindication of EVs to TEE due to insufficient evidence. However, current American Society of Echocardiography guidelines consider EVs a relative contraindication for TEE. This study aims to evaluate the bleeding risk associated with TEE probe placement in patients with graded EVs.

Bar chart showing Westaby scale comparing EVs and total blood loss.

Methods

This retrospective cohort study included 931 adult patients who underwent OLT at a large academic medical center between April 2013 and December 2024. Deidentified patient data were extracted from electronic medical records and stored in a HIPAA-compliant REDCap database. Patients without EVs, with incomplete records, or lacking EV grading were excluded. Preoperative, intraoperative, and postoperative variables, including EV grade and grading criteria (American Association for the Study of Liver Diseases (AASLD), Westaby, Sarin, and Japanese), TEE use, estimated blood loss, bleeding events, need for gastroenterology consult intraoperatively, and patient outcomes, were systematically collected. Group comparisons were evaluated with Mann-Whitney U test or Kruskal-Wallis test (continuous outcomes) and chi-square test (categorical outcomes).  Spearman’s correlations were used to assess associations between continuous variables.

Bar chart showing Westaby scale banded EVs compared to total estimated blood loss.

Results

Of the initial 931 patients undergoing OLT, 329 with EVs and complete grading records were included in the primary analysis. An additional 94 patients had EVs that were either banded or partially/completely eradicated during endoscopy and were analyzed separately. Analyses were performed separately for banded and unbanded patients. Intraoperative estimated blood loss was not associated with higher Westaby graded EVs in banded (P = .656) or unbanded (P = .237) patients (Figures 1 and 2, respectively). In unbanded patients, AASLD “Large” EVs were associated with longer overall hospital (P = .013) and ICU (P = .018) length of stay (Figure 3). Greater total estimated blood loss was associated with higher MELD scores (rho = 0.21, P < .001) in unbanded patients (Figure 4); however, no association (rho = 0.11, P = .338) was found in banded patients (Figure 5). Importantly, no instances of intraoperative bleeding related to TEE probe placement or from mechanical trauma were documented in either group.

Conclusion

This large retrospective cohort found no intraoperative bleeding events attributable to TEE probe placement in patients with esophageal varices graded across various severities, providing strong evidence for its safety in this context. Additionally, aligning with other studies, we found no intraoperative bleeding from EVs due to mechanical trauma. Clinical discretion remains advised, and further prospective studies are warranted to confirm these findings and assess generalizability beyond a single-center setting.


When Propofol Isn’t Enough: Sedation Resistance in a CYP2C19 Ultra-Rapid Metabolizer

Michael Lafferty, BS1; Jaime Tellez, BA1; Amanda Frantz, MD2; Peggy White, MD2; University of Florida

1 College of Medicine, University of Florida, Gainesville; 2 Department of Anesthesiology, University of Florida, Gainesville

Introduction

Propofol is one of the most widely used intravenous anesthetics and is primarily metabolized through hepatic glucuronidation and hepatic hydroxylation, the latter of which utilizes cytochrome P450 enzymes. A rare CYP2C19 variant, CYP2C19 *17/*17, results in 2 copies of the increased-expression, *17, allele and classifies individuals as ultra-rapid metabolizers (URMs). CYP2C19 URMs have been found to have altered metabolization of proton-pump inhibitors, clopidogrel, and benzodiazepines; however, the metabolization of propofol in this subgroup has not been well studied.

Methods

A 40-year-old female patient with a 2-year history of upper and lower gastrointestinal symptoms underwent preprocedure CYP2C19 testing due to poor response to esomeprazole. She was identified as a CYP2C19 *17/*17 ultra-rapid metabolizer and scheduled for a combined colonoscopy and esophagogastroduodenoscopy. Standard bowel preparation was followed, and medical evaluation revealed no comorbidities, medication interactions, or CYP450-inducing agents. Under Monitored Anesthesia Care, propofol was selected as the sole sedative agent, initiated at 250 mcg/kg/min with standard ASA monitoring.

Results

Seven minutes after induction, the patient remained responsive to stimuli, requiring increased infusion rates and a 20-mg propofol bolus. Despite high-dose maintenance infusions and a second bolus, the patient remained intermittently responsive throughout the 37-minute procedure. Hemodynamics and ventilation remained stable with no adverse respiratory events.

Conclusion

Although CYP2C19 contributes to the metabolism of propofol, current evidence indicates its overall contribution is modest when compared with other cytochrome enzymes and hepatic glucuronidation. As a result, CYP2C19 *17/*17 status is not typically associated with increased propofol requirements, raising the possibility of additional unrecognized pharmacogenomic variants such as UGT1A9, CYP2B6, or GABA-receptor related genes. This case highlights the importance of individualized anesthetic plans and pharmacogenomic data should complement, not replace, clinical judgement. Our institution currently houses a database of CYP2C19 genotypes, providing an opportunity to further study the propofol and anesthetic requirements of this group of metabolically unique patients.


High-Flow, Low Interference: Utilizing HFNC and ABG Monitoring for Awake Craniotomy

Michael Lafferty1; Peggy White, MD1,2; Steven Robicsek, MD, PhD1,2

1 University of Florida College of Medicine,Gainesville; 2 Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Awake craniotomies, once a rare procedure, are commonly utilized to allow intraoperative neural mapping, enabling greater tumor resection while reducing postoperative neurological deficits. While intraoperative awareness is typically undesirable, awake craniotomies require full patient consciousness, creating unique anesthetic risks. Anesthetic techniques utilized for this type of surgery, such as monitored anesthesia care (MAC) or asleep-awake-asleep (AAA), pose risks for complications such as hypercapnia, desaturation, respiratory depression, or ischemic injury. Additionally, MAC utilizes a variety of methods, such as a simple face mask, nasal trumpet, nasopharyngeal airway, and high-flow nasal cannula (HFNC).

Methods

We present a case of a 55-year-old woman undergoing left temporal awake craniotomy in which HFNC was used throughout the procedure under MAC. Per institutional practice, an arterial line was placed, and ABGs were used to guide ventilation due to unreliable ETCO₂ measurement during the initial phase of surgery. Ventilation parameters, ABG trends, sedation depth, and intraoperative neurologic performance were evaluated.

Results

High-flow nasal cannula was at flows up to 64 L/min maintained excellent oxygenation (SpO₂ 100%) and allowed stable hemodynamics throughout the asleep and awake phases. ETCO₂ could not be reliably captured during the initial phase; however, serial ABGs demonstrated acceptable pH and PaCO₂ trends, guiding real-time ventilatory adjustments. The patient awoke promptly for mapping, remained conversational and cooperative, and tolerated the procedure without airway complications. Final ABG values and intraoperative monitoring confirmed adequate ventilation, and the patient underwent successful tumor resection without neurologic deficits or perioperative complications.

Conclusion

In this case, HFNC was successfully used under MAC despite limited ETCO₂ monitoring, providing humidified oxygen, mild PEEP, reduced dead space, and improved CO₂ clearance while allowing the patient to awaken and fully participate in neural mapping without interference. Additionally, for this patient and others, HFNC has shown to minimize coughing, bucking, excessive sedative requirements, and voice hoarseness that could compromise mapping. Although ETCO₂ is the gold standard for real-time ventilation assessment, accurate readings were not obtainable due to HFNC flow; however, because an arterial line is standard at our institution, ABG-derived PaCO₂ served as a reliable surrogate. Important limitations remain: ABGs are invasive, intermittent, and require an arterial line that may not always be clinically indicated. HFNC can impede ETCO₂ monitoring, and current evidence supporting HFNC in awake craniotomy is limited to small case reports and series, underscoring the need for larger studies.


Quality Improvement


Beyond Fasting: Perioperative Management of GLP-1RA Therapy

Michael Lafferty, BS1; Lindsey Morrow, BA/BS1; Ahmed Rashid, MD2; Erica Matich, MD2; Peggy A White, MD2; Amanda M Frantz, MD2

University of Florida College of Medicine1; University of Florida Department of Anesthesiology2

Introduction

Glucagon-like peptide-1 receptor agonists (GLP-1RAs) delay gastric emptying and may increase the risk of residual gastric content (RGC) despite standard fasting. They improve glycemic control by increasing insulin secretion, reducing glucagon, decreasing appetite, and delaying gastric emptying, which can cause RGCs despite standard fasting, in patients with or without comorbidities. RGC is associated with risk of pulmonary aspiration, yet the true incidence and significance in GLP-1RA users are unclear. The American Society of Anesthesiologists (ASA) and ADA recommend holding daily GLP-1RAs on the day of surgery and weekly GLP-1RAs 7 days prior, though based on limited evidence as noted by both societies. This study assessed the prevalence of RGC in GLP-1RA users and evaluated perioperative management practices at a large quaternary medical center.

Methods

This prospective quality improvement study (Aug 2024–May 2025) enrolled adult surgical patients at UF Health Shands taking a GLP-1RA for ≥8 weeks. Eligible patients were identified through an Epic EMR query and then approached in person for verbal consent, and preoperative gastric POCUS was performed by a trained anesthesiologist. Collected data included demographics, diabetes status, medications affecting motility, fasting times, and GLP-1RA hold duration. The primary outcome was the presence of RGC on POCUS.

Ultrasound image
Figure 1.

Results

A substantial proportion of GLP-1RA patients demonstrated RGC (Fig. 1 “full stomach”) despite reporting adherence to ASA fasting guidelines. Many patients also received inconsistent or unclear instructions regarding GLP-1RA hold times, with significant variation between what was documented, communicated, and practiced clinically. Fasting duration did not reliably correlate with gastric emptiness. Postoperative adverse events were rare, but several patients required modified anesthetic plans (e.g., RSI, airway protection) due to positive POCUS findings.

Conclusion

A noteworthy percentage of GLP-1RA users had RGC before surgery, indicating that standard fasting instructions may be insufficient for this population. Discrepancies in preoperative communication further complicate management. Gastric POCUS proved feasible within routine preoperative workflow and may support individualized anesthetic planning when GLP-1RA timing, symptoms, or fasting status are unclear. These findings support updating local perioperative protocols and highlight the need for broader guideline refinement as GLP-1RA use continues to expand.


Resident Participation and Case Involvement in Pediatric Anesthesia: A Survey of Attending and Resident Perspectives

Trevor Virno, MS31; Kevin Sullivan, MD2

1 Lake Erie College of Osteopathic Medicine, Bradenton, FL; 2 Department of Anesthesiology, University of Florida College of Medicine, Gainesville, FL

Introduction

Resident involvement in pediatric anesthesia varies widely due to case acuity, procedural complexity, resident experience level, and operating room time constraints. Resident satisfaction with the pediatric anesthesia rotation is impacted by procedural exposure. This project examines resident and attending perspectives on procedural opportunities, their barriers, and overall satisfaction with case involvement during pediatric anesthesia participation at UF Health Shands Hospital.

Methods

Electronic surveys were distributed to 1 pediatric anesthesia or pediatric cardiac anesthesia attending and their corresponding anesthesiology resident assigned to a pediatric anesthesia room each week. Efforts were made to limit the number of queries per resident so as not to bias the results by overrepresenting a small number of residents. Thirty-six weeklyemail surveys were sent to attendings and their residents over a 9-month period. Residents and attendings received an email requesting survey participation on the day of the anesthetic, and 24 hours and 72 hours after their case. Responses were collected and analyzed descriptively. Survey questions included procedural participation (airway, vascular access), case characteristics (first starts, simultaneous starts), patient characteristics, barriers to involvement, and satisfaction.

Results

1.) Participation: 26/36 of attending physicians (72%) responded; 24/26 required 1 or more repeat emails; 10 were nonresponders, and 16 unique attending physician participants accounted for the 26 responses. 26/36 of residents (72%) responded; 18 unique residents accounted for the 26 responses. 2.) Case Characteristics: 16/26 (61.5%) were first starts of the morning. 8/26 (30.8%) of the cases were simultaneous starting cases in the attending’s 2 assigned rooms. 6/26 (23%) of cases were first start cases in the morning that had simultaneous start times. In 3/26 (11.5%) of cases there was a concurrent emergence in another room. 3.) Resident Procedural Participation: In 26/26 cases (100%) the resident managed the airway. In 14/24 cases (58.3%) the resident attempted PIV insertion. Residents attempted arterial line insertion in 8/13 cases (61.5%) in which an arterial line was inserted, and residents attempted CVL line insertion in  7/11 cases (63.6%) in which a CVL was inserted. Residents reported being satisfied with their case participation in 22/26 cases (85%).4.) Barriers to Resident Involvement: In 2/26 cases (8%) the patient did not need another PIV inserted. In 7/26 cases (27%) the attending physician thought the patient’s potential vascular access sites were insufficient to allow the resident to attempt PIV, CVL, or arterial line insertion. In 9/26 patients (34.6%) the attending physician thought the patient was too unstable to permit resident attempts at arterial line, CVL, or PIV insertion. In 8/26 (30.8%) patients the attending physician perceived pressure from the surgeons to move the case along, and in 6/26 patients (23.1%) the attending physician felt the resident was too inexperienced for the vascular access task at hand.

Conclusions

Resident involvement was universal for airway management but more limited for invasive procedures such as PIV, arterial line, and central venous access attempts. PIV access attempts were permitted in ≅ 60% of cases, a number comparable to the arterial and central venous line attempts in the cardiac operating rooms.

Case flow constraints, particularly first starts and simultaneous starts, were frequent barriers to procedural involvement. Patient instability and surgeon pressure further limited opportunities, especially in high-acuity pediatric cardiac cases. Despite these challenges, resident satisfaction remained high, reflecting strong faculty engagement and perceived meaningful learning experiences.

It is notable that only 2/3 of residents and attending physicians chose to participate in the survey despite 2 reminders asking them to do so. The survey assured participants that their responses would be confidential but concerns over anonymity may have dampened willingness to participate and may have also affected the candor of responses.

Optimizing workflow, minimizing concurrent attending physician obligations, and defining expectations for procedural participation based on patient characteristics may further strengthen resident opportunities while maintaining patient safety.


Early Training of Anesthesia Residents Improves Long-Term Retention on Performing Informed Consent

Kyle Chan, MD1; Benjamin Cipion, MSc1; Terrie Vassilopoulos, PhD1,2; Amanda Frantz, MD1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Orthopedic Surgery & Sports Medicine, University of Florida College of Medicine, Gainesville

Background

To obtain informed consent for anesthesia, anesthesiology residents must have a comprehensive understanding of its indications, process, risks, benefits, and alternatives. Obtaining informed consent for anesthesia is a requirement assessed in the American Board of Anesthesiology (ABA) APPLIED Exam for graduates seeking diplomate status. However, there is currently no standardized method for teaching this skill during residency. This study aims to evaluate whether formal training for anesthesiology residents at the beginning of their careers can lead to a standardization of the information provided in an informed consent process for anesthesia. Additionally, we will assess the long-term retention of this knowledge as residents progress through their training.

Introduction

Informed consent (IC) is a fundamental legal and ethical pillar in medicine, serving as the foundation of patient autonomy and a critical element of therapeutic alliance. In modern health care, the informed consent process is a reciprocal conversation where patients must be briefed on the risks, benefits, and alternatives of specific procedures. Both the Association of American Medical Colleges (AAMC) and the Accreditation Council for Graduate Medical Education (ACGME) have made obtaining an IC an Entrustable Professional Activity (EPA) requirement for entering residents and a resident requirement in training, respectively. Informed consent is more than a formality or a signature. It is a process that includes shared decisions, clear communication, understanding, and voluntary agreement. This process is particularly critical and at times challenging in the field of anesthesiology. The preoperative meeting is often the only time a patient can fully understand and agree to a treatment plan that includes the risks, benefits, and options, as well as the specifics of the surgery itself.

Anesthesiology trainees often encounter obstacles when obtaining informed consent as they navigate a complex landscape of practical challenges and patient and familial complexities. These interactions are often hindered by time constraints, language barriers, and the need to establish rapport quickly with patients. As a result, establishing consent can become rushed or generic, leading to inadequate patient understanding. Recent research has shown deficiencies in patient understanding and retention of essential anesthetic risks, benefits, and alternatives.

Training contributes to this shortfall. Formal education in informed consent is not standardized, satisfaction with existing instruction is low, and concerns remain about entrusting trainees with this responsibility without adequate preparation. Within anesthesiology, surveys indicate that residents have historically received little structured instruction; instead, they learn through informal observation, resulting in variable practices and inconsistent preparedness. Recognizing the need for robust assessment, the ABA introduced the Objective Structured Clinical Examination (OSCE) to the APPLIED Exam for initial certification in 2018.

In this study, we aim to demonstrate the influence of early exposure to the fundamental components of informed consent through discussions and mock OSCEs among anesthesiology residents. By examining whether training of this nature enhances performance on standardized OSCE measures and retention of this repeated skill, the study seeks to verify the utility of standardized training in anesthesiology education.

Methods

First, PGY-1 anesthesiology residents (n = 79) without prior training were tasked with obtaining informed anesthesia consent from a standardized evaluator. Their performance was assessed across 7 domains aligned with the ABA APPLIED OSCE criteria: 1) demonstration of understanding; 2) explanation of indications; 3) explanation of the procedure; 4) discussion of risks and benefits; 5) discussion of risk-minimizing strategies; 6) elicitation and response to patient questions; and 7) confirmation of final decisions. Each domain was scored using a 4-point frequency scale: rarely, occasionally, often, consistently. Following this assessment, participants underwent a structured informed-consent training program. This program included a 60-minute introductory lecture that reviewed the principles of informed anesthesia consent, covering topics such as indications, consent processes, risks, benefits, alternatives, and communication expectations in alignment with the ABA APPLIED OSCE. In addition to the lecture, there were 3 targeted training sessions, each lasting 15 minutes, which addressed complex consent scenarios. These scenarios included consent by proxy, multi-procedure consent, and emergency consent. After training completion, residents (n = 59) were reassessed immediately through a mock ABA APPLIED OSCE grading session. Each resident participated in an 8-minute informed-consent encounter with a senior resident acting as the patient. They were reassessed again 6 months later during intern education month (n = 18).

Statistical Analysis 

Each question was coded on a numeric scale from 1 (rarely) to 4 (consistently), then averaged to create a primary outcome of total consent score, summarized by means ± standard deviations. Each individual question was summarized as counts and percentages. Linear mixed models were used to examine change in consent score over time. These models account for repeated measures within the same individual and can handle missing data. Tukey’s test was used for post-pairwise comparisons between timepoints. Pairwise differences were quantified as mean difference with 95% confidence intervals (95%CI). Secondary analyses used chi-square to examine differences in individual questions between timepoints. All analyses were performed in JMP Pro 18 (SAS Institute Inc, Cary NC) and P < .05 was considered statistically significant.

Bar chart showing performance of each area of consent.
Figure 1: Baseline performance on each area of consent.

Results

A total of n = 79 participants completed a baseline assessment. For each individual area, the lowest baseline performance was observed for “Discusses strategies for minimizing risks of the treatment,” with 43.4% rated as rarely, and only 8% rated as consistently. The highest performance was observed for “Elicits questions and responds appropriately in lay terms” and “Confirms final decision with the patient regarding the treatment options and obtains affirmative consent without coercion,” with 43.0% and 41.8% rate consistently, respectively. (Figure 1).

There were significant changes in total consent score over time (F(2,16) = 27.1, P < .001). Total consent scores at post assessment (0.9, 95%CI: 0.6 to 1.2, P < .001) and at 6 months (0.9, 95%CI: 0.5 to 1.4, P = .001) were significantly higher than baseline scores. There were no statistically significant differences in total consent score between post assessment and 6 months (0.0, 95%CI: -0.4 to 0.4, P = 1.0). (Figure 2).

Bar chart of consent scores.
Figure 2: Total consent scores across timepoints with chi-square p-values.

Discussion

Early, structured training in informed anesthesia consent led to significant and lasting improvements among PGY-1 residents. Despite having limited opportunities to engage in consent discussions during their intern year, residents demonstrated stable retention of skills at the 6-month mark. This strongly suggests that the curriculum, rather than daily practice, was the main driver of their retention.

The OSCEs and simulation-based experiences offer effective solutions to training challenges. In recent years, anesthesiology programs have increasingly used simulation and encounters with standardized patients (SPs) to teach and assess informed consent skills. When well-designed, OSCEs provide a valuable supplement to both written and oral exams, helping to identify competencies that traditional testing methods often overlook. Simulation-based education creates a safe environment for residents to practice conducting consent discussions, including addressing patient questions and managing emotional responses without risking harm to real patients.

Exposure to clinically challenging scenarios, such as proxy decision-making and emergency consent, ensured that residents were prepared not only for routine encounters but also for high-stakes OSCE and clinical situations. The lack of significant decline in performance highlights the importance of scheduled reinforcement every 6 months.

Conclusion

A structured, early informed-consent curriculum significantly improves communication performance among anesthesiology interns and demonstrates durable retention of skills essential for high-quality patient care and success on the ABA APPLIED OSCE. These findings support integrating early, formalized informed-consent training into residency education, incorporating progressively more complex scenarios and scheduled reinforcement to promote sustained competency. Future directions include incorporating standardized patients to enhance realism and enable external performance evaluation, formalizing the curriculum to ensure reproducibility across residency classes, assessing the impact of early training on ABA APPLIED OSCE pass rates, and extending longitudinal follow-up to evaluate skill retention into the CA-2 and CA-3 years.


Drivers of Congenital Heart Center Readmission

Kasey Permenter, BS; Wei Wang, MD; and Kevin Sullivan, MD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Background

The Congenital Heart Center (CHC) at UF Health Shands Hospital has experienced readmission rates between 16% and 24%, which is one of the highest in the UF Health system. This quality improvement initiative was undertaken to explore the factors contributing to early patient readmission (< 30 days) in the CHC.

Methods

A comprehensive chart review was designed to identify factors associated with early CHC readmission. First, a retrospective chart review was conducted on patient readmissions that occurred in FY25 Q3 and FY25 Q4. Data extracted included the hospital unit and provider responsible for the index discharge, the hospital unit and clinical reason for readmission, and the procedures and medical therapies altered during the readmission. Finally, the discharge location and discharging provider following the readmission episode were recorded. To more fully capture all premature readmissions in all locations (including patients for whom the CHC service was not the discharging service) prospective data collection began in FY26 Q1 and is ongoing. This data approach allowed for a more detailed assessment of discharge patterns, care transitions, and clinical drivers of CHC readmission.

Results

Prior to readmission, patients were discharged from the PCICU (58%), the pediatric floors (31%), the NTOR or PACU (6%), or an adult ICU or floor bed (5%) after an average initial hospitalization of 35.5 +/- 77.43 days. The average patient age was 6.14 +/- 7.1 years. The reason for hospitalization before premature readmission was cardiac surgery in 78% of the patients.

Prematurely readmitted patients returned to the hospital most commonly through the NT Emergency Department (63%) after coming directly from home, a referring ED, or the CHC clinic. Patients were directly readmitted from home or a referring hospital in 11% of cases and from the CHC clinic in an additional 11%. In 15% of the readmissions, the patients were admitted from the NTOR, the NT cath lab, or the NT PACU. Patients were discharged after readmission from the pediatric wards (64%), the PCICU (22%), the PICU (11%), or the NT PACU (3%).

On average, patients were readmitted 10.46 +/- 7.1 days after discharge. Once readmitted, the typical length of stay was 8.25 +/- 8.7 days. Infectious concerns (viral infections, SSI, bacteremia) represented the most common reason for readmission (25%), followed by scheduled readmissions (17%) for cath, infusions, or surgical procedures, and drainage of pericardial/pleural effusions (17%). Feeding intolerance (14%), diuretic adjustments (11%) due to edema or dehydration, and arrythmias and progression of heart failure symptoms (11%) accounted for the remaining medical causes for readmission. In 6% of the readmissions, there was no hospital-specific therapy rendered other than observation.

Conclusions

In this review of pediatric cardiac ICU readmissions, we identified patterns that highlight potential targets for quality improvement. Infectious concerns, intrathoracic effusions, feeding difficulties, and need for diuretic adjustments represented the most modifiable risk factors for readmission. A significant number of readmissions (16%) were scheduled readmissions for cath, surgery, or infusion therapies.

Patients discharged from the PCICU by ICU physicians were more likely to be readmitted when compared to patients discharged from non-PCICU locations by pediatric cardiologists. Similarly, patients were more often discharged after readmission by pediatric cardiologists from non-PCICU locations. These discharge and readmission data suggest that the patients discharged from the ICU may be sicker or may not have adequately completed the evolution of their clinical trajectory at the time of hospital discharge from the ICU. Due to their retrospective nature, these data are at best hypothesis-generating but raise the question of whether patient would be better served by a period of observation on the wards prior to discharge to allow for enhanced parent education and training, refinement of diuretic therapy, surveillance for the development of fluid collections, and observation for infectious complications. Our prospective data collection will provide a more comprehensive accounting of our discharges and readmissions from all the patient units and for all the patients we consider to be CHC patients (including those not formally on a CHC service, e.g., HVN 66, 67, 76. 77, NICU, PICU, etc.). Such data will provide a more conclusive estimate of our discharges and readmissions.


Experimental or Observational Clinical Trial


Peripheral Nerve Blocks as a Primary Anesthetic for Above-the-Knee Amputations: A Prospective Feasibility Study

Jose Soberon, M.D.

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Above-the-knee amputation is a commonly performed surgical procedure that is associated with significant morbidity and mortality. It is unknown if these surgeries can be reliably performed using peripheral nerve blocks and deep sedation instead of general anesthesia, nor is it known whether anesthetic type influences perioperative mortality.

Methods

In this prospective proof of concept-controlled study, 29 patients scheduled for above-the-knee amputations at a single veterans health administration hospital underwent preoperative placement of femoral and sciatic nerve catheters as well as single-injection obturator and lateral femoral cutaneous nerve blocks, followed by surgery with deep sedation. 

Illustration of the surgical technique for above-the-knee amputation.
Illustration of the surgical technique for above-the-knee amputation.

Results

Block success, intraoperative and postoperative analgesic administration, and conversions to general anesthesia were recorded. Postanesthesia care unit pain scores, patient satisfaction, and mortality at 30 days as well as at the time of study conclusion were assessed. Twenty-nine patients underwent 30 above-the-knee amputations during the study period. The mean age was 69.9 years and 100% were male. One patient required conversion to general anesthesia. Ninety percent (n = 27) of patients reported 0 pain upon postanesthesia care unit arrival, and 96.7% (n = 29) were “extremely satisfied” or “satisfied” with their anesthetic care. One patient died within 30 days postoperatively and only 13 patients were alive at study conclusion (median follow-up: 48 months).

Conclusions

Our findings suggest that peripheral nerve blocks with deep sedation may offer a reliable and feasible anesthetic option, particularly for high-risk patients. Further research is needed to determine the impact of anesthetic type on outcomes.


Microplastics in CPB Circuits

Zeyu Song, Nikolaus Gravenstein1, Thi Hong Na Phung, Christopher Samouce1, Gregory Janelle1, Regina Knudsen, Sungyoon Jung2, Everett Jones1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Environmental Engineering Sciences, University of Florida Herbert Wertheim College of Engineering, Gainesville

Background

Microplastics are plastic particles smaller than 5 millimeters that have been found in blood and various human tissues, including stool, lungs, brain, and placenta. While the environmental impact of microplastics has been studied, the potential for additional iatrogenic exposure during invasive medical procedures remains largely unexamined. Cardiopulmonary bypass (CPB) circuits offer a unique route for direct intravascular microplastic exposure. This raises concern about the generation of microplastics within CPB systems during surgery.

Objective

This bench study quantified and characterized the presence and type of microplastics within CPB circuits and evaluated whether volatile anesthetic exposure influenced their release.

Methods

A benchtop in vitro analysis for microplastics was performed using Plasmalyte A as the perfusate medium under 5 conditions: (1) Plasmalyte A sampled directly from the original bag (control), (2) Plasmalyte A collected after passing through the prebypass filter, (3) CPB circuit after a 0.2 µm prebypass filter, (3) CPB circuit exposed to 5% isoflurane with FiO₂ 100% after a 0.2 µm prebypass filter and (4) CPB circuit exposed to 8% sevoflurane with FiO₂ 100% after a 0.2 µm prebypass filter. All bypass runs were standardized to identical circuit lengths, volume, temperature, duration, and flows. Samples were collected post-oxygenator and analyzed for microplastic content using 2-stage vacuum filtration, stereomicroscopy, and pyrolysis–gas chromatography/mass spectrometry (Py-GC/MS). Emphasis was placed on detecting plastic types commonly found in CPB circuits: polymethyl methacrylate (PMMA), polypropylene (PP), polystyrene (PS), polyvinyl chloride (PVC), polyethylene (PE), and nylon 6 (PA6).

Illustration of CPB circuit used for microplastic analysis.
Figure 1. Schematic of the CPB circuit used for microplastic analysis, including prebypass filtration and optional volatile anesthetic exposure.

Results

Microplastics were detected in the Plasmalyte A control and the bypass circuits under all experimental conditions. Mean microplastic concentrations were 5.73 ± 3.97 µg/L (after prebypass filter filtration), 6.84 ± 2.18 µg/L (isoflurane), and 7.24 ± 5.33 µg/L (sevoflurane), with no significant difference among CPB groups (ANOVA p = 0.748).

The 0.2 µm prebypass filter reduced microplastic mass by ~23.6% (0.887 → 0.678 µg/L), but particles remained detectable. Microplastics were present across all groups, and the 0.2 µm prebypass filter used in CPB runs reduced but did not eliminate detectable microplastics.

Pie charts and bar chart showing concentrations and chemical compositions of microplastics.
Figure 2. Concentrations and chemical compositions of microplastics under 4 conditions: (1) Plasmalyte A, (2) CPB only, (3) Isoflurane, and (4) Sevoflurane. The fraction of PS is much smaller than the other polymer types and is therefore not shown in the pie chart.

Conclusion

The detection of microplastics within CPB circuits identifies a previously unrecognized source of iatrogenic microplastic exposure during cardiac surgery. Volatile anesthetics did not significantly increase particulate release in this study. These findings highlight the opportunity for improved filtration strategies and novel circuit materials to reduce microplastic exposure in extracorporeal circulation systems.


Visualizing and Quantifying the Dimensions of Frailty Associated with Preoperative Anemia

Robert Ryan, Cynthia Garvan, Keith Howell

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Background

Anemia is known to be correlated with both cognitive frailty and physical frailty. Additionally, physical frailty and cognitive frailty were foundto predict worse postoperative outcomes according to prior research by Cheuk et al and others. Since anemia is a modifiable risk factor, research is needed to better understand anemia’s association with frailty. In this retrospective observational cohort study using data from the seminal work by Monk et al, we seek to better understand the relationship among anemia, cognitive frailty, physical frailty, and postoperative outcomes.

Methods and Data

In the original Monk et al study, 1064 patients undergoing major, noncardiac surgery at Shands Hospital at the University of Florida gave their written informed consent and were enrolled in the study between February 1, 1999, and January 31, 2002. Inclusion criteria included adult patients undergoing elective surgery who were scheduled to be admitted to the hospital as an inpatient for a minimum of 2 days after surgery. Additional inclusion criteria were surgery scheduled under general anesthesia that was expected to last 2 h or longer, fluency in English, ability to read, and the absence of serious hearing or vision impairment that would preclude neuropsychological testing. Patients were excluded if they were scheduled to undergo cardiac, carotid, or intracranial procedures or were not expected to be alive or available to complete testing at 3 months after surgery. Additional exclusion criteria included a score of 23 or less on the Mini Mental State Examination before surgery, a history of dementia or any disease of the central nervous system including previous cerebral vascular accident with residual deficit, use of tranquilizers or antidepressants, a current or past history of psychiatric illness, and alcoholism or drug dependence. The Monk et al dataset was filtered to include only patients with a preoperative hemoglobin, leaving 897 patients. A total of 555 patients had no anemia, 199 had mild anemia, and 143 had moderate or severe anemia. Most (n = 559) patients were female and 338 were male. Many (n = 318) patients had postoperative cognitive decline (POCD) at 1 week and 66 had POCD at 3 months.

Statistical Analysis

Analysis of preoperative physical and cognitive frailty was completed with a structural equation model (SEM) fitted with preoperative frailty and preoperative cognitive function as latent factors. A figure of the model is included. Data from 881 patients were included in the SEM analysis due to missing data issues with 16 patients. The indicator variables for cognitive frailty latent factor were the results of 5 preoperative cognitive tests. The indicator variables for the physical frailty latent factor were preoperative resting systolic blood pressure, preoperative hemoglobin, Charlson comorbidity, and age block. Further analysis of postoperative outcomes was conducted using logistic and linear regressions. Regressions were used to determine significant predictors of POCD at 1 week, POCD at 3 months, non-home discharge, 1-year mortality, and hospital length of stay. Each regression had the predictors of gender, ASA (included as low and high), Charlson comorbidity, executive function score, memory score, hemoglobin, systolic blood pressure, and BMI as predictors. P values for all regressions are included in the table. Analyses were performed using SAS version 9.4 software (Cary, NC) and R version 4.5.2 (Vienna, Austria).

An illustration showing a SEM Model
Figure 1: SEM Model

Results

The resultant SEM depicting relationships of indicator variables, constructs of physical and cognitive frailty domains, and magnitude of interdependence is shown in Figure 1. All factor loadings from the SEM were found to be significant and in the anticipated directions. This model met accepted methods of determining model fit as discussed by Kline and summarized by Stone, and all model fit indices were in acceptable ranges. The covariance loading between the latent factors of cognitive and physical frailty was found to be 0.68.

The significant predictors for the POCD at 1 week model were executive function and BMI. The only significant predictor for the POCD at 3-months model was executive function. The significant predictors for the non-home discharge model were executive function, memory, hemoglobin, systolic blood pressure, and BMI. The significant predictors for the 1-year mortality model were ASA, Charlson comorbidity and hemoglobin. The significant predictors for the length of stay model were gender, ASA, Charlson comorbidity, and hemoglobin. The P values for all predictors are included in Table 2.

Conclusion

The SEM visually demonstrates that physical and cognitive frailty are related. The correlation of 0.68 implies that these 2 frailties are strongly interconnected, which is a logical result. The value of preoperative hemoglobin is a significant indicator of physical frailty in addition to age, blood pressure, and number of comorbidities.  Hemoglobin was included as a significant predictor for 1-year mortality and length of stay. This follows the established result from Howell et al1that preoperative hemoglobin affects postoperative outcomes and that anemia is predictive of postoperative outcomes and should be treated. Additionally, the regressions demonstrated that executive function is an important predictor of POCD. Anemia intervention could improve POCD outcomes due to the covariance of physical and cognitive frailty implied by the SEM. Due to the observational nature of this study, further research should be completed to show that anemia interventions improve outcomes and to investigate the relationship between physical and cognitive frailty.


Verbal Fluency as Predictor of Delirium in Post-Anesthesia Care Unit

Kathryn Pendleton, B.S.; Yonah Joffe, M.S.; Cynthia Garvan, Ph.D.; Faith Kimmet, B.S.; Catherine Price, Ph.D.*; Ben Chapin, M.D.*

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Objective

Postoperative delirium (POD) is a syndrome of acute change in mental status after surgery characterized by confusion and inattention. The postanesthesia care unit (PACU) represents the earliest opportunity to detect POD. Delirium occurring during recovery from surgery in the PACU is associated with subsequent delirium during the hospital stay as well as postoperative complications, dementia, and mortality. Preoperative cognitive screening is recommended in older adults undergoing surgery to identify cognitive risk factors which can inform perioperative management. Letter fluency and semantic animal fluency are one-minute measures, which provide a window into executive function and language processes that have been associated with delirium risk but have not been investigated as predictors of PACU delirium.

Participants and Methods

Older adults aged 65+ undergoing orthopedic surgery were recruited as part of an IRB-approved investigation within the preoperative surgical center at the University of Florida (UF) and UF Health. Letter fluency (F) and semantic animal fluency were administered as part of a larger neuropsychological protocol between 1 day and 1.5 months before surgery. Participants were rated for delirium using the Confusion Assessment Method Severity (CAM-S) scale 1 to 2 hours after the end of surgery in the PACU. A multiple linear regression model was used to simultaneously examine letter fluency and semantic fluency as predictors of PACU delirium.

Results

Fifty-three participants completed measures (mean age = 73.83±5.44, education years = 15.55±2.61, female = 62%). Sixteen (30%) participants were identified to have delirium in the PACU. After correcting for age and education, the overall regression model with PACU delirium as the response and letter fluency and semantic fluency as predictors was statistically significant (R2 = 0.2479, F(2,50) = 8.24, P < .001). Both letter fluency (β = -0.114, P = .0102) and semantic fluency (β = -0.082, P = .0292) were individually significant and were negatively associated with PACU delirium severity.

Conclusion

Preoperative letter and semantic fluency are associated with PACU delirium severity scores, such that individuals producing fewer words showed more symptoms of delirium within 1 to 2 hours after surgery. Although letter fluency is most known for requiring more frontal-subcortical and inhibitory functions, both measures involve processing speed which can differentially change with aging and neurodegenerative disorders (e.g., Alzheimer’s disease vs vascular dementia). Limitations include the restriction of age (65+), well-educated sample, and limited consideration of comorbidity. This study is ongoing.

Funding source: AACSF-22-928731(BC); K07AG066813(CP)


Traumatic Brain Injury and the Brain-Gut-Microbiome Axis: A Systematic Review and Meta-Analysis of Clinical Studies

Riya Modi1*; Zara Chowdhury1*; Zeeshan Khan1; Ling-Sha Ju1; Anatoly Martynyuk1,2

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 McKnight Brain Institute, University of Florida College of Medicine, Gainesville; *These undergraduate students contributed equally to this work.

Background

Traumatic brain injury (TBI) is characterized by heightened inflammatory and stress responses and is associated with neurocognitive decline and an increased risk of developing progressive neurocognitive disorders, such as Alzheimer’s disease. Stress and inflammation influence brain activity and neurocognitive function not only through direct effects on the central nervous system but also by disrupting systemic processes, including the brain-gut-microbiota (BGM) axis. Current understanding of the role of the BGM axis in neurocognitive decline following TBI, the contributions of inflammation and stress to these processes, and the potential for gut microbiota-targeted supplementation to improve neurocognitive function during the period after TBI remains in its early stages. To address these clinically and scientifically important questions, we are conducting a systematic review and meta-analysis of published clinical studies. In this review, we aim to answer the following specific questions: (1) Do patients with TBI develop gut microbiota dysbiosis? (2) Is there a correlation between neurocognitive decline, gut microbiota dysbiosis, and dysregulated inflammatory and stress signaling in TBI patients?

Methods

The participant pool consisted of patients with a history of TBI and healthy controls (HC) without a history of TBI. We excluded TBI patients with preexisting gut microbiota dysbiosis, metabolic disorders, elevated inflammation or stress, or neurocognitive and neurodegenerative disorders. Nonclinical or animal studies, reviews, and meta-analyses were also excluded. Relevant articles were retrieved from Embase, MEDLINE, Global Health, Scopus, and Web of Science. The search strategy employed the advanced search function with AND/OR operators and included keywords such as “TBI,” “gut dysbiosis,” “16S rRNA,” and “inflammation.” Our meta-analysis strategy entails a systematic and rigorous synthesis of quantitative data from the selected studies. Microbiota data were normalized using rarefaction and relative abundance scaling. Batch effects across studies were corrected using the ComBat algorithm (sva package, R), with PCA and clustering used to confirm correction. Microbiota analysis included alpha diversity indices (Shannon, Simpson, Chao1) and relative abundance assessments. Forest plots visualized intervention effects on key biomarkers and neurocognitive outcomes, with 95% confidence intervals. Heterogeneity was assessed using I² and τ² statistics. Sample size, mean, and standard deviation (SD) or standard error of the mean (SEM) were extracted manually and analyzed using R software (version 4.5.2) with the meta and metafor packages. Standardized mean difference (SMD) with a random-effects model was used to estimate effect sizes. This work was registered on PROSPERO (ID: CRD420251135998).

Forest plot comparing phylum level and genus level.
Forest plot comparing phylum-level and genus-level relative abundance between TBI and control groups. Actinobacteria were significantly depleted in the TBI group with moderate heterogeneity (I² = 55.7%, SMD = 0.00; 95% CI: −0.01 to 0.00; P = .033). Prevotella was also significantly depleted in the TBI group but showed high heterogeneity (I² = 84.6%, SMD = −0.38; 95% CI: −0.73 to −0.04; P = .03). Leave-one-out analysis did not significantly alter effect sizes or heterogeneity.

Results

Of 1212 records identified, 895 remained after removing duplicates. Thirty-seven articles underwent full-text review, and 5 studies met inclusion criteria. All included studies were conducted in the United States, published within the past 5 years, and involved adults aged 19.3–52.7 years with moderate to severe TBI. Microbiome profiling methods included 16S rRNA sequencing (n = 3), shotgun metagenomic sequencing (n = 1), and qPCR (n = 1). Alpha diversity analysis showed no overall difference in species diversity, richness, or evenness between TBI and controls. However, leave-one-out analysis revealed significantly reduced richness/evenness in TBI patients, with heterogeneity decreased to 29.3%. Relative abundance analysis indicated significant depletion of Actinobacteria (phylum) and Prevotella (genus) in TBI groups. Actinobacteria depletion exhibited moderate heterogeneity (I² = 55.7%, P = .033), whereas Prevotella depletion demonstrated high heterogeneity (I² = 84.6%, P = .03). Actinobacteria support gut homeostasis, while Prevotella aids in metabolizing complex nutrients. Sensitivity analyses did not substantially alter effect sizes or heterogeneity.

Conclusion

Our current findings suggest that TBI patients develop gut microbiota dysbiosis. Actinobacteria and Prevotella may serve as potential diagnostic markers and therapeutic targets. Given the roles of these taxa in maintaining gut homeostasis and metabolic function, their depletion may contribute to persistent inflammatory and neurocognitive abnormalities following TBI. Actinobacteria and Prevotella may therefore represent potential diagnostic biomarkers and therapeutic targets for microbiota-directed interventions aimed at improving outcomes in TBI patients.

Acknowledgements

Supported by the NIH (R01HD107722 and R56HD102898 to A.E.M.), IHAF (Z.A.K. and L.-S.J.), and UF Anesthesiology. We appreciate the assistance of UF undergraduate students.


Rethinking Clock Drawing Norms in the Smartphone Era: A Proposal

Jazzmynblu Hankin; Catherine Price, Ph.D., ABPP-cn

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Background

The Clock Drawing Test (CDT) is a brief neuropsychological measure that is widely used as a cognitive screener and is increasingly implemented in preoperative clinics for older adults. However, its clinical interpretability may shift as future generations have less lifelong analog clock exposure due to advancements in technology. Recent literature questions whether CDT errors in younger adults reflect cognitive impairment or reduced analog familiarity and greater reliance on digital displays. In a sample of self-reported cognitively healthy adults aged 18 to 30, about 25% scored below expected CDT ranges; most did not wear an analog watch, and some struggled to read analog clock faces. These findings raise concerns about cohort-related differences in analog familiarity. Consistent with this, an editorial in Research in Gerontological Nursing cautions that such differences may bias CDT interpretation and increase false positives in perioperative screening. Despite these emerging concerns, a bibliometric analysis documents continued growth in CDT publications. Although CDT performance has been examined in younger adults, most studies emphasize overall final scores rather than process-based elements of clock production. As a result, few studies have directly compared digital pen-derived latency and graphomotor signatures of clock drawing between younger and older cohorts.

Objectives

This project will (1) establish contemporary normative reference data for CDT performance in a University of Florida (UF) undergraduate cohort and (2) compare latency and graphomotor metrics between the UF cohort and cognitively unimpaired older adults using published normative references from UF and the Framingham Heart Study. Findings will be applied to understanding if the CDT remains a viable metric.

Methods

In this cross-sectional comparative study, a cohort of 100 UF students aged 18 to 23 will complete a 30-minute protocol that includes the CDT administered under standardized instructions and recorded using digital pen technology. Graphomotor and time-based CDT elements will be assessed. Key metrics will include time to completion, digit placement accuracy, hand setting accuracy, and hand length ratio. Additional procedures will include 1) completion of a demographic questionnaire assessing educational history, handedness, history of learning disorders that may influence clock production; 2) assessment on time-telling abilities; 3) survey on time-telling habits; and 4) completion of the Mini-Mental State Examination (MMSE) to evaluate cognitive functioning.

Expected Contributions to Literature

Upon completion, it is expected this investigation will provide insight into how age cohort alters clock drawing production. Findings will inform whether CDT interpretation in younger adults requires updated norms and/or task redesign. They may also help reduce misclassification in clinical screening contexts where brief measures are used, including perioperative settings. Future work may extend this framework to a small neuroimaging sub-study. This would test the hypothesis that analog clock drawing in adults raised within predominantly digital environments elicit greater frontal/executive activation, suggesting the task is less automatic and requires more active problem-solving. Furthermore, if younger adults demonstrate consistent, experience-related CDT error patterns, these findings may support refinements in CDT interpretation or new development of brief screening tools that more accurately identify cognitive risk or difficulties.


Level of Training Does Not Appear to Affect Outcomes When Using Biplane

Holly Ryan, PhD; Amanda Frantz, MD; Cynthia Garvan, PhD; Nikolaus Gravenstein, MD; Joshua Sappenfield, MD

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

Biplane imaging is an ultrasound technique that provides 2 perpendicular planes of view through a target to improve accuracy when localizing deep structures, such as when placing a peripheral IV (Figure 1). Simultaneous views of the long and short axes improve users’ ability to track the position of a needle as it is advanced. The purpose of this study was to determine if there was a difference in performance based on the level of training when using biplane imaging compared to single-plane imaging in terms of the time and number of attempts needed to place a deep peripheral IV.

Methods

Twenty-six consenting residents and 25 attendings were recruited from the UF Department of Anesthesiology to participate in a simulation comparing deep peripheral IV placement using biplane versus single-plane ultrasound imaging. IV placements were assessed in terms of time to access, number of attempts, number of redirections, and number of backwall perforations. New attempts involved pulling the needle all the way back to the phantom surface, whereas any reversal of needle insertion was considered a redirection (Figure 2). Backwall perforations were defined as instances of puncturing the posterior vessel wall of the phantom. Data were analyzed using Wilcoxon rank-sum testing with α = .05

The biplane ultrasound simulator being used and a closeup of the simulator screen
Figure 1: Biplane imaging provides 2 perpendicular fields of view.

Results

The biplane imaging modality, when compared to single-plane imaging, resulted in significantly fewer mean (SD) redirections [1.33(1.10) vs 1.73(1.55), P = .0001] and backwall perforations [0.08(0.16) vs 0.21(0.33), P = .0015]. There was no difference between residents and attendings in the change in time to access, number of attempts, number of redirections, and number of backwall perforations for biplane vs single-plane imaging.

Illustration of a new attempt and a backwall perforation.
Figure 2: Illustration of a new attempt and a backwall perforation.

Discussion

This IV-placement simulation demonstrates that, compared to single-plane imaging, biplane imaging decreases the number of redirections and backwall perforations inflicted during target ‘vessel’ access. We did not find a difference in time to access, number of attempts, number of redirections, or number of backwall perforations based on users’ level of training. The similar outcomes between residents and attendings may reflect comparable levels of familiarity with this relatively novel imaging modality. This finding further suggests that the benefits of biplane imaging are not restricted solely to highly experienced users. Additional research is needed to determine whether outcomes using biplane imaging in clinical practice correspond to those obtained using the phantom.

Conclusion

Biplane imaging improved needle placement for simulated deep vascular targets, while user training level had no effect.

Data presented here represent an update to a previously published work: Qu G, Frantz AM, Garvan CS, Gravenstein N, Sappenfield JW. Biplane utilization improves accuracy for peripheral IV placement. J Clin Ultrasound. 2025;53(4):819-824. doi:10.1002/jcu.23943

IRB20210291; Funding: Internal; Drawings created in BioRender


Introduction of Trained Observer Improves Doffing and Donning Compliance in Anesthesiology Trainees

Terrie Vasilopoulos, PhD1,2; Cameron R. Smith, MD, PhD1; Amanda M. Frantz, MD1; Thomas LeMaster, MSN, MEd3; Ramon Andres Martinez, BS1; Amy M. Gunnett, BSN, CCRC4; Brenda G. Fahy, MD, MCCM1

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville, Florida; 2 Department of Orthopaedic Surgery and Sports Medicine, University of Florida College of Medicine, Gainesville, Florida; 3 Center for Experiential Learning and Simulation, University of Florida College of Medicine, Gainesville, Florida; 4 UF Health Cancer Center, University of Florida, Gainesville, Florida

Introduction

Due to the multiple pandemics over the past 2 decades, the teaching of proper use of personal protective equipment (PPE) is of paramount importance, both in terms of health care workers’ personal safety and prevention of infection spread to others. Our group has previously demonstrated the efficacy of video training for improving donning and doffing of PPE. Specifically, we found that while higher donning was maintained at follow-up, doffing compliance reverted to pre-training levels. The current study examined the influence of a trained observer on donning and doffing PPE compliance.

Methods

Multiple cohorts of anesthesiology trainees were recruited and assessed from 2020–2025. Initial assessments involved trainees donning and doffing PPE without instruction, with video ratings of proper compliance. Then, trainees received an educational training (via video) on proper PPE donning and doffing.  They then repeated donning and doffing assessment. One group only received the video, and the other group had a trained observer present during second donning and doffing assessment. Participants then had a long-term assessment at least 7 months following initial assessments and education training/trained observe interaction (Note: long-term retention assessment did not include a trained observe and educational video was not repeated). The primary outcome was percent overall compliance across different PPE donning and doffing areas. Linear mixed models were used to analyze differences in change in compliance over time between groups, with Tukey’s test for pairwise comparisons. The P< .05 was considered statistically significant.

Results

A total of 108 trainees completed long-term retention assessments, including 78 who received only educational video training (Video), while 30 had the addition of a trained observer (Video+TO). For donning, both groups showed non-significant (P > .05) changes in compliance between immediate post-training and long-term retention assessments, with a 4.8% (95%CI:-0.1 to 9.8) decrease in Video group and a 7.8% (95%CI:-0.5 to 12.8) decrease in Video+TO group. However, for doffing the Video group showed a significant decrease (P = .001) in compliance between immediate post-training and long-term retention assessments, with a 11.2% decrease (95%CI:4.4 to 18). The Video+TO group showed little change in doffing compliance (4.2% increase, 95%CI: -15.1 to 6.7, P = .45).

Conclusions

Anesthesiology training receiving an educational intervention for proper PPE donning, both with and without the presences of a trained observer, demonstrated high long-term retention in donning compliance. However, for doffing, trainees only receiving the educational video training had significant decreases in compliance long term. Those who had the trained observer maintained compliance long term.  Our results show that for proper PPE doffing, a short-term video education is not sufficient to maintain long-term compliance. Additionally, training interventions, such as the inclusion of a trained observer, are needed to bolster and maintenance doffing compliance.


ChatGPT in Medical Education: A National Survey of Medical Student Experiences

Alan Xu; Meghan Brennan, M.D.

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Background

Artificial intelligence (AI) is increasingly influencing medical education, with AI-driven chatbots like ChatGPT emerging as powerful study tools among medical students. While these technologies offer numerous benefits, they also pose challenges that warrant the adaption of the medical school curricula.It is hypothesized that ChatGPT is widely being used by medical students for academic study support, but there are concerns regarding its reliability and academic integrity.

Methods

We conducted a cross-sectional study from 08/25/2024 to 12/10/2024 in the United States. Students in all years of medical training, who were enrolled in accredited allopathic or osteopathic medical schools during this period, were eligible to participate. Data were collected using an anonymous, online questionnaire, which was distributed using institutional mailing lists. A total of 177 participants completed the survey representing 14 allopathic and osteopathic medical schools across the U.S. survey items, consisted primarily of Likert scale and multiple-choice questions.

Results

Nearly all participants (98.9% [175/177]) had heard of ChatGPT, with 88.7% (157/177) reported having used it. Medical students most often used ChatGPT to understand complex medical concepts, prepare for exams, and generate study materials. Almost half of users (46.5% [73/157]) reported using it to help complete medical school assignments. Medical students also reported using it clinically, with the most common use being to generate differential diagnoses. Of note, 21.0% (33/157) responded having used ChatGPT to help write clinical notes. Most users (73.9% [116/157]) reported their experience with ChatGPT improved their overall perception of AI’s potential to assist in medical practice, and 86.6% (135/157) believed having it as a resource would make them more effective doctors. Statistical analyses were performed using Pearson’s chi-square test with an α of 0.05. Students who reported moderate or advanced baseline understanding of AI were more likely to practice conscientious use habits such as crosschecking (OR=2.31, 95% CI 1.08-4.97) and editing (OR=2.45, 95% CI 1.05-5.71) the output from ChatGPT before using it compared to those who reported basic or limited understanding.

Bar chart showing usage of ChatGPT among medical students.
Frequency of use of ChatGPT among medical students in the United States (08/25/2024 to 12/10/2024) for clinical-related purposes in medical school.

Conclusions

Our study is among the few to examine medical student perceptions of ChatGPT on a national level. Moreover, we uniquely examined responsible use habits to identify areas where reliance on this technology may lead users astray. We found that ChatGPT is being used to complete academic assignments as well as write clinical notes, raising concerns of information verification, AI literacy, patient confidentiality, and ethical use for academic activities. All together these highlight the need for structured AI education within medical curricula. Integrating AI-focused training can help students leverage these technologies effectively while mitigating risks associated with misinformation and overreliance on AI-generated content.


When Paralysis Outlasts Hypnosis: Pharmacodynamic Mismatch Between Methohexital and Succinylcholine During ECT

Carmelina Gorski, BS1; Asad Bashir, MD1; Brent Carr, MD2 

1 Department of Anesthesiology, University of Florida College of Medicine, Gainesville; 2 Department of Psychiatry, University of Florida College of Medicine, Gainesville

Introduction

Methohexital (MHT) and succinylcholine (SCh) are commonly co-administered during electroconvulsive therapy (ECT) to facilitate rapid induction and brief neuromuscular blockade (NMB). A recognized clinical concern is the temporal mismatch between the hypnotic effects of MHT and the duration of SCh-induced paralysis. Inadequate SCh dosing increases the risk of musculoskeletal injury and postictal agitation, whereas prolonged NMB in the setting of insufficient hypnosis raises concern for awareness during paralysis and respiratory compromise. This mismatch may be amplified in patients with elevated body mass index, where pharmacokinetic variability is increased.

Methods

We reviewed the pharmacokinetics and pharmacodynamics of anesthetic agents commonly used during ECT, with a focus on MHT and SCh. Literature addressing the duration of SCh-induced NMB, dosing strategies in obese and morbidly obese populations, and clinical implications of induction–paralysis mismatch was synthesized.

Results

First, MHT is a short-acting barbiturate with rapid onset and redistribution-dependent offset, resulting in a typical duration of action of approximately 4–7 minutes. In contrast, SCh produces rapid-onset NMB with a functional duration influenced by dose, plasma pseudocholinesterase activity, and body composition.

Current dosing guidelines recommend MHT at 0.5–1.5 mg/kg and SCh at 0.5–1.0 mg/kg, though practice patterns vary. In obese patients, SCh dosing is based on total body weight (TBW) rather than ideal body weight (IBW), reflecting increased extracellular fluid volume and pseudocholinesterase activity. When MHT dosing or hypnotic depth is insufficient relative to TBW-based SCh dosing, paralysis may outlast hypnosis, increasing the risk of awareness during NMB. However, to prevent complications resulting from unnecessarily prolonged paralysis, a recent case report states that it is safer to dose SCh at 1 mg/kg of IBW.

Alternative strategies, such as rocuronium with sugammadex reversal, offer greater control over the duration of NMB. However, this approach introduces additional cost and pharmacologic complexity, and available cohort data do not demonstrate superior clinical outcomes compared with SCh during ECT.

Conclusion

Awareness with residual NMB may result from pharmacodynamic mismatch between MHT and SCh, particularly when TBW-based SCh dosing is not accompanied by adequate hypnotic depth. There is room for discussion regarding the optimal SCh dosing strategy in obese patients. Ensuring unconsciousness prior to NMB, along with individualized titration and appropriate monitoring, remains essential. Future work should explore optimized MHT dosing strategies in obese patients and the role of neuromuscular and depth-of-anesthesia monitoring to reduce risk during ECT.


Double Quarter Turn of the Head and Plumb Bob from the Corner?: A Novel Landmark-Based Approach for Pterygopalatine Fossa Block

Hadia Maqsood, MBBS; Barys Ihnatsenka, M.D.

Department of Anesthesiology, University of Florida College of Medicine, Gainesville

Introduction

The pterygopalatine fossa block (PPFB)—also referred to as the pterygopalatine ganglion block (PPGB), sphenopalatine ganglion block (SPGB), or truncal maxillary nerve/V2 block—is used across a wide range of procedures involving V2-innervated structures. These include palatal surgery, midface and maxillary osteotomies, functional endoscopic sinus surgery, nasal surgery, and tonsillectomy or adenoidectomy. Less frequent applications include ophthalmic procedures [15–16] and its incorporation as part of scalp blocks for craniotomy. Additionally, PPFB has been employed in the management of primary vascular headaches and postdural puncture headaches.

Multiple landmark-based and ultrasound-guided (USG) approaches to PPFB have been described. However, no comparative clinical trials exist, and the optimal technique remains undetermined. Although US guidance provides important advantages, consistent access to equipment and training is not universal. Consequently, a reliable and safe landmark-based technique remains clinically valuable. Existing anatomical approaches vary in needle entry point and trajectory (ref), yet many are challenging to teach or reproduce due to anatomical variability of the facial surface.

A Novel Technique

During a series of US-guided PPFBs, we identified recurring technical patterns that suggested opportunities for a simplified, reproducible landmark-based method. Three observations formed the foundation of this approach.

First, when the needle was inserted precisely at the frontozygomatic (FZ) angle, the optimal trajectory consistently required approximately 25° of posterior and caudad angulation relative to the sagittal and axial planes of the head (Figs. 1–2).

Second, we found that defining needle angulation with respect to anatomical planes—rather than the external facial contour—resulted in far greater consistency across patients.

Third, the most reliable way to reproduce these angles was not by manipulating the needle, but by adjusting the patient’s head orientation and then advancing the needle perpendicular to the floor, analogous to a plumb bob.

These two scans show the FZ angle to the PPF on axial head CT images.
Fig 1. On the left: trajectory from the FZ angle to the PPF (yellow thick dashed line) on axial head CT images, showing approximately 75 degrees relative to the sagittal plane (thin yellow dashed line). On the right: image is rotated. The same trajectory is recreated with a plumb-bob approach when the head is “rotated laterally” so the sagittal plane is angled about 25 degrees relative to the floor. Without head rotation (left image), it is difficult to set the correct angle when relying only on the facial surface—here the trajectory appears perpendicular to the skin surface (green line) but is actually 75 degrees relative to the sagittal plane due to head shape variations. The PPF is outlined by a red circle.

Description of the Technique (When Ultrasound Is Unavailable)

The block is performed with the patient supine after induction of general anesthesia.

  1. The head is rotated laterally, away from the injection side, so that the sagittal plane is angled approximately 25° relative to the OR table.
  2. The cephalad portion of the table is then elevated by an additional 25° relative to the floor.
  3. After palpating the FZ angle, the needle is introduced approximately 5 mm posterior and 5 mm cephalad to the bony counters.
  4. The needle is advanced vertically toward the floor—the “plumb-bob” trajectory.

With the patient’s head positioned as described, this straight-down trajectory produces approximately 25° of posterior and 25° of caudad angulation relative to the cranial planes, replicating the US-derived ideal trajectory. Advancement of 4–5 cm (depending on age, sex, and BMI) positions the needle tip near the sphenopalatine fossa.

This is a CT scan of the coronal head, showing 25 degrees of caudad angulation.
Fig 2. Coronal head CT scan showing approximately 25 degrees of caudad angulation from the FZ angle toward the pterygopalatine fossa (PPF) that is shaded red.

Discussion

This method offers several advantages. The “double 25°” head positioning is intuitive and easy to remember. The plumb-bob trajectory simplifies needle control and minimizes reliance on variable external facial landmarks. Importantly, this trajectory also reduces the risk of unintended orbital penetration—a complication reported with several suprazygomatic anatomical techniques that require anterior (philtrum) needle advancement.1

We validated this technique by using ultrasound to confirm injectate spread after landmark-based needle placement and consistently observed appropriate deposition of local anesthetic within the pterygopalatine fossa. Postoperative assessments further corroborated successful block performance.

This approach conceptually aligns with established landmark techniques that suggest aiming the needle towards the contralateral tragus but provides a more standardized and user-friendly execution.

To further evaluate this technique’s precision, safety, and potential limitations, we plan to accumulate additional clinical experience and perform an anatomical study evaluating our proposed trajectory on 100 head CT scans (across varying ages, sexes, and BMIs).


Acknowledgements

The Department of Anesthesiology wishes to thank the staff from the Communications & Publishing, Clinical Research, and Faculty Support Services teams for working together to make this event successful.

We also thank the many faculty, housestaff, and students who have taken the time to submit their abstracts and make their posters available for this event.

Finally, we would like to thank the abstract judges for carefully reviewing this year’s submissions.