Evelyn Madewell, a graduate student in the Department of Aerospace Engineering at Texas A&M University, has received the prestigious 2025 National Science Foundation Graduate Research Fellowships Program (NSFGRFP).
Madewell, a Ph.D. student, graduated from the University of Washington with a major in Aeronautical and Astronautical Engineering with an Applied Mathematics minor. At UW, her capstone project was awarded the Capstone Technical Excellence and went on to compete as the first US team in the European Air Cargo Challenge with their designed aircraft. Madewell was also recognized as one of the “Trailblazing Women of Aerospace” in the UW Aero & Astro Highflight newspaper.
Madewell has interned with Freefly Systems as a flight test and software engineer where she flew many of their industrial drones, executing test plans, retrofitting with various updates, and creating production scripts. Following this she interned with Hood Technology, where she engaged in a literature review on vision-based UAV navigation while still in her last quarter of undergrad. During the summer, she developed a tracking display for in-flight UAV docking from radar test data as well as drafted and iterated on skyhooking models with prototype testing. This upcoming summer she will be contributing to the Stratolaunch Talon A system as a guidance, navigation, and control intern.
Her NSF proposal discussed how despite the growing applications and technologies available for aerial navigation, current navigation systems for Unmanned Aerial Vehicles (UAVs) typically rely on a pairing of GNSS location information, which can be unreliable due to obstacles or simply limited in remote environments, and IMU measurement data, which can have compounded position error. Madewell will work with Professor Valasek to develop a novel, visual-based navigation system that precisely estimates aerial vehicle pose and location in GPS-denied scenarios, enabling UAVs to reach targets and fly in previously inaccessible areas as well as creating a reliable alternative in GPS outages.
Madewell says “I’m thrilled to accept this prestigious honor. I am especially grateful to Dr. Valasek for his encouragement in pursuing this as well as to Dr. Vagners for his advice along the way — without these two mentors I would not be the engineer I am today. As a recipient of the NSF Graduate Research Fellowship, I am excited to utilize its resources while continuing my research alongside Dr. Valasek and the Vehicle Systems & Control Lab team as a graduate student at Texas A&M University.” Congratulations Evelyn! VSCL is thrilled to have you on our team!





Zach Curtis is graduated from 






Approaches for teaching learning agents via human demonstrations have been widely studied and successfully applied to multiple domains. However, the majority of imitation learning work utilizes only behavioral information from the demonstrator, i.e. which actions were taken, and ignores other useful information. In particular, eye gaze information can give valuable insight towards where the demonstrator is allocating visual attention, and holds the potential to improve agent performance and generalization. In this work, we propose Gaze Regularized Imitation Learning (GRIL), a novel context-aware, imitation learning architecture that learns concurrently from both human demonstrations and eye gaze to solve tasks where visual attention provides important context. We apply GRIL to a visual navigation task, in which an unmanned quadrotor is trained to search for and navigate to a target vehicle in a photo-realistic simulated environment. We show that GRIL outperforms several state-of-the-art gaze-based imitation learning algorithms, simultaneously learns to predict human visual attention, and generalizes to scenarios not present in the training data.