• Skip to primary navigation
  • Skip to main content
  • LinkedIn
  • Videos
  • Research
    • Facilities
    • Vehicles
    • Sponsors
  • Publications
    • Books
    • Journal Papers
    • Conference Papers
  • People
    • Faculty
    • Staff
    • Graduate Students
    • Undergraduate Students
    • Alumni
    • Where VSCL Alumni Work
    • Friends and Colleagues
  • Prospective Students
  • About Us
  • Contact Us
  • Where VSCL Alumni Work

Texas A&M University College of Engineering

Intelligent Vision Sensing For Motion Based Guidance

State of Texas Advanced Research Program, Austin, TX
1 January 2002 – 31 December 2003
Co-P.I. John L. Junkins
Total award $240,000

NASA Langley Research Center

VisNav Glove Flight System
Air vehicles have always required numerous hours of pilot training to obtain a sufficient level of competence. Most people can understand the pitching, rolling, and yawing motions of an airplane by simply watching them fly. However translating these motions into control stick, throttle, and rudder petal movements is much less intuitive. This research focuses on the development of a new glove-based input device, utilizing the revolutionary Vision Based Navigation system, developed at Texas A&M, called VisNav. This data glove type interface is designed to enable the average person to command and fly an aircraft, using only hand motions. This is a very intuitive and natural way to pilot an airplane, and requires very little specialized training. It is a particularly useful capability for rapid prototyping and evaluation of flight control concepts at real-time flight simulator facilities. This concept can also be extended outside of the simulator to allow for remote control of semi-autonomous unmanned aerial vehicles.

Over the last five years, the Aerospace Engineering Department at Texas A&M University has been researching and developing an intelligent Vision Based Navigation system called VisNav. The VisNav system comprises a new kind of optical sensor combined with structured active light sources (beacons) to achieve a selective or “intelligent” vision. Light is structured in the frequency domain, analogous to radar, so that discrimination and target identification is near-trivial even in a noisy ambient environment. We have applied this technology to the problems of autonomous docking and rendezvous of spacecraft (NASA Johnson Space Center), autonomous landing of UAV’s on ships (Office of Naval Research), and autonomous aerial refueling of UAV’s (Army Research Office). Essentially, anywhere that extremely accurate relative position and flight rate information is needed with miniaturized equipment.

Specific tasks and research objectives:

  • Identify the technology factors and requirements for extending the basic VisNav technology.
  • Use these technology factors and requirements to enable development of a VisNav wireless data glove for the remote control of vehicles using hand motions and gestures.
  • Demonstrate real-time operation of the data glove for controlling a high fidelity, real-time, flight simulator.

Working with me on this program are Graduate Research Assistants:

  • Brian Wood
  • Roshawn Bowers
  • Yuanyuan Ding

© 2016–2025 Log in

Texas A&M Engineering Experiment Station Logo
  • State of Texas
  • Open Records
  • Risk, Fraud & Misconduct Hotline
  • Statewide Search
  • Site Links & Policies
  • Accommodations
  • Environmental Health, Safety & Security
  • Employment