The GeoPoints (1998~2004) project at the GeoVirtual Laboratory at West Virginia University was research into mobile interactive augmented reality, where a virtual overlay of "GeoPoints" existed within the real world environment. These GeoPoints were interactive, meaning they had spatial dimensions, triggering 'fences' and were able to interact with a user based upon how the user encountered it.
 
GeoPoints could be easily created by the user to be within 20mm accuracy within 6 degrees of freedom, able to place them over tables, doorways, points of interest, almost anywhere. Then the behavior could be defined, whether they would trigger off an event, invoke a multimedia object, perform a scheduled event, or a series of events.  
Early Prototype consisted of a very large laptop with a camera, accelerometer, GPS and microphone. The concept was to get away from keyboard devices and begin to use hand gesturing and voice command to control an environment. The second obstacle was to get GPS indoors. From initial trials, we were able to "sweep" coordinates indoors from repeated traffic and logs from other various known beacons such as WIFI access points, known physical routes and markers. By creating a 'mesh' style navigation grid, we were able to get down to about 2.5" accuracy indoors, enough to highly locate a GeoPoint almost anywhere the user chose to place it. Using the "Sens8" accelerometer we gained another three degrees of movement (pitch/roll/yaw), allowing us much greater positioning options of an object. 
Image of what the user is seeing in mixed reality, usually in a Heads Up display or AR glasses. The system is being pointed at a street, and spatially overlaid onto the street is a red "GeoPoint" with a "GeoFence" located around it. 
2nd Generation of the Augmented Reality headset with high definition stereo audio with 3D spatial positioning, voice control, HD camera with a hybrid mix of machine vision and GPS defined spatial awareness, and an augmented mixed reality headset to overlay virtual objects projected onto the glasses. 
2nd generation prototype of the controller device for the mobile wearer system. 
A future user of Augmented Reality trying out the first versions of the research as early as 1998. 
A specific research outcome was to be able to use large scale terrain overlaid with interactive virtual objects that had physical directives and controls within the real world. In this case, the user is looking at a chemical plant building and performing measurements as to effective safe zones. The user has also pulled a layer out in order to view various assets (i.e. police, emergency vehicles, etc) that are enroute or have been deployed and are active within the situation. The wearer's computing system consists of an augmented reality headset, 3D sound system, high definition camera for machine vision and gesture recognition and interaction. 
Back to Top