Jet Propulsion Laboratory (JPL) collects and stores hydrology data from the Western United States, using sensors out in the field and algorithms that they have developed, into their Watertrek database. One example of this is using sensors placed in the mountains to weigh snow pillows and give an approximation of the snow depth. Provided the Augmented Reality (AR) framework for Android from last years Senior Design team, version 2 of the Augmented Reality for Hydrology application will represent the hydrology data by implementing the framework.
Since the framework provided by last years team was built from the ground up, there are limitations when implementing it into the Android application. The framework solely relies on the device's gyroscope, excluding computer visions as an aid, to place and track generated objects in the real world. When implementing a mesh for the surrounding terrain and billboards that represent the various hydrological data, it is hard to keep track of the objects and to make sure that they stay in the correct position. Since the hardware of each device varies widely, a way to check the placement, orientation, and possibly scaling of the objects may be corrected by using network calls to an algorithm provided by JPL. An algorithm called Line of Sight (LOS) returns the elevation of a location provided the location and orientation of the device, and this can be used to check the placement and scaling of the terrain mesh. Another challenge is retrieving and representing the historical data for the different data types in a way that is understandable to the user. Using a line graph with respect to time is a simple and clear way to represent the historical data so the user can see change over time. In addition, a list view of the data will be provided to the user in order for them to see in greater detail the historical data. The following are the focus of the software will be developed :
- Refine the interactive user interface that visualizes datasets in the field of view of the camera
- Present data in a graph format for clear interpretation (provided a filter based on a range in time)
- Superimpose terrain mesh onto actual terrain through the camera view
- Provide line of sight information
Based on our testing results, we were able to correctly place the hydrological data points with respect to their perspective and provide the historical data per point of interest. In regards to the terrain mesh, it is a work in progress but it roughly represents the terrain around the user.
Developing Augmented Reality for Hydrology (Version 2) allows users to interactively learn about hydrology on the go. The AR technology will extend what the user can see in the real world and with the representation of the historical data, the user can interpret what is occurring.
Project Lead: Mher Oganesyan - firstname.lastname@example.org
Graphics Engineer/Lead Programmer: Refugio Arroyo-Martinez - email@example.com
Network/Database: Leonardo Obay - firstname.lastname@example.org
Data Analyst: Anthony Soto - email@example.com
UI/UX: Gilberto Placidon - firstname.lastname@example.org
Shan Malhotra - email@example.com
Michael Rueckert - firstname.lastname@example.org
Natalie Gallegos - Natalie.Gallegos@jpl.nasa.gov
George Chang - email@example.com
Emily Law - firstname.lastname@example.org
|Software Requirements Specification|
|Software Design Document|
|Project Demo Video|