Project Abstract:
Jet Propulsion Laboratory (JPL) is world famous for contributions to space exploration, but also does significant work related to Earth Science. Much of the earth science data that JPL has access to is made available on the web and through REST API calls. One example is JPL’s Watertrek system. JPL's Watertrek keeps track of various water data such as well locations and levels, river flow, snow pack, reservoir levels, aquafer boundaries. Sponsored by JPL, this project aims at building a general frame work for Augmented Reality (AR) mobile applications that provide visualization for JPL's Watertrek data. The app and framework will be prototyped for Android for this project. In the future, it is expected for JPL to add support for iOS and the DJI drone platforms.
The developed framework provides the ability to display data using Augmented Reality (AR) techniques. This allows a user to point their smart phone at their surroundings to see their surroundings with superimposed data/graphics that relate to their surroundings on the smart phone screen. In order to accomplish this, the framework is able to (1) collect appropriate scientific data that is to be displayed for the surrounding area (from JPL's Watertrek or other sources), (2) generate/load graphics items to be displayed on top of the surrounding area (such as text, 2D images, and 3D meshes), (3) detect the physical location of key features in the surrounding area with respect to the phone, (4) keep the graphics items lined up with the key features from the surrounding area, even when the device is moved/rotated, (5) allow the user to interact with the graphics items, or the surrounding terrain (by tapping, scrolling, etc). All the components of this framework have been built from scatch.
In order to showcase the features of our developed framework and test its capabilities in the real world, the team created an application that tested all the modules currently with functionality. This produced an application that could receive real time data from JPL's REST api, display on a 2D map (Google maps) and at the same time have 3D objects superimposed fulfilling the Augmented Reality component of the project.
This AR system can make working with scientific data more engaging, understandable and interesting and will most definitely lead to increased interest in science and accelerated discovery.
Team Members:
Project Lead: Wilbert Veit - wilbertveit@rocketmail.com
Database: Kaicheng Zhou - kaichengzhou1@gmail.com
Network: Ernesto Padilla - miro1380@gmail.com
UI/UX: Christopher Nguyen - christopher_ng@aim.com
Framework: Cuong Pham - billvpham@gmail.com
Liaisons:
Shan Malhotra - shantanu.malhotra@jpl.nasa.gov
Emily Law - emily.s.law@jpl.nasa.gov
Michael Rueckert - michael.a.rueckert@jpl.nasa.gov
George Chang - george.w.chang@jpl.nasa.gov
Meeting Days / Time:
Member Only: Tuesday 4:30pm - 7:00pm
Thursday 10:30am - 1:30pm, 4:30pm - 5:30pm
With advisor: Tuesdays 3:30pm - 4:30pm
With liaisons: Thursdays 3:30pm - 4:30pm
Useful links:
Original Project Description: https://csns.calstatela.edu/wiki/content/JPL-1/Hydrology/Proposal
Wiki Page: https://csns.calstatela.edu/wiki/content/JPL-1/Hydrology
Project Presentation: https://drive.google.com/open?id=1HrNhtybcGaqYVTzKRXVOHrPKsoKJXl2U
Requirements Document: https://drive.google.com/open?id=1sEQLQ_9PwnC0cDPbEBnxJKCzhRw56V7j
Demo Videos: https://drive.google.com/drive/folders/1WFTIQha6G7VoNhvzTlswLHnXOjMOZlgg?usp=sharing
Wiki Page |
Original Project Description |
Project Presentation |
Requirements Document |
Project Poster |
Design Document |
Final Framework Source Code |
Final App Source Code |
Final Project Report |