Learn how researchers are pairing autonomous vehicles with Chameleon to bridge edge to cloud computation to conduct marine surveys. Featuring work presented at the 2021 Supercomputing conference, with a notebook available on Trovi that you can reproduce yourself, and a YouTube video to accompany it!
What is the central challenge your experiment investigates?
The oceans make up more than seventy percent of the Earth’s surface and marine ecosystems are central to many global challenges. However, monitoring the environment and ensuring the sustainable use of marine resources presents unique challenges spanning spatial and temporal scales including socio-economic balances, technology development, and dynamic ecological processes. Technological advances in computer vision technology, artificial intelligence, and cloud computing allow marine scientists to collect and process data in greater volumes than ever before and offer promise for solutions that may improve the efficiency of real-time data acquisition and analysis to address complex biological and ecological questions.
We aim to better understand the Biscayne Bay ecosystem by improving survey efficiency by leveraging the capabilities of autonomous vehicles (AVs). To do this, we develop metrics of fish abundance and map the fish distributions of important habitats in the Biscayne Bay in real-time. To accomplish this goal, we need to ask two questions.
1. What is the best strategy for collecting and analyzing data from autonomous vehicles and can we leverage cloud computing resources to improve access to data products in real-time?
2. How does the resolution of video data and quality of network connection influence which strategy is best?
How is your research addressing this challenge?
The challenge in this work was to compare configurations in which the vehicle is equipped with CHI@Edge versus configurations that perform similar calculations in the cloud and find out which has the lowest latency time and the best cost benefit.
We measured response time in each configuration, defined as how long from the time of video frame capture until the result can be received from the AI model. We found that in the cloud configuration, runtime defined as the time it takes for the model to run in a frame was dominated by transfer time that is the time it takes for video to be uploaded and returned from IoT to Chameleon Cloud (for example, RTX 6000 could evaluate at 25 ms but it took 235 ms to transfer the data, for a total response time of 260 ms).
How do you structure your experiment on Chameleon?
For this experiment, we used Chameleon’s CHI@Edge, as well as provision cloud resources for our experiment. We were able to connect the devices to GPU nodes on Chameleon’s Cloud and thus answer the questions mentioned in the introduction. The experiments were implemented via a Jupyter notebook that provisioned the resources and ran the experiments.
The prices of each GPU in dollars used to calculate the (price / performance) was collected in August 2021, where FPS is the reciprocal of response time: how many frames can be processed in a given second The Jetson Nano is almost 19 times cheaper per FPS than Edge/Cloud (V100). The cheaper RTX 6000 outperforms the more expensive V100 with this perspective.
Can you point us to artifacts connected with your experiment that would be of interest to our readership?
SC21 Poster presentation: https://youtu.be/jpFun2AbxoY
To reproduce the experiment: https://www.chameleoncloud.org/experiment/share/58
YouTube video from September 2021 CHI@Edge Workshop: https://www.youtube.com/watch?v=YJ46wOgGC40&t=4s
About the Authors
My name is Jonathan Tsen, and I'm currently working as a data scientist at a startup in Brazil. I am interested in specializing in Artificial Intelligence and to accomplish this, I have just started a Master's degree at the Aeronautical Technological Institute (ITA) in São José dos Campos in Brazil. I hope in the future to work as a data scientist at a company in the US. My hobbies include traveling and playing soccer.
Dr. Leonardo Bobadilla (FIU) is currently an Assistant Professor at the Knight Foundation School of Computing and Information Sciences at Florida International University. He received his Ph.D. degree in Computer Science from the University of Illinois at Urbana-Champaign. He has received several awards and has published 37 peer-reviewed journal articles and conference papers. His research articles have appeared in prestigious journals such as IEEE Journal of Automation Science and Engineering, IEEE Robotics and Automation Letters, and ACM Transactions on Sensor Networks. His research has been sponsored by the Army Research Office, Department of Homeland Security, NSF, and the Ware Foundation.
Kevin Boswell is a marine ecologist. His research program broadly focuses on the interacting factors that mediate the distributional patterns, behavior, habitat use, energetics and natural ecology of coastal and oceanic animals, including the implications of ecosystem variability, particularly for rapidly changing environments. To address many of these interests, his lab integrates advanced sampling techniques, such as active and passive underwater acoustics, with observations from autonomous aerial and aquatic platforms to collect high-resolution data to simultaneously describe spatial and temporal patterns of interest, ranging from individual-level interactions to broad ecosystem dynamics.
Are there any researchers you admire? Can you describe why?
I admire Jason Anderson because he is an excellent mentor, and he took me in during my internship at the University of Chicago.
What's your most powerful piece of advice for students beginning research or finding a new research project?
Start, don't end with Wikipedia.
Have a research question in mind.
Deal with one piece at a time.
Use a system.
Know your resources.
Ask for help.
Carry an idea book.
Bring it up to date.