Category – User Experiments

A Statistics-Based Performance Testing Methodology for Cloud Applications

University of Texas, San Antonio Professor Wei Wang and PhD student Sen He investigate performance testing for cloud computing research to help make your research more efficient and cost effective. Learn about their research, which won the ACM SIGSOFT Distinguished Paper award in 2019, experience on Chameleon and AWS, and life philosophies.

Chameleon for Education: IIT’s Intro to Parallel Programming

Interested in using Chameleon for education? Illinois Institute of Technology’s TA and PhD candidate Melanie Cornelius and Dr. Zhiling Lan use Chameleon for undergraduate and graduate students in their Intro to Parallel Programming and Parallel and Distributed Processing classes. Learn all about how the course is structured, incorporating Chameleon into assignments, and tips for using Chameleon for education.

Network Traffic Fingerprinting of IoT Devices

This blog features Stevens Institute of Technology PhD candidate Batyr Charyyev’s research on using network traffic fingerprinting of IoT devices for device identification, anomaly detection and user interaction identification. Learn more about Charyyev and his research, including its applications to infer voice commands to smart home speakers.

Profile-Guided Frequency Scaling for Latency-Critical Search Workloads

Interested in learning how you can save power in computer systems, especially with latency critical applications? Learn more about how profile-guided frequency scaling can help solve this problem, with research supervised by Assistant Professor Vinicius Petrucci at the University of Pittsburgh and presented last month at IEEE/ACM CCGrid!

High-Performance Federated Learning Systems

This work is part of George Mason University PhD student Zheng Chai and Prof. Yue Cheng’s research on solving federated learning (FL) bottlenecks for edge devices. Learn more about the authors, their research, and their novel FL training system, FedAT which already has impressive results, improving prediction performance by up to 21.09% and reducing communication cost by up to 8.5 times compared to state-of-the-art FL systems.

Fluid: Resource-Aware Hyperparameter Tuning Engine

This blog feature explores 4th year University of Michigan PhD student Peifeng Yu’s research on hyperparameter tuning, presented earlier this month at MLSys21. Learn more about Yu, the hyperparameter tuning engine, and how it can improve your deep learning model training process.

Automated Calibration of CyberInfrastructure Simulations Based on Real-World Chameleon Executions

Learn about using Chameleon to develop automated calibration for cyberinfrastructure research as part of WRENCH research team member's William Koch's Master's thesis. A M.S. student at the University of Hawai`i at Manoa (UHM), Koch explores cyberinfrastructure research, this research project's approach, and his research background in this blog post.

Using AI to Direct Traffic: Building Self Learning Networks on Chameleon

Dr. Mariam Kiran is a research scientist in the Scientific Networking Division, as a member of the Prototypes and Testbed group at ESnet, LBNL, and is leading research efforts in AI solutions for operational network research and engineering problems. In this blog, she discusses her research project DAPHNE (Deep and Autonomous High-speed Networks), her use of Chameleon, and her research background.

Biometric Research in The Cloud

January’s User Experiment’s blog features Keivan Bahmani, a PhD candidate at Clarkson University. Learn more about Bahmani and his use of Chameleon for biometric research.

Performance Analysis of Deep Learning Workloads Using Roofline Trajectories on Chameleon

Dr. Xiaoyi Lu is a research assistant professor at The Ohio State University focusing on High Performance Interconnects and Protocols, Big Data Computing, Deep Learning, Parallel Computing, Virtualization, and Cloud Computing. In this blog post, we explore his research and usage of Chameleon Cloud.