Changes between Initial Version and Version 1 of Other/Summer/2023/Inference


Ignore:
Timestamp:
Jul 5, 2023, 2:52:14 PM (17 months ago)
Author:
tm897
Comment:

Legend:

Unmodified
Added
Removed
Modified
  • Other/Summer/2023/Inference

    v1 v1  
     1Project Title: Resilient Edge-Cloud Autonomous Learning with Timely inferences
     2Project Advisor and Mentors: Professor Anand Sarwate, Professor Waheed Bajwa, Nitya Sathyavageeswaran, Vishakha Ramani, Yu Wu, Aliasghar Mohammadsalehi, Muhammad Zulqarnain
     3Names: Shreya Venugopaal, Haider Abdelrahman, Tanushree Mehta, Lakshya Gour, Yunhyuk Chang
     4
     5
     6Objective: The purpose of this project is to use ORBIT in order to
     7
     8Main objectives
     9learn how to design and run experiments on ORBIT
     10prototype a handoff scheme for ML prediction
     11develop a latency profiling framework for MEC-assisted machine learning
     12
     13– Split the network, early exit, simulate multiple CPU speeds, network delays
     14
     15Progress
     16Goals
     17Team introductions
     18Kickoff meetings
     19Refining and understanding research questions
     20go over goals for the summer
     21For next week:
     22Go through WINLAB orientation (TBD)
     23Reading for next time:
     24PPA Chapters 1 and 2
     25Coding:
     26install python (via Anaconda or something else)
     27go through the interactive intro to python online or by downloading the notebook
     28Review the basics of Python linked from the UT Austin site to learn basic syntax, flow control, etc.
     29Week 2:
     30Goals:
     31Summary:
     32Basics of pattern recognition and Machine Learning (PPA - Patterns, Predictions, Actions)
     33set up an instance using pytorch on an Orbit node
     34Created a node image with Pytorch
     35Basics of Pytorch
     36Created small Machine Learning models
     37Loaded the Modified National Institute of Standards and Technology (MNIST) database onto the node
     38computed the second moment matrix
     39Did PCA and SVM on MNIST
     40Next Steps:
     41Create and train a “small” and “large” Neural Network
     42Attempt to simulate the difference between their performances at inference
     43Links:
     44Week 3:
     45Goals:
     46Summary:
     47Created and Trained a ‘Small’ and ’Large’ Neural Networks
     48Compared their performances on the CIFAR10 dataset
     49Established connection between two nodes
     50Communicated test data between nodes to compare accuracy and delay between our NN models
     51Need to serialize by bytes instead of transferring as strings
     52Had a discussion on various research papers related to our project
     53Next Steps:
     54Send the data as bytes instead of strings
     55Calculate the times for transfer & processing
     56Read more papers related to stuff about early exit and edge cloud computing
     57Divide data into “chunks” for faster and more efficient transmission
     58Design experiments to test different architectures and implementations of Early Exiting and Split Computing
     59Track and add Age of Information Metrics
     60Presentation links:
     61Link - Lakshya
     62Link - Shreya
     63Link - Haider
     64Link - James
     65Link - Tanushree
     66Week 4:
     67Goals
     68Summary
     69Next Steps
     70