Changes between Version 42 and Version 43 of Other/Summer/2023/Features


Ignore:
Timestamp:
Jul 19, 2023, 3:38:49 PM (16 months ago)
Author:
KatieLew
Comment:

Legend:

Unmodified
Added
Removed
Modified
  • Other/Summer/2023/Features

    v42 v43  
    1414=== Project Description
    1515
    16 Neural networks have a long history of being used for classification, and more recently. content generation, Example classifiers including, image classification between dogs and cats, text sentiment classification. Example generative networks include those for human faces, images, and text. Rather than classification or generation, this work explores using networks for feature analysis. Intuitively, features are the high level patterns that distinguish data, such as text and images, into different classes. Our goal is to explore bee motion datasets to qualitatively measure the ease or difficulty of reverse-engineering the features found by the neural networks.
     16Neural networks have a long history of being used for classification, and more recently content generation, Example classifiers including, image classification between dogs and cats, text sentiment classification. Example generative networks include those for human faces, images, and text. Rather than classification or generation, this work explores using networks for feature analysis. Intuitively, features are the high level patterns that distinguish data, such as text and images, into different classes. Our goal is to explore bee motion datasets to qualitatively measure the ease or difficulty of reverse-engineering the features found by the neural networks.
    1717
    1818=== Week 1
     
    3636=== Week 3
    3737
    38 * Programmed a randomness function that allows the user to adjust the degree of randomness of synthetic bee motion along a spectrum, where 0.0 represents a "bee" moving in a completely random motion and 1.0 represents a "bee" moving via a distinct non-random pattern (e.g in a clockwise circle)
     38* **Randomness Function:**We programmed a function that allows the user to adjust the degree of randomness of synthetic bee motion along a spectrum.  0.0 represents the "bee" moving in a completely random motion, and 1.0 represents the "bee" moving via a distinct non-random pattern like a clockwise circle.
    3939
    40 * Trained machine learning model (AlexNet) to identify random versus non-random behavioral patterns using the randomness function
     40* **Train model: **We used the randomness function to trained the machine learning model (AlexNet adjacent) to try to detect the difference between the random and non-random behavioral patterns. The model outputted a confusion matrix and an accuracy of 0.798 in identifying randomness.
    4141
    42 * Researched Shannon's Entropy Function as measure of model accuracy and created program that calculates the joint entropy of two discrete random variables within the randomn system (e.g angle and distance)
     42* **Shannon's Entropy: **We researched Shannon's Entropy Function as a measure of the model's accuracy and created a program that automates the calculation of the joint entropy of two discrete random variables within the random system (e.g angle and distance)
    4343
    44 [[Image(0.0.mov, width=200, height=200)]] [[Image(0.1.mov, width=200, height=200)]] [[Image(0.3.mov, width=200, height=200)]] [[Image(0.9.mov, width=200, height=200)]]
     44[[Image(0.0.mov, width=200, height=200)]] [[Image(0.1.mov, width=200, height=200)]] [[Image(0.3.mov, width=200, height=200)]] [[Image(0.9.mov, width=200, height=200)]]
     45
     46Simulations generated from randomness function (from left to right: 0.0, 0.1, 0.3, 0.9) ↑
     47
     48[[Image(Screen Shot 2023-07-19 at 10.44.07 AM.png, width=400, height=125)]] [[Image(Screen Shot 2023-07-19 at 10.46.33 AM.png​, width=400, height=125)]]
     49
     50Model accuracy from initial training run ↑
    4551
    4652=== Week 4
     53
     54* **Validate results:** We discovered that there was a mistake in our training data, so last week's training results were null. There was a bias in the input data, and irrelevant learning happened.
     55
     56* **Retrain model** We retrained the machine learning model using simpler test cases, like the black-white frame test. With simple black and white classes, our model obtained 100% accuracy. With more complicated classes, our model obtained 98% accuracy.
     57
     58* **Reformat tar files **
    4759
    4860=== Week 5