Latent Attention & Training ML Algorithms – Jamar Sullivan

Latent Attention & Training ML Algorithms - Jamar Sullivan

190 People Read – 0 People Liked – You Can Also Like

Jamar Sullivan is an incoming freshman at the University of Chicago studying computer science and astrophysics, and recent high school graduate from Gwendolyn Brooks College Prep. This summer, he continued work with Prof. Blase Ur in the SUPERGroup Lab to explore the difference in machine learning models’ performance when using human-collected vs. machine-learned attention. The project created a user interface that requires users to select words that they believe indicate the sentiment of a movie review, and then created a model that would learn the indicative words in a movie review dataset. It’s understood that attention can lead to greater performance in machine learning models, but collecting human information means that it is possible to collect more data from the same sized dataset, and get high accuracy with a smaller model.

This presentation was part of the 2020 CDAC Data & Computing Summer Lab, an immersive 10-week paid summer research program at the University of Chicago. For more on the CDAC Summer Lab, visit https://cdac.uchicago.edu/engage/summerlab/

Youtube

Make Beautify

Latent Attention & Training ML Algorithms – Jamar Sullivan

By Luther