A humorous cartoon to initiate a conversation about interpreting a time series plot (e.g. discussing trend versus random components). The cartoon was drawn by American cartoonist Jon Carter in 2014.
A humorous cartoon to initiate a conversation about interpreting a time series plot (e.g. discussing trend versus random components). The cartoon was drawn by American cartoonist Jon Carter in 2014.
A humorous cartoon to initiate a conversation about reasons for low response rates. The cartoon was drawn by American cartoonist Jon Carter in 2013.
A humorous cartoon to initiate a conversation about the importance of using graphics for a purpose in order to show important features of data and not just to add sizzle. The cartoon was drawn by American cartoonist Jon Carter in 2015.
A humorous cartoon to initiate a conversation about how graphs are an efficient "language" for describing data. The cartoon is drawn by American cartoonist Jon Carter in 2015.
A humorous cartoon to initiate a conversation about censored data situations such as those seen with survival data. The cartoon is drawn by American cartoonist Jon Carter in 2015.
A song designed to assist in teaching the basics of Multi-Armed Bandits, which is a type of machine learning algorithm and is the foundation for many recommender systems. These algorithms spend some part of the time exploiting choices (arms) that they know are good while exploring new choices. The song (music and lyrics) was written in 2021 by Cynthia Rudin from Duke University and was part of a set of three data science oriented songs that won the grand prize in the 2023 A-mu-sing competition. The lyrics are full of double entendres so that the whole song has another meaning where the bandit could be someone who just takes advantage of other people! The composer mentions these examples of lines with important meanings:
"explore/exploit" - the fundamental topic in MAB!
"No regrets" - the job of the bandit is to minimize the regret throughout the game for choosing a suboptimal arm
"I keep score" - I keep track of the regrets for all the turns in the game
"without thinking too hard," - MAB algorithms typically don't require much computation
"no context, there to use," - This particular bandit isn't a contextual bandit, it doesn't have feature vectors
"uncertainty drove this ride." - rewards are probabilistic
"I always win my game" - asymptotically the bandit always finds the best arm
"help you, decide without the AB testing you might do" - Bandits are an alternative to massive AB testing of all pairs of arms
"Never, keeping anyone, always looking around and around" - There's always some probability of exploration throughout the play of the bandit algorithm
A music video designed to assist in teaching the basics of Multi-Armed Bandits, which is a type of machine learning algorithm and is the foundation for many recommender systems. These algorithms spend some part of the time exploiting choices (arms) that they know are good while exploring new choices (think of an ad company choosing an advertisement they know is good, versus exploring how good a new advertisement is). The music and lyrics were written by Cynthia Rudin of Duke University and was one of three data Science songs that won the grand prize and first in the song category for the 2023 A-mu-sing competition.
The lyrics are full of double entendres so that the whole song has another meaning where the bandit could be someone who just takes advantage of other people! The author provides these examples of some lines with important meanings:
"explore/exploit" - the fundamental topic in MAB!
"No regrets" - the job of the bandit is to minimize the regret throughout the game for choosing a suboptimal arm
"I keep score" - I keep track of the regrets for all the turns in the game
"without thinking too hard," - MAB algorithms typically don't require much computation
"no context, there to use," - This particular bandit isn't a contextual bandit, it doesn't have feature vectors
"uncertainty drove this ride." - rewards are probabilistic
"I always win my game" - asymptotically the bandit always finds the best arm
"help you, decide without the AB testing you might do" - Bandits are an alternative to massive AB testing of all pairs of arms
"Never, keeping anyone, always looking around and around" - There's always some probability of exploration throughout the play of the bandit algorithm
This song is about overfitting, a central concept in machine learning. It is in the style of mountain music and, when listening, one should think about someone staying up all night trying to get their algorithm to work, but it just won't stop overfitting! The music and lyrics are by Cynthia Rudin from Duke University and was one of three data science songs by Dr. Rudin that won the grand prize and 1st place in the song category in the 2023 A-mu-sing competition.
This song is about the k-nearest neighbors algorithm in machine learning. This popular algorithm uses case-based reasoning to make a prediction for a current observation based on nearby observations. The music and lyrics were written by Cynthia Rudin from Duke University who was accompanied by Dargan Frierson, from University of Washington in the audio recording. The song is one of three data science songs written by Cynthia Rodin that took the grand prize and first prize in the song category in the 2023 A-mu-sing competition.
A cartoon to spark a discussion about the normal equations in the matrix approach to linear models. The cartoon was created by Kylie Lynch, a student at the University of Virginia. The cartoon won first place in the non-song categories of the 2023 A-mu-sing competition.