ML Limericks
@MLimericks
Limericks about machine learning. Originally by @fhuszar
You might like
Those were the days, my good friend We wrote gradients down by hand They weren't too complex Our losses were convex You’re too young, you won't understand
Yann LeCun was ready to go With Galactica, a deep net show But then came the recall Of the feature so small It was an AI flop, as we all know
The conference of NeurIPS has come Our tests and models just begun The researchers and nerds Have come from all worlds To show off their work, one by one.
About ML, sadly, he hasn’t a clue Oh, how I hate this man, reviewer two The review had no merit I don’t think he read it I wish he would get sick or fall into poo
There’s no way around it, I have to hereby Declare that my ResNet is pre-AGI It costs a Ferrari To train on Atari It can't put my socks on or pour me a chai
I talked to my good friends at Microsoft We needed some money to refurb the loft Satya said "sure, But please use Azure" A billion dollars will keep us aloft
Thank you for joining us at ICML This year we're promising a whole new level Of deep nets and GANs Muslim travel bans And more west-coast tech bros than ever
Is pytorch a knock-off of chainer? Laments the keras maintainer. Maybe, but if so Keras is also Argues the chief convnet-trainer
PyTorch autograd was inspired by Chainer. Keras was copied on Torch7 (transcribed from Lua to Python). Torch7 was very much inspired by Lush (transcribed from Lisp to Lua). But every time, some new twists are added.
How do I order a tarte à citron? “Lemon tart” responds the Translatatron End-to-end speech-to-speech Is not out of reach If you have data and wealthy patrons
Why are you crying my dear, Shouldn’t you be at ICLR? They rejected my visa Cause I am from Giza. It’s coming to Africa next year!
Elegant dactylic trimeter Using a single parameter?! You might say “no way” But look at it this way: Overfitting’s just got prettier
A common misconception is that the risk of overfitting increases with the number of parameters in the model. In reality, a single parameter suffices to fit most datasets: arxiv.org/abs/1904.12320 Implementation available at: github.com/Ranlot/single-…
Look at this stuff: it's a StyleGAN the output looks just like a floor plan A flat without bedroom With seven feet headroom And living room shaped like a saucepan
A series of blog posts on applying machine learning to architecture Experiments: bit.ly/2XDhjtJ Background: bit.ly/2DIeWh4
He erects no paywall, he charges no fee Andrej is sharing his wisdom for free The number you all want, The Karpathy constant: Ten to the minus fourth power by three
New blog post: "A Recipe for Training Neural Networks" karpathy.github.io/2019/04/25/rec… a collection of attempted advice for training neural nets with a focus on how to structure that process over time
It took a researcher with some gumption to revisit deep energy functions. Try to normalize and you'll bleed out your eyes; It's intractable (by construction).
Progress towards stable and scalable training of energy-based models: 💻Blog: openai.com/blog/energy-ba… 📝Paper: s3-us-west-2.amazonaws.com/openai-assets/… 🔤Code: sites.google.com/view/igebm
His thesis was done in 2013 His online trail is not so clean Though his account has no name His e-mail’s not the same And the e-mail is easy to glean
Remember vacuous bounds do not spark joy Nobody would agree more than Dan Roy He doesn’t get sarcasm Dislikes the huge chasm Between the Bayes risk and bounds you employ
Will people stop citing generalization bounds that are vacuous!!! There is no such thing as a tighter vacuous generalization bound. If it is vacuous then the value is 1!!
Pearl told us ML was curve fitting This lead to lots of folks admitting They may have to buy Hashtag-Bookofwhy Best selling book for the unwitting
Most economists understand that curve fitting is not causal inference, so they excuse themselves with: "we assume ignorability" which makes them feel less guilty. In contrast, Most machine-learning researchers do not see a reason to apologize. #Bookofwhy #econbookclub
.@MLimericks Since your tweets I have read with delight, To respond I thought that I might. But I’ve not got the time To sit down and rhyme. Have you not got a thesis to write?
United States Trends
- 1. Shakur N/A
- 2. #UFC325 N/A
- 3. Volk N/A
- 4. Keyshawn N/A
- 5. #boxing N/A
- 6. Happy Black History Month N/A
- 7. Conor Benn N/A
- 8. Connor Storrie N/A
- 9. #RING6 N/A
- 10. Ortiz N/A
- 11. Keon Ellis N/A
- 12. Dan Hooker N/A
- 13. Cavs N/A
- 14. Kings N/A
- 15. Schroder N/A
- 16. Shu Shu N/A
- 17. Ruffy N/A
- 18. Haney N/A
- 19. Hunter N/A
- 20. Lonzo N/A
You might like
-
Jürgen Schmidhuber
@SchmidhuberAI -
Shimon Whiteson
@shimon8282 -
Sasha Rush
@srush_nlp -
Neil Lawrence
@lawrennd -
Sebastian Ruder
@seb_ruder -
Greg Yang
@TheGregYang -
Sergey Levine
@svlevine -
Yarin
@yaringal -
Matt Gardner
@nlpmattg -
Alec Radford
@AlecRad -
Yee Whye Teh
@yeewhye -
Thomas Kipf
@tkipf -
Alexia Jolicoeur-Martineau
@jm_alexia -
David Duvenaud
@DavidDuvenaud -
Gabriel Synnaeve
@syhw
Something went wrong.
Something went wrong.