Machine Learning: The Great Stagnation ↦
This piece by Mark Saroufim on the state of ML starts pretty salty:
Graduate Student Descent is one of the most reliable ways of getting state of the art performance in Machine Learning today and it’s also a fully parallelizable over as many graduate students or employees your lab has. Armed with Graduate Student Descent you are more likely to get published or promoted than if you took on uncertain projects.
BERT engineer is now a full time job. Qualifications include:
- Some bash scripting
- Deep knowledge of pip (starting a new environment is the suckier version of practicing scales)
- Waiting for new HuggingFace models to be released
- Watching Yannic Kilcher’s new Transformer paper the day it comes out
- Repeating what Yannic said at your team reading group
It’s kind of like Dev-ops but you get paid more.
But if you survive through (or maybe even enjoy) the lamentations and ranting, you’ll find some hope and optimism around specific projects that the author believes are pushing the industry through its Great Stagnation.
I learned a few things. Maybe you will too.
Sign in or Join to comment or subscribe