BERT: one NLP model to rule them all
Fully Connected – a series where Chris and Daniel keep you up to date with everything that’s happening in the AI community.
This week we discuss BERT, a new method of pre-training language representations from Google for natural language processing (NLP) tasks. Then we tackle Facebook’s Horizon, the first open source reinforcement learning platform for large-scale products and services. We also address synthetic data, and suggest a few learning resources.
Discussion
Sign in or Join to comment or subscribe