InfoQ Icon InfoQ

AI training method exceeds GPT-3 performance with 99.9% fewer parameters  ↦

A team of scientists at LMU Munich have developed Pattern-Exploiting Training (PET), a deep-learning training technique for natural language processing (NLP) models. Using PET, the team trained a Transformer NLP model with 223M parameters that out-performed the 175B-parameter GPT-3 by over 3 percentage points on the SuperGLUE benchmark.


Sign in or Join to comment or subscribe

Player art
  0:00 / 0:00