medical / technology / education / art / flub
Improving Language Understanding with Unsupervised Learning: We've obtained state-of-the-art results on a suite of diverse language tasks with a scalable, task-agnostic system, which we're also releasing. Our approach is a combination of two existing ideas: transformers and unsupervised pre-training. These results provide a convincing example that pairing supervised learning methods with unsupervised pre-training works very well;
Source: blog.openai.com
unsupervised pre-training language results pairing task-agnostic convincing transformers