← Back to context

Comment by PaulDavisThe1st

4 years ago

> I like the idea of one "base model" that represents language and many different context models on top of it (finetuning the core model)

This is an entirely different concept of computer language than the current GPT style models. These systems don't "represent language", and cannot. The whole reason why GPT is so exciting right now is that it fundamentally threw away the entire concept of "representing language". That has some upsides ... and some downsides.