Have you guys hear about GPT-3?
If you haven't, here it is:
GPT-3 is an unsupervised "transformer language model" and the successor to GPT-2.
OpenAI (the artificial intelligence research laboratory behind GPT-3. Yeah, one of the founders was, obviously, Elon Musk) stated that full version of GPT-3 contains 175 billion parameters.
Just to give you an idea of the proportion of the beast.
Check this comparison:
OpenAI's "GPT-2" - 1.5 billion parameters
Google T5 - 11.0 billion parameters
Turing-NLG - 17.0 billion parameters
GPT-3 - 175.0 billion parameters
Check the some of the GPT-3 applications here: https://gpt3examples.com/#examples
There are tons of Youtube videos explaining GPT-3, its uses, implications, etc...
I was wondering: what if the AI was trained in LUA, then we could just use English to make recipes.
I know, I'm kind of dreaming here... but hey, who knows?