nextgenllm's profile picture. consult for big Data and AI

nextgenllm

@nextgenllm

consult for big Data and AI

🧠 Why GPT-3 was actually UNDERTRAINED (and how Chinchilla fixed AI forever) GPT-3: 175B params, 300B tokens Chinchilla optimal: 3.3T tokens needed (11x more!) Full breakdown + Python demo 👇 medium.com/nextgenllm/und… #AI #LLM #MachineLearning #ChatGPT


There was a problem, please refresh the page and try again.

United States Trends

Loading...

Something went wrong.


Something went wrong.