Context Size
LLM Parameters
· ☕ 8 min read · 🤖 Naresh Mehta

When we start learning about Large Language Models (LLMs), it is but natural to become quite interested in how the various parameters, training data size, context size, tokens, etc. affect the performance of the model. And how the existing models out there in the wild; both open and closed source; use the different parameters, what are their strengths and weaknesses, etc. It is also important to know and compare the training data sizes used in such models so one can understand how much resources would a relative model need in order to be trained from scratch.