Commentary

  • I haven’t completed watching it but felt really excited to learn more about LLMs.
  • I like the analogy of human brain and the LLM. When we sleep we kind of reset the context window, but update our parameters, we internalise the lessons, we can think and process in the background and connect stuff up.
  • I also found it surprising that reaching the state of the art models with 1B parameter would take a decade or so? Kind of practical but considering the frequency of the current releases of models, it looks it could happen almost next year.