Commentary

  • This is so well put, LLMs are not compilers or interpreters, they still are unreliable. I mean software is unreliable, untested software is unreliable but LLM generated code is on the extreme end of black holes.
  • Hallucination is a things, I am not talking about on-surface wrong things, I am talking about deep and little details, that right now only human developers can craft.