[$] Supporting kernel development with large language models
Kernel development and machine learning seem like vastly different areas of
endeavor; there are not, yet, stories circulating about the vibe-coding of
new memory-management algorithms. There may well be places where machine
learning (and large language models — LLMs — in particular) prove to be
helpful on the edges of the kernel project, though. At the 2025
North-American edition of the Open Source Summit, Sasha Levin presented
some of the work he has done putting LLMs to work to make the kernel better