What llms.txt is
llms.txt is a markdown file at the root of a website (served at /llms.txt, like /robots.txt) that gives AI assistants a structured summary of the site. It typically lists the site's purpose, primary services, key resource URLs grouped by intent, and citation preferences. The format is a proposed convention by Jeremy Howard, supported informally by reputable AI engines, and adopted by serious AEO programs as a low-cost, high-upside foundation.
The proposal originated with Jeremy Howard and the fast.ai team in 2024, modeled loosely on robots.txt. The premise is simple: AI models reading a website have to wade through a lot of HTML to figure out what the site is, what it offers, and which URLs matter. A single markdown file at /llms.txt gives them a clean, structured summary they can ingest in one fetch.
The file is plain markdown. There is no XML, no schema, no validation tool you have to pass. The convention is loose by design because the goal is to be read by language models, which are tolerant parsers. What matters is that the file is short, accurate, and well-organized.