The interactive file manager requires Javascript. Please enable it or use sftp or scp.
You may still browse the files here.

Download Latest Version 0.27.1 source code.tar.gz (233.8 kB)
Email in envelope

Get an email when there's a new version of LLM CLI

Home / 0.25
Name Modified Size InfoDownloads / Week
Parent folder
0.25 source code.tar.gz 2025-05-05 178.0 kB
0.25 source code.zip 2025-05-05 213.3 kB
README.md 2025-05-05 2.2 kB
Totals: 3 Items   393.5 kB 0
  • New plugin feature: register_fragment_loaders(register) plugins can now return a mixture of fragments and attachments. The llm-video-frames plugin is the first to take advantage of this mechanism. #972
  • New OpenAI models: gpt-4.1, gpt-4.1-mini, gpt-41-nano, o3, o4-mini. #945, #965, #976.
  • New environment variables: LLM_MODEL and LLM_EMBEDDING_MODEL for setting the model to use without needing to specify -m model_id every time. #932
  • New command: llm fragments loaders, to list all currently available fragment loader prefixes provided by plugins. #941
  • llm fragments command now shows fragments ordered by the date they were first used. #973
  • llm chat now includes a !edit command for editing a prompt using your default terminal text editor. Thanks, Benedikt Willi. #969
  • Allow -t and --system to be used at the same time. #916
  • Fixed a bug where accessing a model via its alias would fail to respect any default options set for that model. #968
  • Improved documentation for extra-openai-models.yaml. Thanks, Rahim Nathwani and Dan Guido. #950, #957
  • llm -c/--continue now works correctly with the -d/--database option. llm chat now accepts that -d/--database option. Thanks, Sukhbinder Singh. #933
Source: README.md, updated 2025-05-05