" />
MediaPipe LLM
Address
:
[go:
up one dir
,
main page
]
Include Form
Remove Scripts
Accept Cookies
Show Images
Show Referer
Rotate13
Base64
Strip Meta
Strip Title
Session Cookies
MediaPipe LLM
For this demo, download the
gemma-2b-it-gpu-int4
model
from Kaggle
.
Load model from disk
Download model from Web
Cancel download
Prompt:
Given the three storage technologies IndexedDB, the Origin Private File System, and the Service Worker Cache, plus the option to store a `FileSystemFileHandle` to IndexedDB pointing at a local file on disk, where should I store large language model files?