Yes, it's another LLM-powered chat over documents implementation... but this one is entirely local in your browser!🌐The vector store (Voy) and embeddings (Transformers.js) are served via Vercel Edge function and run fully in the browser with no setup required.♊It uses the experimental preview of Chrome's built-in Gemini Nano model. You will need access to the program to use this mode.🚧Note that the built-in Gemini Nano model is quite small, experimental, and not chat tuned, so results may vary!🗺️The default embeddings are Xenova/all-MiniLM-L6-v2. For higher-quality, slower embeddings, switch to nomic-ai/nomic-embed-text-v1 in app/worker.ts.🦜LangChain.js handles orchestration and ties everything together!🐙This template is open source - you can see the source code and deploy your own version from the GitHub repo!👇Try embedding a PDF below, then asking questions! You can even turn off your WiFi after the initial model download.