.cache.json
file will be created in your current working directory.
The location can be configured by setting the environment variable UNIFY_CACHE_DIR
.
Deleting the .cache.json
file will of course delete the entire cache,
and queries will once again be made to the LLMs. You can also open up the .cache.json
file in any text editor, and modify the cache on a query-by-query basis if needed.
It’s also worth noting that this cache implementation is very simple. For production
scale caching of thousands or millions of prompts, you should use server-side caching,
which uses SQL under the hood (coming soon).
Thoughts + feedback always welcome on discord! 👾