New LLM models come out all the time, with each provider requiring a unique API key, billing, and API standard. It quickly becomes cumbersome trying to manage all of these nuances.

Our Universal API provides:

  • A single, common interface for all models and providers 🟢
  • One account, with one balance and one API key 🔑
  • Integrations for your own fine-tuned and/or local LLMs 🖥️

To get started, just sign up, pip install unifyai, and make your first query:

import unify
client = unify.Unify("gpt-4o@openai", api_key="UNIFY_KEY")
client.generate("hello world!")

You can list all supported models, endpoints and providers like so:

# all
unify.list_models()
unify.list_providers()
unify.list_endpoints()

# filtered
unify.list_models("openai")
unify.list_providers("llama-3-8b-chat")
unify.list_endpoints(provider="openai")
unify.list_endpoints(model="llama-3-8b-chat")

The rest of the docs generally assume we’re using Python, but the Python client is built directly on top of our REST API, which you can also query directly.

For example, listing endpoints:

curl -X 'GET' \
  'https://api.unify.ai/v0/endpoints' \
  -H "Authorization: Bearer $UNIFY_KEY" \
  -H 'accept: application/json'

We won’t show too many more curl examples in the docs, but you can check out the API Reference to see how to make queries for all supported features of the Universal API.

That’s it, you now have all the power of all LLMs at your finger tips ⚡

With great power comes great responsibility 🦸