Create Custom Endpoint
Creates a custom endpoint. This endpoint must either be a fine-tuned model from one
of the supported providers (/v0/providers
), in which case the “provider” argument
must be set accordingly. Otherwise, the endpoint must support the OpenAI
/chat/completions
format. To query your custom endpoint, replace your endpoint
string with <endpoint_name>@custom
when querying any general custom endpoint. You
can show all custom endpoints by querying /v0/endpoints
and passing custom
as
the provider
argument.
Authorizations
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
Query Parameters
The endpoint name for your custom endpoint, in model@provider format. If it’s a custom endpoint following the OpenAI format then the provider must be @custom
, otherwise if it’s a fine-tuned model from one of the existing providers it can be specified with a prepending custom-
, i.e. @custom-anthropic
.
Base URL of the endpoint being called. Must support the OpenAI format.
Name of the API key that will be passed as part of the query.
The value passed to the model arugment of the underlying API which is being wrapped into Unify. For example, you might call your endpoint llama-3-baseten@custom
to distinguish the custom endpoint within Unify, but under the hood need to pass llama-3.2-90b-chat
to the Baseten endpoint.