on-prem
, which contains six files,
as explained below.
Firstly, the following three files store environment variables for the three
different applications:
.env-console
contains the necessary environment variables to set up the console app.env-db
contains the necessary environment variables to set up the database app.env-orchestra
contains the necessary environment variables to set up the backend app.json
files, as explained below:
sessionInfo.json
contains the session information for the console app, it’s mainly used for
displaying data on your profile page. The file should contain the following fields when
we share the folder with you:
name
: first nameemail
: email idimage
: a link to the profile imageuserInfo.json
contains more information about the user currently using the app.
This is something we need to do so to bypass the login process. The file should contain
the following fields when we share the folder with you:
id
: user idname
: first namelastName
: last nameimage
: a link to the profile imageemail
: the email idcreatedAt
: the date when the account was createdapiKey
: the details about your api keyUserDetails
: your organization and jobTitle (Optional)docker-compose.yaml
: contains the configuration for starting the Docker Compose service.env_orchestra
file with the
credentials of the providers you would like to use:
ORCHESTRA_OPENAI_KEY
variable and
so on.AWS_SECRET_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_REGION
(the location can also be instead passed as the region
field in
/chat/completions
)GOOGLE_APPLICATION_CREDENTIALS
ORCHESTRA_VERTEXAI_PROJECT
ORCHESTRA_VERTEXAI_LOCATION
(again, the location can also be instead passed as
the region
field in /chat/completions
)AZURE_PROJECT
AZURE_OPENAI_API_KEY
AZURE_REGION
(you can instead pass this as the region
field in
/chat/completions
)AZURE_AI_LLAMA_3_1_8B_CHAT_API_KEY
AZURE_AI_LLAMA_3_1_70B_CHAT_API_KEY
AZURE_AI_LLAMA_3_1_405B_CHAT_API_KEY
docker-compose.yaml
that mounts the service account .json
onto the container.
For e.g. if application_default_credentials.json
is the name of the file, then
GOOGLE_APPLICATION_CREDENTIALS
would be set as
/app/src/application_default_credentials.json
.
Once you’ve set the environment variables, you can now spin up the containers for the
images by running docker compose up
in the same folder as the docker-compose.yaml
file. There’s a total of 5 containers that should be running:
console
orchestra
dataset_evaluation
db
redis
http://0.0.0.0:8000
”
(as orchestra
takes a few seconds to be up and running). After that is displayed,
you’re set to start using the console!
http://localhost:3000
, you should be able to access
features like uploading datasets, running evaluations, adding custom api keys,
adding custom endpoints etc. You can also use all the endpoints in the
API Reference by making calls to
http://localhost:8000
instead of https://api.unify.ai
UNIFY_BASE_URL
environment variable as
the address to the orchestra service. For e.g. if you’re using the python client in the
same host as the orchestra service, then you should set the UNIFY_BASE_URL
to
http://localhost:8000
.
docker compose up -d --no-deps --build <service_name>
<service_name> can be one ofconsole
,orchestra
ordataset_evaluation
. The goal here is to make sure that your data remains intact, so we would advice against deleting thedb
service.
PUBSUB_MESSAGING_TOPIC
environment variable in the .env_orchestra
as the name of
the topic you’d like to set.PUBSUB_PROJECT_NAME
environment variable in the same file as the name of the
project you’re using.<topic_name>
will be set as the publisher topic in orchestra
and the <topic_name>-sub
will be set as the subscriber to that topic in the dataset-evaluation
service.volumes:
of the docker-compose.yaml
for both the services.