Skip to main content
Vijil Evaluate is integrated with a number of leading LLM providers. To evaluate serverless LLM endpoints hosted on any of these, follow directions in the quickstart tutorial, but with different values for the model hub and model name.
Providermodel_hubmodel_name
OpenAIopenaiList of Models
Together AItogetherList of Models
Mistral AImistralList of Models
Fireworks AIfireworksList of Models
NVIDIA NIMnvidiaList of Models
Vijil also supports a number of other cloud services, giving you the flexibility of evaluating agents accessible through custom endpoints.

Google Vertex AI

Learn more about integrating Google Vertex AI

DigitalOcean

Learn more about integrating DigitalOcean

AWS Bedrock

Learn more about integrating AWS Bedrock