Core Setup

Configure your LLM

Add your model provider + API key so Disseqt can run simulations, tests, and validations on top of it.

Add your model provider + API key so Disseqt can run simulations, tests, and validations on top of it.

Last Updated on June 19, 2023

Disseqt runs evaluation, testing, anomaly detection, compliance checks, jailbreak testing etc. on your model. So this step connects your preferred provider (OpenAI / Anthropic / Google etc.) to Disseqt.

How to configure your LLM

STEP 1: Go to LLM Configurations from the left menu → click Add Custom LLM

STEP 2: Select whether you want to save this model as Permanent or Temporary

STEP 3: Choose a Provider (OpenAI, Anthropic, Google AI, Azure OpenAI, Cohere or Custom)

STEP 4: Pick your Model Name from the dropdown list

STEP 5: Paste your API Key

STEP 6: Click Test Connection (very important)

STEP 7: After successful test, click Create Permanent

Once this is added, the LLM becomes available for selection inside any project you create.

© Disseqt AI Product Starter Guide

© Disseqt AI Product Starter Guide

© Disseqt AI Product Starter Guide