docs(lighthouse): update lighthouse multi llm docs (#9362)

Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
This commit is contained in:
Chandrapal Badshah
2025-12-03 13:23:34 +05:30
committed by GitHub
parent 803ada7b16
commit 069f0d106c

View File

@@ -16,6 +16,16 @@ Lighthouse AI supports the following LLM providers:
- **Amazon Bedrock**: Offers AWS-hosted access to Claude, Llama, Titan, and other models
- **OpenAI Compatible**: Supports custom endpoints like OpenRouter, Ollama, or any OpenAI-compatible service
## Model Requirements
For Lighthouse AI to work properly, models **must** support all of the following capabilities:
- **Text input**: Ability to receive text prompts
- **Text output**: Ability to generate text responses
- **Tool calling**: Ability to invoke tools and functions
If any of these capabilities are missing, the model will not be compatible with Lighthouse AI.
## How Default Providers Work
All three providers can be configured for a tenant, but only one can be set as the default provider. The first configured provider automatically becomes the default.
@@ -39,63 +49,73 @@ To connect a provider:
3. Select a default model for that provider
4. Click **Connect** to save
### OpenAI
<Tabs>
<Tab title="OpenAI">
### Required Information
#### Required Information
- **API Key**: OpenAI API key (starts with `sk-` or `sk-proj-`). You can create an API key from the [OpenAI platform](https://platform.openai.com/api-keys).
- **API Key**: OpenAI API key (starts with `sk-` or `sk-proj-`)
### Before Connecting
<Note>
To generate an OpenAI API key, visit https://platform.openai.com/api-keys
</Note>
- Ensure the OpenAI account has sufficient credits
- Verify that the `gpt-5` model (recommended for Lighthouse AI) is not blocked in the OpenAI organization settings
</Tab>
### Amazon Bedrock
<Tab title="Amazon Bedrock">
### Required Information
#### Required Information
To connect Amazon Bedrock, you need:
- **AWS Access Key ID**: AWS access key ID
- **AWS Secret Access Key**: AWS secret access key
- **AWS Region**: Region where Bedrock is available (e.g., `us-east-1`, `us-west-2`)
- **AWS Access Key ID**: AWS access key ID
- **AWS Secret Access Key**: AWS secret access key
- **AWS Region**: Region where Bedrock is available (e.g., `us-east-1`, `us-west-2`)
#### Required Permissions
Available models are region dependent. See [Amazon Bedrock models by region](https://docs.aws.amazon.com/bedrock/latest/userguide/models-regions.html) for details.
The AWS user must have the `AmazonBedrockLimitedAccess` managed policy attached:
### Required Permissions
```text
arn:aws:iam::aws:policy/AmazonBedrockLimitedAccess
```
The AWS IAM user must have the `AmazonBedrockLimitedAccess` managed policy attached:
<Note>
Currently, only AWS access key and secret key authentication is supported. Amazon Bedrock API key support will be available soon.
</Note>
```text
arn:aws:iam::aws:policy/AmazonBedrockLimitedAccess
```
<Note>
Available models depend on AWS region and account entitlements. Lighthouse AI displays only accessible models.
</Note>
<Note>
Access to all Amazon Bedrock foundation models is enabled by default. When you select a model or invoke it for the first time (using Prowler or otherwise), you agree to Amazon's EULA. More info: [Amazon Bedrock Model Access](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html)
</Note>
### OpenAI Compatible
Prowler provides flexibility to select available models. When you select a model, you are subscribed in the Amazon Marketplace, but you are charged only on a usage basis.
Use this option to connect to any LLM provider exposing OpenAI compatible API endpoint (OpenRouter, Ollama, etc.).
<Note>
Lighthouse AI automatically filters Amazon Bedrock models to display only those that support text input, text output, and tool calling capabilities. This ensures all available models are compatible with Lighthouse AI features.
</Note>
#### Required Information
</Tab>
- **API Key**: API key from the compatible service
- **Base URL**: API endpoint URL including the API version (e.g., `https://openrouter.ai/api/v1`)
<Tab title="OpenAI Compatible">
Use this option to connect to any LLM provider exposing an OpenAI compatible API endpoint (OpenRouter, Ollama, etc.).
#### Example: OpenRouter
### Required Information
1. Create an account at [OpenRouter](https://openrouter.ai/)
2. [Generate an API key](https://openrouter.ai/docs/guides/overview/auth/provisioning-api-keys) from the OpenRouter dashboard
3. Configure in Lighthouse AI:
- **API Key**: OpenRouter API key
- **Base URL**: `https://openrouter.ai/api/v1`
- **API Key**: API key from the compatible service
- **Base URL**: API endpoint URL including the API version (e.g., `https://openrouter.ai/api/v1`)
### Example: OpenRouter
1. Create an account at [OpenRouter](https://openrouter.ai/)
2. [Generate an API key](https://openrouter.ai/docs/guides/overview/auth/provisioning-api-keys) from the OpenRouter dashboard
3. Configure in Lighthouse AI:
- **API Key**: OpenRouter API key
- **Base URL**: `https://openrouter.ai/api/v1`
</Tab>
</Tabs>
## Changing the Default Provider
To set a different provider as default:
1. Navigate to **Configuration** → **Lighthouse AI**
2. Click **Configure** under the provider you want as default
2. Click **Configure** under the desired provider to set as default
3. Click **Set as Default**
<img src="/images/prowler-app/lighthouse-set-default-provider.png" alt="Set default LLM provider" />
@@ -121,7 +141,7 @@ To remove a configured provider:
For best results with Lighthouse AI, the recommended model is `gpt-5` from OpenAI.
Models from other providers such as Amazon Bedrock and OpenAI Compatible endpoints can be connected and used, but performance is not guaranteed.
Models from other providers such as Amazon Bedrock and OpenAI Compatible endpoints can be connected and used, but performance is not guaranteed. Ensure that any selected model supports text input, text output, and tool calling capabilities.
## Getting Help