Skip to content

Commit

Permalink
📝 improve free trial docs
Browse files Browse the repository at this point in the history
  • Loading branch information
sestinj committed May 17, 2024
1 parent c407e80 commit b2c7b07
Show file tree
Hide file tree
Showing 3 changed files with 62 additions and 18 deletions.
8 changes: 3 additions & 5 deletions docs/docs/reference/Model Providers/anthropicllm.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,18 @@
# AnthropicLLM
# Anthropic

To setup Anthropic, add the following to your `config.json` file:
To setup Anthropic, obtain an API key from [here](https://www.anthropic.com/api) and add the following to your `config.json` file:

```json title="~/.continue/config.json"
{
"models": [
{
"title": "Anthropic",
"provider": "anthropic",
"model": "claude-2",
"model": "claude-3-opus-20240229",
"apiKey": "YOUR_API_KEY"
}
]
}
```

Claude 2 is not yet publicly released. You can request early access [here](https://www.anthropic.com/earlyaccess).

[View the source](https://github.com/continuedev/continue/blob/main/core/llm/llms/Anthropic.ts)
64 changes: 56 additions & 8 deletions docs/docs/reference/Model Providers/freetrial.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,72 @@
# FreeTrial
# Free Trial

With the `FreeTrial` provider, new users can try out Continue with GPT-4 using a proxy server that securely makes calls to OpenAI using our API key. Continue should just work the first time you install the extension in VS Code.
The `"free-trial"` provider lets new users try out Continue with GPT-4, Llama3, Claude 3, and other models using a proxy server that securely makes API calls to these services. Continue should just work the first time you install the extension.

Once you are using Continue regularly though, you will need to add an OpenAI API key that has access to GPT-4 by following these steps:
While the Continue extension is always free to use, we cannot support infinite free LLM usage for all of our users. You will eventually need to either:

1. Select an open-source model to use for free locally, or
2. Add your own API key for OpenAI, Anthropic, or another LLM provider

## Options

### 🦙 Ollama (free, local)

Ollama is a local service that makes it easy to run language models on your laptop.

1. Download Ollama from https://ollama.ai
2. Open `~/.continue/config.json`. You can do this by clicking the gear icon in the bottom right corner of the Continue sidebar
3. Add the following to your `config.json`:

```json title="~/.continue/config.json"
{
"models": [
{
"title": "Llama3 8b",
"provider": "ollama",
"model": "llama3:8b"
}
]
}
```

### ⚡️ Groq (extremely fast)

Groq provides lightning fast inference for open-source LLMs like Llama3, up to twice as fast as through other providers.

1. Obtain an API key from https://console.groq.com
2. Add the following to your `config.json`:

```json title="~/.continue/config.json"
{
"models": [
{
"title": "Llama3 70b",
"provider": "groq",
"model": "llama3-70b",
"apiKey": "YOUR_API_KEY"
}
]
}
```

### ✨ OpenAI (highly capable)

1. Copy your API key from https://platform.openai.com/account/api-keys
2. Open `~/.continue/config.json`. You can do this by using the '/config' command in Continue
3. Change the LLM to look like this:
2. Add the following to your `config.json`:

```json title="~/.continue/config.json"
{
"models": [
{
"title": "GPT-4",
"title": "GPT-4-Turbo",
"provider": "openai",
"model": "gpt-4",
"model": "gpt-4-turbo",
"apiKey": "YOUR_API_KEY"
}
]
}
```

[View the source](https://github.com/continuedev/continue/blob/main/core/llm/llms/FreeTrial.ts)
### ⏩ Other options

The above were only a few examples, but Continue can be used with any LLM or provider. You can find [a full list of providers here](../../setup/select-provider.md).
8 changes: 3 additions & 5 deletions docs/docs/reference/Model Providers/ollama.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,14 @@
# Ollama

[Ollama](https://ollama.ai/) is an application for Mac and Linux that makes it easy to locally run open-source models, including Llama-2. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/jmorganca/ollama). Continue can then be configured to use the `Ollama` LLM class:
[Ollama](https://ollama.ai/) is an application for Mac, Windows, and Linux that makes it easy to locally run open-source models, including Llama3. Download the app from the website, and it will walk you through setup in a couple of minutes. You can also read more in their [README](https://github.com/ollama/ollama). Continue can then be configured to use the `"ollama"` provider:

```json title="~/.continue/config.json"
{
"models": [
{
"title": "Ollama",
"provider": "ollama",
"model": "llama2-7b",
"completionOptions": {},
"apiBase": "http://localhost:11434"
"model": "llama3:8b"
}
]
}
Expand Down Expand Up @@ -38,7 +36,7 @@ If you need to send custom headers for authentication, you may use the `requestO
{
"title": "Ollama",
"provider": "ollama",
"model": "llama2-7b",
"model": "llama3:8b",
"requestOptions": {
"headers": {
"Authorization": "Bearer xxx"
Expand Down

0 comments on commit b2c7b07

Please sign in to comment.