mirror of
https://github.com/explosion/spaCy.git
synced 2025-04-21 01:21:58 +03:00
Update docs w.r.t. PaLM support.
This commit is contained in:
parent
163ec6fba8
commit
31ab725489
|
@ -19,8 +19,8 @@ prototyping** and **prompting**, and turning unstructured responses into
|
|||
An LLM component is implemented through the `LLMWrapper` class. It is accessible
|
||||
through a generic `llm`
|
||||
[component factory](https://spacy.io/usage/processing-pipelines#custom-components-factories)
|
||||
as well as through task-specific component factories: `llm_ner`, `llm_spancat`, `llm_rel`,
|
||||
`llm_textcat`, `llm_sentiment` and `llm_summarization`.
|
||||
as well as through task-specific component factories: `llm_ner`, `llm_spancat`,
|
||||
`llm_rel`, `llm_textcat`, `llm_sentiment` and `llm_summarization`.
|
||||
|
||||
### LLMWrapper.\_\_init\_\_ {id="init",tag="method"}
|
||||
|
||||
|
@ -984,6 +984,7 @@ Currently, these models are provided as part of the core library:
|
|||
| `spacy.Claude-1-3.v1` | Anthropic | `["claude-1.3", "claude-1.3-100k"]` | `"claude-1.3"` | `{}` |
|
||||
| `spacy.Claude-instant-1.v1` | Anthropic | `["claude-instant-1", "claude-instant-1-100k"]` | `"claude-instant-1"` | `{}` |
|
||||
| `spacy.Claude-instant-1-1.v1` | Anthropic | `["claude-instant-1.1", "claude-instant-1.1-100k"]` | `"claude-instant-1.1"` | `{}` |
|
||||
| `spacy.PaLM.v1` | Google | `["chat-bison-001", "text-bison-001"]` | `"text-bison-001"` | `{temperature=0.0}` |
|
||||
|
||||
To use these models, make sure that you've [set the relevant API](#api-keys)
|
||||
keys as environment variables.
|
||||
|
@ -1014,6 +1015,12 @@ For Anthropic:
|
|||
export ANTHROPIC_API_KEY="..."
|
||||
```
|
||||
|
||||
For PaLM:
|
||||
|
||||
```shell
|
||||
export PALM_API_KEY="..."
|
||||
```
|
||||
|
||||
### Models via HuggingFace {id="models-hf"}
|
||||
|
||||
These models all take the same parameters:
|
||||
|
|
|
@ -170,8 +170,8 @@ to be `"databricks/dolly-v2-12b"` for better performance.
|
|||
### Example 3: Create the component directly in Python {id="example-3"}
|
||||
|
||||
The `llm` component behaves as any other component does, and there are
|
||||
[task-specific components](/api/large-language-models#config) defined to
|
||||
help you hit the ground running with a reasonable built-in task implementation.
|
||||
[task-specific components](/api/large-language-models#config) defined to help
|
||||
you hit the ground running with a reasonable built-in task implementation.
|
||||
|
||||
```python
|
||||
import spacy
|
||||
|
@ -484,6 +484,7 @@ provider's documentation.
|
|||
| [`spacy.Claude-1-0.v1`](/api/large-language-models#models-rest) | Anthropic’s `claude-1.0` model family. |
|
||||
| [`spacy.Claude-1-2.v1`](/api/large-language-models#models-rest) | Anthropic’s `claude-1.2` model family. |
|
||||
| [`spacy.Claude-1-3.v1`](/api/large-language-models#models-rest) | Anthropic’s `claude-1.3` model family. |
|
||||
| [`spacy.PaLM.v1`](/api/large-language-models#models-rest) | Google’s `PaLM` model family. |
|
||||
| [`spacy.Dolly.v1`](/api/large-language-models#models-hf) | Dolly models through HuggingFace. |
|
||||
| [`spacy.Falcon.v1`](/api/large-language-models#models-hf) | Falcon models through HuggingFace. |
|
||||
| [`spacy.Llama2.v1`](/api/large-language-models#models-hf) | Llama2 models through HuggingFace. |
|
||||
|
|
Loading…
Reference in New Issue
Block a user