mirror of
https://github.com/explosion/spaCy.git
synced 2024-12-26 18:06:29 +03:00
rename to TransformerListener
This commit is contained in:
parent
15902c5aa2
commit
ec069627fe
|
@ -346,13 +346,13 @@ in other components, see
|
||||||
| `tokenizer_config` | Tokenizer settings passed to [`transformers.AutoTokenizer`](https://huggingface.co/transformers/model_doc/auto.html#transformers.AutoTokenizer). ~~Dict[str, Any]~~ |
|
| `tokenizer_config` | Tokenizer settings passed to [`transformers.AutoTokenizer`](https://huggingface.co/transformers/model_doc/auto.html#transformers.AutoTokenizer). ~~Dict[str, Any]~~ |
|
||||||
| **CREATES** | The model using the architecture. ~~Model[List[Doc], FullTransformerBatch]~~ |
|
| **CREATES** | The model using the architecture. ~~Model[List[Doc], FullTransformerBatch]~~ |
|
||||||
|
|
||||||
### spacy-transformers.Tok2VecListener.v1 {#transformers-Tok2VecListener}
|
### spacy-transformers.TransformerListener.v1 {#TransformerListener}
|
||||||
|
|
||||||
> #### Example Config
|
> #### Example Config
|
||||||
>
|
>
|
||||||
> ```ini
|
> ```ini
|
||||||
> [model]
|
> [model]
|
||||||
> @architectures = "spacy-transformers.Tok2VecListener.v1"
|
> @architectures = "spacy-transformers.TransformerListener.v1"
|
||||||
> grad_factor = 1.0
|
> grad_factor = 1.0
|
||||||
>
|
>
|
||||||
> [model.pooling]
|
> [model.pooling]
|
||||||
|
|
|
@ -29,7 +29,7 @@ This pipeline component lets you use transformer models in your pipeline.
|
||||||
Supports all models that are available via the
|
Supports all models that are available via the
|
||||||
[HuggingFace `transformers`](https://huggingface.co/transformers) library.
|
[HuggingFace `transformers`](https://huggingface.co/transformers) library.
|
||||||
Usually you will connect subsequent components to the shared transformer using
|
Usually you will connect subsequent components to the shared transformer using
|
||||||
the [TransformerListener](/api/architectures##transformers-Tok2VecListener) layer. This
|
the [TransformerListener](/api/architectures#TransformerListener) layer. This
|
||||||
works similarly to spaCy's [Tok2Vec](/api/tok2vec) component and
|
works similarly to spaCy's [Tok2Vec](/api/tok2vec) component and
|
||||||
[Tok2VecListener](/api/architectures/Tok2VecListener) sublayer.
|
[Tok2VecListener](/api/architectures/Tok2VecListener) sublayer.
|
||||||
|
|
||||||
|
@ -233,7 +233,7 @@ The `Transformer` component therefore does **not** perform a weight update
|
||||||
during its own `update` method. Instead, it runs its transformer model and
|
during its own `update` method. Instead, it runs its transformer model and
|
||||||
communicates the output and the backpropagation callback to any **downstream
|
communicates the output and the backpropagation callback to any **downstream
|
||||||
components** that have been connected to it via the
|
components** that have been connected to it via the
|
||||||
[TransformerListener](/api/architectures##transformers-Tok2VecListener) sublayer. If there
|
[TransformerListener](/api/architectures#TransformerListener) sublayer. If there
|
||||||
are multiple listeners, the last layer will actually backprop to the transformer
|
are multiple listeners, the last layer will actually backprop to the transformer
|
||||||
and call the optimizer, while the others simply increment the gradients.
|
and call the optimizer, while the others simply increment the gradients.
|
||||||
|
|
||||||
|
|
|
@ -101,7 +101,7 @@ it processes a batch of documents, it will pass forward its predictions to the
|
||||||
listeners, allowing the listeners to **reuse the predictions** when they are
|
listeners, allowing the listeners to **reuse the predictions** when they are
|
||||||
eventually called. A similar mechanism is used to pass gradients from the
|
eventually called. A similar mechanism is used to pass gradients from the
|
||||||
listeners back to the model. The [`Transformer`](/api/transformer) component and
|
listeners back to the model. The [`Transformer`](/api/transformer) component and
|
||||||
[TransformerListener](/api/architectures#transformers-Tok2VecListener) layer do the same
|
[TransformerListener](/api/architectures#TransformerListener) layer do the same
|
||||||
thing for transformer models, but the `Transformer` component will also save the
|
thing for transformer models, but the `Transformer` component will also save the
|
||||||
transformer outputs to the
|
transformer outputs to the
|
||||||
[`Doc._.trf_data`](/api/transformer#custom_attributes) extension attribute,
|
[`Doc._.trf_data`](/api/transformer#custom_attributes) extension attribute,
|
||||||
|
|
|
@ -64,7 +64,7 @@ menu:
|
||||||
[`TransformerData`](/api/transformer#transformerdata),
|
[`TransformerData`](/api/transformer#transformerdata),
|
||||||
[`FullTransformerBatch`](/api/transformer#fulltransformerbatch)
|
[`FullTransformerBatch`](/api/transformer#fulltransformerbatch)
|
||||||
- **Architectures: ** [TransformerModel](/api/architectures#TransformerModel),
|
- **Architectures: ** [TransformerModel](/api/architectures#TransformerModel),
|
||||||
[Tok2VecListener](/api/architectures#transformers-Tok2VecListener),
|
[TransformerListener](/api/architectures#TransformerListener),
|
||||||
[Tok2VecTransformer](/api/architectures#Tok2VecTransformer)
|
[Tok2VecTransformer](/api/architectures#Tok2VecTransformer)
|
||||||
- **Models:** [`en_core_trf_lg_sm`](/models/en)
|
- **Models:** [`en_core_trf_lg_sm`](/models/en)
|
||||||
- **Implementation:**
|
- **Implementation:**
|
||||||
|
|
Loading…
Reference in New Issue
Block a user