mirror of
https://github.com/explosion/spaCy.git
synced 2025-07-10 16:22:29 +03:00
Merge branch 'website/curated-docs' of github.com:vin-ivar/spaCy into pr/vin-ivar/12677
This commit is contained in:
commit
feda7fa63c
|
@ -58,8 +58,8 @@ The default config is defined by the pipeline component factory and describes
|
||||||
how the component should be configured. You can override its settings via the
|
how the component should be configured. You can override its settings via the
|
||||||
`config` argument on [`nlp.add_pipe`](/api/language#add_pipe) or in your
|
`config` argument on [`nlp.add_pipe`](/api/language#add_pipe) or in your
|
||||||
[`config.cfg` for training](/usage/training#config). See the
|
[`config.cfg` for training](/usage/training#config). See the
|
||||||
[model architectures](/api/architectures#transformers) documentation for details
|
[model architectures](/api/architectures#curated-trf) documentation for details
|
||||||
on the transformer architectures and their arguments and hyperparameters.
|
on the curated transformer architectures and their arguments and hyperparameters.
|
||||||
|
|
||||||
Note that the default config does not include the mandatory `vocab_size`
|
Note that the default config does not include the mandatory `vocab_size`
|
||||||
hyperparameter as this value can differ between different models. So, you will
|
hyperparameter as this value can differ between different models. So, you will
|
||||||
|
@ -100,7 +100,7 @@ https://github.com/explosion/spacy-curated-transformers/blob/main/spacy_curated_
|
||||||
> "@architectures": "spacy-curated-transformers.XlmrTransformer.v1",
|
> "@architectures": "spacy-curated-transformers.XlmrTransformer.v1",
|
||||||
> "vocab_size": 250002,
|
> "vocab_size": 250002,
|
||||||
> "num_hidden_layers": 12,
|
> "num_hidden_layers": 12,
|
||||||
> "hidden_width": 768
|
> "hidden_width": 768,
|
||||||
> "piece_encoder": {
|
> "piece_encoder": {
|
||||||
> "@architectures": "spacy-curated-transformers.XlmrSentencepieceEncoder.v1"
|
> "@architectures": "spacy-curated-transformers.XlmrSentencepieceEncoder.v1"
|
||||||
> }
|
> }
|
||||||
|
@ -146,7 +146,7 @@ and all pipeline components are applied to the `Doc` in order. Both
|
||||||
> doc = nlp("This is a sentence.")
|
> doc = nlp("This is a sentence.")
|
||||||
> trf = nlp.add_pipe("curated_transformer")
|
> trf = nlp.add_pipe("curated_transformer")
|
||||||
> # This usually happens under the hood
|
> # This usually happens under the hood
|
||||||
> processed = transformer(doc)
|
> processed = trf(doc)
|
||||||
> ```
|
> ```
|
||||||
|
|
||||||
| Name | Description |
|
| Name | Description |
|
||||||
|
|
Loading…
Reference in New Issue
Block a user