Merge pull request #8466 from explosion/docs/new-in-v3-1 [ci skip]

This commit is contained in:
Ines Montani 2021-07-06 22:20:24 +10:00 committed by GitHub
commit 04a9ade40f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
11 changed files with 497 additions and 8 deletions

View File

@ -16,6 +16,7 @@ menu:
- ['package', 'package']
- ['project', 'project']
- ['ray', 'ray']
- ['huggingface-hub', 'huggingface-hub']
---
spaCy's CLI provides a range of helpful commands for downloading and training
@ -1276,3 +1277,49 @@ $ python -m spacy ray train [config_path] [--code] [--output] [--n-workers] [--a
| `--verbose`, `-V` | Display more information for debugging purposes. ~~bool (flag)~~ |
| `--help`, `-h` | Show help message and available arguments. ~~bool (flag)~~ |
| overrides | Config parameters to override. Should be options starting with `--` that correspond to the config section and value to override, e.g. `--paths.train ./train.spacy`. ~~Any (option/flag)~~ |
## huggingface-hub {#huggingface-hub new="3.1"}
The `spacy huggingface-cli` CLI includes commands for uploading your trained
spaCy pipelines to the [Hugging Face Hub](https://huggingface.co/).
> #### Installation
>
> ```cli
> $ pip install spacy-huggingface-hub
> $ huggingface-cli login
> ```
<Infobox variant="warning">
To use this command, you need the
[`spacy-huggingface-hub`](https://github.com/explosion/spacy-huggingface-hub)
package installed. Installing the package will automatically add the
`huggingface-hub` command to the spaCy CLI.
</Infobox>
### huggingface-hub push {#huggingface-hub-push tag="command"}
Push a spaCy pipeline to the Hugging Face Hub. Expects a `.whl` file packaged
with [`spacy package`](/api/cli#package) and `--build wheel`. For more details,
see the spaCy project [integration](/usage/projects#huggingface_hub).
```cli
$ python -m spacy huggingface-hub push [whl_path] [--org] [--msg] [--local-repo] [--verbose]
```
> #### Example
>
> ```cli
> $ python -m spacy huggingface-hub push en_ner_fashion-0.0.0-py3-none-any.whl
> ```
| Name | Description |
| -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------- |
| `whl_path` | The path to the `.whl` file packaged with [`spacy package`](https://spacy.io/api/cli#package). ~~Path(positional)~~ |
| `--org`, `-o` | Optional name of organization to which the pipeline should be uploaded. ~~str (option)~~ |
| `--msg`, `-m` | Commit message to use for update. Defaults to `"Update spaCy pipeline"`. ~~str (option)~~ |
| `--local-repo`, `-l` | Local path to the model repository (will be created if it doesn't exist). Defaults to `hub` in the current working directory. ~~Path (option)~~ |
| `--verbose`, `-V` | Output additional info for debugging, e.g. the full generated hub metadata. ~~bool (flag)~~  |
| **UPLOADS** | The pipeline to the hub. |

View File

@ -82,7 +82,7 @@ shortcut for this and instantiate the component using its string name and
| `moves` | A list of transition names. Inferred from the data if set to `None`, which is the default. ~~Optional[List[str]]~~ |
| _keyword-only_ | |
| `update_with_oracle_cut_size` | During training, cut long sequences into shorter segments by creating intermediate states based on the gold-standard history. The model is not very sensitive to this parameter, so you usually won't need to change it. Defaults to `100`. ~~int~~ |
| `incorrect_spans_key` | Identifies spans that are known to be incorrect entity annotations. The incorrect entity annotations can be stored in the span group, under this key. Defaults to `None`. ~~Optional[str]~~ |
| `incorrect_spans_key` | Identifies spans that are known to be incorrect entity annotations. The incorrect entity annotations can be stored in the span group in [`Doc.spans`](/api/doc#spans), under this key. Defaults to `None`. ~~Optional[str]~~ |
## EntityRecognizer.\_\_call\_\_ {#call tag="method"}

Binary file not shown.

After

Width:  |  Height:  |  Size: 110 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 304 KiB

View File

@ -49,6 +49,7 @@ production.
<Integration title="FastAPI" logo="fastapi" url="#fastapi">Serve your models and host APIs</Integration>
<Integration title="Ray" logo="ray" url="#ray">Distributed and parallel training</Integration>
<Integration title="Weights &amp; Biases" logo="wandb" url="#wandb">Track your experiments and results</Integration>
<Integration title="Hugging Face Hub" logo="huggingface_hub" url="#huggingface_hub">Upload your pipelines to the Hugging Face Hub</Integration>
</Grid>
### 1. Clone a project template {#clone}
@ -1013,3 +1014,68 @@ creating variants of the config for a simple hyperparameter grid search and
logging the results.
</Project>
---
### Hugging Face Hub {#huggingface_hub} <IntegrationLogo name="huggingface_hub" width={175} height="auto" align="right" />
The [Hugging Face Hub](https://huggingface.co/) lets you upload models and share
them with others. It hosts models as Git-based repositories which are storage
spaces that can contain all your files. It support versioning, branches and
custom metadata out-of-the-box, and provides browser-based visualizers for
exploring your models interactively, as well as an API for production use. The
[`spacy-huggingface-hub`](https://github.com/explosion/spacy-huggingface-hub)
package automatically adds the `huggingface-hub` command to your `spacy` CLI if
it's installed.
> #### Installation
>
> ```cli
> $ pip install spacy-huggingface-hub
> # Check that the CLI is registered
> $ python -m spacy huggingface-hub --help
> ```
You can then upload any pipeline packaged with
[`spacy package`](/api/cli#package). Make sure to set `--build wheel` to output
a binary `.whl` file. The uploader will read all metadata from the pipeline
package, including the auto-generated pretty `README.md` and the model details
available in the `meta.json`. For examples, check out the
[spaCy pipelines](https://huggingface.co/spacy) we've uploaded.
```cli
$ huggingface-cli login
$ python -m spacy package ./en_ner_fashion ./output --build wheel
$ cd ./output/en_ner_fashion-0.0.0/dist
$ python -m spacy huggingface-hub push en_ner_fashion-0.0.0-py3-none-any.whl
```
After uploading, you will see the live URL of your pipeline packages, as well as
the direct URL to the model wheel you can install via `pip install`. You'll also
be able to test your pipeline interactively from your browser:
![Screenshot: interactive NER visualizer](../images/huggingface_hub.jpg)
In your `project.yml`, you can add a command that uploads your trained and
packaged pipeline to the hub. You can either run this as a manual step, or
automatically as part of a workflow. Make sure to set `--build wheel` when
running `spacy package` to build a wheel file for your pipeline package.
<!-- prettier-ignore -->
```yaml
### project.yml
- name: "push_to_hub"
help: "Upload the trained model to the Hugging Face Hub"
script:
- "python -m spacy huggingface-hub push packages/en_${vars.name}-${vars.version}/dist/en_${vars.name}-${vars.version}-py3-none-any.whl"
deps:
- "packages/en_${vars.name}-${vars.version}/dist/en_${vars.name}-${vars.version}-py3-none-any.whl"
```
<Project id="integrations/huggingface_hub">
Get started with uploading your models to the Hugging Face hub using our project
template. It trains a simple pipeline, packages it and uploads it if the
packaged model has changed. This makes it easy to deploy your models end-to-end.
</Project>

309
website/docs/usage/v3-1.md Normal file
View File

@ -0,0 +1,309 @@
---
title: What's New in v3.1
teaser: New features and how to upgrade
menu:
- ['New Features', 'features']
- ['Upgrading Notes', 'upgrading']
---
## New Features {#features hidden="true"}
It's been great to see the adoption of the new spaCy v3, which introduced
[transformer-based](/usage/embeddings-transformers) pipelines, a new
[config and training system](/usage/training) for reproducible experiments,
[projects](/usage/projects) for end-to-end workflows, and many
[other features](/usage/v3). Version 3.1 adds more on top of it, including the
ability to use predicted annotations during training, a new `SpanCategorizer`
component for predicting arbitrary and potentially overlapping spans, support
for partial incorrect annotations in the entity recognizer, new trained
pipelines for Catalan and Danish, as well as many bug fixes and improvements.
### Using predicted annotations during training {#predicted-annotations-training}
By default, components are updated in isolation during training, which means
that they don't see the predictions of any earlier components in the pipeline.
The new
[`[training.annotating_components]`](/usage/training#annotating-components)
config setting lets you specify pipeline components that should set annotations
on the predicted docs during training. This makes it easy to use the predictions
of a previous component in the pipeline as features for a subsequent component,
e.g. the dependency labels in the tagger:
```ini
### config.cfg (excerpt) {highlight="7,12"}
[nlp]
pipeline = ["parser", "tagger"]
[components.tagger.model.tok2vec.embed]
@architectures = "spacy.MultiHashEmbed.v1"
width = ${components.tagger.model.tok2vec.encode.width}
attrs = ["NORM","DEP"]
rows = [5000,2500]
include_static_vectors = false
[training]
annotating_components = ["parser"]
```
<Project id="pipelines/tagger_parser_predicted_annotations">
This project shows how to use the `token.dep` attribute predicted by the parser
as a feature for a subsequent tagger component in the pipeline.
</Project>
### SpanCategorizer for predicting arbitrary and overlapping spans {#spancategorizer tag="experimental"}
A common task in applied NLP is extracting spans of texts from documents,
including longer phrases or nested expressions. Named entity recognition isn't
the right tool for this problem, since an entity recognizer typically predicts
single token-based tags that are very sensitive to boundaries. This is effective
for proper nouns and self-contained expressions, but less useful for other types
of phrases or overlapping spans. The new
[`SpanCategorizer`](/api/spancategorizer) component and
[SpanCategorizer](/api/architectures#spancategorizer) architecture let you label
arbitrary and potentially overlapping spans of texts. A span categorizer
consists of two parts: a [suggester function](/api/spancategorizer#suggesters)
that proposes candidate spans, which may or may not overlap, and a labeler model
that predicts zero or more labels for each candidate. The predicted spans are
available via the [`Doc.spans`](/api/doc#spans) container.
<!-- TODO: example, getting started (init config?), maybe project template -->
<Infobox title="Tip: Create data with Prodigy's new span annotation UI">
[![Prodigy: example of the new manual spans UI](../images/prodigy_spans-manual.jpg)](https://support.prodi.gy/t/3861)
The upcoming version of our annotation tool [Prodigy](https://prodi.gy)
(currently available as a [pre-release](https://support.prodi.gy/t/3861) for all
users) features a [new workflow and UI](https://support.prodi.gy/t/3861) for
annotating overlapping and nested spans. You can use it to create training data
for spaCy's `SpanCategorizer` component.
</Infobox>
### Update the entity recognizer with partial incorrect annotations {#negative-samples}
> #### config.cfg (excerpt)
>
> ```ini
> [components.ner]
> factory = "ner"
> incorrect_spans_key = "incorrect_spans"
> moves = null
> update_with_oracle_cut_size = 100
> ```
The [`EntityRecognizer`](/api/entityrecognizer) can now be updated with known
incorrect annotations, which lets you take advantage of partial and sparse data.
For example, you'll be able to use the information that certain spans of text
are definitely **not** `PERSON` entities, without having to provide the complete
gold-standard annotations for the given example. The incorrect span annotations
can be added via the [`Doc.spans`](/api/doc#spans) in the training data under
the key defined as [`incorrect_spans_key`](/api/entityrecognizer#init) in the
component config.
```python
train_doc = nlp.make_doc("Barack Obama was born in Hawaii.")
# The doc.spans key can be defined in the config
train_doc.spans["incorrect_spans"] = [
Span(doc, 0, 2, label="ORG"),
Span(doc, 5, 6, label="PRODUCT")
]
```
<!-- TODO: more details and/or example project? -->
### New pipeline packages for Catalan and Danish {#pipeline-packages}
spaCy v3.1 adds 5 new pipeline packages, including a new core family for Catalan
and a new transformer-based pipeline for Danish using the
[`danish-bert-botxo`](http://huggingface.co/Maltehb/danish-bert-botxo) weights.
See the [models directory](/models) for an overview of all available trained
pipelines and the [training guide](/usage/training) for details on how to train
your own.
> Thanks to Carlos Rodríguez Penagos and the
> [Barcelona Supercomputing Center](https://temu.bsc.es/) for their
> contributions for Catalan and to Kenneth Enevoldsen for Danish. For additional
> Danish pipelines, check out [DaCy](https://github.com/KennethEnevoldsen/DaCy).
| Package | Language | UPOS | Parser LAS |  NER F |
| ------------------------------------------------- | -------- | ---: | ---------: | -----: |
| [`ca_core_news_sm`](/models/ca#ca_core_news_sm) | Catalan | 98.2 | 87.4 | 79.8 |
| [`ca_core_news_md`](/models/ca#ca_core_news_md) | Catalan | 98.3 | 88.2 | 84.0 |
| [`ca_core_news_lg`](/models/ca#ca_core_news_lg) | Catalan | 98.5 | 88.4 | 84.2 |
| [`ca_core_news_trf`](/models/ca#ca_core_news_trf) | Catalan | 98.9 | 93.0 | 91.2 |
| [`da_core_news_trf`](/models/da#da_core_news_trf) | Danish | 98.0 | 85.0 | 82.9 |
### Resizable text classification architectures {#resizable-textcat}
Previously, the [`TextCategorizer`](/api/textcategorizer) architectures could
not be resized, meaning that you couldn't add new labels to an already trained
model. In spaCy v3.1, the [TextCatCNN](/api/architectures#TextCatCNN) and
[TextCatBOW](/api/architectures#TextCatBOW) architectures are now resizable,
while ensuring that the predictions for the old labels remain the same.
### CLI command to assemble pipeline from config {#assemble}
The [`spacy assemble`](/api/cli#assemble) command lets you assemble a pipeline
from a config file without additional training. It can be especially useful for
creating a blank pipeline with a custom tokenizer, rule-based components or word
vectors.
```cli
$ python -m spacy assemble config.cfg ./output
```
### Pretty pipeline package READMEs {#package-readme}
The [`spacy package`](/api/cli#package) command now auto-generates a pretty
`README.md` based on the pipeline information defined in the `meta.json`. This
includes a table with a general overview, as well as the label scheme and
accuracy figures, if available. For an example, see the
[model releases](https://github.com/explosion/spacy-models/releases).
### Support for streaming large or infinite corpora {#streaming-corpora}
> #### config.cfg (excerpt)
>
> ```ini
> [training]
> max_epochs = -1
> ```
The training process now supports streaming large or infinite corpora
out-of-the-box, which can be controlled via the
[`[training.max_epochs]`](/api/data-formats#training) config setting. Setting it
to `-1` means that the train corpus should be streamed rather than loaded into
memory with no shuffling within the training loop. For details on how to
implement a custom corpus loader, e.g. to stream in data from a remote storage,
see the usage guide on
[custom data reading](/usage/training#custom-code-readers-batchers).
When streaming a corpus, only the first 100 examples will be used for
[initialization](/usage/training#config-lifecycle). This is no problem if you're
training a component like the text classifier with data that specifies all
available labels in every example. If necessary, you can use the
[`init labels`](/api/cli#init-labels) command to pre-generate the labels for
your components using a representative sample so the model can be initialized
correctly before training.
### New lemmatizers for Catalan and Italian {#pos-lemmatizers}
The trained pipelines for [Catalan](/models/ca) and [Italian](/models/it) now
include lemmatizers that use the predicted part-of-speech tags as part of the
lookup lemmatization for higher lemmatization accuracy. If you're training your
own pipelines for these languages and you want to include a lemmatizer, make
sure you have the
[`spacy-lookups-data`](https://github.com/explosion/spacy-lookups-data) package
installed, which provides the relevant tables.
### Upload your pipelines to the Hugging Face Hub {#huggingface-hub}
The [Hugging Face Hub](https://huggingface.co/) lets you upload models and share
them with others, and it now supports spaCy pipelines out-of-the-box. The new
[`spacy-huggingface-hub`](https://github.com/explosion/spacy-huggingface-hub)
package automatically adds the `huggingface-hub` command to your `spacy` CLI. It
lets you upload any pipelines packaged with [`spacy package`](/api/cli#package)
and `--build wheel` and takes care of auto-generating all required meta
information.
After uploading, you'll get a live URL for your model page that includes all
details, files and interactive visualizers, as well as a direct URL to the wheel
file that you can install via `pip install`. For examples, check out the
[spaCy pipelines](https://huggingface.co/spacy) we've uploaded.
```cli
$ pip install spacy-huggingface-hub
$ huggingface-cli login
$ python -m spacy package ./en_ner_fashion ./output --build wheel
$ cd ./output/en_ner_fashion-0.0.0/dist
$ python -m spacy huggingface-hub push en_ner_fashion-0.0.0-py3-none-any.whl
```
You can also integrate the upload command into your
[project template](/usage/projects#huggingface_hub) to automatically upload your
packaged pipelines after training.
<Project id="integrations/huggingface_hub">
Get started with uploading your models to the Hugging Face hub using our project
template. It trains a simple pipeline, packages it and uploads it if the
packaged model has changed. This makes it easy to deploy your models end-to-end.
</Project>
## Notes about upgrading from v3.0 {#upgrading}
### Pipeline package version compatibility {#version-compat}
> #### Using legacy implementations
>
> In spaCy v3, you'll still be able to load and reference legacy implementations
> via [`spacy-legacy`](https://github.com/explosion/spacy-legacy), even if the
> components or architectures change and newer versions are available in the
> core library.
When you're loading a pipeline package trained with spaCy v3.0, you will see a
warning telling you that the pipeline may be incompatible. This doesn't
necessarily have to be true, but we recommend running your pipelines against
your test suite or evaluation data to make sure there are no unexpected results.
If you're using one of the [trained pipelines](/models) we provide, you should
run [`spacy download`](/api/cli#download) to update to the latest version. To
see an overview of all installed packages and their compatibility, you can run
[`spacy validate`](/api/cli#validate).
If you've trained your own custom pipeline and you've confirmed that it's still
working as expected, you can update the spaCy version requirements in the
[`meta.json`](/api/data-formats#meta):
```diff
- "spacy_version": ">=3.0.0,<3.1.0",
+ "spacy_version": ">=3.0.0,<3.2.0",
```
### Updating v3.0 configs
To update a config from spaCy v3.0 with the new v3.1 settings, run
[`init fill-config`](/api/cli#init-fill-config):
```bash
python -m spacy init fill-config config-v3.0.cfg config-v3.1.cfg
```
In many cases (`spacy train`, `spacy.load()`), the new defaults will be filled
in automatically, but you'll need to fill in the new settings to run
[`debug config`](/api/cli#debug) and [`debug data`](/api/cli#debug-data).
### Sourcing pipeline components with vectors {#source-vectors}
If you're sourcing a pipeline component that requires static vectors (for
example, a tagger or parser from an `md` or `lg` pretrained pipeline), be sure
to include the source model's vectors in the setting `[initialize.vectors]`. In
spaCy v3.0, a bug allowed vectors to be loaded implicitly through `source`,
however in v3.1 this setting must be provided explicitly as
`[initialize.vectors]`:
```ini
### config.cfg (excerpt)
[components.ner]
source = "en_core_web_md"
[initialize]
vectors = "en_core_web_md"
```
<Infobox title="Important note" variant="warning">
Each pipeline can only store one set of static vectors, so it's not possible to
assemble a pipeline with components that were trained on different static
vectors.
</Infobox>
[`spacy train`](/api/cli#train) and [`spacy assemble`](/api/cli#assemble) will
provide warnings if the source and target pipelines don't contain the same
vectors. If you are sourcing a rule-based component like an entity ruler or
lemmatizer that does not use the vectors as a model feature, then this warning
can be safely ignored.

View File

@ -9,7 +9,8 @@
{ "text": "Models & Languages", "url": "/usage/models" },
{ "text": "Facts & Figures", "url": "/usage/facts-figures" },
{ "text": "spaCy 101", "url": "/usage/spacy-101" },
{ "text": "New in v3.0", "url": "/usage/v3" }
{ "text": "New in v3.0", "url": "/usage/v3" },
{ "text": "New in v3.1", "url": "/usage/v3-1" }
]
},
{
@ -136,9 +137,7 @@
},
{
"label": "Legacy",
"items": [
{ "text": "Legacy functions", "url": "/api/legacy" }
]
"items": [{ "text": "Legacy functions", "url": "/api/legacy" }]
}
]
}

View File

@ -14,7 +14,7 @@ import GitHubCode from './github'
import classes from '../styles/code.module.sass'
const WRAP_THRESHOLD = 30
const CLI_GROUPS = ['init', 'debug', 'project', 'ray']
const CLI_GROUPS = ['init', 'debug', 'project', 'ray', 'huggingface-hub']
export default props => (
<Pre>

View File

@ -0,0 +1,66 @@
<svg width="2258" height="309" viewBox="0 0 2258 309" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M1592.43 22.3918H2224.99V273.72H1592.43V22.3918Z" fill="#FF6108"/>
<path d="M1592.43 22.3918H2224.99V273.72H1592.43V22.3918Z" fill="url(#paint0_linear)"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M1616.42 65.0073C1616.42 56.5344 1621.28 50.4248 1628.08 49.4956H1663.32C1669.96 50.4019 1674.74 56.2368 1674.97 64.3857V110.101C1674.97 123.369 1685.41 134.125 1698.3 134.125C1711.19 134.125 1721.63 123.369 1721.63 110.101V65.0073H1721.79C1721.79 56.5344 1726.65 50.4248 1733.45 49.4956H1767.69C1773.98 50.3555 1778.61 55.6504 1779.27 63.1489V245.17C1778.57 253.17 1773.35 258.662 1766.4 258.936H1734.74C1727.27 258.641 1721.79 252.309 1721.79 243.312H1721.63V194.385C1721.63 181.117 1711.19 172.282 1698.3 172.282C1685.41 172.282 1674.97 181.117 1674.97 194.385V243.933C1674.72 252.599 1669.33 258.648 1662.03 258.936H1629.31C1621.84 258.641 1616.37 252.309 1616.37 243.312H1616.36V65.0073H1616.42ZM1827.12 57.2922C1827.12 54.5674 1826.7 51.946 1825.9 49.4956H1884.78C1883.87 52.0574 1883.38 54.8223 1883.38 57.7058V172.282C1883.38 185.55 1893.82 196.306 1906.71 196.306C1919.6 196.306 1930.04 185.55 1930.04 172.282V57.7058C1930.04 54.8223 1929.55 52.0574 1928.64 49.4956H1987.18C1986.39 51.9458 1985.96 54.5674 1985.96 57.2922V225.885C1984.96 244.146 1969.84 258.647 1951.34 258.647V258.936H1861.84V258.647C1842.69 258.647 1827.17 243.124 1827.17 223.975L1827.12 224.038V57.2922ZM2033.88 236.861H2033.82V66.6709C2034.12 53.4836 2036.11 50.1069 2049.31 49.4956H2171.32V49.5181L2171.76 49.4956H2174.63C2187.96 50.0525 2209.41 58.3354 2209.41 89.0737C2209.41 112.256 2203.22 121.2 2197.7 129.168C2193.27 135.568 2189.27 141.339 2189.27 153.352C2189.27 165.827 2193.58 172.628 2198.21 179.938C2203.59 188.432 2209.41 197.613 2209.41 217.181C2209.41 238.143 2198.67 257.833 2173.56 258.936H2055.01C2036.6 258.871 2033.88 257.714 2033.88 236.861ZM1625.93 0.215576C1594.21 0.215576 1568.5 26.6917 1568.5 59.3516V249.08C1568.5 281.74 1594.21 308.216 1625.93 308.216H2200.07C2231.79 308.216 2257.5 281.74 2257.5 249.08V59.3516C2257.5 26.6917 2231.79 0.215576 2200.07 0.215576H1625.93ZM2141.55 116.024C2141.55 104.28 2132.3 94.7593 2120.9 94.7593H2109.72C2098.32 94.7593 2089.07 104.28 2089.07 116.024C2089.07 127.768 2098.32 137.288 2109.72 137.288H2120.9C2132.3 137.288 2141.55 127.768 2141.55 116.024ZM2120.9 178.787C2132.3 178.787 2141.55 188.307 2141.55 200.051C2141.55 211.795 2132.3 221.316 2120.9 221.316H2109.72C2098.32 221.316 2089.07 211.795 2089.07 200.051C2089.07 188.307 2098.32 178.787 2109.72 178.787H2120.9Z" fill="black"/>
<path d="M0.669922 234.41V71.0908H43.8868V131.896H92.6314V71.0908H135.848V234.41H92.6314V169.585H43.8868V234.41H0.669922Z" fill="url(#paint1_linear)"/>
<path d="M206.614 237.426C199.746 237.426 193.799 236.253 188.774 233.908C183.916 231.395 179.896 227.961 176.714 223.606C173.531 219.084 171.186 213.723 169.678 207.525C168.171 201.16 167.417 194.041 167.417 186.168V108.78H210.634V180.641C210.634 188.681 211.639 194.041 213.649 196.721C215.659 199.401 218.842 200.741 223.197 200.741C227.217 200.741 230.483 199.904 232.996 198.229C235.676 196.554 238.44 193.706 241.288 189.686V108.78H284.505V234.41H249.328L246.313 217.827H245.308C240.283 223.857 234.671 228.631 228.473 232.149C222.443 235.667 215.156 237.426 206.614 237.426Z" fill="url(#paint2_linear)"/>
<path d="M359.412 286.17C352.209 286.17 345.425 285.584 339.06 284.411C332.694 283.406 327.083 281.647 322.225 279.135C317.535 276.622 313.766 273.356 310.918 269.336C308.071 265.483 306.647 260.709 306.647 255.014C306.647 245.131 312.51 237.091 324.235 230.893V229.888C320.885 227.543 318.121 224.695 315.944 221.345C313.766 217.995 312.677 213.64 312.677 208.279C312.677 203.924 313.933 199.653 316.446 195.465C318.959 191.277 322.393 187.676 326.748 184.661V183.656C322.225 180.641 318.205 176.369 314.687 170.841C311.337 165.314 309.662 158.865 309.662 151.494C309.662 143.789 311.17 137.089 314.185 131.393C317.2 125.531 321.22 120.757 326.245 117.072C331.27 113.219 336.966 110.371 343.331 108.529C349.864 106.686 356.564 105.765 363.432 105.765C370.802 105.765 377.503 106.77 383.533 108.78H430.267V139.936H412.177C412.679 141.444 413.098 143.37 413.433 145.715C413.935 148.06 414.187 150.489 414.187 153.002C414.187 160.372 412.847 166.737 410.166 172.098C407.486 177.29 403.885 181.562 399.362 184.912C394.84 188.262 389.479 190.775 383.281 192.45C377.084 193.957 370.467 194.711 363.432 194.711C359.244 194.711 354.554 194.125 349.361 192.952C348.021 194.125 347.184 195.214 346.849 196.219C346.514 197.056 346.346 198.396 346.346 200.239C346.346 203.087 347.686 205.097 350.366 206.269C353.046 207.274 357.569 207.777 363.934 207.777H382.779C398.525 207.777 410.669 210.373 419.212 215.566C427.922 220.591 432.277 228.883 432.277 240.441C432.277 247.308 430.519 253.506 427.001 259.034C423.651 264.729 418.793 269.587 412.428 273.607C406.23 277.627 398.608 280.726 389.563 282.904C380.685 285.081 370.635 286.17 359.412 286.17ZM363.432 169.083C367.452 169.083 370.718 167.659 373.231 164.811C375.744 161.964 377 157.525 377 151.494C377 145.632 375.744 141.36 373.231 138.68C370.718 135.832 367.452 134.409 363.432 134.409C359.412 134.409 356.145 135.832 353.633 138.68C351.12 141.36 349.864 145.632 349.864 151.494C349.864 157.525 351.12 161.964 353.633 164.811C356.145 167.659 359.412 169.083 363.432 169.083ZM366.447 259.537C373.482 259.537 379.345 258.448 384.035 256.27C388.725 254.26 391.071 251.496 391.071 247.979C391.071 244.628 389.563 242.535 386.548 241.697C383.7 240.859 379.596 240.441 374.236 240.441H364.437C359.412 240.441 355.559 240.273 352.879 239.938C350.366 239.771 348.189 239.436 346.346 238.933C344.839 240.441 343.666 241.865 342.828 243.205C342.158 244.545 341.823 246.136 341.823 247.979C341.823 251.999 344.085 254.93 348.607 256.773C353.13 258.615 359.077 259.537 366.447 259.537Z" fill="url(#paint3_linear)"/>
<path d="M496.575 286.17C489.372 286.17 482.588 285.584 476.223 284.411C469.857 283.406 464.246 281.647 459.388 279.135C454.698 276.622 450.929 273.356 448.081 269.336C445.234 265.483 443.81 260.709 443.81 255.014C443.81 245.131 449.673 237.091 461.398 230.893V229.888C458.048 227.543 455.284 224.695 453.107 221.345C450.929 217.995 449.84 213.64 449.84 208.279C449.84 203.924 451.096 199.653 453.609 195.465C456.122 191.277 459.556 187.676 463.911 184.661V183.656C459.388 180.641 455.368 176.369 451.85 170.841C448.5 165.314 446.825 158.865 446.825 151.494C446.825 143.789 448.333 137.089 451.348 131.393C454.363 125.531 458.383 120.757 463.408 117.072C468.433 113.219 474.129 110.371 480.494 108.529C487.027 106.686 493.727 105.765 500.595 105.765C507.965 105.765 514.665 106.77 520.696 108.78H567.43V139.936H549.339C549.842 141.444 550.261 143.37 550.596 145.715C551.098 148.06 551.35 150.489 551.35 153.002C551.35 160.372 550.01 166.737 547.329 172.098C544.649 177.29 541.048 181.562 536.525 184.912C532.002 188.262 526.642 190.775 520.444 192.45C514.247 193.957 507.63 194.711 500.595 194.711C496.407 194.711 491.717 194.125 486.524 192.952C485.184 194.125 484.347 195.214 484.012 196.219C483.677 197.056 483.509 198.396 483.509 200.239C483.509 203.087 484.849 205.097 487.529 206.269C490.209 207.274 494.732 207.777 501.097 207.777H519.942C535.688 207.777 547.832 210.373 556.375 215.566C565.085 220.591 569.44 228.883 569.44 240.441C569.44 247.308 567.682 253.506 564.164 259.034C560.814 264.729 555.956 269.587 549.591 273.607C543.393 277.627 535.771 280.726 526.726 282.904C517.848 285.081 507.798 286.17 496.575 286.17ZM500.595 169.083C504.615 169.083 507.881 167.659 510.394 164.811C512.907 161.964 514.163 157.525 514.163 151.494C514.163 145.632 512.907 141.36 510.394 138.68C507.881 135.832 504.615 134.409 500.595 134.409C496.575 134.409 493.308 135.832 490.796 138.68C488.283 141.36 487.027 145.632 487.027 151.494C487.027 157.525 488.283 161.964 490.796 164.811C493.308 167.659 496.575 169.083 500.595 169.083ZM503.61 259.537C510.645 259.537 516.508 258.448 521.198 256.27C525.888 254.26 528.234 251.496 528.234 247.979C528.234 244.628 526.726 242.535 523.711 241.697C520.863 240.859 516.759 240.441 511.399 240.441H501.6C496.575 240.441 492.722 240.273 490.042 239.938C487.529 239.771 485.352 239.436 483.509 238.933C482.002 240.441 480.829 241.865 479.991 243.205C479.321 244.545 478.986 246.136 478.986 247.979C478.986 251.999 481.248 254.93 485.77 256.773C490.293 258.615 496.24 259.537 503.61 259.537Z" fill="url(#paint4_linear)"/>
<path d="M588.008 234.41V108.78H631.225V234.41H588.008ZM609.617 93.2018C602.581 93.2018 596.802 91.1917 592.28 87.1715C587.757 83.1513 585.496 77.9586 585.496 71.5933C585.496 65.228 587.757 60.0353 592.28 56.0151C596.802 51.995 602.581 49.9849 609.617 49.9849C616.652 49.9849 622.431 51.995 626.954 56.0151C631.476 60.0353 633.738 65.228 633.738 71.5933C633.738 77.9586 631.476 83.1513 626.954 87.1715C622.431 91.1917 616.652 93.2018 609.617 93.2018Z" fill="url(#paint5_linear)"/>
<path d="M660.393 234.41V108.78H695.569L698.585 123.856H699.59C704.615 118.998 710.31 114.81 716.675 111.293C723.208 107.607 730.746 105.765 739.289 105.765C753.024 105.765 762.991 110.455 769.189 119.835C775.387 129.048 778.486 141.444 778.486 157.022V234.41H735.269V162.55C735.269 154.509 734.264 149.149 732.254 146.469C730.243 143.789 727.061 142.449 722.706 142.449C718.685 142.449 715.335 143.286 712.655 144.962C709.975 146.637 706.96 148.982 703.61 151.997V234.41H660.393Z" fill="url(#paint6_linear)"/>
<path d="M852.364 286.17C845.161 286.17 838.377 285.584 832.012 284.411C825.647 283.406 820.035 281.647 815.177 279.135C810.487 276.622 806.718 273.356 803.871 269.336C801.023 265.483 799.599 260.709 799.599 255.014C799.599 245.131 805.462 237.091 817.188 230.893V229.888C813.837 227.543 811.074 224.695 808.896 221.345C806.718 217.995 805.63 213.64 805.63 208.279C805.63 203.924 806.886 199.653 809.398 195.465C811.911 191.277 815.345 187.676 819.7 184.661V183.656C815.177 180.641 811.157 176.369 807.64 170.841C804.29 165.314 802.614 158.865 802.614 151.494C802.614 143.789 804.122 137.089 807.137 131.393C810.152 125.531 814.172 120.757 819.198 117.072C824.223 113.219 829.918 110.371 836.283 108.529C842.816 106.686 849.516 105.765 856.384 105.765C863.755 105.765 870.455 106.77 876.485 108.78H923.22V139.936H905.129C905.631 141.444 906.05 143.37 906.385 145.715C906.888 148.06 907.139 150.489 907.139 153.002C907.139 160.372 905.799 166.737 903.119 172.098C900.439 177.29 896.837 181.562 892.315 184.912C887.792 188.262 882.432 190.775 876.234 192.45C870.036 193.957 863.42 194.711 856.384 194.711C852.197 194.711 847.506 194.125 842.314 192.952C840.974 194.125 840.136 195.214 839.801 196.219C839.466 197.056 839.299 198.396 839.299 200.239C839.299 203.087 840.639 205.097 843.319 206.269C845.999 207.274 850.522 207.777 856.887 207.777H875.731C891.477 207.777 903.621 210.373 912.164 215.566C920.875 220.591 925.23 228.883 925.23 240.441C925.23 247.308 923.471 253.506 919.953 259.034C916.603 264.729 911.745 269.587 905.38 273.607C899.182 277.627 891.561 280.726 882.515 282.904C873.638 285.081 863.587 286.17 852.364 286.17ZM856.384 169.083C860.404 169.083 863.671 167.659 866.183 164.811C868.696 161.964 869.952 157.525 869.952 151.494C869.952 145.632 868.696 141.36 866.183 138.68C863.671 135.832 860.404 134.409 856.384 134.409C852.364 134.409 849.098 135.832 846.585 138.68C844.072 141.36 842.816 145.632 842.816 151.494C842.816 157.525 844.072 161.964 846.585 164.811C849.098 167.659 852.364 169.083 856.384 169.083ZM859.399 259.537C866.435 259.537 872.297 258.448 876.988 256.27C881.678 254.26 884.023 251.496 884.023 247.979C884.023 244.628 882.515 242.535 879.5 241.697C876.653 240.859 872.549 240.441 867.188 240.441H857.389C852.364 240.441 848.511 240.273 845.831 239.938C843.319 239.771 841.141 239.436 839.299 238.933C837.791 240.441 836.618 241.865 835.781 243.205C835.111 244.545 834.776 246.136 834.776 247.979C834.776 251.999 837.037 254.93 841.56 256.773C846.083 258.615 852.029 259.537 859.399 259.537Z" fill="url(#paint7_linear)"/>
<path d="M997.617 234.41V71.0908H1104.15V107.272H1040.83V138.429H1095.11V174.61H1040.83V234.41H997.617Z" fill="url(#paint8_linear)"/>
<path d="M1151.41 237.426C1145.55 237.426 1140.27 236.421 1135.58 234.41C1130.89 232.233 1126.87 229.385 1123.52 225.868C1120.34 222.35 1117.91 218.246 1116.24 213.556C1114.56 208.866 1113.72 203.924 1113.72 198.731C1113.72 185.666 1119.08 175.448 1129.8 168.078C1140.52 160.54 1157.94 155.514 1182.07 153.002C1181.06 144.291 1175.2 139.936 1164.48 139.936C1159.95 139.936 1155.18 140.858 1150.16 142.7C1145.13 144.375 1139.35 146.972 1132.82 150.489L1117.74 122.348C1126.62 116.988 1135.58 112.884 1144.63 110.036C1153.67 107.189 1163.14 105.765 1173.02 105.765C1189.27 105.765 1202 110.455 1211.21 119.835C1220.59 129.048 1225.28 143.956 1225.28 164.56V234.41H1190.11L1187.09 222.35H1186.09C1181.06 226.873 1175.7 230.558 1170.01 233.405C1164.48 236.085 1158.28 237.426 1151.41 237.426ZM1166.49 204.259C1170.01 204.259 1172.85 203.505 1175.03 201.998C1177.38 200.323 1179.72 198.229 1182.07 195.716V178.63C1171.68 180.138 1164.56 182.399 1160.71 185.415C1156.86 188.43 1154.93 191.696 1154.93 195.214C1154.93 201.244 1158.78 204.259 1166.49 204.259Z" fill="url(#paint9_linear)"/>
<path d="M1310.24 237.426C1301.36 237.426 1293.07 236.002 1285.37 233.154C1277.66 230.139 1270.96 225.868 1265.27 220.34C1259.57 214.645 1255.05 207.693 1251.7 199.485C1248.52 191.277 1246.92 181.981 1246.92 171.595C1246.92 161.21 1248.77 151.913 1252.45 143.705C1256.14 135.497 1261.08 128.63 1267.28 123.102C1273.47 117.407 1280.59 113.135 1288.63 110.287C1296.84 107.272 1305.38 105.765 1314.26 105.765C1321.8 105.765 1328.58 106.937 1334.61 109.282C1340.64 111.628 1346.09 114.81 1350.95 118.83L1330.85 146.469C1328 144.124 1325.49 142.533 1323.31 141.695C1321.3 140.858 1319.12 140.439 1316.77 140.439C1308.73 140.439 1302.45 143.286 1297.93 148.982C1293.41 154.509 1291.15 162.047 1291.15 171.595C1291.15 181.143 1293.41 188.765 1297.93 194.46C1302.62 199.988 1308.4 202.752 1315.27 202.752C1318.78 202.752 1322.22 201.998 1325.57 200.49C1329.09 198.983 1332.35 197.056 1335.37 194.711L1351.95 222.852C1345.59 228.38 1338.63 232.233 1331.1 234.41C1323.73 236.421 1316.77 237.426 1310.24 237.426Z" fill="url(#paint10_linear)"/>
<path d="M1422.66 237.426C1413.45 237.426 1404.9 236.002 1397.03 233.154C1389.16 230.139 1382.29 225.868 1376.43 220.34C1370.73 214.645 1366.21 207.693 1362.86 199.485C1359.51 191.277 1357.83 181.981 1357.83 171.595C1357.83 161.377 1359.51 152.248 1362.86 144.208C1366.38 136 1370.9 129.048 1376.43 123.353C1382.12 117.658 1388.57 113.303 1395.77 110.287C1402.98 107.272 1410.43 105.765 1418.14 105.765C1427.35 105.765 1435.39 107.356 1442.26 110.539C1449.12 113.721 1454.82 118.077 1459.34 123.604C1463.86 129.132 1467.21 135.581 1469.39 142.951C1471.74 150.322 1472.91 158.195 1472.91 166.57C1472.91 170.423 1472.66 174.024 1472.16 177.374C1471.82 180.724 1471.49 183.153 1471.15 184.661H1399.54C1401.55 192.199 1405.15 197.475 1410.35 200.49C1415.54 203.338 1421.65 204.762 1428.69 204.762C1433.04 204.762 1437.23 204.175 1441.25 203.003C1445.44 201.663 1449.79 199.736 1454.32 197.224L1468.39 222.852C1461.52 227.71 1453.9 231.395 1445.52 233.908C1437.31 236.253 1429.69 237.426 1422.66 237.426ZM1399.04 156.52H1436.23C1436.23 151.662 1434.97 147.474 1432.46 143.956C1430.11 140.271 1425.67 138.429 1419.14 138.429C1414.28 138.429 1410.01 139.853 1406.33 142.7C1402.81 145.548 1400.38 150.154 1399.04 156.52Z" fill="url(#paint11_linear)"/>
<defs>
<linearGradient id="paint0_linear" x1="1908.71" y1="22.3918" x2="1908.71" y2="273.72" gradientUnits="userSpaceOnUse">
<stop stop-color="#FF7C04"/>
<stop offset="1" stop-color="#FFD21E"/>
</linearGradient>
<linearGradient id="paint1_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint2_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint3_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint4_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint5_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint6_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint7_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint8_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint9_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint10_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
<linearGradient id="paint11_linear" x1="736.79" y1="8.06926" x2="736.79" y2="315.693" gradientUnits="userSpaceOnUse">
<stop stop-color="#00183D"/>
<stop offset="1" stop-color="#0E0E0E"/>
</linearGradient>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 18 KiB

View File

@ -119,8 +119,8 @@ const AlertSpace = ({ nightly, legacy }) => {
}
const navAlert = (
<Link to="/usage/v3" hidden>
<strong>💥 Out now:</strong> spaCy v3.0
<Link to="/usage/v3-1" hidden>
<strong>💥 Out now:</strong> spaCy v3.1
</Link>
)

View File

@ -8,6 +8,7 @@ import StreamlitLogo from '-!svg-react-loader!../images/logos/streamlit.svg'
import FastAPILogo from '-!svg-react-loader!../images/logos/fastapi.svg'
import WandBLogo from '-!svg-react-loader!../images/logos/wandb.svg'
import RayLogo from '-!svg-react-loader!../images/logos/ray.svg'
import HuggingFaceHubLogo from '-!svg-react-loader!../images/logos/huggingface_hub.svg'
const LOGOS = {
dvc: DVCLogo,
@ -16,6 +17,7 @@ const LOGOS = {
fastapi: FastAPILogo,
wandb: WandBLogo,
ray: RayLogo,
huggingface_hub: HuggingFaceHubLogo,
}
export const IntegrationLogo = ({ name, title, width, height, maxWidth, align, ...props }) => {