Merge branch 'develop' into nightly.spacy.io

This commit is contained in:
Ines Montani 2020-09-21 10:56:01 +02:00
commit 98ed3e8864
2 changed files with 14 additions and 6 deletions

View File

@ -921,6 +921,14 @@ package is installed in the same environment as spaCy, it will automatically add
[parallel training](/usage/training#parallel-training) for more details on how [parallel training](/usage/training#parallel-training) for more details on how
it works under the hood. it works under the hood.
<Project id="integrations/ray">
Get started with parallel training using our project template. It trains a
simple model on a Universal Dependencies Treebank and lets you parallelize the
training with Ray.
</Project>
You can integrate [`spacy ray train`](/api/cli#ray-train) into your You can integrate [`spacy ray train`](/api/cli#ray-train) into your
`project.yml` just like the regular training command and pass it the config, and `project.yml` just like the regular training command and pass it the config, and
optional output directory or remote storage URL and config overrides if needed. optional output directory or remote storage URL and config overrides if needed.
@ -940,10 +948,6 @@ commands:
- "training/model-best" - "training/model-best"
``` ```
<!-- TODO: <Project id="integrations/ray">
</Project> -->
--- ---
### Weights & Biases {#wandb} <IntegrationLogo name="wandb" width={175} height="auto" align="right" /> ### Weights & Biases {#wandb} <IntegrationLogo name="wandb" width={175} height="auto" align="right" />

View File

@ -895,9 +895,13 @@ cluster. If it's not set, Ray will run locally.
python -m spacy ray train config.cfg --n-workers 2 python -m spacy ray train config.cfg --n-workers 2
``` ```
<!-- TODO: <Project id="integrations/ray"> <Project id="integrations/ray">
</Project> --> Get started with parallel training using our project template. It trains a
simple model on a Universal Dependencies Treebank and lets you parallelize the
training with Ray.
</Project>
### How parallel training works {#parallel-training-details} ### How parallel training works {#parallel-training-details}