diff --git a/website/docs/usage/projects.md b/website/docs/usage/projects.md
index f8d5a3761..95e20525a 100644
--- a/website/docs/usage/projects.md
+++ b/website/docs/usage/projects.md
@@ -921,6 +921,14 @@ package is installed in the same environment as spaCy, it will automatically add
[parallel training](/usage/training#parallel-training) for more details on how
it works under the hood.
+
+
+Get started with parallel training using our project template. It trains a
+simple model on a Universal Dependencies Treebank and lets you parallelize the
+training with Ray.
+
+
+
You can integrate [`spacy ray train`](/api/cli#ray-train) into your
`project.yml` just like the regular training command and pass it the config, and
optional output directory or remote storage URL and config overrides if needed.
@@ -940,10 +948,6 @@ commands:
- "training/model-best"
```
-
-
---
### Weights & Biases {#wandb}
diff --git a/website/docs/usage/training.md b/website/docs/usage/training.md
index 6e9de62c5..071434162 100644
--- a/website/docs/usage/training.md
+++ b/website/docs/usage/training.md
@@ -895,9 +895,13 @@ cluster. If it's not set, Ray will run locally.
python -m spacy ray train config.cfg --n-workers 2
```
-
+Get started with parallel training using our project template. It trains a
+simple model on a Universal Dependencies Treebank and lets you parallelize the
+training with Ray.
+
+
### How parallel training works {#parallel-training-details}