mirror of
https://github.com/explosion/spaCy.git
synced 2025-08-03 03:40:24 +03:00
Wording
This commit is contained in:
parent
842bbeae29
commit
06ecd0890a
|
@ -1710,7 +1710,7 @@ typical use case for distillation is to extract a smaller, more performant model
|
||||||
from a larger high-accuracy model. Since distillation uses the activations of
|
from a larger high-accuracy model. Since distillation uses the activations of
|
||||||
the teacher, distillation can be performed on a corpus of raw text without (gold
|
the teacher, distillation can be performed on a corpus of raw text without (gold
|
||||||
standard) annotations. A development set of gold annotations _is_ needed to
|
standard) annotations. A development set of gold annotations _is_ needed to
|
||||||
evaluate the distilled model on during distillation.
|
evaluate the student pipeline on during distillation.
|
||||||
|
|
||||||
`distill` will save out the best performing pipeline across all epochs, as well
|
`distill` will save out the best performing pipeline across all epochs, as well
|
||||||
as the final pipeline. The `--code` argument can be used to provide a Python
|
as the final pipeline. The `--code` argument can be used to provide a Python
|
||||||
|
|
Loading…
Reference in New Issue
Block a user