mirror of
https://github.com/explosion/spaCy.git
synced 2025-07-10 16:22:29 +03:00
Update Tok2Vec.distill
docstring
This commit is contained in:
parent
a54efef469
commit
c07941d14a
|
@ -199,14 +199,14 @@ class Tok2Vec(TrainablePipe):
|
||||||
sgd: Optional[Optimizer] = None,
|
sgd: Optional[Optimizer] = None,
|
||||||
losses: Optional[Dict[str, float]] = None,
|
losses: Optional[Dict[str, float]] = None,
|
||||||
) -> Dict[str, float]:
|
) -> Dict[str, float]:
|
||||||
"""Train a pipe (the student) on the predictions of another pipe
|
"""Performs an update of the student pipe's model using the
|
||||||
(the teacher). The student is typically trained on the probability
|
student's distillation examples and sets the annotations
|
||||||
distribution of the teacher, but details may differ per pipe.
|
of the teacher's distillation examples using the teacher pipe.
|
||||||
|
|
||||||
teacher_pipe (Optional[TrainablePipe]): The teacher pipe to learn
|
teacher_pipe (Optional[TrainablePipe]): The teacher pipe to use
|
||||||
from.
|
for prediction.
|
||||||
examples (Iterable[Example]): Distillation examples. The reference
|
examples (Iterable[Example]): Distillation examples. The reference (teacher)
|
||||||
and predicted docs must have the same number of tokens and the
|
and predicted (student) docs must have the same number of tokens and the
|
||||||
same orthography.
|
same orthography.
|
||||||
drop (float): dropout rate.
|
drop (float): dropout rate.
|
||||||
sgd (Optional[Optimizer]): An optimizer. Will be created via
|
sgd (Optional[Optimizer]): An optimizer. Will be created via
|
||||||
|
|
|
@ -102,10 +102,14 @@ pipeline components are applied to the `Doc` in order. Both
|
||||||
|
|
||||||
## Tok2Vec.distill {id="distill", tag="method,experimental", version="4"}
|
## Tok2Vec.distill {id="distill", tag="method,experimental", version="4"}
|
||||||
|
|
||||||
Train a pipe (the student) on the predictions of another pipe (the teacher). The
|
Performs an update of the student pipe's model using the student's distillation
|
||||||
student is typically trained on the probability distribution of the teacher, but
|
examples and sets the annotations of the teacher's distillation examples using
|
||||||
details may differ per pipe. The goal of distillation is to transfer knowledge
|
the teacher pipe.
|
||||||
from the teacher to the student.
|
|
||||||
|
Unlike other trainable pipes, the student pipe doesn't directly learn its
|
||||||
|
representations from the teacher. However, since downstream pipes that do
|
||||||
|
perform distillation expect the tok2vec annotations to be present on the
|
||||||
|
correct distillation examples, we need to ensure that they are set beforehand.
|
||||||
|
|
||||||
The distillation is performed on ~~Example~~ objects. The `Example.reference`
|
The distillation is performed on ~~Example~~ objects. The `Example.reference`
|
||||||
and `Example.predicted` ~~Doc~~s must have the same number of tokens and the
|
and `Example.predicted` ~~Doc~~s must have the same number of tokens and the
|
||||||
|
@ -125,8 +129,8 @@ This feature is experimental.
|
||||||
|
|
||||||
| Name | Description |
|
| Name | Description |
|
||||||
| -------------- | ------------------------------------------------------------------------------------------------------------------------------------------- |
|
| -------------- | ------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||||
| `teacher_pipe` | The teacher pipe to learn from. ~~Optional[TrainablePipe]~~ |
|
| `teacher_pipe` | The teacher pipe to use for prediction. ~~Optional[TrainablePipe]~~ |
|
||||||
| `examples` | Distillation examples. The reference and predicted docs must have the same number of tokens and the same orthography. ~~Iterable[Example]~~ |
|
| `examples` | Distillation examples. The reference (teacher) and predicted (student) docs must have the same number of tokens and the same orthography. ~~Iterable[Example]~~ |
|
||||||
| _keyword-only_ | |
|
| _keyword-only_ | |
|
||||||
| `drop` | Dropout rate. ~~float~~ |
|
| `drop` | Dropout rate. ~~float~~ |
|
||||||
| `sgd` | An optimizer. Will be created via [`create_optimizer`](#create_optimizer) if not set. ~~Optional[Optimizer]~~ |
|
| `sgd` | An optimizer. Will be created via [`create_optimizer`](#create_optimizer) if not set. ~~Optional[Optimizer]~~ |
|
||||||
|
|
Loading…
Reference in New Issue
Block a user