diff --git a/spacy/pipeline/trainable_pipe.pyx b/spacy/pipeline/trainable_pipe.pyx index e3944592c..fcc838d05 100644 --- a/spacy/pipeline/trainable_pipe.pyx +++ b/spacy/pipeline/trainable_pipe.pyx @@ -70,8 +70,9 @@ cdef class TrainablePipe(Pipe): teacher_pipe (Optional[TrainablePipe]): The teacher pipe to learn from. - examples (Iterable[Example]): Distillation examples. The reference - must contain teacher annotations (if any). + examples (Iterable[Example]): Distillation examples. The eference + and predicted docs must have the same number of tokens and the + same orthography. drop (float): dropout rate. sgd (Optional[Optimizer]): An optimizer. Will be created via create_optimizer if not set. diff --git a/spacy/pipeline/transition_parser.pyx b/spacy/pipeline/transition_parser.pyx index 7ba2641ae..d0efa83c8 100644 --- a/spacy/pipeline/transition_parser.pyx +++ b/spacy/pipeline/transition_parser.pyx @@ -221,8 +221,10 @@ cdef class Parser(TrainablePipe): teacher_pipe (Optional[TrainablePipe]): The teacher pipe to learn from. - examples (Iterable[Example]): Distillation examples. The reference - must contain teacher annotations (if any). + examples (Iterable[Example]): Distillation examples. The eference + and predicted docs must have the same number of tokens and the + same orthography. + drop (float): dropout rate. sgd (Optional[Optimizer]): An optimizer. Will be created via create_optimizer if not set. losses (Optional[Dict[str, float]]): Optional record of loss during diff --git a/website/docs/api/pipe.mdx b/website/docs/api/pipe.mdx index 9813da197..120c8f690 100644 --- a/website/docs/api/pipe.mdx +++ b/website/docs/api/pipe.mdx @@ -239,7 +239,14 @@ predictions and gold-standard annotations, and update the component's model. Train a pipe (the student) on the predictions of another pipe (the teacher). The student is typically trained on the probability distribution of the teacher, but details may differ per pipe. The goal of distillation is to transfer knowledge -from the teacher to the student. This feature is experimental. +from the teacher to the student. + +The distillation is performed on ~~Example~~ objects. The `Example.reference` +and `Example.predicted` ~~Doc~~s must have the same number of tokens and the +same orthography. Even though the reference does not need have to have gold +annotations, the teacher could adds its own annotations when necessary. + +This feature is experimental. > #### Example > @@ -247,19 +254,18 @@ from the teacher to the student. This feature is experimental. > teacher_pipe = teacher.add_pipe("your_custom_pipe") > student_pipe = student.add_pipe("your_custom_pipe") > optimizer = nlp.resume_training() -> losses = student.distill(teacher_pipe, teacher_docs, student_docs, sgd=optimizer) +> losses = student.distill(teacher_pipe, examples, sgd=optimizer) > ``` -| Name | Description | -| -------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | -| `teacher_pipe` | The teacher pipe to learn from. ~~Optional[TrainablePipe]~~ | -| `teacher_docs` | Documents passed through teacher pipes. ~~Iterable[Doc]~~ | -| `student_docs` | Documents passed through student pipes. Must contain the same tokens as `teacher_docs` but may have different annotations. ~~Iterable[Doc]~~ | -| _keyword-only_ | | -| `drop` | Dropout rate. ~~float~~ | -| `sgd` | An optimizer. Will be created via [`create_optimizer`](#create_optimizer) if not set. ~~Optional[Optimizer]~~ | -| `losses` | Optional record of the loss during distillation. Updated using the component name as the key. ~~Optional[Dict[str, float]]~~ | -| **RETURNS** | The updated `losses` dictionary. ~~Dict[str, float]~~ | +| Name | Description | +| -------------- | ------------------------------------------------------------------------------------------------------------------------------------------- | +| `teacher_pipe` | The teacher pipe to learn from. ~~Optional[TrainablePipe]~~ | +| `examples` | Distillation examples. The reference and predicted docs must have the same number of tokens and the same orthography. ~~Iterable[Example]~~ | +| _keyword-only_ | | +| `drop` | Dropout rate. ~~float~~ | +| `sgd` | An optimizer. Will be created via [`create_optimizer`](#create_optimizer) if not set. ~~Optional[Optimizer]~~ | +| `losses` | Optional record of the loss during distillation. Updated using the component name as the key. ~~Optional[Dict[str, float]]~~ | +| **RETURNS** | The updated `losses` dictionary. ~~Dict[str, float]~~ | ## TrainablePipe.rehearse {id="rehearse",tag="method,experimental",version="3"}