Fix _distill_loop docstring wrt. stopping condition

This commit is contained in:
Daniël de Kok 2023-04-19 20:11:17 +02:00
parent 962c2972e4
commit abc4a7d320

View File

@ -284,11 +284,12 @@ def _distill_loop(
before_update: Optional[Callable[["Language", Dict[str, Any]], None]], before_update: Optional[Callable[["Language", Dict[str, Any]], None]],
student_to_teacher: Dict[str, str], student_to_teacher: Dict[str, str],
): ):
"""Train until an evaluation stops improving. Works as a generator, """Distill until the data is exhausted or the maximum number of steps
with each iteration yielding a tuple `(batch, info, is_best_checkpoint)`, has been reached. Works as a generator, with each iteration yielding
where info is a dict, and is_best_checkpoint is in [True, False, None] -- a tuple `(batch, info, is_best_checkpoint)`, where info is a dict, and
None indicating that the iteration was not evaluated as a checkpoint. is_best_checkpoint is in [True, False, None] -- None indicating that
The evaluation is conducted by calling the evaluate callback. the iteration was not evaluated as a checkpoint. The evaluation is
conducted by calling the evaluate callback.
Positional arguments: Positional arguments:
teacher (Language): The teacher pipeline to distill from. teacher (Language): The teacher pipeline to distill from.