Document Language.evaluate

This commit is contained in:
Ines Montani 2019-05-24 14:06:36 +02:00
parent 45e6855550
commit 7634812172
2 changed files with 32 additions and 0 deletions

View File

@ -600,6 +600,19 @@ class Language(object):
def evaluate(
self, docs_golds, verbose=False, batch_size=256, scorer=None, component_cfg=None
):
"""Evaluate a model's pipeline components.
docs_golds (iterable): Tuples of `Doc` and `GoldParse` objects.
verbose (bool): Print debugging information.
batch_size (int): Batch size to use.
scorer (Scorer): Optional `Scorer` to use. If not passed in, a new one
will be created.
component_cfg (dict): An optional dictionary with extra keyword
arguments for specific components.
RETURNS (Scorer): The scorer containing the evaluation results.
DOCS: https://spacy.io/api/language#evaluate
"""
if scorer is None:
scorer = Scorer()
if component_cfg is None:

View File

@ -122,6 +122,25 @@ Update the models in the pipeline.
| `losses` | dict | Dictionary to update with the loss, keyed by pipeline component. |
| `component_cfg` <Tag variant="new">2.1</Tag> | dict | Config parameters for specific pipeline components, keyed by component name. |
## Language.evaluate {#evaluate tag="method"}
Evaluate a model's pipeline components.
> #### Example
>
> ```python
> scorer = nlp.evaluate(docs_golds, verbose=True)
> print(scorer.scores)
> ```
| Name | Type | Description |
| -------------------------------------------- | -------- | ------------------------------------------------------------------------------------- |
| `docs_golds` | iterable | Tuples of `Doc` and `GoldParse` objects. |
| `verbose` | bool | Print debugging information. |
| `batch_size` | int | The batch size to use. |
| `scorer` | `Scorer` | Optional [`Scorer`](/api/scorer) to use. If not passed in, a new one will be created. |
| `component_cfg` <Tag variant="new">2.1</Tag> | dict | Config parameters for specific pipeline components, keyed by component name. |
## Language.begin_training {#begin_training tag="method"}
Allocate models, pre-process training data and acquire an optimizer.