Merge remote-tracking branch 'origin/master' into specify-data-path

This commit is contained in:
Mark Amery 2016-11-20 18:29:06 +00:00
commit 2dc305f46b
5 changed files with 8 additions and 8 deletions

View File

@ -164,8 +164,8 @@ p
+cell #[code other]
+cell -
+cell
| The object to compare with. By default, accepts #[code Doc],
| #[code Span], #[code Token] and #[code Lexeme] objects.
| The object to compare with. By default, accepts #[code Doc],
| #[code Span], #[code Token] and #[code Lexeme] objects.
+footrow
+cell return

View File

@ -156,8 +156,8 @@ p
+cell #[code other]
+cell -
+cell
| The object to compare with. By default, accepts #[code Doc],
| #[code Span], #[code Token] and #[code Lexeme] objects.
| The object to compare with. By default, accepts #[code Doc],
| #[code Span], #[code Token] and #[code Lexeme] objects.
+footrow
+cell return

View File

@ -70,8 +70,8 @@ p Create the vocabulary.
+cell #[code lex_attr_getters]
+cell dict
+cell
| A dictionary mapping attribute IDs to functions to compute them.
| Defaults to #[code None].
| A dictionary mapping attribute IDs to functions to compute them.
| Defaults to #[code None].
+row
+cell #[code lemmatizer]

View File

@ -56,7 +56,7 @@ p
| fetched via #[code spacy.util.get_data_path()]. You can
| configure this default using #[code spacy.util.set_data_path()].
| The data path is expected to be either a string, or an object
| responding to #[code thepathlib.Path] interface. If the path is
| responding to the #[code pathlib.Path] interface. If the path is
| a string, it will be immediately transformed into a
| #[code pathlib.Path] object. spaCy promises to never manipulate
| or open file-system paths as strings. All access to the

View File

@ -73,7 +73,7 @@ p
| one-by-one. After a long and bitter struggle, the global interpreter
| lock was freed around spaCy's main parsing loop in v0.100.3. This means
| that the #[code .pipe()] method will be significantly faster in most
| practical situations, because it allows shared memory parallelism.
| practical situations, because it allows shared memory parallelism.
+code.
for doc in nlp.pipe(texts, batch_size=10000, n_threads=3):