mirror of
https://github.com/explosion/spaCy.git
synced 2024-12-27 10:26:35 +03:00
9ce059dd06
* Limiting noun_chunks for specific langauges * Limiting noun_chunks for specific languages Contributor Agreement * Addressing review comments * Removed unused fixtures and imports * Add fa_tokenizer in test suite * Use fa_tokenizer in test * Undo extraneous reformatting Co-authored-by: adrianeboyd <adrianeboyd@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
_tokenizer_exceptions_list.py | ||
examples.py | ||
lemmatizer.py | ||
lex_attrs.py | ||
punctuation.py | ||
stop_words.py | ||
syntax_iterators.py | ||
tag_map.py | ||
tokenizer_exceptions.py |