spaCy/spacy/lang/ja
2019-01-10 15:40:37 +01:00
..
__init__.py Making lang/th/test_tokenizer.py pass by creating ThaiTokenizer (#3078) 2019-01-10 15:40:37 +01:00
examples.py Add example sentences for Japanese and Chinese (see #1107) 2017-10-24 13:02:24 +02:00
stop_words.py Add Japanese stop words. (#2549) 2018-07-17 10:12:48 +02:00
tag_map.py Port Japanese mecab tokenizer from v1 (#2036) 2018-05-03 18:38:26 +02:00