Add Russian to alpha docs and update tokenizer dependencies

This commit is contained in:
ines 2017-10-14 12:52:41 +02:00
parent a69f4e56e5
commit a5da683578
2 changed files with 8 additions and 4 deletions

View File

@ -80,6 +80,7 @@
"da": "Danish",
"hu": "Hungarian",
"pl": "Polish",
"ru": "Russian",
"he": "Hebrew",
"bn": "Bengali",
"id": "Indonesian",

View File

@ -40,10 +40,13 @@ p
+src(gh("spaCy", "spacy/lang/" + code)) #[code lang/#{code}]
+infobox("Dependencies")
| Some language tokenizers require external dependencies. To use #[strong Chinese],
| you need to have #[+a("https://github.com/fxsjy/jieba") Jieba] installed.
| The #[strong Japanese] tokenizer requires
| #[+a("https://github.com/mocobeta/janome") Janome].
.o-block-small Some language tokenizers require external dependencies.
+list.o-no-block
+item #[strong Chinese]: #[+a("https://github.com/fxsjy/jieba") Jieba]
+item #[strong Japanese]: #[+a("https://github.com/mocobeta/janome") Janome]
+item #[strong Thai]: #[+a("https://github.com/wannaphongcom/pythainlp") pythainlp]
+item #[strong Russian]: #[+a("https://github.com/kmike/pymorphy2") pymorphy2]
+h(3, "multi-language") Multi-language support
+tag-new(2)