Yu-chun Huang
|
7692b8c071
|
Update __init__.py
Set the "cut_all" parameter to False, or jieba will return ALL POSSIBLE word segmentations.
|
2017-09-12 16:23:47 +08:00 |
|
gispk47
|
669bd14213
|
Update __init__.py
remove the empty string return from jieba.cut,this will cause the list of tokens cant be pushed assert error
|
2017-07-01 13:12:00 +08:00 |
|
Matthew Honnibal
|
532318e80b
|
Import Jieba inside zh.make_doc
|
2016-11-02 23:49:19 +01:00 |
|
Matthew Honnibal
|
5363224395
|
Add draft Jieba tokenizer for Chinese
|
2016-11-02 19:57:38 +01:00 |
|
Matthew Honnibal
|
9bbd6cf031
|
* Work on Chinese support
|
2016-05-05 11:39:12 +02:00 |
|
Matthew Honnibal
|
8569dbc2d0
|
* Add initial stuff for Chinese parsing
|
2016-04-24 18:44:24 +02:00 |
|