mirror of
https://github.com/fxsjy/jieba.git
synced 2025-07-10 00:01:33 +08:00
update version to 0.41
This commit is contained in:
parent
381b0691ac
commit
eb37e048da
@ -1,3 +1,7 @@
|
|||||||
|
2019-1-8: version 0.41
|
||||||
|
1. 开启paddle模式更友好
|
||||||
|
2. 修复cut_all模式不支持中英混合词的bug
|
||||||
|
|
||||||
2019-12-25: version 0.40
|
2019-12-25: version 0.40
|
||||||
1. 支持基于paddle的深度学习分词模式(use_paddle=True); by @JesseyXujin, @xyzhou-puck
|
1. 支持基于paddle的深度学习分词模式(use_paddle=True); by @JesseyXujin, @xyzhou-puck
|
||||||
2. 修复自定义Tokenizer实例的add_word方法指向全局的问题; by @linhx13
|
2. 修复自定义Tokenizer实例的add_word方法指向全局的问题; by @linhx13
|
||||||
|
0
jieba/__init__.py
Executable file → Normal file
0
jieba/__init__.py
Executable file → Normal file
0
jieba/_compat.py
Executable file → Normal file
0
jieba/_compat.py
Executable file → Normal file
2
setup.py
Executable file → Normal file
2
setup.py
Executable file → Normal file
@ -43,7 +43,7 @@ GitHub: https://github.com/fxsjy/jieba
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
setup(name='jieba',
|
setup(name='jieba',
|
||||||
version='0.40',
|
version='0.41',
|
||||||
description='Chinese Words Segmentation Utilities',
|
description='Chinese Words Segmentation Utilities',
|
||||||
long_description=LONGDOC,
|
long_description=LONGDOC,
|
||||||
author='Sun, Junyi',
|
author='Sun, Junyi',
|
||||||
|
Loading…
x
Reference in New Issue
Block a user