Hot Search : Source embeded web remote control p2p game More...
Location : Home Search - fenCi
Search - fenCi - List
实现了基于词库的nutch中文分词,这一部分是其中的dll文件-realized based on the thesaurus nutch Chinese word, this part is one of the dll file
Update : 2008-10-13 Size : 2.29mb Publisher : 冯凡立

DL : 0
支持java的中文分词程序-support the Chinese word segmentation procedures
Update : 2008-10-13 Size : 2.34mb Publisher : 张永军

DL : 0
java版的分词程序,可以灵活生成添加字典。
Update : 2008-10-13 Size : 8.3kb Publisher : 王文荣

海量分词结合lucene的分词部分源代码,不包含海量分词dll,请下载的朋友注意,希望能对大家有帮助。
Update : 2008-10-13 Size : 2.39mb Publisher : 长亭

一个简单的分词程序,里面有代码和词库,编译连接后在命令行里运行
Update : 2008-10-13 Size : 205.57kb Publisher : 张正

实现中文信息处理中的汉语分词的功能,采用了正向最大和逆向最大的分词算法。
Update : 2008-10-13 Size : 3.05mb Publisher : 舒晓明

分词程序,HMM模型训练,维特比解码,有说明文档。
Update : 2008-10-13 Size : 9.99kb Publisher : haiba

DL : 0
支持java的中文分词程序-support the Chinese word segmentation procedures
Update : 2025-02-17 Size : 2.34mb Publisher : 张永军

实现了基于词库的nutch中文分词,这一部分是其中的dll文件-realized based on the thesaurus nutch Chinese word, this part is one of the dll file
Update : 2025-02-17 Size : 2.29mb Publisher : 冯凡立

挺不错的东西,分词库要自己去添加,我没有乱写啊。-quite good things, thesaurus minutes to add to his, ah, I do not write without basis.
Update : 2025-02-17 Size : 46kb Publisher : sharemin

讲述面向信息检索的中文分词程序的PDF文档,-For information retrieval on Chinese word segmentation process PDF documents,
Update : 2025-02-17 Size : 115kb Publisher : 吴代文

DL : 0
分词程序//db.executeUpdate("UPDATE article SET tag= "+server.codestring(tempword)+" WHERE id="+id+"") out.print("原题目:"+title+"<br>"+"分词结果:"+tempword+"<br>") //System.out.println("id:"+id+"---原题目:"+title) //System.out.println("分词结果:"+tempword) -Segmentation procedure// db.executeUpdate (UPDATE article SET tag =+ Server.codestring (tempword)+ WHERE id =+ Id+) Out.print (the original topic:+ Title+
Update : 2025-02-17 Size : 1kb Publisher : wwwwwww

DL : 0
java版的分词程序,可以灵活生成添加字典。-java version of the segmentation procedure, you can add flexibility to generate the dictionary.
Update : 2025-02-17 Size : 8kb Publisher : 王文荣

基于中科院计算所的ICTCLAS系统开发的分词工具,只有一个类,上手最快,功能强大-Based on calculations of the Chinese Academy of Sciences ICTCLAS segmentation system development tools, there is only one category, the fastest to use and powerful
Update : 2025-02-17 Size : 2.13mb Publisher : xielang

海量分词结合lucene的分词部分源代码,不包含海量分词dll,请下载的朋友注意,希望能对大家有帮助。-Massive sub-word combination of the sub-word part of Lucene source code, does not contain a massive segmentation dll, please download the attention of friends, I hope everyone can help.
Update : 2025-02-17 Size : 2.39mb Publisher : 长亭

一个简单的分词程序,里面有代码和词库,编译连接后在命令行里运行-A simple segmentation procedure, which has code and thesaurus, the compiler to connect the command line after the run
Update : 2025-02-17 Size : 205kb Publisher : 张正

实现中文信息处理中的汉语分词的功能,采用了正向最大和逆向最大的分词算法。-The realization of Chinese information processing in Chinese word segmentation features, the use of the forward and reverse largest largest segmentation algorithm.
Update : 2025-02-17 Size : 3.05mb Publisher : 舒晓明

DL : 0
中科院分词,自动分词,java实现,内附说明谢谢使用-zhongkeyuanfenci
Update : 2025-02-17 Size : 2.36mb Publisher : 常天舒

一个简单的基于词典分词的程序,lucene的分词程序不少,但有时候并不需要复杂的功能,只是需要简单的根据指定的词典分词。代码简单,可以作为学习参考-A simple dictionary-based word process, lucene procedures for sub-word a lot, but sometimes does not require complex functions, but only require a simple dictionary word according to the specified. Code is simple, can serve as a learning reference
Update : 2025-02-17 Size : 55kb Publisher : strayly

DL : 0
将该jar包导入到用户创建的工程中,然后在用户创建的类中调用其中的分词功能。-Import this fenci.jar into user s project, and call it in the .java file to segment Chinese words
Update : 2025-02-17 Size : 2.13mb Publisher : wyatt
« 12 3 4 »
CodeBus is one of the largest source code repositories on the Internet!
Contact us :
1999-2046 CodeBus All Rights Reserved.