Location:
Search - fenci
Search list
Description: 实现了基于词库的nutch中文分词,这一部分是其中的dll文件-realized based on the thesaurus nutch Chinese word, this part is one of the dll file
Platform: |
Size: 2403424 |
Author: 冯凡立 |
Hits:
Description: 支持java的中文分词程序-support the Chinese word segmentation procedures
Platform: |
Size: 2458040 |
Author: 张永军 |
Hits:
Description: java版的分词程序,可以灵活生成添加字典。
Platform: |
Size: 8495 |
Author: 王文荣 |
Hits:
Description: 海量分词结合lucene的分词部分源代码,不包含海量分词dll,请下载的朋友注意,希望能对大家有帮助。
Platform: |
Size: 2504463 |
Author: 长亭 |
Hits:
Description: 一个简单的分词程序,里面有代码和词库,编译连接后在命令行里运行
Platform: |
Size: 210502 |
Author: 张正 |
Hits:
Description: 实现中文信息处理中的汉语分词的功能,采用了正向最大和逆向最大的分词算法。
Platform: |
Size: 3199772 |
Author: 舒晓明 |
Hits:
Description: 分词程序,HMM模型训练,维特比解码,有说明文档。
Platform: |
Size: 10225 |
Author: haiba |
Hits:
Description: 支持java的中文分词程序-support the Chinese word segmentation procedures
Platform: |
Size: 2457600 |
Author: 张永军 |
Hits:
Description: 实现了基于词库的nutch中文分词,这一部分是其中的dll文件-realized based on the thesaurus nutch Chinese word, this part is one of the dll file
Platform: |
Size: 2403328 |
Author: 冯凡立 |
Hits:
Description: 挺不错的东西,分词库要自己去添加,我没有乱写啊。-quite good things, thesaurus minutes to add to his, ah, I do not write without basis.
Platform: |
Size: 47104 |
Author: sharemin |
Hits:
Description: 讲述面向信息检索的中文分词程序的PDF文档,-For information retrieval on Chinese word segmentation process PDF documents,
Platform: |
Size: 117760 |
Author: 吴代文 |
Hits:
Description: 分词程序//db.executeUpdate("UPDATE article SET tag= "+server.codestring(tempword)+" WHERE id="+id+"")
out.print("原题目:"+title+"<br>"+"分词结果:"+tempword+"<br>")
//System.out.println("id:"+id+"---原题目:"+title)
//System.out.println("分词结果:"+tempword) -Segmentation procedure// db.executeUpdate (UPDATE article SET tag =+ Server.codestring (tempword)+ WHERE id =+ Id+) Out.print (the original topic:+ Title+
Platform: |
Size: 1024 |
Author: wwwwwww |
Hits:
Description: java版的分词程序,可以灵活生成添加字典。-java version of the segmentation procedure, you can add flexibility to generate the dictionary.
Platform: |
Size: 8192 |
Author: 王文荣 |
Hits:
Description: 基于中科院计算所的ICTCLAS系统开发的分词工具,只有一个类,上手最快,功能强大-Based on calculations of the Chinese Academy of Sciences ICTCLAS segmentation system development tools, there is only one category, the fastest to use and powerful
Platform: |
Size: 2234368 |
Author: xielang |
Hits:
Description: 海量分词结合lucene的分词部分源代码,不包含海量分词dll,请下载的朋友注意,希望能对大家有帮助。-Massive sub-word combination of the sub-word part of Lucene source code, does not contain a massive segmentation dll, please download the attention of friends, I hope everyone can help.
Platform: |
Size: 2504704 |
Author: 长亭 |
Hits:
Description: 一个简单的分词程序,里面有代码和词库,编译连接后在命令行里运行-A simple segmentation procedure, which has code and thesaurus, the compiler to connect the command line after the run
Platform: |
Size: 209920 |
Author: 张正 |
Hits:
Description: 实现中文信息处理中的汉语分词的功能,采用了正向最大和逆向最大的分词算法。-The realization of Chinese information processing in Chinese word segmentation features, the use of the forward and reverse largest largest segmentation algorithm.
Platform: |
Size: 3200000 |
Author: 舒晓明 |
Hits:
Description: 中科院分词,自动分词,java实现,内附说明谢谢使用-zhongkeyuanfenci
Platform: |
Size: 2477056 |
Author: 常天舒 |
Hits:
Description: 一个简单的基于词典分词的程序,lucene的分词程序不少,但有时候并不需要复杂的功能,只是需要简单的根据指定的词典分词。代码简单,可以作为学习参考-A simple dictionary-based word process, lucene procedures for sub-word a lot, but sometimes does not require complex functions, but only require a simple dictionary word according to the specified. Code is simple, can serve as a learning reference
Platform: |
Size: 56320 |
Author: strayly |
Hits:
Description: 将该jar包导入到用户创建的工程中,然后在用户创建的类中调用其中的分词功能。-Import this fenci.jar into user s project, and call it in the .java file to segment Chinese words
Platform: |
Size: 2236416 |
Author: wyatt |
Hits: