Welcome![Sign In][Sign Up]
Location:
Downloads SourceCode Other MultiLanguage
Title: siuying_segment Download
 Description: "I am a Chinese," ChineseTokenizer will split into the five words: "I am,,, countries and people", CJKTokenizer can be divided into "I am, is, China, the people in the" four second section of the word. The problem of the former is that there are no Chinese words to consider, such as the search for "I am Chinese". The latter problem is to make a lot of meaningless words such as "the Chinese", so that the index does not need to increase, reducing the search efficiency.
 Downloaders recently: [More information of uploader gytlfj]
 To Search:
File list (Check if you may need any files):

CodeBus www.codebus.net