Skip to content

Commit

Permalink
Update link to segmenter on NER page
Browse files Browse the repository at this point in the history
  • Loading branch information
AngledLuffa committed Nov 15, 2023
1 parent e110411 commit 53a1d63
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion _pages/tools_crf_ner.md
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,7 @@ We also provide Chinese models built from the Ontonotes Chinese named entity
data. There are two models, one using distributional similarity clusters and
one without. These are designed to be run on _word-segmented Chinese_. So, if
you want to use these on normal Chinese text, you will first need to run
[Stanford Word Segmenter](http://nlp.stanford.edu/software/segmenter.html) or
[Stanford Word Segmenter](tools_segmenter.md) or
some other Chinese word segmenter, and then run NER on the output of that!

### Online Demo
Expand Down

0 comments on commit 53a1d63

Please sign in to comment.