Skip to content

Commit

Permalink
fix cargo doc
Browse files Browse the repository at this point in the history
  • Loading branch information
PSeitz committed Oct 1, 2024
1 parent 1622c8f commit d07d50c
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions src/tokenize.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
//! Splits input into array of strings separated by opinionated
//! [`TokenType`](crate::tokenize::TokenType).
//! [`TokenType`].
//!
//! [`tokenize_detailed`](crate::tokenize::tokenize_detailed) returns an
//! [`tokenize_detailed`] returns an
//! array containing `{ TokenType, String }` instead of `String`
//!
//! # Example
Expand Down Expand Up @@ -91,7 +91,7 @@ fn get_type(input: char, compact: bool) -> TokenType {
}

/// Tokenizes the text. Splits input into array of strings separated by opinionated
/// [`TokenType`](crate::tokenize::TokenType).
/// [`TokenType`].
///
/// # Example
/// ```
Expand All @@ -107,7 +107,7 @@ pub fn tokenize(input: &str) -> Vec<String> {
}

/// Tokenizes the text. Splits input into array of strings separated by opinionated
/// [`TokenType`](crate::tokenize::TokenType).
/// [`TokenType`].
///
/// If `compact` is set, many same-language tokens are combined (spaces + text, kanji + kana,
/// numeral + punctuation).
Expand Down

0 comments on commit d07d50c

Please sign in to comment.