Package redempt.redlex.processing
Class Lexer
java.lang.Object
redempt.redlex.processing.Lexer
A lexer which will tokenize an input String. Best created with
BNFParser.createLexer(Path)-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptiongetRoot()getStrategy(Token token) voidsetRetainEmpty(boolean retainEmpty) Sets whether this Lexer will retain empty tokensvoidsetRetainStringLiterals(boolean retainStringLiterals) Sets whether this Lexer will retain string literal tokensvoidsetRuleByName(CullStrategy strategy, String... names) Sets how tokens with given names should be handledvoidsetUnnamedRule(CullStrategy unnamedRule) Sets how this Lexer will handle unnamed tokensTokenizes an input StringTokenizes an input String
-
Constructor Details
-
Lexer
Create a Lexer from a token. Not recommended unless you want to do everything by hand.- Parameters:
root- The root TokenType for this Lexer
-
-
Method Details
-
getRoot
- Returns:
- The root TokenType for this Lexer
-
getStrategy
-
setRetainEmpty
public void setRetainEmpty(boolean retainEmpty) Sets whether this Lexer will retain empty tokens- Parameters:
retainEmpty- Whether this Lexer should retain empty tokens
-
setRetainStringLiterals
public void setRetainStringLiterals(boolean retainStringLiterals) Sets whether this Lexer will retain string literal tokens- Parameters:
retainStringLiterals- Whether this Lexer should retain string literal tokens
-
setUnnamedRule
Sets how this Lexer will handle unnamed tokens- Parameters:
unnamedRule- What should be done with unnamed tokens
-
setRuleByName
Sets how tokens with given names should be handled- Parameters:
strategy- The strategy to use on the tokens with the given namesnames- The names to apply the rule to
-
tokenize
Tokenizes an input String- Parameters:
str- The string to tokenizeerrorOnFail- Whether to throw a LexException on failure- Returns:
- The root Token of the tokenized tree
-
tokenize
Tokenizes an input String- Parameters:
str- The string to tokenize- Returns:
- The root Token of the tokenized tree
-