class documentation

class Lexer:

View In Hierarchy

Class that implements a lexer for a given environment. Automatically created by the environment class, usually you don't have to do that.

Note that the lexer is not automatically bound to an environment. Multiple environments can share the same lexer.

Method __init__ Undocumented
Method ​_normalize​_newlines Replace all newlines with the configured sequence in strings and template data.
Method tokeniter This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template.
Method tokenize Calls tokeniter + tokenize and wraps it in a token stream.
Method wrap This is called with the stream as returned by tokenize and wraps every token in a Token and converts the value.
Instance Variable keep​_trailing​_newline Undocumented
Instance Variable lstrip​_unless​_re Undocumented
Instance Variable newline​_sequence Undocumented
Instance Variable rules Undocumented
def __init__(self, environment):

Undocumented

Parameters
environment:EnvironmentUndocumented
def _normalize_newlines(self, value):
Replace all newlines with the configured sequence in strings and template data.
Parameters
value:strUndocumented
Returns
strUndocumented
def tokeniter(self, source, name, filename=None, state=None):

This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template.

Changed in version 3.0: Only \n, \r\n and \r are treated as line breaks.
Parameters
source:strUndocumented
name:t.Optional[str]Undocumented
filename:t.Optional[str]Undocumented
state:t.Optional[str]Undocumented
Returns
t.Iterator[t.Tuple[int, str, str]]Undocumented
def tokenize(self, source, name=None, filename=None, state=None):
Calls tokeniter + tokenize and wraps it in a token stream.
Parameters
source:strUndocumented
name:t.Optional[str]Undocumented
filename:t.Optional[str]Undocumented
state:t.Optional[str]Undocumented
Returns
TokenStreamUndocumented
def wrap(self, stream, name=None, filename=None):
This is called with the stream as returned by tokenize and wraps every token in a Token and converts the value.
Parameters
stream:t.Iterable[t.Tuple[int, str, str]]Undocumented
name:t.Optional[str]Undocumented
filename:t.Optional[str]Undocumented
Returns
t.Iterator[Token]Undocumented
keep_trailing_newline =

Undocumented

lstrip_unless_re =

Undocumented

newline_sequence =

Undocumented

rules: t.Dict[str, t.List[_Rule]] =

Undocumented