module documentation

Base lexer classes.

Unknown Field: copyright
Copyright 2006-2021 by the Pygments team, see AUTHORS.
Unknown Field: license
BSD, see LICENSE for details.
Class ​Lexer Lexer for a specific language.
Variable inherit Undocumented
Variable this Undocumented
Class ​_inherit Indicates the a state should inherit from its superclass.
Class _​Pseudo​Match A pseudo match object constructed from a string.
Class _​This Special singleton used for indicating the caller class. Used by using.
Class combined Indicates a state combined from multiple states.
Class default Indicates a state or state action (e.g. #pop) to apply. For example default('#pop') is equivalent to ('', Token, '#pop') Note that state tuples may be used as well.
Class ​Delegating​Lexer No summary
Class ​Extended​Regex​Lexer A RegexLexer that uses a context object to store its state.
Class include Indicates that a state should include rules from another state.
Class ​Lexer​Context A helper object that holds lexer position data.
Class ​Lexer​Meta This metaclass automagically converts analyse_text methods into static methods which always return float values.
Class ​Profiling​Regex​Lexer Drop-in replacement for RegexLexer that does profiling of its regexes.
Class ​Profiling​Regex​Lexer​Meta Metaclass for ProfilingRegexLexer, collects regex timing info.
Class ​Regex​Lexer Base for simple stateful regular expression-based lexers. Simplifies the lexing process so that you need only provide a list of states and regular expressions.
Class ​Regex​Lexer​Meta Metaclass for RegexLexer, creates the self._tokens attribute from self.tokens on the first instantiation.
Class words Indicates a list of literal words that is transformed into an optimized regex that matches any of the words.
Function bygroups Callback that yields multiple actions for each group in the match.
Function do​_insertions Helper for lexers which must combine the results of several sublexers.
Function using Callback that processes the match with a different lexer.
Variable ​_default​_analyse Undocumented
Variable ​_encoding​_map Undocumented
inherit =

Undocumented

this =

Undocumented

def bygroups(*args):
Callback that yields multiple actions for each group in the match.
def do_insertions(insertions, tokens):

Helper for lexers which must combine the results of several sublexers.

insertions is a list of (index, itokens) pairs. Each itokens iterable should be inserted at position index into the token stream given by the tokens argument.

The result is a combined token stream.

TODO: clean up the code here.

def using(_other, **kwargs):

Callback that processes the match with a different lexer.

The keyword arguments are forwarded to the lexer, except state which is handled separately.

state specifies the state that the new lexer will start in, and can be an enumerable such as ('root', 'inline', 'string') or a simple string which is assumed to be on top of the root state.

Note: For that to work, _other must not be an ExtendedRegexLexer.

_default_analyse =

Undocumented

_encoding_map: list[tuple] =

Undocumented