class documentation

class CoqLexer(RegexLexer):

View In Hierarchy

For the Coq theorem prover.

New in version 1.5.
Method analyse​_text No summary
Class Variable aliases Undocumented
Class Variable filenames Undocumented
Class Variable infix​_syms Undocumented
Class Variable keyopts Undocumented
Class Variable keywords1 Undocumented
Class Variable keywords2 Undocumented
Class Variable keywords3 Undocumented
Class Variable keywords4 Undocumented
Class Variable keywords5 Undocumented
Class Variable keywords6 Undocumented
Class Variable mimetypes Undocumented
Class Variable name Undocumented
Class Variable operators Undocumented
Class Variable prefix​_syms Undocumented
Class Variable tokens Undocumented

Inherited from RegexLexer:

Method get​_tokens​_unprocessed Split text into (tokentype, text) pairs.

Inherited from Lexer (via RegexLexer):

Method get​_tokens Return an iterable of (tokentype, value) pairs generated from text. If unfiltered is set to True, the filtering mechanism is bypassed even if filters are defined.
Class Variable alias​_filenames Undocumented
Method __init__ Undocumented
Method __repr__ Undocumented
Method add​_filter Add a new stream filter to this lexer.
Class Variable priority Undocumented
Instance Variable encoding Undocumented
Instance Variable ensurenl Undocumented
Instance Variable filters Undocumented
Instance Variable options Undocumented
Instance Variable stripall Undocumented
Instance Variable stripnl Undocumented
Instance Variable tabsize Undocumented
def analyse_text(text):

Has to return a float between 0 and 1 that indicates if a lexer wants to highlight this text. Used by guess_lexer. If this method returns 0 it won't highlight it in any case, if it returns 1 highlighting with this lexer is guaranteed.

The LexerMeta metaclass automatically wraps this function so that it works like a static method (no self or cls parameter) and the return value is automatically converted to float. If the return value is an object that is boolean False it's the same as if the return values was 0.0.

aliases: list[str] =

Undocumented

filenames: list[str] =

Undocumented

infix_syms: str =

Undocumented

keyopts: tuple[str, ...] =

Undocumented

keywords1: tuple[str, ...] =

Undocumented

keywords2: tuple[str, ...] =

Undocumented

keywords3: tuple[str, ...] =

Undocumented

keywords4: tuple[str, ...] =

Undocumented

keywords5: tuple[str, ...] =

Undocumented

keywords6: tuple[str, ...] =

Undocumented

mimetypes: list[str] =

Undocumented

name: str =

Undocumented

operators: str =

Undocumented

prefix_syms: str =

Undocumented

tokens =

Undocumented