class documentation

class ShenLexer(RegexLexer):

View In Hierarchy

Lexer for Shen source code.

New in version 2.1.
Constant BUILTINS Undocumented
Constant BUILTINS​_ANYWHERE Undocumented
Constant DECLARATIONS Undocumented
Constant MAPPINGS Undocumented
Constant SPECIAL​_FORMS Undocumented
Method ​_process​_declaration Undocumented
Method ​_process​_declarations Undocumented
Method ​_process​_signature Undocumented
Method ​_process​_symbols Undocumented
Method ​_relevant Undocumented
Method get​_tokens​_unprocessed Split text into (tokentype, text) pairs.
Class Variable aliases Undocumented
Class Variable filenames Undocumented
Class Variable mimetypes Undocumented
Class Variable name Undocumented
Class Variable symbol​_name Undocumented
Class Variable tokens Undocumented
Class Variable valid​_name Undocumented
Class Variable valid​_symbol​_chars Undocumented
Class Variable variable Undocumented

Inherited from Lexer (via RegexLexer):

Method analyse​_text No summary
Method get​_tokens Return an iterable of (tokentype, value) pairs generated from text. If unfiltered is set to True, the filtering mechanism is bypassed even if filters are defined.
Class Variable alias​_filenames Undocumented
Method __init__ Undocumented
Method __repr__ Undocumented
Method add​_filter Add a new stream filter to this lexer.
Class Variable priority Undocumented
Instance Variable encoding Undocumented
Instance Variable ensurenl Undocumented
Instance Variable filters Undocumented
Instance Variable options Undocumented
Instance Variable stripall Undocumented
Instance Variable stripnl Undocumented
Instance Variable tabsize Undocumented
BUILTINS: tuple[str, ...] =

Undocumented

Value
('==',
 '=',
 '*',
 '+',
 '-',
 '/',
 '<',
...
BUILTINS_ANYWHERE: tuple[str, ...] =

Undocumented

Value
('where', 'skip', '>>', '_', '!', '<e>', '<!>')
DECLARATIONS: tuple[str, ...] =

Undocumented

Value
('datatype',
 'define',
 'defmacro',
 'defprolog',
 'defcc',
 'synonyms',
 'declare',
...
MAPPINGS =

Undocumented

Value
{s: Keyword for s in DECLARATIONS}
SPECIAL_FORMS: tuple[str, ...] =

Undocumented

Value
('lambda',
 'get',
 'let',
 'if',
 'cases',
 'cond',
 'put',
...
def _process_declaration(self, declaration, tokens):

Undocumented

def _process_declarations(self, tokens):

Undocumented

def _process_signature(self, tokens):

Undocumented

def _process_symbols(self, tokens):

Undocumented

def _relevant(self, token):

Undocumented

def get_tokens_unprocessed(self, text):

Split text into (tokentype, text) pairs.

stack is the inital stack (default: ['root'])

aliases: list[str] =

Undocumented

filenames: list[str] =

Undocumented

mimetypes: list[str] =

Undocumented

name: str =

Undocumented

symbol_name =

Undocumented

tokens =

Undocumented

valid_name =

Undocumented

valid_symbol_chars: str =

Undocumented

variable =

Undocumented