class documentation

class TypeDocstring:

Known subclasses: pydoctor.epydoc.markup._types.ParsedTypeDocstring

View In Hierarchy

Convert natural language type strings to reStructuredText.

Syntax is based on numpydoc type specification with additionnal recognition of PEP 484-like type annotations (with parentheses or square brackets characters).

Exemples of valid type strings and output
Type string Output
List[str] or list(bytes), optional List[str] or list(bytes), optional
{"html", "json", "xml"}, optional {"html", "json", "xml"}, optional
list of int or float or None, default: None list of int or float or None, default: None
`complicated string` or `strIO <twisted.python.compat.NativeStringIO>` complicated string or strIO
Method __init__ Undocumented
Method __str__ Returns ------- The parsed type in reStructuredText format.
Instance Variable warnings Undocumented
Class Method ​_tokenize​_type​_spec Split the string in tokens for further processing.
Static Method ​_recombine​_set​_tokens Merge the special literal choices tokens together.
Method ​_build​_tokens Undocumented
Method ​_convert​_type​_spec​_to​_rst Undocumented
Method ​_token​_type Find the type of a token. Types are defined in C{TokenType} enum.
Method ​_trigger​_warnings Append some warnings.
Class Variable ​_ast​_like​_delimiters​_regex Undocumented
Class Variable ​_ast​_like​_delimiters​_regex​_str Undocumented
Class Variable ​_default​_regex Undocumented
Class Variable ​_natural​_language​_delimiters​_regex Undocumented
Class Variable ​_natural​_language​_delimiters​_regex​_str Undocumented
Class Variable ​_token​_regex Undocumented
Instance Variable ​_annotation Undocumented
Instance Variable ​_tokens Undocumented
Instance Variable ​_warns​_on​_unknown​_tokens Undocumented
def __init__(self, annotation, warns_on_unknown_tokens=False):

Undocumented

Parameters
annotation:strUndocumented
warns​_on​_unknown​_tokens:boolUndocumented
def __str__(self):
Returns
strThe parsed type in reStructuredText format.
warnings: List[str] =

Undocumented

@classmethod
def _tokenize_type_spec(cls, spec):
Split the string in tokens for further processing.
Parameters
spec:strUndocumented
Returns
List[str]Undocumented
@staticmethod
def _recombine_set_tokens(tokens):

Merge the special literal choices tokens together.

Example

>>> tokens = ["{", "1", ", ", "2", "}"]
>>> ann._recombine_set_tokens(tokens)
... ["{1, 2}"]
Parameters
tokens:List[str]Undocumented
Returns
List[str]Undocumented
def _build_tokens(self, _tokens):

Undocumented

Parameters
​_tokens:List[Union[str, Any]]Undocumented
Returns
List[Tuple[str, TokenType]]Undocumented
def _convert_type_spec_to_rst(self):

Undocumented

Returns
strUndocumented
def _token_type(self, token):
Find the type of a token. Types are defined in C{TokenType} enum.
Parameters
token:Union[str, Any]Undocumented
Returns
TokenTypeUndocumented
def _trigger_warnings(self):
Append some warnings.
_ast_like_delimiters_regex =

Undocumented

_ast_like_delimiters_regex_str: str =

Undocumented

_default_regex =

Undocumented

_natural_language_delimiters_regex =

Undocumented

_natural_language_delimiters_regex_str: str =

Undocumented

_token_regex =

Undocumented

_annotation =

Undocumented

_tokens: List[Tuple[str, TokenType]] =
_warns_on_unknown_tokens =

Undocumented