class TypeDocstring:
Known subclasses: pydoctor.epydoc.markup._types.ParsedTypeDocstring
Convert natural language type strings to reStructuredText.
Syntax is based on numpydoc type specification with additionnal recognition of PEP 484-like type annotations (with parentheses or square brackets characters).
Type string | Output |
---|---|
List[str] or list(bytes), optional | List [str ] or list (bytes ), optional |
{"html", "json", "xml"}, optional | {"html", "json", "xml"}, optional |
list of int or float or None, default: None | list of int or float or None , default: None |
`complicated string` or `strIO <twisted.python.compat.NativeStringIO>` | complicated string or strIO |
Method | __init__ |
Undocumented |
Method | __str__ |
Returns ------- The parsed type in reStructuredText format. |
Instance Variable | warnings |
Undocumented |
Class Method | _tokenize_type_spec |
Split the string in tokens for further processing. |
Static Method | _recombine_set_tokens |
Merge the special literal choices tokens together. |
Method | _build_tokens |
Undocumented |
Method | _convert_type_spec_to_rst |
Undocumented |
Method | _token_type |
Find the type of a token. Types are defined in C{TokenType} enum. |
Method | _trigger_warnings |
Append some warnings. |
Class Variable | _ast_like_delimiters_regex |
Undocumented |
Class Variable | _ast_like_delimiters_regex_str |
Undocumented |
Class Variable | _default_regex |
Undocumented |
Class Variable | _natural_language_delimiters_regex |
Undocumented |
Class Variable | _natural_language_delimiters_regex_str |
Undocumented |
Class Variable | _token_regex |
Undocumented |
Instance Variable | _annotation |
Undocumented |
Instance Variable | _tokens |
Undocumented |
Instance Variable | _warns_on_unknown_tokens |
Undocumented |
pydoctor.epydoc.markup._types.ParsedTypeDocstring
Undocumented
Parameters | |
annotation:str | Undocumented |
warns_on_unknown_tokens:bool | Undocumented |
Parameters | |
spec:str | Undocumented |
Returns | |
List[ | Undocumented |
Merge the special literal choices tokens together.
Example
>>> tokens = ["{", "1", ", ", "2", "}"] >>> ann._recombine_set_tokens(tokens) ... ["{1, 2}"]
Parameters | |
tokens:List[ | Undocumented |
Returns | |
List[ | Undocumented |
Undocumented
Parameters | |
_tokens:List[ | Undocumented |
Returns | |
List[ | Undocumented |
Parameters | |
token:Union[ | Undocumented |
Returns | |
TokenType | Undocumented |
List[ Tuple[ str, TokenType]]
=
pydoctor.epydoc.markup._types.ParsedTypeDocstring
Undocumented