jdxi_editor.midi.sysex.tokenizer

Tokeniser for lexing # Example input # input_data = “DIGITAL_SYNTH_1 COMMON” # mapping = generate_mapping(input_data) # print(mapping)

Attributes

TOKEN_PATTERNS

Functions

tokenize(→ dict[str, str])

tokenize

generate_mapping(→ Optional[dict[str, str]])

generate_mapping

Module Contents

jdxi_editor.midi.sysex.tokenizer.TOKEN_PATTERNS[source]
jdxi_editor.midi.sysex.tokenizer.tokenize(input_string: str) dict[str, str][source]

tokenize

Parameters:

input_string – str

Returns:

dict[str,str] tokens

jdxi_editor.midi.sysex.tokenizer.generate_mapping(input_string: str) dict[str, str] | None[source]

generate_mapping

Parameters:

input_string – str

Returns:

Optional[dict[str, str]]