Lexer

Lexical tokenization is conversion of a text into (semantically or syntactically) meaningful lexical tokens belonging to categories defined by a "lexer" program. In case of a natural language, those categories include nouns, verbs, adjectives, punctuations etc. In case of a programming language, the categories include identifiers, operators, grouping symbols and data types. Lexical tokenization is related to the type of tokenization used in large language models (LLMs) but with two differences. First, lexical tokenization is usually based on a lexical grammar, whereas LLM tokenizers are usually probability-based. Second, LLM tokenizers perform a second step that converts the tokens into numerical values.

Summer 2023 (DJ Mix) - 2023-08-25T00:00:00.000000Z

The First Last Day - 2022-10-28T00:00:00.000000Z

Faces - 2020-04-17T00:00:00.000000Z

EINMIX by Lexer (DJ Mix) - 2020-01-24T00:00:00.000000Z

Against the Current - 2017-06-30T00:00:00.000000Z

Similar Artists

Monkey Safari

Marsh

Ben Böhmer

Nils Hoffmann

Oliver Schories

YOTTO

Joris Voorn

Jan Blomqvist

Dabeat

Warung