Lexical Elements of Commands

For the debugger to parse the line, it must first turn the line into a sequence of tokens, a process called "tokenizing" or "lexical analysis". Tokenizing is done with a state machine.

As the debugger starts tokenizing a line into a command, it starts processing the characters using the lexical state LKEYWORD. It uses the rules for lexical tokens in this state, recognizing the longest sequence of characters that forms a lexical token.

After the lexical token is recognized, the debugger appends it to the tokenized form of the line, perhaps changes the state of the tokenizer, and starts on the next token.

For more detailed information on lexical elements, see Lexical Elements of Commands in Part III.