We're back after a server migration that caused effbot.org to fall over a bit harder than expected. Expect some glitches.


Besides the special NEWLINE, INDENT and DEDENT tokens described in lexical-analysis, the following categories of tokens exist: identifiers, keywords, literals, operator-tokens, and delimiter-tokens.

Whitespace characters (other than line terminators, discussed earlier) are not tokens, but serve to delimit tokens. Where ambiguity exists, a token comprises the longest possible string that forms a legal token, when read from left to right.