Add lexer: Tokenize function with full test coverage
Implements lexer.Tokenize(input string) ([]token.Token, error)
- Skips whitespace
- Parses integer and decimal numbers (including leading dot e.g. .5)
- Handles all operators: + - * /
- Handles parentheses: ( )
- Appends EOF token
- Returns error on invalid characters with position info
- 12 unit tests covering: empty, whitespace-only, integers, decimals,
leading-dot numbers, operators, parens, full expressions, no-space
expressions, invalid chars, multiple decimals (1.2.3)
2 files changed