Add lexer: Tokenize function with full test coverage

Implements lexer.Tokenize(input string) ([]token.Token, error)
- Skips whitespace
- Parses integer and decimal numbers (including leading dot e.g. .5)
- Handles all operators: + - * /
- Handles parentheses: ( )
- Appends EOF token
- Returns error on invalid characters with position info
- 12 unit tests covering: empty, whitespace-only, integers, decimals,
  leading-dot numbers, operators, parens, full expressions, no-space
  expressions, invalid chars, multiple decimals (1.2.3)
2 files changed
tree: c1e1a9b48d7f8405e0fcb0a15e3285cf7b38e404
  1. cmd/
  2. docs/
  3. lexer/
  4. token/
  5. go.mod