As I thought, the problem can be solved by expanding the behavior of Lexer, not Parser. The standard lexer only accepts decimal digits, so I created a new lexer:
class MyLexer extends StdLexical { override type Elem = Char override def digit = ( super.digit | hexDigit ) lazy val hexDigits = Set[Char]() ++ "0123456789abcdefABCDEF".toArray lazy val hexDigit = elem("hex digit", hexDigits.contains(_)) }
And my parser (which should be StandardTokenParser) can be expanded as follows:
object ParseAST extends StandardTokenParsers{ override val lexical:MyLexer = new MyLexer() lexical.delimiters += ( "(" , ")" , "," , "@") ... }
The construction of a "number" of numbers is performed according to the StdLexical class:
class StdLexical { ... def token: Parser[Token] = ... | digit~rep(digit)^^{case first ~ rest => NumericLit(first :: rest mkString "")} }
Since StdLexical only gives the parsed number as a String, this is not a problem for me, as I am not interested in a numerical value either.
thequark
source share