Are there any ready-made solutions for lexical analysis in Haskell that allow you to use a dynamic lexicon at run time?

I am working on a small Haskell project that should be able to use lex a very small subset of well-formed English for tokens for semantic analysis. This is a very naive natural language interface for a system with many different end effects than commands can be issued. I am currently using Alex for this, but Alex relies on his vocabulary to be statically composed. The nature of the system is such that the number and even the type of end effects in the world can increase as well as decrease after compilation, and therefore I need to be able to add or remove viable tokens from the lexicon at runtime.

I tried looking for dynamic solutions for lexing, and the closest I could be to this is the Lexer Dynamic Engine, which has not looked updated since 2000.

I looked at some methods, such as using a lower-level approach (perhaps Attoparsec) or even connecting a recompilation hook for Alex and separating the lexer from the rest of the application.

Are there any known solutions for this kind of lexical analysis? I intend to work on natural language processing for a working programmer , so I can use a less simplified approach, but currently mostly a lexer is what I need.

+7
source share
1 answer

CTK is the equivalent of parsec, but for lexing. It supports the dynamic addition of new combinators.

+4
source

All Articles