First, it’s always recommended that you include types for all top-level ads. This makes the code more structured and more readable.
One simple way to achieve this is through the use of applicative functors . During parsing, you have an “efficient” calculation, in which the effect consumes part of the input signal, and its result is one analyzed part. We can use the State monad to track the remaining input and create a polymorphic function that consumes one input element and read it:
import Control.Applicative import Control.Monad.State data Person = Person { name :: String, surname :: String, age :: Int } deriving (Eq, Ord, Show, Read) readField :: (Read a) => State [String] a readField = state $ \(x : xs) -> (read x, xs)
And for the analysis of many such fields, we use the combinators <$> and <*> , which allow us to sequentially perform operations:
readPerson :: [String] -> Person readPerson = evalState $ Person <$> readField <*> readField <*> readField
The expression Person <$> ... is of type State [String] Person and we run evalState on this input to start the stateful calculation and extract the result. We still need to have the same amount of readField as many times as there are fields, but without using indexes or explicit types.
For a real program, you will probably enable some error handling, since read will fail with an exception, and patterm (x : xs) if the input list is too short. Using a fully functional parser, for example, parsec or attoparsec allows you to use the same notation and have the correct error handling, configure parsing of individual fields, etc.
An even more universal way is to automate the wrapping and expanding of fields into lists using generics . Then you just get Generic . If you are interested, I can give an example.
Or you can use an existing serialization package, be it binary, for example, grain or binary, or text, such as aeson or yaml, which usually allows you to do (either automatically output the (de) series from Generic or provide your own).