How can I prevent scottis from storing memory for large text output?

I have a Scotty / WAI application, and one of the endpoints sends a large output Textbuilt from a list of elements. Here is the relevant code:

  import Data.Text.Lazy as L
  import Data.Text.Lazy.Encoding as E

  class (Show csv) => ToCSV csv where
    toCSV :: csv -> L.Text
    toCSV = pack . show

  instance (ToCSV c) => ToCSV [c] where
    toCSV []     = empty
    toCSV (c:cs) = toCSV c <> "\n" <> toCSV cs


  get "/api/transactions" $ accept "text/csv" $ do
    purp <- selectPurpose
    txs <- allEntries <$> inWeb (listTransactions purp)
    setHeader "Content-Type" "text/csv"
    raw $ E.encodeUtf8 $ toCSV txs

As I understand the Scotty documentation , the output should be lazily built and sent over the wire without the need to create all the text / bytes in memory, however, this is not the behavior that I observe: when I call this endpoint, the server starts to eat memory, and I suppose that he builds the whole line before sending it at a time.

Did I miss something?

Edit 1:

I wrote a function doStreamthat should send pieces of the resulting BS in turn:

doStream :: Text -> W.StreamingBody   
doStream t build flush = do
  let bs = E.encodeUtf8 t
  mapM_ (\ chunk -> build (B.fromByteString chunk)) (BS.toChunks bs)
  flush

but in reality he is still building all the output in mind ...

2:

, . - , , , . , , .

3:

2 , . ...

+4
1

"stream" Web.Scotty.Trans. , , .

StreamingBody, (Builder → IO()) → IO() → IO().

, :

doMyStreaming send flush =
...

, doMyStreaming "raw".

+1

All Articles