I run the following Haskell quicksort function (using the merge and sort algorithm):
qsort []=[]
qsort (x:xs) =
qsort smaller ++ [x] ++ qsort larger
where
smaller = [a | a <- xs , a<x ] -- edit
larger = [b | b <- xs , b>=x ]
To check the runtime for this for a small large amount of data, I run the code below:
qsort [100000,99999..0]
It takes up to 40 minutes and it still works! Even the list of 100,000 is not so large, therefore:
- How can I process data with a billion data?
- this sorting algorithm (merging and sorting) takes less time in another language such as C #, python ... this is a problem in haskell
- How can I predict the execution time of an algorithm using its complexity?
source
share