Filter Stream # Launches memory for 1,000,000 items

Let's say I have Streamlengths 1,000,000with all 1.

scala> val million = Stream.fill(100000000)(1)
million: scala.collection.immutable.Stream[Int] = Stream(1, ?)

scala> million filter (x => x % 2 == 0)
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded

I get an exception Out of Memory.

Then I tried the same call filterwith List.

scala> val y = List.fill(1000000)(1)
y: List[Int] = List(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ...

scala> y.filter(x => x % 2 == 0)
res2: List[Int] = List()

And yet it succeeds.

Why is there Stream#filternot enough memory here but it List#filterends just fine?

Finally, with a large flow, filterlead to an inconsistent assessment of the entire flow?

+4
source share
1 answer

Overhead List- a single object (instance ::) with 2 fields (2 pointers) per element.

Stream - Cons ( ) Function (tl: => Stream[A]) Stream#tail .

, ~ 2 Stream.

Stream val. million def - filter GC , .

, tail Stream , head , filter , , , filter million .

+3

All Articles