How can you calculate apache data frame size using pyspark?

Is there a way to calculate the size in bytes of the Apache Source frame of data using pyspark?

+7
source share
1 answer

why don’t you just cache df and then look into the spark interface under the repository and convert units to bytes

df.cache() 
0
source

All Articles