How can I change spark.driver.maxResultSize in the pyspark interactive shell? I used the following code
from pyspark import SparkConf, SparkContext conf = (SparkConf() .set("spark.driver.maxResultSize", "10g")) sc.stop() sc=SparkContext(conf)
but it gives me an error
AttributeError: 'SparkConf' object has no attribute '_get_object_id'
python apache-spark pyspark
MARK
source share