SparkContext.
defaultParallelism
Default level of parallelism to use when not given by user (e.g. for reduce tasks)
previous
pyspark.SparkContext.defaultMinPartitions
next
pyspark.SparkContext.dump_profiles