SparkContext.
defaultMinPartitions
Default min number of partitions for Hadoop RDDs when not given by user
New in version 1.1.0.
Examples
>>> sc.defaultMinPartitions > 0 True
previous
pyspark.SparkContext.clearJobTags
next
pyspark.SparkContext.defaultParallelism