SparkContext.
defaultMinPartitions
Default min number of partitions for Hadoop RDDs when not given by user
previous
pyspark.SparkContext.cancelJobGroup
next
pyspark.SparkContext.defaultParallelism