withResources(profile: pyspark.resource.profile.ResourceProfile) → pyspark.rdd.RDD[T]¶
pyspark.resource.ResourceProfileto use when calculating this RDD. This is only supported on certain cluster managers and currently requires dynamic allocation to be enabled. It will result in new executors with the resources specified being acquired to calculate the RDD.
This API is experimental