TaskContext.
taskAttemptId
An ID that is unique to this task attempt (within the same SparkContext, no two task attempts will share the same attempt ID). This is roughly equivalent to Hadoop’s TaskAttemptID.
previous
pyspark.TaskContext.stageId
next
pyspark.RDDBarrier.mapPartitions