pyspark.TaskContext.taskAttemptId

TaskContext.taskAttemptId() → int

An ID that is unique to this task attempt (within the same SparkContext, no two task attempts will share the same attempt ID). This is roughly equivalent to Hadoop’s TaskAttemptID.