StructType¶
-
class
pyspark.sql.types.
StructType
(fields: Optional[List[pyspark.sql.types.StructField]] = None)¶ Struct type, consisting of a list of
StructField
.This is the data type representing a
Row
.Iterating a
StructType
will iterate over itsStructField
s. A containedStructField
can be accessed by its name or position.Examples
>>> struct1 = StructType([StructField("f1", StringType(), True)]) >>> struct1["f1"] StructField('f1', StringType(), True) >>> struct1[0] StructField('f1', StringType(), True)
>>> struct1 = StructType([StructField("f1", StringType(), True)]) >>> struct2 = StructType([StructField("f1", StringType(), True)]) >>> struct1 == struct2 True >>> struct1 = StructType([StructField("f1", CharType(10), True)]) >>> struct2 = StructType([StructField("f1", CharType(10), True)]) >>> struct1 == struct2 True >>> struct1 = StructType([StructField("f1", VarcharType(10), True)]) >>> struct2 = StructType([StructField("f1", VarcharType(10), True)]) >>> struct1 == struct2 True >>> struct1 = StructType([StructField("f1", StringType(), True)]) >>> struct2 = StructType([StructField("f1", StringType(), True), ... StructField("f2", IntegerType(), False)]) >>> struct1 == struct2 False
Methods
add
(field[, data_type, nullable, metadata])Construct a
StructType
by adding new elements to it, to define the schema.Returns all field names in a list.
fromInternal
(obj)Converts an internal SQL object into a native Python object.
fromJson
(json)json
()Does this type needs conversion between Python object and internal SQL object.
toInternal
(obj)Converts a Python object into an internal SQL object.
typeName
()Methods Documentation
-
add
(field: Union[str, pyspark.sql.types.StructField], data_type: Union[str, pyspark.sql.types.DataType, None] = None, nullable: bool = True, metadata: Optional[Dict[str, Any]] = None) → pyspark.sql.types.StructType¶ Construct a
StructType
by adding new elements to it, to define the schema. The method accepts either:A single parameter which is a
StructField
object.Between 2 and 4 parameters as (name, data_type, nullable (optional), metadata(optional). The data_type parameter may be either a String or a
DataType
object.
- Parameters
- fieldstr or
StructField
Either the name of the field or a
StructField
object- data_type
DataType
, optional If present, the DataType of the
StructField
to create- nullablebool, optional
Whether the field to add should be nullable (default True)
- metadatadict, optional
Any additional metadata (default None)
- fieldstr or
- Returns
Examples
>>> struct1 = StructType().add("f1", StringType(), True).add("f2", StringType(), True, None) >>> struct2 = StructType([StructField("f1", StringType(), True), \ ... StructField("f2", StringType(), True, None)]) >>> struct1 == struct2 True >>> struct1 = StructType().add(StructField("f1", StringType(), True)) >>> struct2 = StructType([StructField("f1", StringType(), True)]) >>> struct1 == struct2 True >>> struct1 = StructType().add("f1", "string", True) >>> struct2 = StructType([StructField("f1", StringType(), True)]) >>> struct1 == struct2 True
-
fieldNames
() → List[str]¶ Returns all field names in a list.
Examples
>>> struct = StructType([StructField("f1", StringType(), True)]) >>> struct.fieldNames() ['f1']
-
fromInternal
(obj: Tuple) → pyspark.sql.types.Row¶ Converts an internal SQL object into a native Python object.
-
classmethod
fromJson
(json: Dict[str, Any]) → pyspark.sql.types.StructType¶
-
json
() → str¶
-
jsonValue
() → Dict[str, Any]¶
-
needConversion
() → bool¶ Does this type needs conversion between Python object and internal SQL object.
This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.
-
simpleString
() → str¶
-
toInternal
(obj: Tuple) → Tuple¶ Converts a Python object into an internal SQL object.
-
classmethod
typeName
() → str¶
-