MapType

class pyspark.sql.types.MapType(keyType: pyspark.sql.types.DataType, valueType: pyspark.sql.types.DataType, valueContainsNull: bool = True)

Map data type.

Parameters
keyTypeDataType

DataType of the keys in the map.

valueTypeDataType

DataType of the values in the map.

valueContainsNullbool, optional

indicates whether values can contain null (None) values.

Notes

Keys in a map data type are not allowed to be null (None).

Examples

>>> (MapType(StringType(), IntegerType())
...        == MapType(StringType(), IntegerType(), True))
True
>>> (MapType(StringType(), IntegerType(), False)
...        == MapType(StringType(), FloatType()))
False

Methods

fromInternal(obj)

Converts an internal SQL object into a native Python object.

fromJson(json)

json()

jsonValue()

needConversion()

Does this type needs conversion between Python object and internal SQL object.

simpleString()

toInternal(obj)

Converts a Python object into an internal SQL object.

typeName()

Methods Documentation

fromInternal(obj: Dict[T, Optional[U]]) → Dict[T, Optional[U]]

Converts an internal SQL object into a native Python object.

classmethod fromJson(json: Dict[str, Any])pyspark.sql.types.MapType
json() → str
jsonValue() → Dict[str, Any]
needConversion() → bool

Does this type needs conversion between Python object and internal SQL object.

This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.

simpleString() → str
toInternal(obj: Dict[T, Optional[U]]) → Dict[T, Optional[U]]

Converts a Python object into an internal SQL object.

classmethod typeName() → str