# StandardScalerModel¶

class pyspark.mllib.feature.StandardScalerModel(java_model: py4j.java_gateway.JavaObject)

Represents a StandardScaler model that can transform vectors.

Methods

 call(name, *a) Call method of java_model setWithMean(withMean) Setter of the boolean which decides whether it uses mean or not setWithStd(withStd) Setter of the boolean which decides whether it uses std or not transform(vector) Applies standardization transformation on a vector.

Attributes

 mean Return the column mean values. std Return the column standard deviation values. withMean Returns if the model centers the data before scaling. withStd Returns if the model scales the data to unit standard deviation.

Methods Documentation

call(name: str, *a: Any) → Any

Call method of java_model

setWithMean(withMean: bool)pyspark.mllib.feature.StandardScalerModel

Setter of the boolean which decides whether it uses mean or not

setWithStd(withStd: bool)pyspark.mllib.feature.StandardScalerModel

Setter of the boolean which decides whether it uses std or not

transform(vector: Union[VectorLike, pyspark.rdd.RDD[VectorLike]]) → Union[pyspark.mllib.linalg.Vector, pyspark.rdd.RDD[pyspark.mllib.linalg.Vector]]

Applies standardization transformation on a vector.

Parameters
vector

Input vector(s) to be standardized.

Returns
pyspark.mllib.linalg.Vector or pyspark.RDD

Standardized vector(s). If the variance of a column is zero, it will return default 0.0 for the column with zero variance.

Notes

In Python, transform cannot currently be used within an RDD transformation or action. Call transform directly on the RDD instead.

Attributes Documentation

mean

Return the column mean values.

std

Return the column standard deviation values.

withMean

Returns if the model centers the data before scaling.

withStd

Returns if the model scales the data to unit standard deviation.