pyspark.sql.functions.crc32¶
-
pyspark.sql.functions.
crc32
(col: ColumnOrName) → pyspark.sql.column.Column[source]¶ Calculates the cyclic redundancy check value (CRC32) of a binary column and returns the value as a bigint.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
- col
Column
or str target column to compute on.
- col
- Returns
Column
the column for computed results.
New in version 1.5.0: ..
Examples
>>> spark.createDataFrame([('ABC',)], ['a']).select(crc32('a').alias('crc32')).collect() [Row(crc32=2743272264)]