pyspark.sql.functions.try_divide#
- pyspark.sql.functions.try_divide(left, right)[source]#
Returns dividend/divisor. It always performs floating point division. Its result is always null if divisor is 0.
New in version 3.5.0.
Examples
Example 1: Integer divided by Integer.
>>> import pyspark.sql.functions as sf >>> spark.createDataFrame( ... [(6000, 15), (1990, 2), (1234, 0)], ["a", "b"] ... ).select(sf.try_divide("a", "b")).show() +----------------+ |try_divide(a, b)| +----------------+ | 400.0| | 995.0| | NULL| +----------------+
Example 2: Interval divided by Integer.
>>> import pyspark.sql.functions as sf >>> spark.range(4).select( ... sf.try_divide(sf.make_interval(sf.lit(1)), "id") ... ).show() +--------------------------------------------------+ |try_divide(make_interval(1, 0, 0, 0, 0, 0, 0), id)| +--------------------------------------------------+ | NULL| | 1 years| | 6 months| | 4 months| +--------------------------------------------------+
Example 3: Exception during division, resulting in NULL when ANSI mode is on
>>> import pyspark.sql.functions as sf >>> origin = spark.conf.get("spark.sql.ansi.enabled") >>> spark.conf.set("spark.sql.ansi.enabled", "true") >>> try: ... df = spark.range(1) ... df.select(sf.try_divide(df.id, sf.lit(0))).show() ... finally: ... spark.conf.set("spark.sql.ansi.enabled", origin) +-----------------+ |try_divide(id, 0)| +-----------------+ | NULL| +-----------------+