pyspark.StorageLevel#
- class pyspark.StorageLevel(useDisk, useMemory, useOffHeap, deserialized, replication=1)[source]#
Flags for controlling the storage of an RDD. Each StorageLevel records whether to use memory, whether to drop the RDD to disk if it falls out of memory, whether to keep the data in memory in a JAVA-specific serialized format, and whether to replicate the RDD partitions on multiple nodes. Also contains static constants for some commonly used storage levels, MEMORY_ONLY. Since the data is always serialized on the Python side, all the constants use the serialized formats.
Attributes
NONE