pyspark.SparkContext.addPyFile#
- SparkContext.addPyFile(path)[source]#
Add a .py or .zip dependency for all tasks to be executed on this SparkContext in the future. The path passed can be either a local file, a file in HDFS (or other Hadoop-supported filesystems), or an HTTP, HTTPS or FTP URI.
New in version 0.7.0.
- Parameters
- pathstr
can be either a .py file or .zip dependency.
See also
Notes
A path can be added only once. Subsequent additions of the same path are ignored.