Accessing Hadoop configuration from PySpark

Normally, you don't need to access the underlying Hadoop configuration when you're using PySpark but, just in case you do, you can access it like this: from pyspark import SparkSession ... # Extract the configuration spark = SparkSession.builder.getOrCreate() hadoop_config = spark._jsc.hadoopConfiguration() # Set a new config value hadoop_config.set(…