问题描述:

I am pretty naive to development in Spark and Scala.

I am able to set properties at runtime on spark session using the config method like below -

val spark = SparkSession.builder()

.master("local")

.config("spark.files.overwrite",true)

The above code will allow me to set properties on spark session level, but I want to set properties on a DataFrame level. Regarding this I have a few questions:

  1. Is there any way using which I can achieve this?
  2. If yes, will it affect the parallelism achieved by Spark?

相关阅读:
Top