By default, spark defines file:/tmp/spark-events as the log directory for the history server, and your log clearly says that spark.history.fs.logDirectory is not configured
First of all, you need to create a spark events folder in / tmp (which is not very good, since / tmp is updated every time the machine reboots), and then add spark.history.fs.logDirectory to the default spark values. conf to point to this directory. But I suggest you create another folder in which the user can access and update the spark-defaults.conf file.
You need to define two more variables in the spark-defaults.conf file
spark.eventLog.dir file:path to where you want to store your logs spark.history.fs.logDirectory file:same path as above
Suppose you want to save in / opt / spark -events, where the spark user has access to more specified parameters in spark-defaults.conf, will be
spark.eventLog.enabled true spark.eventLog.dir file:/opt/spark-events spark.history.fs.logDirectory file:/opt/spark-events
You can find more information in Monitoring and Tools
source share