Post-Upgrade Steps for Spark
After you upgrade Spark with the MapR Installer, perform the following steps.
Post-Upgrade Steps for Spark Standalone Mode
About this task
Procedure
-
Migrate Custom Configurations (optional).
Migrate any custom configuration settings into the new default files in the conf directory(/opt/mapr/spark/spark-<version>/conf).
- If Spark SQL is configured to work with Hive, copy hive-site.xml file in the conf directory(/opt/mapr/spark/spark-<version>/conf).
-
Run the following commands to configure the slaves:
-
Restart all the spark slaves as the mapr user:
/opt/mapr/spark/spark-<version>/sbin/start-slaves.sh spark://<comma-separated list of spark master hostname: port>
Post-Upgrade Steps for Spark on YARN
Procedure
-
Migrate Custom Configurations (optional).
Migrate any custom configuration settings into the new default files in the conf directory(/opt/mapr/spark/spark-<version>/conf). Also, if you previously configured Spark to use the Spark JAR file from a location on the MapR-FS, you need to copy the latest JAR file to the MapR-FS and reconfigure the path to the JAR file in the spark-defaults.conf file. See Configure Spark JAR Location.
- If Spark SQL is configured to work with Hive, copy hive-site.xml file into the conf directory(/opt/mapr/spark/spark-<version>/conf).
-
Start the spark-historyserver services (if installed):
maprcli node services -nodes <node-ip> -name spark-historyserver -action start