Post-Upgrade Steps for Spark
Complete the following steps after you upgrade Spark with or without the MapR Installer.
Post-Upgrade Steps for Spark Standalone Mode
About this task
Procedure
-
(Optional) Migrate Custom Configurations.
Migrate any custom configuration settings into the new default files in the conf directory (/opt/mapr/spark/spark-<version>/conf).
- If Spark SQL is configured to work with Hive, copy hive-site.xml file into the conf directory (/opt/mapr/spark/spark-<version>/conf).
-
Run the following commands to configure the slaves:
-
For upgrades without the MapR Installer: Start spark-master services
and spark-historyserver services (if installed):
maprcli node services -nodes <node-ip> -name spark-master -action start maprcli node services -nodes <node-ip> -name spark-historyserver -action start
-
Restart all the spark slaves as the mapr user:
/opt/mapr/spark/spark-<version>/sbin/start-slaves.sh spark://<comma-separated list of spark master hostname: port>
- Delete the old Spark directory from /opt/mapr/spark. For example, if you upgraded from Spark 1.6.1 to 2.0.1, you need to delete /opt/mapr/spark/spark-1.6.1.
Post-Upgrade Steps for Spark on YARN
Procedure
-
(Optional) Migrate Custom Configurations.
Migrate any custom configuration settings into the new default files in the conf directory (/opt/mapr/spark/spark-<version>/conf). Also, if you previously configured Spark to use the Spark JAR file from a location on the MapR-FS, you need to copy the latest JAR file to the MapR-FS and reconfigure the path to the JAR file in the spark-defaults.conf file. See Configure Spark JAR Location.
- If Spark SQL is configured to work with Hive, copy hive-site.xml file into the conf directory (/opt/mapr/spark/spark-<version>/conf).
-
For upgrades without the MapR Installer, start the spark-historyserver
services (if installed). (If you upgrade using the MapR Installer, the
spark-historyserver services are started automatically.)
maprcli node services -nodes <node-ip> -name spark-historyserver -action start
- Delete the old Spark directory from /opt/mapr/spark. For example, if you upgraded from Spark 1.6.1 to 2.0.1, you need to delete /opt/mapr/spark/spark-1.6.1.