Spark 1.5.2-1512 Release Notes
The notes below relate specifically to the MapR Distribution for Apache Hadoop. You may also be interested in the open source Spark 1.5.2 Release Notes.
Spark Version | 1.5.2 |
Release Date | December 21, 2015 |
MapR Version Interoperability | See Spark Support Matrix. |
Source on GitHub | https://github.com/mapr/spark |
Package Names | The following packages are associated with this release:
|
New in This Release
This release of Apache Spark for MapR includes the following features:
- Support for SparkR (R on Spark)
For details on the features available in the open source version of this component, see the Apache Spark documentation.
Hive Support
This version of Spark supports integration with Hive. However, note the following exceptions:
- Hive on Spark is not supported.
- Spark-SQL is supported but it is not fully compatible with Hive; see the Apache Spark documentation and the MapR Spark documentation for details.
Fixes
This release by MapR includes the following fixes on the base Apache release. For complete details, refer to the commit log for this project in GitHub.
GitHub Commit | Date (YYYY-MM-DD) | Comment |
---|---|---|
a7dad34 | 2015-11-25 | MAPR-21570: The Spark Master no longer fails to start when it is configured to be highly available. |
cdd328a | 2015-11-19 | MAPR-21525: The HBase version is now set to 0.98.12 in the
/opt/mapr/spark/spark-1.5.2/mapr-util/compatibility.version file. |
0d4c58e | 2015-11-02 | MAPR-21243: With Spark on YARN, spark.sql.hive.metastore.sharedPrefixes is now set
automatically based on the mode that is used to submit the job. |
Known Issues
- MAPR-17271: On secure clusters, the MapR Control System (MCS) does not display links for Spark-Master and Spark-HistoryServer.
- MAPR-19761: On a secure cluster, MapR does not support the Spark SQL Thrift JDBC server. When the cluster is secure, the Spark Thrift server will not start.
- MAPR-20263: On a secure cluster, MapR does not support submitting jobs that interacts with Hive Metastore on yarn-cluster mode. When the cluster is secure, jobs will not complete successfully.
- Spark versions up to and including 2.3.0 have the following security vulnerability: CVE-2018-1334 Apache Spark local privilege escalation vulnerability