Installing the Data Fabric Client on Windows (Non-FIPS)

Installing the HPE Ezmeral Data Fabric client makes it possible to access the file system from a Windows workstation.

Compatibility with Network Address Translation (NAT) Adapters

In VM environments, the data-fabric client on Windows works with a single NAT virtual adapter as long as it is the only virtual adapter configured for the VM. If you want to use more than one adapter, you must use other types of virtual adapters. If you use multiple NAT adapters in your VM environment, your jobs and file-system operations will fail.

Use these steps to install the client:

  1. To use the client with Release 7.0.0, make sure that a supported distribution of Java 11 is installed on the Windows computer. See Java Support Matrix. To check the Java version, use this command in the Windows command prompt:
    java -version
  2. Create the \opt\mapr directory on your c: drive (or on another hard drive of your choosing). You can use Windows Explorer, or type the following at the command prompt:
    mkdir c:\opt\mapr
  3. Add the following environment variables:
    System Variable Value
    JAVA_HOME JAVA_HOME=C:\jdk-11
    NOTE The path that you set for the JAVA_HOME environment variable must not include spaces.
    MAPR_HOME MAPR_HOME=C:\opt\mapr
    PATH
    %JAVA_HOME%\bin
    %MAPR_HOME%\bin
    %MAPR_HOME%\hadoop\hadoop-2.7.6\bin
  4. After adding environment variables, exit and reopen the command prompt.

  5. Download the client package archive:
    IMPORTANT To access the Data Fabric internet repository, you must specify the user name (email) and token of an HPE Passport account. For more information, see Using the HPE Ezmeral Token-Authenticated Internet Repository.
    1. Navigate to the Internet repository:
      https://package.ezmeral.hpe.com/releases/v<version>/windows/<package name>
    2. Download the mapr-client-7.0.0.0 package to C:\opt\mapr.
    3. Extract the archive by right-clicking the file and selecting Extract All....
    4. Specify C:\opt\mapr\ as the folder where the files are extracted. If you extract the files to a subfolder of C:\opt\mapr\, such as C:\opt\mapr\mapr-client-7.0.0.0.<timestamp>, the configure.bat command can return errors.
  6. At the command prompt, run configure.bat to configure the client.

    In the following examples:
    • -N specifies the cluster name.
    • -c (lowercase) specifies a client configuration.
    • -secure is added if the cluster is secure.
    • -C (uppercase) specifies the CLDB nodes.
    • -HS specifies the HistoryServer node.
    • 7222 is the default port for the CLDB node.
    To ensure that the client can connect in the event of a CLDB node failure, you can optionally specify all CLDB nodes. For details about the syntax, parameters, and behavior of configure.bat, see configure.sh.
    Secure cluster example
    server\configure.bat -N <cluster_name> -c -secure -C  mynode01:7222,mynode02:7222,mynode03:7222
  7. To use this client with a secure cluster or clusters, copy the following files from the /opt/mapr/conf directory on the cluster to the /opt/mapr/conf directory on the client:
    • ssl_truststore
    • ssl-client.xml
    • maprtrustcreds.jceks
    • maprtrustcreds.conf
    If this client will connect to multiple clusters, you must merge the ssl_truststore files on the server by using the /opt/mapr/server/manageSSLKeys.sh tool, and then copy the merged file to c:\opt\mapr\conf on the client. For an example of merging the ssl_truststore files, see step 3 in Configuring Secure Clusters for Running Commands Remotely.

    For more information about connecting to a secure cluster, see Managing Secure Clusters.

  8. On the Windows computer, create a ticket:
    maprlogin password -user <DataFabricUserName>
    This command creates a ticket for <DataFabricUserName>, usually as:
    C:\Users\<WindowsUserName>\AppData\Local\Temp\maprticket_<WindowsUserName>
    NOTE If you intend to run MapReduce jobs as <DataFabricUserName>, set the MAPR_TICKETFILE_LOCATION system variable to C:\Users\<WindowsUserName>\AppData\Local\Temp\maprticket_<DataFabricUserName>.
  9. Use the hadoop fs -ls / command to check for connectivity to the cluster. For example:
    hadoop fs -ls /
    22/02/01 15:59:26 INFO util.log: Logging initialized @2631ms to org.eclipse.jetty.util.log.Slf4jLog
    Found 5 items
    drwxr-xr-x   - uid_1000 gid_1000          4 2022-01-28 12:19 /apps
    drwxr-xr-x   - uid_1000 gid_1000          0 2022-01-27 19:49 /opt
    drwxrwxrwx   - uid_1000 gid_1000          0 2022-01-27 19:46 /tmp
    drwxr-xr-x   - uid_1000 gid_1000          1 2022-01-27 19:49 /user
    drwxr-xr-x   - uid_1000 gid_1000          2 2022-01-27 19:49 /var
For more information about running Hadoop commands on Windows, see Running Hadoop Commands on a Mac and Windows Client.