Preparing each Node
Defines minimum requirements for each node in your cluster.
Every node contributes to the cluster, so each node must be able to run data-fabric and Hadoop software. Nodes must meet minimum requirements for operating system, memory and disk resources and installed software, such as Java. Including unsuitable nodes in a cluster is a major source of installation difficulty.
Component | Requirements |
---|---|
CPU | 64-bit x86 |
OS | RedHat, Oracle Linux, CentOS, SUSE, or Ubuntu |
Memory | 16 GB minimum, more in production |
Disk | Raw, unformatted drives and no partitions |
DNS | Hostname, reaches all other nodes |
Users | Common users across all nodes; passwordless ssh (optional) |
Java | Must run Java 11 |
Other | NTP, Syslog, PAM |
Use the subsequent sections as a checklist to make each candidate node suitable for its assigned roles. Install data-fabric software on each node that you identify as meeting the minimum requirements.