Using the Command Line to View Logs for Completed Applications

Describes how to view logs from the CLI.

  1. Use the maprcli job linklogs command to create centralized logs for completed applications. For example, you can run the following maprcli job linklogs command to create centralized logs for application_1434605941718_0001:
    maprcli job linklogs -jobid application_1434605941718_0001 -todir /logsdir
    The centralized log directory contains symbolic links that are organized by hostname and containerID.
  2. To determine where the logs are located, run the following command on the directory that contains the symlinks to the log files for a specific container:
    hadoop mfs -ls <todir>/<applicationID>/hosts/<hostName>/<containerID>
    For example, if you specified logsdir as the directory, you might issue a command similar to the following example. The system then displays the location of the log files:
    hadoop mfs -ls /logsdir/application_1434605941718_0001/hosts/
    Found 1 items
    lrwxrwxrwx U U U 3 root root 138 2015-06-18 05:50 0
    p 2068.40.262432
    The link location appears after the arrow.
  3. To determine the types of log files that are available for this container and the path to each available log file, run the following command:
    hadoop fs -ls <link location>
    For example:
    hadoop fs -ls ../../../../var/mapr/local/
    -rw-r-----   2 root root       2337 2015-06-18 05:48
    In this example, the path to the syslog is the only one that is displayed in the output. However, the stdout or stderr may also be available depending on what is generated by the application.
  4. Run one of the following options to view the contents of a log file:
    1. To view the end of the log file, run hadoop fs -tail <path to log file>.
      hadoop fs -tail ../../../../var/mapr/local/ 
    2. To view the entire log file, run hadoop fs- cat <path to log file>.
      hadoop fs- cat ../../../../var/mapr/local/