Explorar el Código

HADOOP-13115. dynamic subcommand docs should talk about exit vs. continue program flow

Allen Wittenauer hace 9 años
padre
commit
55f1fec38d

+ 5 - 2
hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md

@@ -121,6 +121,7 @@ Prior to executing a subcommand, the primary scripts will check for the existanc
 function yarn_subcommand_hello
 {
   echo "$@"
+  exit $?
 }
 ```
 
@@ -159,7 +160,7 @@ Some key environment variables related to Dynamic Subcommands:
 
 * HADOOP\_CLASSNAME
 
-This is the name of the Java class to execute.
+This is the name of the Java class to use when program execution continues.
 
 * HADOOP\_SHELL\_EXECNAME
 
@@ -187,4 +188,6 @@ If this command can be executed as a daemon, set this to true.
 
 * HADOOP\_USER\_PARAMS
 
-This is the full content of the command line, prior to any parsing done. It will contain flags such as `--debug`.  It MAY NOT be manipulated.
+This is the full content of the command line, prior to any parsing done. It will contain flags such as `--debug`.  It MAY NOT be manipulated.
+
+The Apache Hadoop runtime facilities require functions exit if no further processing is required.  For example, in the hello example above, Java and other facilities were not required so a simple `exit $?` was sufficient.  However, if the function were to utilize `HADOOP\_CLASSNAME`, then program execution must continue so that Java with the Apache Hadoop-specific parameters will be launched against the given Java class. Another example would be in the case of an unrecoverable error.  It is the function's responsbility to print an appropriate message (preferably using the hadoop_error API call) and exit appropriately.