Browse Source

HADOOP-12457. [JDK8] Fix a failure of compiling common by javadoc. Contributed by Akira AJISAKA.

(cherry picked from commit ea6b183a1a649ad2874050ade8856286728c654c)

Conflicts:
	hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java
Tsuyoshi Ozawa 9 years ago
parent
commit
1035637200

+ 3 - 0
hadoop-common-project/hadoop-common/CHANGES.txt

@@ -677,6 +677,9 @@ Release 2.8.0 - UNRELEASED
     HADOOP-12513. Dockerfile lacks initial 'apt-get update'.
     (Akihiro Suda via ozawa)
 
+    HADOOP-12457. [JDK8] Fix a failure of compiling common by javadoc.
+    (Akira AJISAKA via ozawa)
+
   OPTIMIZATIONS
 
     HADOOP-12051. ProtobufRpcEngine.invoke() should use Exception.toString()

+ 7 - 1
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/conf/Configuration.java

@@ -167,7 +167,13 @@ import com.google.common.base.Preconditions;
  * will be resolved to another property in this Configuration, while
  * <tt>${<i>user.name</i>}</tt> would then ordinarily be resolved to the value
  * of the System property with that name.
- * By default, warnings will be given to any deprecated configuration 
+ * <p>When <tt>conf.get("otherdir")</tt> is called, then <tt>${<i>env.BASE_DIR</i>}</tt>
+ * will be resolved to the value of the <tt>${<i>BASE_DIR</i>}</tt> environment variable.
+ * It supports <tt>${<i>env.NAME:-default</i>}</tt> and <tt>${<i>env.NAME-default</i>}</tt> notations.
+ * The former is resolved to "default" if <tt>${<i>NAME</i>}</tt> environment variable is undefined
+ * or its value is empty.
+ * The latter behaves the same way only if <tt>${<i>NAME</i>}</tt> is undefined.
+ * <p>By default, warnings will be given to any deprecated configuration 
  * parameters and these are suppressible by configuring
  * <tt>log4j.logger.org.apache.hadoop.conf.Configuration.deprecation</tt> in
  * log4j.properties file.

+ 1 - 1
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/shell/Delete.java

@@ -65,7 +65,7 @@ class Delete {
             "-[rR]:  Recursively deletes directories.\n" +
             "-skipTrash: option bypasses trash, if enabled, and immediately " +
             "deletes <src>.\n" +
-            "-safely: option requires safety confirmationif enabled, " +
+            "-safely: option requires safety confirmation, if enabled, " +
             "requires confirmation before deleting large directory with more " +
             "than <hadoop.shell.delete.limit.num.files> files. Delay is " +
             "expected when walking over large directory recursively to count " +

+ 1 - 1
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java

@@ -413,7 +413,7 @@ public abstract class Server {
    * if this request took too much time relative to other requests
    * we consider that as a slow RPC. 3 is a magic number that comes
    * from 3 sigma deviation. A very simple explanation can be found
-   * by searching for 68–95–99.7 rule. We flag an RPC as slow RPC
+   * by searching for 68-95-99.7 rule. We flag an RPC as slow RPC
    * if and only if it falls above 99.7% of requests. We start this logic
    * only once we have enough sample size.
    */