Browse Source

HDFS-11223. Fix typos in HttpFs documentations. Contributed by Yiqun Lin.

(cherry picked from commit 4c2cf5560f6d952cfa36ef656f0b04dc3150f8b3)
Akira Ajisaka 8 years ago
parent
commit
0478597ea9

+ 1 - 1
hadoop-hdfs-project/hadoop-hdfs-httpfs/src/site/markdown/ServerSetup.md.vm

@@ -50,7 +50,7 @@ IMPORTANT: Replace `#HTTPFSUSER#` with the Unix user that will start the HttpFS
 Restart Hadoop
 Restart Hadoop
 --------------
 --------------
 
 
-You need to restart Hadoop for the proxyuser configuration ot become active.
+You need to restart Hadoop for the proxyuser configuration to become active.
 
 
 Start/Stop HttpFS
 Start/Stop HttpFS
 -----------------
 -----------------

+ 3 - 3
hadoop-hdfs-project/hadoop-hdfs-httpfs/src/site/markdown/index.md

@@ -15,7 +15,7 @@
 Hadoop HDFS over HTTP - Documentation Sets
 Hadoop HDFS over HTTP - Documentation Sets
 ==========================================
 ==========================================
 
 
-HttpFS is a server that provides a REST HTTP gateway supporting all HDFS File System operations (read and write). And it is inteoperable with the **webhdfs** REST HTTP API.
+HttpFS is a server that provides a REST HTTP gateway supporting all HDFS File System operations (read and write). And it is interoperable with the **webhdfs** REST HTTP API.
 
 
 HttpFS can be used to transfer data between clusters running different versions of Hadoop (overcoming RPC versioning issues), for example using Hadoop DistCP.
 HttpFS can be used to transfer data between clusters running different versions of Hadoop (overcoming RPC versioning issues), for example using Hadoop DistCP.
 
 
@@ -23,9 +23,9 @@ HttpFS can be used to access data in HDFS on a cluster behind of a firewall (the
 
 
 HttpFS can be used to access data in HDFS using HTTP utilities (such as curl and wget) and HTTP libraries Perl from other languages than Java.
 HttpFS can be used to access data in HDFS using HTTP utilities (such as curl and wget) and HTTP libraries Perl from other languages than Java.
 
 
-The **webhdfs** client FileSytem implementation can be used to access HttpFS using the Hadoop filesystem command (`hadoop fs`) line tool as well as from Java applications using the Hadoop FileSystem Java API.
+The **webhdfs** client FileSystem implementation can be used to access HttpFS using the Hadoop filesystem command (`hadoop fs`) line tool as well as from Java applications using the Hadoop FileSystem Java API.
 
 
-HttpFS has built-in security supporting Hadoop pseudo authentication and HTTP SPNEGO Kerberos and other pluggable authentication mechanims. It also provides Hadoop proxy user support.
+HttpFS has built-in security supporting Hadoop pseudo authentication and HTTP SPNEGO Kerberos and other pluggable authentication mechanisms. It also provides Hadoop proxy user support.
 
 
 How Does HttpFS Works?
 How Does HttpFS Works?
 ----------------------
 ----------------------