ServerSetup.apt.vm 4.6 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159
  1. ~~ Licensed under the Apache License, Version 2.0 (the "License");
  2. ~~ you may not use this file except in compliance with the License.
  3. ~~ You may obtain a copy of the License at
  4. ~~
  5. ~~ http://www.apache.org/licenses/LICENSE-2.0
  6. ~~
  7. ~~ Unless required by applicable law or agreed to in writing, software
  8. ~~ distributed under the License is distributed on an "AS IS" BASIS,
  9. ~~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  10. ~~ See the License for the specific language governing permissions and
  11. ~~ limitations under the License.
  12. ---
  13. Hadoop HDFS over HTTP ${project.version} - Server Setup
  14. ---
  15. ---
  16. ${maven.build.timestamp}
  17. Hadoop HDFS over HTTP ${project.version} - Server Setup
  18. This page explains how to quickly setup HttpFS with Pseudo authentication
  19. against a Hadoop cluster with Pseudo authentication.
  20. * Requirements
  21. * Java 6+
  22. * Maven 3+
  23. * Install HttpFS
  24. +---+
  25. ~ $ tar xzf httpfs-${project.version}.tar.gz
  26. +---+
  27. * Configure HttpFS
  28. By default, HttpFS assumes that Hadoop configuration files
  29. (<<<core-site.xml & hdfs-site.xml>>>) are in the HttpFS
  30. configuration directory.
  31. If this is not the case, add to the <<<httpfs-site.xml>>> file the
  32. <<<httpfs.hadoop.config.dir>>> property set to the location
  33. of the Hadoop configuration directory.
  34. * Configure Hadoop
  35. Edit Hadoop <<<core-site.xml>>> and defined the Unix user that will
  36. run the HttpFS server as a proxyuser. For example:
  37. +---+
  38. ...
  39. <property>
  40. <name>hadoop.proxyuser.#HTTPFSUSER#.hosts</name>
  41. <value>httpfs-host.foo.com</value>
  42. </property>
  43. <property>
  44. <name>hadoop.proxyuser.#HTTPFSUSER#.groups</name>
  45. <value>*</value>
  46. </property>
  47. ...
  48. +---+
  49. IMPORTANT: Replace <<<#HTTPFSUSER#>>> with the Unix user that will
  50. start the HttpFS server.
  51. * Restart Hadoop
  52. You need to restart Hadoop for the proxyuser configuration ot become
  53. active.
  54. * Start/Stop HttpFS
  55. To start/stop HttpFS use HttpFS's bin/httpfs.sh script. For example:
  56. +---+
  57. httpfs-${project.version} $ bin/httpfs.sh start
  58. +---+
  59. NOTE: Invoking the script without any parameters list all possible
  60. parameters (start, stop, run, etc.). The <<<httpfs.sh>>> script is a wrapper
  61. for Tomcat's <<<catalina.sh>>> script that sets the environment variables
  62. and Java System properties required to run HttpFS server.
  63. * Test HttpFS is working
  64. +---+
  65. ~ $ curl -i "http://<HTTPFSHOSTNAME>:14000?user.name=babu&op=homedir"
  66. HTTP/1.1 200 OK
  67. Content-Type: application/json
  68. Transfer-Encoding: chunked
  69. {"homeDir":"http:\/\/<HTTPFS_HOST>:14000\/user\/babu"}
  70. +---+
  71. * Embedded Tomcat Configuration
  72. To configure the embedded Tomcat go to the <<<tomcat/conf>>>.
  73. HttpFS preconfigures the HTTP and Admin ports in Tomcat's <<<server.xml>>> to
  74. 14000 and 14001.
  75. Tomcat logs are also preconfigured to go to HttpFS's <<<logs/>>> directory.
  76. The following environment variables (which can be set in HttpFS's
  77. <<<conf/httpfs-env.sh>>> script) can be used to alter those values:
  78. * HTTPFS_HTTP_PORT
  79. * HTTPFS_ADMIN_PORT
  80. * HTTPFS_LOG
  81. * HttpFS Configuration
  82. HttpFS supports the following {{{./httpfs-default.html}configuration properties}}
  83. in the HttpFS's <<<conf/httpfs-site.xml>>> configuration file.
  84. * HttpFS over HTTPS (SSL)
  85. To configure HttpFS to work over SSL edit the {{httpfs-env.sh}} script in the
  86. configuration directory setting the {{HTTPFS_SSL_ENABLED}} to {{true}}.
  87. In addition, the following 2 properties may be defined (shown with default
  88. values):
  89. * HTTPFS_SSL_KEYSTORE_FILE=${HOME}/.keystore
  90. * HTTPFS_SSL_KEYSTORE_PASS=password
  91. In the HttpFS <<<tomcat/conf>>> directory, replace the <<<server.xml>>> file
  92. with the <<<ssl-server.xml>>> file.
  93. You need to create an SSL certificate for the HttpFS server. As the
  94. <<<httpfs>>> Unix user, using the Java <<<keytool>>> command to create the
  95. SSL certificate:
  96. +---+
  97. $ keytool -genkey -alias tomcat -keyalg RSA
  98. +---+
  99. You will be asked a series of questions in an interactive prompt. It will
  100. create the keystore file, which will be named <<.keystore>> and located in the
  101. <<<httpfs>>> user home directory.
  102. The password you enter for "keystore password" must match the value of the
  103. <<<HTTPFS_SSL_KEYSTORE_PASS>>> environment variable set in the
  104. <<<httpfs-env.sh>>> script in the configuration directory.
  105. The answer to "What is your first and last name?" (i.e. "CN") must be the
  106. hostname of the machine where the HttpFS Server will be running.
  107. Start HttpFS. It should work over HTTPS.
  108. Using the Hadoop <<<FileSystem>>> API or the Hadoop FS shell, use the
  109. <<<swebhdfs://>>> scheme. Make sure the JVM is picking up the truststore
  110. containing the public key of the SSL certificate if using a self-signed
  111. certificate.