浏览代码

HDFS-4852. Merging change r1619967 from trunk to branch-2.

git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/branches/branch-2@1619968 13f79535-47bb-0310-9956-ffa450edef68
Chris Nauroth 10 年之前
父节点
当前提交
b2d86ebf78
共有 2 个文件被更改,包括 20 次插入11 次删除
  1. 2 0
      hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
  2. 18 11
      hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm

+ 2 - 0
hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt

@@ -285,6 +285,8 @@ Release 2.6.0 - UNRELEASED
     HDFS-6829. DFSAdmin refreshSuperUserGroupsConfiguration failed in
     HDFS-6829. DFSAdmin refreshSuperUserGroupsConfiguration failed in
     security cluster (zhaoyunjiong via Arpit Agarwal)
     security cluster (zhaoyunjiong via Arpit Agarwal)
 
 
+    HDFS-4852. libhdfs documentation is out of date. (cnauroth)
+
 Release 2.5.0 - 2014-08-11
 Release 2.5.0 - 2014-08-11
 
 
   INCOMPATIBLE CHANGES
   INCOMPATIBLE CHANGES

+ 18 - 11
hadoop-hdfs-project/hadoop-hdfs/src/site/apt/LibHdfs.apt.vm

@@ -26,14 +26,17 @@ C API libhdfs
    (HDFS). It provides C APIs to a subset of the HDFS APIs to manipulate
    (HDFS). It provides C APIs to a subset of the HDFS APIs to manipulate
    HDFS files and the filesystem. libhdfs is part of the Hadoop
    HDFS files and the filesystem. libhdfs is part of the Hadoop
    distribution and comes pre-compiled in
    distribution and comes pre-compiled in
-   <<<${HADOOP_PREFIX}/libhdfs/libhdfs.so>>> .
+   <<<${HADOOP_HDFS_HOME}/lib/native/libhdfs.so>>> .  libhdfs is compatible with
+   Windows and can be built on Windows by running <<<mvn compile>>> within the
+   <<<hadoop-hdfs-project/hadoop-hdfs>>> directory of the source tree.
 
 
 * The APIs
 * The APIs
 
 
-   The libhdfs APIs are a subset of: {{{hadoop fs APIs}}}.
+   The libhdfs APIs are a subset of the
+   {{{../../api/org/apache/hadoop/fs/FileSystem.html}Hadoop FileSystem APIs}}.
 
 
    The header file for libhdfs describes each API in detail and is
    The header file for libhdfs describes each API in detail and is
-   available in <<<${HADOOP_PREFIX}/src/c++/libhdfs/hdfs.h>>>
+   available in <<<${HADOOP_HDFS_HOME}/include/hdfs.h>>>.
 
 
 * A Sample Program
 * A Sample Program
 
 
@@ -55,24 +58,28 @@ C API libhdfs
                fprintf(stderr, "Failed to 'flush' %s\n", writePath);
                fprintf(stderr, "Failed to 'flush' %s\n", writePath);
               exit(-1);
               exit(-1);
         }
         }
-       hdfsCloseFile(fs, writeFile);
+        hdfsCloseFile(fs, writeFile);
     }
     }
 ----
 ----
 
 
 * How To Link With The Library
 * How To Link With The Library
 
 
-   See the Makefile for <<<hdfs_test.c>>> in the libhdfs source directory
-   (<<<${HADOOP_PREFIX}/src/c++/libhdfs/Makefile>>>) or something like:
-   <<<gcc above_sample.c -I${HADOOP_PREFIX}/src/c++/libhdfs -L${HADOOP_PREFIX}/libhdfs -lhdfs -o above_sample>>>
+   See the CMake file for <<<test_libhdfs_ops.c>>> in the libhdfs source
+   directory (<<<hadoop-hdfs-project/hadoop-hdfs/src/CMakeLists.txt>>>) or
+   something like:
+   <<<gcc above_sample.c -I${HADOOP_HDFS_HOME}/include -L${HADOOP_HDFS_HOME}/lib/native -lhdfs -o above_sample>>>
 
 
 * Common Problems
 * Common Problems
 
 
    The most common problem is the <<<CLASSPATH>>> is not set properly when
    The most common problem is the <<<CLASSPATH>>> is not set properly when
    calling a program that uses libhdfs. Make sure you set it to all the
    calling a program that uses libhdfs. Make sure you set it to all the
-   Hadoop jars needed to run Hadoop itself. Currently, there is no way to
-   programmatically generate the classpath, but a good bet is to include
-   all the jar files in <<<${HADOOP_PREFIX}>>> and <<<${HADOOP_PREFIX}/lib>>> as well
-   as the right configuration directory containing <<<hdfs-site.xml>>>
+   Hadoop jars needed to run Hadoop itself as well as the right configuration
+   directory containing <<<hdfs-site.xml>>>.  It is not valid to use wildcard
+   syntax for specifying multiple jars.  It may be useful to run
+   <<<hadoop classpath --glob>>> or <<<hadoop classpath --jar <path>>>> to
+   generate the correct classpath for your deployment.  See
+   {{{../hadoop-common/CommandsManual.html#classpath}Hadoop Commands Reference}}
+   for more information on this command.
 
 
 * Thread Safe
 * Thread Safe