浏览代码

HADOOP-18597. Simplify single node instructions for creating directories for Map Reduce. (#5305)

Signed-off-by: Ayush Saxena <ayushsaxena@apache.org>
Nikita Eshkeev 2 年之前
父节点
当前提交
d07356e60e
共有 1 个文件被更改,包括 1 次插入2 次删除
  1. 1 2
      hadoop-common-project/hadoop-common/src/site/markdown/SingleCluster.md.vm

+ 1 - 2
hadoop-common-project/hadoop-common/src/site/markdown/SingleCluster.md.vm

@@ -157,8 +157,7 @@ The following instructions are to run a MapReduce job locally. If you want to ex
 
 
 4.  Make the HDFS directories required to execute MapReduce jobs:
 4.  Make the HDFS directories required to execute MapReduce jobs:
 
 
-          $ bin/hdfs dfs -mkdir /user
-          $ bin/hdfs dfs -mkdir /user/<username>
+          $ bin/hdfs dfs -mkdir -p /user/<username>
 
 
 5.  Copy the input files into the distributed filesystem:
 5.  Copy the input files into the distributed filesystem: