瀏覽代碼

HADOOP-2908. PDF and HTML versions of a document that describes
the DFS Shell command. (Mahadev Konar via dhruba)



git-svn-id: https://svn.apache.org/repos/asf/hadoop/core/trunk@636902 13f79535-47bb-0310-9956-ffa450edef68

Dhruba Borthakur 17 年之前
父節點
當前提交
6ceeb0df5a
共有 2 個文件被更改,包括 1241 次插入0 次删除
  1. 894 0
      docs/hdfs_shell.html
  2. 347 0
      docs/hdfs_shell.pdf

+ 894 - 0
docs/hdfs_shell.html

@@ -0,0 +1,894 @@
+<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
+<html>
+<head>
+<META http-equiv="Content-Type" content="text/html; charset=UTF-8">
+<meta content="Apache Forrest" name="Generator">
+<meta name="Forrest-version" content="0.8">
+<meta name="Forrest-skin-name" content="pelt">
+<title>Hadoop Shell Commands</title>
+<link type="text/css" href="skin/basic.css" rel="stylesheet">
+<link media="screen" type="text/css" href="skin/screen.css" rel="stylesheet">
+<link media="print" type="text/css" href="skin/print.css" rel="stylesheet">
+<link type="text/css" href="skin/profile.css" rel="stylesheet">
+<script src="skin/getBlank.js" language="javascript" type="text/javascript"></script><script src="skin/getMenu.js" language="javascript" type="text/javascript"></script><script src="skin/fontsize.js" language="javascript" type="text/javascript"></script>
+<link rel="shortcut icon" href="images/favicon.ico">
+</head>
+<body onload="init()">
+<script type="text/javascript">ndeSetTextSize();</script>
+<div id="top">
+<!--+
+    |breadtrail
+    +-->
+<div class="breadtrail">
+<a href="http://www.apache.org/">Apache</a> &gt; <a href="http://hadoop.apache.org/">Hadoop</a> &gt; <a href="http://hadoop.apache.org/core/">Core</a><script src="skin/breadcrumbs.js" language="JavaScript" type="text/javascript"></script>
+</div>
+<!--+
+    |header
+    +-->
+<div class="header">
+<!--+
+    |start group logo
+    +-->
+<div class="grouplogo">
+<a href="http://hadoop.apache.org/"><img class="logoImage" alt="Hadoop" src="images/hadoop-logo.jpg" title="Apache Hadoop"></a>
+</div>
+<!--+
+    |end group logo
+    +-->
+<!--+
+    |start Project Logo
+    +-->
+<div class="projectlogo">
+<a href="http://hadoop.apache.org/core/"><img class="logoImage" alt="Hadoop" src="images/core-logo.gif" title="Scalable Computing Platform"></a>
+</div>
+<!--+
+    |end Project Logo
+    +-->
+<!--+
+    |start Search
+    +-->
+<div class="searchbox">
+<form action="http://www.google.com/search" method="get" class="roundtopsmall">
+<input value="hadoop.apache.org" name="sitesearch" type="hidden"><input onFocus="getBlank (this, 'Search the site with google');" size="25" name="q" id="query" type="text" value="Search the site with google">&nbsp; 
+                    <input name="Search" value="Search" type="submit">
+</form>
+</div>
+<!--+
+    |end search
+    +-->
+<!--+
+    |start Tabs
+    +-->
+<ul id="tabs">
+<li>
+<a class="unselected" href="http://hadoop.apache.org/core/">Project</a>
+</li>
+<li>
+<a class="unselected" href="http://wiki.apache.org/hadoop">Wiki</a>
+</li>
+<li class="current">
+<a class="selected" href="index.html">Hadoop 0.16 Documentation</a>
+</li>
+</ul>
+<!--+
+    |end Tabs
+    +-->
+</div>
+</div>
+<div id="main">
+<div id="publishedStrip">
+<!--+
+    |start Subtabs
+    +-->
+<div id="level2tabs"></div>
+<!--+
+    |end Endtabs
+    +-->
+<script type="text/javascript"><!--
+document.write("Last Published: " + document.lastModified);
+//  --></script>
+</div>
+<!--+
+    |breadtrail
+    +-->
+<div class="breadtrail">
+
+             &nbsp;
+           </div>
+<!--+
+    |start Menu, mainarea
+    +-->
+<!--+
+    |start Menu
+    +-->
+<div id="menu">
+<div onclick="SwitchMenu('menu_selected_1.1', 'skin/')" id="menu_selected_1.1Title" class="menutitle" style="background-image: url('skin/images/chapter_open.gif');">Documentation</div>
+<div id="menu_selected_1.1" class="selectedmenuitemgroup" style="display: block;">
+<div class="menuitem">
+<a href="index.html">Overview</a>
+</div>
+<div class="menuitem">
+<a href="quickstart.html">Quickstart</a>
+</div>
+<div class="menuitem">
+<a href="cluster_setup.html">Cluster Setup</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_design.html">HDFS Architecture</a>
+</div>
+<div class="menuitem">
+<a href="hdfs_user_guide.html">HDFS User Guide</a>
+</div>
+<div class="menupage">
+<div class="menupagetitle">HDFS Shell Guide</div>
+</div>
+<div class="menuitem">
+<a href="hdfs_permissions_guide.html">HDFS Permissions Guide</a>
+</div>
+<div class="menuitem">
+<a href="mapred_tutorial.html">Map-Reduce Tutorial</a>
+</div>
+<div class="menuitem">
+<a href="native_libraries.html">Native Hadoop Libraries</a>
+</div>
+<div class="menuitem">
+<a href="streaming.html">Streaming</a>
+</div>
+<div class="menuitem">
+<a href="hod.html">Hadoop On Demand</a>
+</div>
+<div class="menuitem">
+<a href="api/index.html">API Docs</a>
+</div>
+<div class="menuitem">
+<a href="http://wiki.apache.org/hadoop/">Wiki</a>
+</div>
+<div class="menuitem">
+<a href="http://wiki.apache.org/hadoop/FAQ">FAQ</a>
+</div>
+<div class="menuitem">
+<a href="http://hadoop.apache.org/core/mailing_lists.html">Mailing Lists</a>
+</div>
+</div>
+<div id="credit"></div>
+<div id="roundbottom">
+<img style="display: none" class="corner" height="15" width="15" alt="" src="skin/images/rc-b-l-15-1body-2menu-3menu.png"></div>
+<!--+
+  |alternative credits
+  +-->
+<div id="credit2"></div>
+</div>
+<!--+
+    |end Menu
+    +-->
+<!--+
+    |start content
+    +-->
+<div id="content">
+<div title="Portable Document Format" class="pdflink">
+<a class="dida" href="hdfs_shell.pdf"><img alt="PDF -icon" src="skin/images/pdfdoc.gif" class="skin"><br>
+        PDF</a>
+</div>
+<h1>Hadoop Shell Commands</h1>
+<div id="minitoc-area">
+<ul class="minitoc">
+<li>
+<a href="#DFShell"> DFShell </a>
+</li>
+<li>
+<a href="#cat"> cat </a>
+</li>
+<li>
+<a href="#chgrp"> chgrp </a>
+</li>
+<li>
+<a href="#chmod"> chmod </a>
+</li>
+<li>
+<a href="#chown"> chown </a>
+</li>
+<li>
+<a href="#copyFromLocal">copyFromLocal</a>
+</li>
+<li>
+<a href="#copyToLocal"> copyToLocal</a>
+</li>
+<li>
+<a href="#cp"> cp </a>
+</li>
+<li>
+<a href="#du">du</a>
+</li>
+<li>
+<a href="#dus"> dus </a>
+</li>
+<li>
+<a href="#expunge"> expunge </a>
+</li>
+<li>
+<a href="#get"> get </a>
+</li>
+<li>
+<a href="#getmerge"> getmerge </a>
+</li>
+<li>
+<a href="#ls"> ls </a>
+</li>
+<li>
+<a href="#lsr">lsr</a>
+</li>
+<li>
+<a href="#mkdir"> mkdir </a>
+</li>
+<li>
+<a href="#movefromLocal"> movefromLocal </a>
+</li>
+<li>
+<a href="#mv"> mv </a>
+</li>
+<li>
+<a href="#put"> put </a>
+</li>
+<li>
+<a href="#rm"> rm </a>
+</li>
+<li>
+<a href="#rmr"> rmr </a>
+</li>
+<li>
+<a href="#setrep"> setrep </a>
+</li>
+<li>
+<a href="#stat"> stat </a>
+</li>
+<li>
+<a href="#tail"> tail </a>
+</li>
+<li>
+<a href="#test"> test </a>
+</li>
+<li>
+<a href="#text"> text </a>
+</li>
+<li>
+<a href="#touchz"> touchz </a>
+</li>
+</ul>
+</div>
+		
+<a name="N1000D"></a><a name="DFShell"></a>
+<h2 class="h3"> DFShell </h2>
+<div class="section">
+<p>
+      The HDFS shell is invoked by 
+      <span class="codefrag">bin/hadoop dfs &lt;args&gt;</span>.
+      All the HDFS shell commands take path URIs as arguments. The URI format is <em>scheme://autority/path</em>. For HDFS the scheme is <em>hdfs</em>, and for the local filesystem the scheme is <em>file</em>. The scheme and authority are optional. If not specified, the default scheme specified in the configuration is used. An HDFS file or directory such as <em>/parent/child</em> can be specified as <em>hdfs://namenode:namenodeport/parent/child</em> or simply as <em>/parent/child</em> (given that your configuration is set to point to <em>namenode:namenodeport</em>). Most of the commands in HDFS shell behave like corresponding Unix commands. Differences are described with each of the commands. Error information is sent to <em>stderr</em> and the output is sent to <em>stdout</em>. 
+  </p>
+</div>
+		
+<a name="N10035"></a><a name="cat"></a>
+<h2 class="h3"> cat </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -cat URI [URI &hellip;]</span>
+			
+</p>
+<p>
+		   Copies source paths to <em>stdout</em>. 
+		   </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -cat hdfs://host1:port1/file1 hdfs://host2:port2/file2 
+		   </span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag">hadoop dfs -cat file:///file3 /user/hadoop/file4 </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:<br>
+		   
+<span class="codefrag"> Returns 0 on success and -1 on error. </span>
+</p>
+</div>
+		
+<a name="N10061"></a><a name="chgrp"></a>
+<h2 class="h3"> chgrp </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -chgrp [-R] GROUP URI [URI &hellip;]</span>
+			
+</p>
+<p>
+	    Change group association of files. With <span class="codefrag">-R</span>, make the change recursively through the directory structure. The user must be the owner of files, or else a super-user. Additional information is in the <a href="hdfs_permissions_guide.html">Permissions User Guide</a>.
+	    </p>
+</div>
+		
+<a name="N10078"></a><a name="chmod"></a>
+<h2 class="h3"> chmod </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -chmod [-R] &lt;MODE[,MODE]... | OCTALMODE&gt; URI [URI &hellip;]</span>
+			
+</p>
+<p>
+	    Change the permissions of files. With <span class="codefrag">-R</span>, make the change recursively through the directory structure. The user must be the owner of the file, or else a super-user. Additional information is in the <a href="hdfs_permissions_guide.html">Permissions User Guide</a>.
+	    </p>
+</div>
+		
+<a name="N1008F"></a><a name="chown"></a>
+<h2 class="h3"> chown </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -chown [-R] [OWNER][:[GROUP]] URI [URI ]</span>
+			
+</p>
+<p>
+	    Change the owner of files. With <span class="codefrag">-R</span>, make the change recursively through the directory structure. The user must be a super-user. Additional information is in the <a href="hdfs_permissions_guide.html">Permissions User Guide</a>.
+	    </p>
+</div>
+		
+<a name="N100A6"></a><a name="copyFromLocal"></a>
+<h2 class="h3">copyFromLocal</h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -copyFromLocal &lt;localsrc&gt; URI</span>
+			
+</p>
+<p>Similar to <a href="#putlink"><strong>put</strong></a> command, except that the source is restricted to a local file reference. </p>
+</div>
+		
+<a name="N100BB"></a><a name="copyToLocal"></a>
+<h2 class="h3"> copyToLocal</h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -copyToLocal [-ignorecrc] [-crc] URI &lt;localdst&gt;</span>
+			
+</p>
+<p> Similar to <a href="#getlink"><strong>get</strong></a> command, except that the destination is restricted to a local file reference.</p>
+</div>
+		
+<a name="N100D0"></a><a name="cp"></a>
+<h2 class="h3"> cp </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -cp URI [URI &hellip;] &lt;dest&gt;</span>
+			
+</p>
+<p>
+	    Copy files from source to destination. This command allows multiple sources as well in which case the destination must be a directory.
+	    <br>
+	    Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -cp /user/hadoop/file1 /user/hadoop/file2</span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -cp /user/hadoop/file1 /user/hadoop/file2 /user/hadoop/dir </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag"> Returns 0 on success and -1 on error.</span>
+			
+</p>
+</div>
+		
+<a name="N100FA"></a><a name="du"></a>
+<h2 class="h3">du</h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -du URI [URI &hellip;]</span>
+			
+</p>
+<p>
+	     Displays aggregate length of  files contained in the directory or the length of a file in case its just a file.<br>
+	     Example:<br>
+<span class="codefrag">hadoop dfs -du /user/hadoop/dir1 /user/hadoop/file1 hdfs://host:port/user/hadoop/dir1</span>
+<br>
+	     Exit Code:<br>
+<span class="codefrag"> Returns 0 on success and -1 on error. </span>
+<br>
+</p>
+</div>
+		
+<a name="N10115"></a><a name="dus"></a>
+<h2 class="h3"> dus </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -dus &lt;args&gt;</span>
+			
+</p>
+<p>
+	    Displays a summary of file lengths.
+	   </p>
+</div>
+		
+<a name="N10125"></a><a name="expunge"></a>
+<h2 class="h3"> expunge </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -expunge</span>
+			
+</p>
+<p>Empty the Trash. Refer to <a href="hdfs_design.html">HDFS Design</a> for more information on Trash feature.
+	   </p>
+</div>
+		
+<a name="N10139"></a><a name="get"></a>
+<h2 class="h3"> get </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -get [-ignorecrc] [-crc] &lt;src&gt; &lt;localdst&gt;</span>
+				
+<br>
+			
+</p>
+<p>
+	   Copy files to the local file system. Files that fail the CRC check may be copied with the  
+	   <span class="codefrag">-ignorecrc</span> option. Files and CRCs may be copied using the 
+	   <span class="codefrag">-crc</span> option.
+	  </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -get /user/hadoop/file localfile </span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -get hdfs://host:port/user/hadoop/file localfile</span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag"> Returns 0 on success and -1 on error. </span>
+			
+</p>
+</div>
+		
+<a name="N1016D"></a><a name="getmerge"></a>
+<h2 class="h3"> getmerge </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -getmerge &lt;src&gt; &lt;localdst&gt; [addnl]</span>
+			
+</p>
+<p>
+	  Takes a source directory and a destination file as input and concatenates files in src into the destination local file. Optionally <span class="codefrag">addnl</span> can be set to enable adding a newline character at the end of each file.  
+	  </p>
+</div>
+		
+<a name="N10180"></a><a name="ls"></a>
+<h2 class="h3"> ls </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -ls &lt;args&gt;</span>
+			
+</p>
+<p>
+		 For a file returns stat on the file with the following format:<br>
+<span class="codefrag">filename &lt;number of replicas&gt; filesize modification_date modification_time permissions userid groupid</span>
+<br>
+	         For a directory it returns list of its direct children as in unix.
+	         A directory is listed as: <br>
+<span class="codefrag">dirname &lt;dir&gt; modification_time modification_time permissions userid groupid</span>
+<br>
+	         Example:<br>
+<span class="codefrag">hadoop dfs -ls /user/hadoop/file1 /user/hadoop/file2 hdfs://host:port/user/hadoop/dir1 /nonexistentfile</span>
+<br>
+	         Exit Code:<br>
+<span class="codefrag"> Returns 0 on success and -1 on error. </span>
+<br>
+</p>
+</div>
+		
+<a name="N101A3"></a><a name="lsr"></a>
+<h2 class="h3">lsr</h2>
+<div class="section">
+<p>
+<span class="codefrag">Usage: hadoop dfs -lsr &lt;args&gt;</span>
+<br>
+	      Recursive version of <span class="codefrag">ls</span>. Similar to Unix <span class="codefrag">ls -R</span>.
+	      </p>
+</div>
+		
+<a name="N101B6"></a><a name="mkdir"></a>
+<h2 class="h3"> mkdir </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -mkdir &lt;paths&gt;</span>
+				
+<br>
+			
+</p>
+<p>
+	   Takes path uri's as argument and creates directories. The behavior is much like unix mkdir -p creating parent directories along the path.
+	  </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag">hadoop dfs -mkdir /user/hadoop/dir1 /user/hadoop/dir2 </span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag">hadoop dfs -mkdir hdfs://host1:port1/user/hadoop/dir hdfs://host2:port2/user/hadoop/dir
+	  </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag">Returns 0 on success and -1 on error.</span>
+			
+</p>
+</div>
+		
+<a name="N101E3"></a><a name="movefromLocal"></a>
+<h2 class="h3"> movefromLocal </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: dfs -moveFromLocal &lt;src&gt; &lt;dst&gt;</span>
+			
+</p>
+<p>Displays a "not implemented" message.
+	   </p>
+</div>
+		
+<a name="N101F3"></a><a name="mv"></a>
+<h2 class="h3"> mv </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -mv URI [URI &hellip;] &lt;dest&gt;</span>
+			
+</p>
+<p>
+	    Moves files from source to destination. This command allows multiple sources as well in which case the destination needs to be a directory. Moving files across filesystems is not permitted.
+	    <br>
+	    Example:
+	    </p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -mv /user/hadoop/file1 /user/hadoop/file2</span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -mv hdfs://host:port/file1 hdfs://host:port/file2 hdfs://host:port/file3 hdfs://host:port/dir1</span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag"> Returns 0 on success and -1 on error.</span>
+			
+</p>
+</div>
+		
+<a name="N1021D"></a><a name="put"></a>
+<h2 class="h3"> put </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -put &lt;localsrc&gt; &lt;dst&gt;</span>
+			
+</p>
+<p>Copy src from local file system to the destination filesystem. Also reads input from stdin and writes to destination filesystem.<br>
+	   
+</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -put localfile /user/hadoop/hadoopfile</span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -put localfile hdfs://host:port/hadoop/hadoopfile</span>
+				
+</li>
+				
+<li>
+<span class="codefrag">hadoop dfs -put - hdfs://host:port/hadoop/hadoopfile</span>
+<br>Reads the input from stdin.</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag"> Returns 0 on success and -1 on error. </span>
+			
+</p>
+</div>
+		
+<a name="N1024E"></a><a name="rm"></a>
+<h2 class="h3"> rm </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -rm URI [URI &hellip;] </span>
+			
+</p>
+<p>
+	   Delete files specified as args. Only deletes non empty directory and files. Refer to rmr for recursive deletes.<br>
+	   Example:
+	   </p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -rm hdfs://host:port/file /user/hadoop/emptydir </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag"> Returns 0 on success and -1 on error.</span>
+			
+</p>
+</div>
+		
+<a name="N10272"></a><a name="rmr"></a>
+<h2 class="h3"> rmr </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -rmr URI [URI &hellip;]</span>
+			
+</p>
+<p>Recursive version of delete.<br>
+	   Example:
+	   </p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -rmr /user/hadoop/dir </span>
+				
+</li>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -rmr hdfs://host:port/user/hadoop/dir </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag"> Returns 0 on success and -1 on error. </span>
+			
+</p>
+</div>
+		
+<a name="N1029C"></a><a name="setrep"></a>
+<h2 class="h3"> setrep </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -setrep [-R] &lt;path&gt;</span>
+			
+</p>
+<p>
+	   Changes the replication factor of a file. -R option is for recursively increasing the replication factor of files within a directory.
+	  </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -setrep -w 3 -R /user/hadoop/dir1 </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:</p>
+<p>
+				
+<span class="codefrag">Returns 0 on success and -1 on error. </span>
+			
+</p>
+</div>
+		
+<a name="N102C1"></a><a name="stat"></a>
+<h2 class="h3"> stat </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -stat URI [URI &hellip;]</span>
+			
+</p>
+<p>
+	   Returns the stat information on the path.
+	   </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -stat path </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:<br>
+	   
+<span class="codefrag"> Returns 0 on success and -1 on error.</span>
+</p>
+</div>
+		
+<a name="N102E4"></a><a name="tail"></a>
+<h2 class="h3"> tail </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -tail [-f] URI </span>
+			
+</p>
+<p>
+	   Displays last kilobyte of the file to stdout. -f option can be used as in Unix.
+	   </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -tail pathname </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code: <br>
+	   
+<span class="codefrag"> Returns 0 on success and -1 on error.</span>
+</p>
+</div>
+		
+<a name="N10307"></a><a name="test"></a>
+<h2 class="h3"> test </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -test -[ezd] URI</span>
+			
+</p>
+<p>
+	   Options: <br>
+	   -e check to see if the file exists. Return 0 if true. <br>
+	   -z check to see if the file is zero length. Return 0 if true <br>
+	   -d check return 1 if the path is directory else return 0. <br>
+</p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop dfs -test -e filename </span>
+				
+</li>
+			
+</ul>
+</div>
+		
+<a name="N1032A"></a><a name="text"></a>
+<h2 class="h3"> text </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -text &lt;src&gt;</span>
+				
+<br>
+			
+</p>
+<p>
+	   Takes a source file and outputs the file in text format. The allowed formats are zip and TextRecordInputStream.
+	  </p>
+</div>
+		
+<a name="N1033C"></a><a name="touchz"></a>
+<h2 class="h3"> touchz </h2>
+<div class="section">
+<p>
+				
+<span class="codefrag">Usage: hadoop dfs -touchz URI [URI &hellip;]</span>
+				
+<br>
+			
+</p>
+<p>
+	   Create a file of zero length.
+	   </p>
+<p>Example:</p>
+<ul>
+				
+<li>
+					
+<span class="codefrag"> hadoop -touchz pathname </span>
+				
+</li>
+			
+</ul>
+<p>Exit Code:<br>
+	   
+<span class="codefrag"> Returns 0 on success and -1 on error.</span>
+</p>
+</div>
+	
+</div>
+<!--+
+    |end content
+    +-->
+<div class="clearboth">&nbsp;</div>
+</div>
+<div id="footer">
+<!--+
+    |start bottomstrip
+    +-->
+<div class="lastmodified">
+<script type="text/javascript"><!--
+document.write("Last Published: " + document.lastModified);
+//  --></script>
+</div>
+<div class="copyright">
+        Copyright &copy;
+         2007 <a href="http://www.apache.org/licenses/">The Apache Software Foundation.</a>
+</div>
+<!--+
+    |end bottomstrip
+    +-->
+</div>
+</body>
+</html>

File diff suppressed because it is too large
+ 347 - 0
docs/hdfs_shell.pdf


Some files were not shown because too many files changed in this diff