浏览代码

HADOOP-5671. Fix FNF exceptions when copying from old versions of
HftpFileSystem. Contributed by Tsz Wo (Nicholas), SZE


git-svn-id: https://svn.apache.org/repos/asf/hadoop/core/trunk@765427 13f79535-47bb-0310-9956-ffa450edef68

Christopher Douglas 16 年之前
父节点
当前提交
86d28384de
共有 2 个文件被更改,包括 20 次插入1 次删除
  1. 3 0
      CHANGES.txt
  2. 17 1
      src/tools/org/apache/hadoop/tools/DistCp.java

+ 3 - 0
CHANGES.txt

@@ -394,6 +394,9 @@ Trunk (unreleased changes)
     HADOOP-5652. Fix a bug where in-memory segments are incorrectly retained in
     memory. (cdouglas)
 
+    HADOOP-5671. Fix FNF exceptions when copying from old versions of
+    HftpFileSystem. (Tsz Wo (Nicholas), SZE via cdouglas)
+
 Release 0.20.0 - Unreleased
 
   INCOMPATIBLE CHANGES

+ 17 - 1
src/tools/org/apache/hadoop/tools/DistCp.java

@@ -1171,9 +1171,25 @@ public class DistCp implements Tool {
       return false;
     }
 
+    //get src checksum
+    final FileChecksum srccs;
+    try {
+      srccs = srcfs.getFileChecksum(srcstatus.getPath());
+    } catch(FileNotFoundException fnfe) {
+      /*
+       * Two possible cases:
+       * (1) src existed once but was deleted between the time period that
+       *     srcstatus was obtained and the try block above.
+       * (2) srcfs does not support file checksum and (incorrectly) throws
+       *     FNFE, e.g. some previous versions of HftpFileSystem.
+       * For case (1), it is okay to return true since src was already deleted.
+       * For case (2), true should be returned.  
+       */
+      return true;
+    }
+
     //compare checksums
     try {
-      final FileChecksum srccs = srcfs.getFileChecksum(srcstatus.getPath());
       final FileChecksum dstcs = dstfs.getFileChecksum(dststatus.getPath());
       //return true if checksum is not supported
       //(i.e. some of the checksums is null)