Sfoglia il codice sorgente

HADOOP-1437. Add Eclipse plugin in contrib. (Eugene Hung and Christophe Taton via cutting)

git-svn-id: https://svn.apache.org/repos/asf/lucene/hadoop/trunk@566838 13f79535-47bb-0310-9956-ffa450edef68
Doug Cutting 17 anni fa
parent
commit
3d7c91a3f4
84 ha cambiato i file con 7919 aggiunte e 5 eliminazioni
  1. 3 0
      CHANGES.txt
  2. 7 5
      src/contrib/build-contrib.xml
  3. 7 0
      src/contrib/eclipse-plugin/.classpath
  4. 28 0
      src/contrib/eclipse-plugin/.project
  5. 262 0
      src/contrib/eclipse-plugin/.settings/org.eclipse.jdt.core.prefs
  6. 6 0
      src/contrib/eclipse-plugin/.settings/org.eclipse.jdt.ui.prefs
  7. 6 0
      src/contrib/eclipse-plugin/.settings/org.eclipse.wst.validation.prefs
  8. 28 0
      src/contrib/eclipse-plugin/META-INF/MANIFEST.MF
  9. 5 0
      src/contrib/eclipse-plugin/build.properties
  10. 44 0
      src/contrib/eclipse-plugin/build.xml
  11. 262 0
      src/contrib/eclipse-plugin/plugin.xml
  12. 32 0
      src/contrib/eclipse-plugin/resources/ConnectDFS.xml
  13. 62 0
      src/contrib/eclipse-plugin/resources/CreateProj.xml
  14. BIN
      src/contrib/eclipse-plugin/resources/Elephant100x100.gif
  15. BIN
      src/contrib/eclipse-plugin/resources/Elephant16x16.gif
  16. 121 0
      src/contrib/eclipse-plugin/resources/HelloWorld.xml
  17. BIN
      src/contrib/eclipse-plugin/resources/MAP100x100.gif
  18. BIN
      src/contrib/eclipse-plugin/resources/MAP16x15.gif
  19. 24 0
      src/contrib/eclipse-plugin/resources/RunProj.xml
  20. 25 0
      src/contrib/eclipse-plugin/resources/SetHadoopPath.xml
  21. 18 0
      src/contrib/eclipse-plugin/resources/Setup.xml
  22. BIN
      src/contrib/eclipse-plugin/resources/drive100x100.gif
  23. BIN
      src/contrib/eclipse-plugin/resources/drive16x16.gif
  24. BIN
      src/contrib/eclipse-plugin/resources/driver.png
  25. BIN
      src/contrib/eclipse-plugin/resources/driverwiz.png
  26. BIN
      src/contrib/eclipse-plugin/resources/elephantblue16x16.gif
  27. BIN
      src/contrib/eclipse-plugin/resources/files.gif
  28. BIN
      src/contrib/eclipse-plugin/resources/hadoop.gif
  29. BIN
      src/contrib/eclipse-plugin/resources/hadoop_small.gif
  30. BIN
      src/contrib/eclipse-plugin/resources/job.gif
  31. BIN
      src/contrib/eclipse-plugin/resources/map16x16.gif
  32. BIN
      src/contrib/eclipse-plugin/resources/mapper16.png
  33. BIN
      src/contrib/eclipse-plugin/resources/mapwiz.png
  34. BIN
      src/contrib/eclipse-plugin/resources/projwiz.png
  35. BIN
      src/contrib/eclipse-plugin/resources/reduce100x100.gif
  36. BIN
      src/contrib/eclipse-plugin/resources/reduce16x16.gif
  37. BIN
      src/contrib/eclipse-plugin/resources/reducer-16x16.gif
  38. BIN
      src/contrib/eclipse-plugin/resources/reducer16.png
  39. BIN
      src/contrib/eclipse-plugin/resources/reducewiz.png
  40. BIN
      src/contrib/eclipse-plugin/resources/spite_overcloud.png
  41. BIN
      src/contrib/eclipse-plugin/resources/spitesmall.gif
  42. BIN
      src/contrib/eclipse-plugin/resources/spitesmall.png
  43. 67 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/Activator.java
  44. 95 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/HadoopPerspectiveFactory.java
  45. 80 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/JSchUtilities.java
  46. 146 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/MapReduceNature.java
  47. 99 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewDriverWizard.java
  48. 270 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewDriverWizardPage.java
  49. 412 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewMapReduceProjectWizard.java
  50. 189 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewMapperWizard.java
  51. 192 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewReducerWizard.java
  52. 43 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/PropertyTester.java
  53. 275 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/DfsAction.java
  54. 68 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/EditServerAction.java
  55. 76 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/NewServerAction.java
  56. 76 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/OpenNewMRClassWizardAction.java
  57. 49 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/OpenNewMRProjectAction.java
  58. 102 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/RunOnHadoopActionDelegate.java
  59. 177 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/ActionProvider.java
  60. 203 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSContentProvider.java
  61. 157 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DfsFile.java
  62. 324 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DfsFolder.java
  63. 202 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DfsPath.java
  64. 58 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/LaunchShortcut.java
  65. 182 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/LocalMapReduceLaunchTabGroup.java
  66. 37 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/MutexRule.java
  67. 264 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/SWTUserInfo.java
  68. 47 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/StartHadoopLaunchTabGroup.java
  69. 373 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/StartMapReduceServer.java
  70. 63 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/preferences/HadoopHomeDirPreferencePage.java
  71. 34 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/preferences/PreferenceConstants.java
  72. 33 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/preferences/PreferenceInitializer.java
  73. 100 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopJob.java
  74. 124 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopPathPage.java
  75. 683 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopServer.java
  76. 33 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/IJobListener.java
  77. 102 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/JarModule.java
  78. 445 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/DefineHadoopServerLocWizardPage.java
  79. 75 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/HadoopServerSelectionListContentProvider.java
  80. 28 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/IHadoopServerListener.java
  81. 235 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/RunOnHadoopWizard.java
  82. 229 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/ServerRegistry.java
  83. 383 0
      src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/view/servers/ServerView.java
  84. 149 0
      src/contrib/eclipse-plugin/todo.txt

+ 3 - 0
CHANGES.txt

@@ -28,6 +28,9 @@ Trunk (unreleased changes)
     HADOOP-1610.  Add metrics for failed tasks.
     (Devaraj Das via tomwhite)
 
+    HADOOP-1437.  Add Eclipse plugin in contrib.
+    (Eugene Hung and Christophe Taton via cutting)
+
   OPTIMIZATIONS
 
     HADOOP-1565.  Reduce memory usage of NameNode by replacing 

+ 7 - 5
src/contrib/build-contrib.xml

@@ -64,12 +64,13 @@
 
 
   <!-- to be overridden by sub-projects -->
+  <target name="check-contrib"/>
   <target name="init-contrib"/>
 
   <!-- ====================================================== -->
   <!-- Stuff needed by all targets                            -->
   <!-- ====================================================== -->
-  <target name="init">
+  <target name="init" depends="check-contrib" unless="skip.contrib">
     <echo message="contrib: ${name}"/>
     <mkdir dir="${build.dir}"/>
     <mkdir dir="${build.classes}"/>
@@ -83,7 +84,7 @@
   <!-- ====================================================== -->
   <!-- Compile a Hadoop contrib's files                       -->
   <!-- ====================================================== -->
-  <target name="compile" depends="init">
+  <target name="compile" depends="init" unless="skip.contrib">
     <echo message="contrib: ${name}"/>
     <javac
      encoding="${build.encoding}"
@@ -132,7 +133,7 @@
   <!-- ====================================================== -->
   <!-- Make a Hadoop contrib's jar                            -->
   <!-- ====================================================== -->
-  <target name="jar" depends="compile">
+  <target name="jar" depends="compile" unless="skip.contrib">
     <echo message="contrib: ${name}"/>
     <jar
       jarfile="${build.dir}/hadoop-${name}.jar"
@@ -144,7 +145,8 @@
   <!-- ====================================================== -->
   <!-- Make a Hadoop contrib's examples jar                   -->
   <!-- ====================================================== -->
-  <target name="jar-examples" depends="compile-examples">
+  <target name="jar-examples" depends="compile-examples"
+          if="examples.available" unless="skip.contrib">
     <echo message="contrib: ${name}"/>
     <jar jarfile="${build.dir}/hadoop-${name}-examples.jar">
       <fileset dir="${build.classes}">
@@ -154,7 +156,7 @@
     </jar>
   </target>
   
-  <target name="deploy" depends="jar, jar-examples">
+  <target name="deploy" depends="jar, jar-examples" unless="skip.contrib">
     <echo message="contrib: ${name}"/>
     <mkdir dir="${deploy.dir}"/>
     <copy file="${build.dir}/hadoop-${name}.jar" todir="${deploy.dir}"/>

+ 7 - 0
src/contrib/eclipse-plugin/.classpath

@@ -0,0 +1,7 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+	<classpathentry kind="src" path="src"/>
+	<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>
+	<classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>
+	<classpathentry kind="output" path="bin"/>
+</classpath>

+ 28 - 0
src/contrib/eclipse-plugin/.project

@@ -0,0 +1,28 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+	<name>MapReduceTools</name>
+	<comment></comment>
+	<projects>
+	</projects>
+	<buildSpec>
+		<buildCommand>
+			<name>org.eclipse.jdt.core.javabuilder</name>
+			<arguments>
+			</arguments>
+		</buildCommand>
+		<buildCommand>
+			<name>org.eclipse.pde.ManifestBuilder</name>
+			<arguments>
+			</arguments>
+		</buildCommand>
+		<buildCommand>
+			<name>org.eclipse.pde.SchemaBuilder</name>
+			<arguments>
+			</arguments>
+		</buildCommand>
+	</buildSpec>
+	<natures>
+		<nature>org.eclipse.pde.PluginNature</nature>
+		<nature>org.eclipse.jdt.core.javanature</nature>
+	</natures>
+</projectDescription>

+ 262 - 0
src/contrib/eclipse-plugin/.settings/org.eclipse.jdt.core.prefs

@@ -0,0 +1,262 @@
+#Wed Aug 15 11:41:44 PDT 2007
+eclipse.preferences.version=1
+instance/org.eclipse.core.net/org.eclipse.core.net.hasMigrated=true
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.unusedLocal=preserve
+org.eclipse.jdt.core.compiler.debug.lineNumber=generate
+org.eclipse.jdt.core.compiler.debug.localVariable=generate
+org.eclipse.jdt.core.compiler.debug.sourceFile=generate
+org.eclipse.jdt.core.formatter.align_type_members_on_columns=false
+org.eclipse.jdt.core.formatter.alignment_for_arguments_in_allocation_expression=16
+org.eclipse.jdt.core.formatter.alignment_for_arguments_in_enum_constant=16
+org.eclipse.jdt.core.formatter.alignment_for_arguments_in_explicit_constructor_call=16
+org.eclipse.jdt.core.formatter.alignment_for_arguments_in_method_invocation=16
+org.eclipse.jdt.core.formatter.alignment_for_arguments_in_qualified_allocation_expression=16
+org.eclipse.jdt.core.formatter.alignment_for_assignment=16
+org.eclipse.jdt.core.formatter.alignment_for_binary_expression=16
+org.eclipse.jdt.core.formatter.alignment_for_compact_if=16
+org.eclipse.jdt.core.formatter.alignment_for_conditional_expression=80
+org.eclipse.jdt.core.formatter.alignment_for_enum_constants=0
+org.eclipse.jdt.core.formatter.alignment_for_expressions_in_array_initializer=16
+org.eclipse.jdt.core.formatter.alignment_for_multiple_fields=16
+org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration=16
+org.eclipse.jdt.core.formatter.alignment_for_parameters_in_method_declaration=16
+org.eclipse.jdt.core.formatter.alignment_for_selector_in_method_invocation=16
+org.eclipse.jdt.core.formatter.alignment_for_superclass_in_type_declaration=16
+org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_enum_declaration=16
+org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_type_declaration=16
+org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration=16
+org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration=16
+org.eclipse.jdt.core.formatter.blank_lines_after_imports=1
+org.eclipse.jdt.core.formatter.blank_lines_after_package=1
+org.eclipse.jdt.core.formatter.blank_lines_before_field=1
+org.eclipse.jdt.core.formatter.blank_lines_before_first_class_body_declaration=0
+org.eclipse.jdt.core.formatter.blank_lines_before_imports=1
+org.eclipse.jdt.core.formatter.blank_lines_before_member_type=1
+org.eclipse.jdt.core.formatter.blank_lines_before_method=1
+org.eclipse.jdt.core.formatter.blank_lines_before_new_chunk=1
+org.eclipse.jdt.core.formatter.blank_lines_before_package=0
+org.eclipse.jdt.core.formatter.blank_lines_between_import_groups=1
+org.eclipse.jdt.core.formatter.blank_lines_between_type_declarations=1
+org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_anonymous_type_declaration=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_array_initializer=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_block=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_block_in_case=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_constructor_declaration=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_enum_constant=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_enum_declaration=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_method_declaration=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_switch=end_of_line
+org.eclipse.jdt.core.formatter.brace_position_for_type_declaration=end_of_line
+org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment=false
+org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment=false
+org.eclipse.jdt.core.formatter.comment.format_block_comments=true
+org.eclipse.jdt.core.formatter.comment.format_header=false
+org.eclipse.jdt.core.formatter.comment.format_html=true
+org.eclipse.jdt.core.formatter.comment.format_javadoc_comments=true
+org.eclipse.jdt.core.formatter.comment.format_line_comments=true
+org.eclipse.jdt.core.formatter.comment.format_source_code=true
+org.eclipse.jdt.core.formatter.comment.indent_parameter_description=false
+org.eclipse.jdt.core.formatter.comment.indent_root_tags=true
+org.eclipse.jdt.core.formatter.comment.insert_new_line_before_root_tags=insert
+org.eclipse.jdt.core.formatter.comment.insert_new_line_for_parameter=do not insert
+org.eclipse.jdt.core.formatter.comment.line_length=77
+org.eclipse.jdt.core.formatter.compact_else_if=true
+org.eclipse.jdt.core.formatter.continuation_indentation=2
+org.eclipse.jdt.core.formatter.continuation_indentation_for_array_initializer=2
+org.eclipse.jdt.core.formatter.format_guardian_clause_on_one_line=false
+org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header=true
+org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_constant_header=true
+org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_declaration_header=true
+org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_type_header=true
+org.eclipse.jdt.core.formatter.indent_breaks_compare_to_cases=true
+org.eclipse.jdt.core.formatter.indent_empty_lines=false
+org.eclipse.jdt.core.formatter.indent_statements_compare_to_block=true
+org.eclipse.jdt.core.formatter.indent_statements_compare_to_body=true
+org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_cases=true
+org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch=true
+org.eclipse.jdt.core.formatter.indentation.size=4
+org.eclipse.jdt.core.formatter.insert_new_line_after_annotation=insert
+org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer=do not insert
+org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing=insert
+org.eclipse.jdt.core.formatter.insert_new_line_before_catch_in_try_statement=do not insert
+org.eclipse.jdt.core.formatter.insert_new_line_before_closing_brace_in_array_initializer=do not insert
+org.eclipse.jdt.core.formatter.insert_new_line_before_else_in_if_statement=do not insert
+org.eclipse.jdt.core.formatter.insert_new_line_before_finally_in_try_statement=do not insert
+org.eclipse.jdt.core.formatter.insert_new_line_before_while_in_do_statement=do not insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_annotation_declaration=insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration=insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block=insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant=insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration=insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body=insert
+org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_after_and_in_type_parameter=insert
+org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator=insert
+org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation_type_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_binary_operator=insert
+org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_arguments=insert
+org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters=insert
+org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block=insert
+org.eclipse.jdt.core.formatter.insert_space_after_closing_paren_in_cast=insert
+org.eclipse.jdt.core.formatter.insert_space_after_colon_in_assert=insert
+org.eclipse.jdt.core.formatter.insert_space_after_colon_in_case=insert
+org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional=insert
+org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for=insert
+org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_allocation_expression=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_annotation=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_array_initializer=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_parameters=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_throws=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_constant_arguments=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_declarations=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_inits=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_parameters=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_throws=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_invocation_arguments=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_field_declarations=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_local_declarations=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_parameterized_type_reference=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_superinterfaces=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_arguments=insert
+org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_parameters=insert
+org.eclipse.jdt.core.formatter.insert_space_after_ellipsis=insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_parameters=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_brace_in_array_initializer=insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_allocation_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_annotation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_cast=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_catch=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_constructor_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_enum_constant=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_for=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_invocation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_parenthesized_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_switch=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_synchronized=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_while=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_prefix_operator=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_question_in_conditional=insert
+org.eclipse.jdt.core.formatter.insert_space_after_question_in_wildcard=do not insert
+org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_for=insert
+org.eclipse.jdt.core.formatter.insert_space_after_unary_operator=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_and_in_type_parameter=insert
+org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator=insert
+org.eclipse.jdt.core.formatter.insert_space_before_at_in_annotation_type_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_binary_operator=insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_brace_in_array_initializer=insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_allocation_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_annotation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_cast=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_catch=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_constructor_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_if=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_switch=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_synchronized=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_while=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_colon_in_assert=insert
+org.eclipse.jdt.core.formatter.insert_space_before_colon_in_case=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_colon_in_conditional=insert
+org.eclipse.jdt.core.formatter.insert_space_before_colon_in_default=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_colon_in_for=insert
+org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_allocation_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_annotation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_array_initializer=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_constant_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_explicitconstructorcall_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_increments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_parameters=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_invocation_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_field_declarations=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_local_declarations=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_superinterfaces=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_parameters=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_ellipsis=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_parameterized_type_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_arguments=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_annotation_type_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_array_initializer=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_block=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_constructor_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_constant=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_method_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_switch=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_type_declaration=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_type_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation_type_member_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_catch=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_enum_constant=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_for=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_invocation=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_switch=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_synchronized=insert
+org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_while=insert
+org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_return=insert
+org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw=insert
+org.eclipse.jdt.core.formatter.insert_space_before_postfix_operator=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_prefix_operator=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_question_in_conditional=insert
+org.eclipse.jdt.core.formatter.insert_space_before_question_in_wildcard=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_semicolon=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_for=do not insert
+org.eclipse.jdt.core.formatter.insert_space_before_unary_operator=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_braces_in_array_initializer=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_brackets_in_array_allocation_expression=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_constructor_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_enum_constant=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_declaration=do not insert
+org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_invocation=do not insert
+org.eclipse.jdt.core.formatter.keep_else_statement_on_same_line=false
+org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line=false
+org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line=false
+org.eclipse.jdt.core.formatter.keep_then_statement_on_same_line=false
+org.eclipse.jdt.core.formatter.lineSplit=77
+org.eclipse.jdt.core.formatter.never_indent_block_comments_on_first_column=false
+org.eclipse.jdt.core.formatter.never_indent_line_comments_on_first_column=false
+org.eclipse.jdt.core.formatter.number_of_blank_lines_at_beginning_of_method_body=0
+org.eclipse.jdt.core.formatter.number_of_empty_lines_to_preserve=1
+org.eclipse.jdt.core.formatter.put_empty_statement_on_new_line=true
+org.eclipse.jdt.core.formatter.tabulation.char=space
+org.eclipse.jdt.core.formatter.tabulation.size=2
+org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations=false
+org.eclipse.jdt.core.formatter.wrap_before_binary_operator=true

+ 6 - 0
src/contrib/eclipse-plugin/.settings/org.eclipse.jdt.ui.prefs

@@ -0,0 +1,6 @@
+#Tue Aug 14 19:41:15 PDT 2007
+eclipse.preferences.version=1
+formatter_profile=_Lucene
+formatter_settings_version=11
+instance/org.eclipse.core.net/org.eclipse.core.net.hasMigrated=true
+org.eclipse.jdt.ui.text.custom_code_templates=<?xml version\="1.0" encoding\="UTF-8" standalone\="no"?><templates/>

+ 6 - 0
src/contrib/eclipse-plugin/.settings/org.eclipse.wst.validation.prefs

@@ -0,0 +1,6 @@
+#Tue Aug 14 19:41:15 PDT 2007
+DELEGATES_PREFERENCE=delegateValidatorListorg.eclipse.wst.xsd.core.internal.validation.eclipse.XSDDelegatingValidator\=org.eclipse.wst.xsd.core.internal.validation.eclipse.Validator;org.eclipse.wst.wsdl.validation.internal.eclipse.WSDLDelegatingValidator\=org.eclipse.wst.wsdl.validation.internal.eclipse.Validator;
+USER_BUILD_PREFERENCE=enabledBuildValidatorListorg.eclipse.wst.xsd.core.internal.validation.eclipse.XSDDelegatingValidator;org.eclipse.jst.jsp.core.internal.validation.JSPContentValidator;org.eclipse.wst.html.internal.validation.HTMLValidator;org.eclipse.wst.xml.core.internal.validation.eclipse.Validator;org.eclipse.jst.jsf.validation.internal.appconfig.AppConfigValidator;org.eclipse.jst.jsp.core.internal.validation.JSPBatchValidator;org.eclipse.wst.dtd.core.internal.validation.eclipse.Validator;org.eclipse.wst.wsi.ui.internal.WSIMessageValidator;org.eclipse.wst.wsdl.validation.internal.eclipse.WSDLDelegatingValidator;org.eclipse.jst.jsf.validation.internal.JSPSemanticsValidator;
+USER_MANUAL_PREFERENCE=enabledManualValidatorListorg.eclipse.wst.xsd.core.internal.validation.eclipse.XSDDelegatingValidator;org.eclipse.jst.jsp.core.internal.validation.JSPContentValidator;org.eclipse.wst.html.internal.validation.HTMLValidator;org.eclipse.wst.xml.core.internal.validation.eclipse.Validator;org.eclipse.jst.jsf.validation.internal.appconfig.AppConfigValidator;org.eclipse.jst.jsp.core.internal.validation.JSPBatchValidator;org.eclipse.wst.dtd.core.internal.validation.eclipse.Validator;org.eclipse.wst.wsi.ui.internal.WSIMessageValidator;org.eclipse.wst.wsdl.validation.internal.eclipse.WSDLDelegatingValidator;org.eclipse.jst.jsf.validation.internal.JSPSemanticsValidator;
+USER_PREFERENCE=overrideGlobalPreferencesfalse
+eclipse.preferences.version=1

+ 28 - 0
src/contrib/eclipse-plugin/META-INF/MANIFEST.MF

@@ -0,0 +1,28 @@
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: MapReduce Tools for Eclipse
+Bundle-SymbolicName: org.apache.hadoop.eclipse;singleton:=true
+Bundle-Version: 1.0.4
+Bundle-Activator: org.apache.hadoop.eclipse.Activator
+Bundle-Localization: plugin
+Require-Bundle: org.eclipse.ui,
+ org.eclipse.core.runtime,
+ org.eclipse.jdt.launching,
+ org.eclipse.debug.core,
+ org.eclipse.jdt,
+ org.eclipse.jdt.core,
+ org.eclipse.core.resources,
+ org.eclipse.ui.ide,
+ org.eclipse.jdt.ui,
+ org.eclipse.debug.ui,
+ org.eclipse.jdt.debug.ui,
+ org.eclipse.core.expressions,
+ org.eclipse.ui.cheatsheets,
+ org.eclipse.ui.console,
+ org.eclipse.ui.navigator,
+ org.eclipse.core.filesystem,
+ org.eclipse.team.cvs.ssh2,
+ com.jcraft.jsch
+Eclipse-LazyStart: true
+Bundle-ClassPath: bin/
+Bundle-Vendor: Apache Hadoop

+ 5 - 0
src/contrib/eclipse-plugin/build.properties

@@ -0,0 +1,5 @@
+output.. = bin/
+bin.includes = META-INF/,\
+               plugin.xml,\
+               resources/,\
+               bin/

+ 44 - 0
src/contrib/eclipse-plugin/build.xml

@@ -0,0 +1,44 @@
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+
+<project default="jar" name="eclipse-plugin">
+
+  <import file="../build-contrib.xml"/>
+
+  <path id="eclipse-sdk-jars">
+    <fileset dir="${eclipse.home}/plugins/">
+      <include name="org.eclipse.ui*.jar"/>
+      <include name="org.eclipse.jdt*.jar"/>
+      <include name="org.eclipse.core*.jar"/>
+      <include name="org.eclipse.equinox*.jar"/>
+      <include name="org.eclipse.debug*.jar"/>
+      <include name="org.eclipse.osgi*.jar"/>
+      <include name="org.eclipse.swt*.jar"/>
+      <include name="org.eclipse.jface*.jar"/>
+
+      <include name="org.eclipse.team.cvs.ssh2*.jar"/>
+      <include name="com.jcraft.jsch*.jar"/>
+    </fileset> 
+  </path>
+
+  <!-- Override classpath to include Eclipse SDK jars -->
+  <path id="classpath">
+    <pathelement location="${build.classes}"/>
+    <path refid="eclipse-sdk-jars"/>
+  </path>
+
+  <!-- Skip building if eclipse.home is unset. -->
+  <target name="check-contrib" unless="eclipse.home">
+    <property name="skip.contrib" value="yes"/>
+    <echo message="eclipse.home unset: skipping eclipse plugin"/>
+  </target>
+
+  <!-- Override jar target to specify manifest -->
+  <target name="jar" depends="compile" unless="skip.contrib">
+    <jar
+      jarfile="${build.dir}/hadoop-${name}.jar"
+      basedir="${build.classes}"
+      manifest="META-INF/MANIFEST.MF"
+    />
+  </target>
+
+</project>

+ 262 - 0
src/contrib/eclipse-plugin/plugin.xml

@@ -0,0 +1,262 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<?eclipse version="3.2"?>
+<plugin>
+   <extension
+         point="org.eclipse.debug.core.launchConfigurationTypes">
+      <launchConfigurationType
+            delegate="org.apache.hadoop.eclipse.launch.StartMapReduceServer"
+            id="org.apache.hadoop.eclipse.launch.StartServer"
+            modes="run,debug"
+            name="Start Hadoop Server"
+            public="true"/> 
+   </extension>
+   <extension 
+         name="MapReduce Nature" 
+         id="org.apache.hadoop.eclipse.Nature"
+         point="org.eclipse.core.resources.natures">
+      <runtime>
+         <run class="org.apache.hadoop.eclipse.MapReduceNature"/>
+      </runtime>
+   </extension>
+   <extension 
+         point="org.eclipse.ui.ide.projectNatureImages">
+   </extension>
+   <extension
+         point="org.eclipse.ui.newWizards">
+      <primaryWizard id="org.apache.hadoop.eclipse.NewProjectWizard"/> 
+      <wizard
+            category="org.apache.hadoop.eclipse.category"
+            class="org.apache.hadoop.eclipse.NewMapReduceProjectWizard"
+            finalPerspective="org.apache.hadoop.eclipse.Perspective"
+            hasPages="true"
+            icon="resources/Elephant16x16.gif"
+            id="org.apache.hadoop.eclipse.NewProjectWizard"
+            name="MapReduce Project"
+            preferredPerspectives="org.apache.hadoop.eclipse.Perspective"
+            project="true"/>
+      <wizard
+            category="org.apache.hadoop.eclipse.category"
+            class="org.apache.hadoop.eclipse.NewMapperWizard"
+            icon="resources/mapper16.png"
+            id="org.apache.hadoop.eclipse.NewMapperWizard"
+            name="Mapper"
+            project="false"/>
+      <wizard
+            category="org.apache.hadoop.eclipse.category"
+            class="org.apache.hadoop.eclipse.NewReducerWizard"
+            icon="resources/reducer16.png"
+            id="org.apache.hadoop.eclipse.NewReducerWizard"
+            name="Reducer"/>
+      <wizard
+            category="org.apache.hadoop.eclipse.category"
+            class="org.apache.hadoop.eclipse.NewDriverWizard"
+            icon="resources/driver.png"
+            id="org.apache.hadoop.eclipse.NewDriverWizard"
+            name="MapReduce Driver"
+            project="false"/>
+      <category
+            id="org.apache.hadoop.eclipse.category"
+            name="MapReduce"/>
+   </extension>
+   <extension
+         point="org.eclipse.debug.ui.launchConfigurationTypeImages">
+      <launchConfigurationTypeImage
+            configTypeID="org.apache.hadoop.eclipse.launch.Local"
+            icon="resources/elephantblue16x16.gif"
+            id="Hadouken.launchConfigurationTypeImage1"/>
+   </extension>
+   <extension
+         point="org.eclipse.debug.ui.launchConfigurationTabGroups">
+      <launchConfigurationTabGroup
+            class="org.apache.hadoop.eclipse.launch.StartHadoopLaunchTabGroup"
+            id="org.apache.hadoop.eclipse.launch.StartHadoopLaunchTabGroup"
+            type="org.apache.hadoop.eclipse.launch.StartServer"/>
+     
+   </extension>
+
+   <extension
+         point="org.eclipse.ui.perspectives">
+      <perspective
+            class="org.apache.hadoop.eclipse.HadoopPerspectiveFactory"
+            icon="resources/elephantblue16x16.gif"
+            id="org.apache.hadoop.eclipse.Perspective"
+            name="MapReduce"/>
+   </extension>
+   <extension
+         point="org.eclipse.core.expressions.propertyTesters">
+      <propertyTester
+            class="org.apache.hadoop.eclipse.PropertyTester"
+            id="mapreduce.deployable"
+            namespace="mapreduce"
+            properties="deployable"
+            type="org.eclipse.core.resources.IResource"/>
+      <propertyTester
+            class="org.apache.hadoop.eclipse.PropertyTester"
+            id="mapreduce.server"
+            namespace="mapreduce"
+            properties="server"
+            type="org.eclipse.wst.server.core.IServer"/>
+   </extension>
+   <extension
+         point="org.eclipse.debug.ui.launchShortcuts">
+      <shortcut
+            class="org.apache.hadoop.eclipse.launch.LaunchShortcut"
+            icon="resources/elephantblue16x16.gif"
+            id="org.apache.hadoop.eclipse.launch.shortcut"
+            label="Run on Hadoop"
+            modes="run">
+         <contextualLaunch>
+
+           <enablement>
+             <with variable="selection">
+               <count value="1"/>
+               <iterate>
+                <or>
+               	  <test property="org.eclipse.jdt.launching.hasMain"/>
+               	  <and>
+               	     <test property="org.eclipse.jdt.launching.isContainer"/>
+               	     <test property="org.eclipse.jdt.launching.hasProjectNature" args="org.eclipse.jdt.core.javanature"/>
+               	     <test property="org.eclipse.jdt.launching.hasProjectNature" args="org.apache.hadoop.eclipse.Nature"/>               	     
+               	  </and>
+               	</or>
+               </iterate>
+               </with>
+           </enablement>
+  		 </contextualLaunch>
+         <perspective id="org.apache.hadoop.eclipse.Perspective"/>
+      </shortcut>
+   </extension>
+   <extension
+         point="org.eclipse.ui.views">
+      <category
+            id="com.ibm.hipods.mapreduce.views.category"
+            name="MapReduce Tools"/>
+      <view
+            allowMultiple="false"
+            category="com.ibm.hipods.mapreduce.views.category"
+            class="org.apache.hadoop.eclipse.view.servers.ServerView"
+            id="org.apache.hadoop.eclipse.view.servers"
+            name="MapReduce Servers"/>
+   </extension>
+   <extension
+         point="org.eclipse.ui.cheatsheets.cheatSheetContent">
+      <category
+            id="org.apache.hadoop.eclipse.cheatsheet.Examples"
+            name="MapReduce"/>
+      <cheatsheet
+            category="org.apache.hadoop.eclipse.cheatsheet.Examples"
+            composite="true"
+            contentFile="resources/HelloWorld.xml"
+            id="org.apache.hadoop.eclipse.cheatsheet"
+            name="Write a MapReduce application"/>
+   </extension>
+   <extension
+         point="org.eclipse.ui.navigator.navigatorContent">
+      <navigatorContent
+            activeByDefault="true"
+            contentProvider="org.apache.hadoop.eclipse.dfs.DFSContentProvider"
+            icon="resources/elephantblue16x16.gif"
+            id="org.apache.hadoop.eclipse.views.dfscontent"
+            labelProvider="org.apache.hadoop.eclipse.dfs.DFSContentProvider"
+            name="Distributed File Systems"
+            priority="highest"
+            providesSaveables="false">
+         <triggerPoints>
+         	<or>
+             <instanceof value="org.apache.hadoop.eclipse.dfs.DfsPath"/>
+             <adapt type="org.eclipse.core.resources.IResource">
+                <test
+                      forcePluginActivation="true"
+                      property="mapreduce.deployable"/>
+             </adapt>
+    	    </or>
+         </triggerPoints>
+         <actionProvider class="org.apache.hadoop.eclipse.dfs.ActionProvider">
+         </actionProvider>
+         <possibleChildren>
+         	<or>
+	            <instanceof value="org.apache.hadoop.eclipse.dfs.DfsPath"/>
+    	        <instanceof value="org.eclipse.wst.server.core.IServer"/>
+        	    <instanceof value="org.apache.hadoop.eclipse.dfs.DFSContentProvider$DFS"/>
+        	</or>
+         </possibleChildren>
+      </navigatorContent>
+   </extension>
+   <extension
+         point="org.eclipse.ui.navigator.viewer">
+      <viewer
+            viewerId="org.apache.hadoop.eclipse.dfs.DFSViewer">
+            
+          <popupMenu
+                allowsPlatformContributions="true"
+                id="org.apache.hadoop.eclipse.dfs.DFSViewer#PopupMenu">  
+             <insertionPoint name="group.new"/>
+             <insertionPoint
+                   name="group.open"
+                   separator="true"/>
+             <insertionPoint name="group.openWith"/>   
+             <insertionPoint name="group.edit"
+                   separator="true"/>   
+             <insertionPoint name="group.reorganize" />         
+             <insertionPoint
+                   name="group.port"
+                   separator="true"/>     
+             <insertionPoint
+                   name="group.build"
+                   separator="true"/> 
+             <insertionPoint
+                   name="group.generate"
+                   separator="true"/> 
+             <insertionPoint
+                   name="group.search"
+                   separator="true"/>              
+             <insertionPoint
+                   name="additions"
+                   separator="true"/>              
+             <insertionPoint
+                   name="group.properties"
+                   separator="true"/>
+          </popupMenu>
+            
+      </viewer>
+      <viewerContentBinding viewerId="org.eclipse.ui.navigator.ProjectExplorer">
+         <includes>
+            <contentExtension
+                  isRoot="false"
+                  pattern="org.apache.hadoop.eclipse.views.dfscontent"/>
+            <actionExtension pattern="org.apache.hadoop.eclipse.views.dfscontent.*"/>
+
+				
+		   
+
+
+				
+							
+         </includes>
+      </viewerContentBinding>
+   </extension>
+   <extension
+         point="org.eclipse.core.filesystem.filesystems">
+      <filesystem scheme="dfs">
+         <run class="org.apache.hadoop.eclipse.dfs.FileSystem"/>
+      </filesystem>
+   </extension>
+   <extension
+         point="org.eclipse.ui.popupMenus">
+      <viewerContribution
+            id="Hadouken.viewerContribution1"
+            targetID="org.eclipse.ui.navigator.ProjectExplorer#PopupMenu"/>
+   </extension>
+   <extension
+         point="org.eclipse.ui.preferencePages">
+      <page
+            class="org.apache.hadoop.eclipse.preferences.HadoopHomeDirPreferencePage"
+            id="org.apache.hadoop.eclipse.preferences.HadoopHomeDirPreferencePage"
+            name="Hadoop Home Directory"/>
+   </extension>
+   <extension
+         point="org.eclipse.core.runtime.preferences">
+      <initializer class="org.apache.hadoop.eclipse.preferences.PreferenceInitializer"/>
+   </extension>
+ 
+</plugin>

+ 32 - 0
src/contrib/eclipse-plugin/resources/ConnectDFS.xml

@@ -0,0 +1,32 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<cheatsheet title="Set default Hadoop path tutorial">
+	<intro>
+		<description>
+			This tutorial informs you how to set the default Hadoop
+			directory for the plugin.
+		</description>
+	</intro>
+	<item title="Create MapReduce Cluster" skip="true">
+		<description>
+			Define a MapReduce cluster [if you have not done so already]
+			by opening the MapReduce Servers view and clicking on the
+			blue elephant in the upper right.
+
+			Use the following embedded command to create a new Hadoop Server:
+		</description>
+
+		<action pluginId="com.ibm.hipods.mapreduce"
+			class="org.apache.hadoop.eclipse.actions.NewServerAction" />
+	</item>
+	<item title="Open and Explore DFS Tree">
+
+		<description>
+			Project Explorer view shows an elephant icon for each defined
+			server.  Opening a server entry will open a connection to
+			the root of that server's DFS tree.  You can then explore the
+			DFS tree.
+		</description>
+
+	</item>
+</cheatsheet>

+ 62 - 0
src/contrib/eclipse-plugin/resources/CreateProj.xml

@@ -0,0 +1,62 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<cheatsheet title="MapReduce project creation tutorial">
+	<intro>
+		<description>
+			This tutorial guides you through the creation of a simple
+			MapReduce project with three MapReduce classes: a Mapper, a
+			Reducer, and a Driver.
+		</description>
+	</intro>
+	<item title="Open the MapReduce Perspective">
+		<action pluginId="org.eclipse.ui.cheatsheets"
+			class="org.eclipse.ui.internal.cheatsheets.actions.OpenPerspective"
+			param1="org.apache.hadoop.eclipse.Perspective" />
+		<description>
+			Select <b>Window->Open Perspective->MapReduce</b> in the menubar at
+			the top of the workbench. This step changes the perspective
+			to set up the Eclipse workbench for MapReduce development.
+		</description>
+	</item>
+	<item title="Create a MapReduce project" skip="true">
+		<action pluginId="com.ibm.hipods.mapreduce"
+			class="org.apache.hadoop.eclipse.actions.OpenNewMRProjectAction" />
+		<description>
+			The first thing you will need is a MapReduce Project. If you
+			already have a MapReduce project in your workspace that you
+			would like to use, you may skip this step by clicking the
+			"Click to Skip" button. If not, select <b>File->New->Project</b>
+			and choose MapReduce Project in the list. Complete the
+			subsequent pages as required.
+		</description>
+	</item>
+	<item title="Create a MapReduce package" skip="true">
+		<action pluginId="org.eclipse.jdt.ui"
+			class="org.eclipse.jdt.ui.actions.OpenNewPackageWizardAction" />
+		<description>
+			You should now have a MapReduce project in your workspace.
+			The next thing to do is creating a package. Use the Eclipse
+			tools by selecting <b>File -> New ->Package</b> action. Specify the
+			source folder (the project containing the package). Then,
+			give the package a name, such as "mapreduce.test", and click
+			the "Finish" button. If you already have a project with a
+			package you might as well skip this step.
+		</description>
+	</item>
+	<item title="Create the MapReduce application classes" skip="true">
+		<description>
+			Now you should be set up for creating your MapReduce
+			application.  The MapReduce application consists of three
+			classes: a Mapper class, a Reducer class and a Driver class.
+			In this step you will create the three classes.  Use the
+			class wizard by selecting <b>File -> New -> Class</b>.  
+			Repeat this for	every class.
+		</description>
+		<repeated-subitem values="Mapper,Reducer,Driver">
+			<subitem label="Create the class ${this}.">
+				<action pluginId="com.ibm.hipods.mapreduce"
+					class="org.apache.hadoop.eclipse.actions.OpenNewMRClassWizardAction"
+					param1="${this}" />
+			</subitem>
+		</repeated-subitem>
+	</item>
+</cheatsheet>

BIN
src/contrib/eclipse-plugin/resources/Elephant100x100.gif


BIN
src/contrib/eclipse-plugin/resources/Elephant16x16.gif


+ 121 - 0
src/contrib/eclipse-plugin/resources/HelloWorld.xml

@@ -0,0 +1,121 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<compositeCheatsheet name="IBM MapReduce Tools for Eclipse">
+	<taskGroup name="Develop Hadoop Applications" kind="set">
+		<intro
+			href="http://www.alphaworks.ibm.com/tech/mapreducetools">
+			IBM(R) MapReduce Tools for Eclipse enables you to write
+			distributed applications based on the MapReduce paradigm
+			using the Apache Hadoop runtime. This cheat sheet will walk
+			you through the steps needed to write a MapReduce
+			application and run it on a Hadoop server.
+		</intro>
+		<onCompletion>
+
+		</onCompletion>
+		<taskGroup name="Initial Setup" kind="sequence" skip="true">
+			<intro>
+				This task takes you through the steps to setup the
+				Hadoop environment with the MapReduce Tools. If you
+				already have Hadoop installed and linked to Eclipse, you
+				can skip this task.
+			</intro>
+			<onCompletion>
+				Congratulations! You have now installed Hadoop on your
+				computer and linked it with the MapReduce Tools.
+			</onCompletion>
+			<task kind="cheatsheet"
+				name="Download and unzip Apache Hadoop distribution">
+				<intro>
+					Hadoop must be downloaded to a place where Eclipse
+					can access its libraries. This task covers the steps
+					needed to execute this task.
+				</intro>
+				<param name="showIntro" value="false" />
+				<param name="path" value="Setup.xml" />
+				<onCompletion>
+					The plugin currently supports Hadoop v0.7.2 through
+					0.12.2. Now click on the top-most link that you feel
+					comfortable installing.
+				</onCompletion>
+			</task>
+			<task kind="cheatsheet"
+				name="Specify path to Apache Hadoop distribution">
+				...
+				<intro>
+					This tutorial informs you how to set the default
+					Hadoop directory for the plugin.
+				</intro>
+				<param name="showIntro" value="false" />
+			 	<param name="path" value="SetHadoopPath.xml" />
+			</task>
+		</taskGroup>
+		<taskGroup name="Create and run a MapReduce project"
+			kind="sequence" skip="true">
+			<intro>
+				This section walks you through the steps to create and
+				run your MapReduce project.
+			</intro>
+
+			<task kind="cheatsheet" name="Create a MapReduce project"
+				skip="true">
+				<intro>
+					This tutorial guides you through the creation of a
+					simple MapReduce project with three MapReduce
+					classes: a Mapper, a Reducer, and a Driver.
+				</intro>
+				<param name="showIntro" value="false" />
+				<param name="path" value="CreateProj.xml" />
+				<onCompletion>
+					Congratulations! You have now mastered the steps for
+					creating a Hadoop project.
+				</onCompletion>
+			</task>
+			<task kind="cheatsheet"
+				name="Run a MapReduce application">
+				<param name="path" value="RunProj.xml" />
+				<onCompletion>
+					Congratulations! You have now mastered the steps for
+					implementing a Hadoop application.
+				</onCompletion>
+			</task>
+
+		</taskGroup>
+
+		<taskGroup name="Using a MapReduce cluster" kind="set"
+			skip="true">
+			<intro>
+				The MapReduce Tools for Eclipse plugin lets you 
+				browse and upload files to the DFS of a MapReduce cluster.
+			</intro>
+			<onCompletion>
+				Congratulations!  You have completed the tutorials on using a
+				MapReduce Cluster.
+			</onCompletion>
+			<task kind="cheatsheet"
+				name="Connect to a MapReduce cluster" skip="true">
+				<intro>
+					This tutorial explains how to show files in the DFS of a
+					MapReduce cluster.
+				</intro>
+				<param name="showIntro" value="false" />
+				<param name="path" value="ConnectDFS.xml" />
+			</task>
+			<task kind="cheatsheet" id="viewFiles"
+				name="Viewing file contents on the Hadoop Distributed File System (HDFS)">
+				<intro>
+					Simply double-click on any file in the DFS in the Project
+					Explorer view.
+				</intro>
+			</task>
+			<task kind="cheatsheet" 
+				name="Transfer files to the Hadoop Distributed File System (HDFS)">
+				<intro>
+					Right-click on an existing directory in the DFS.<br />
+					Choose the <b>Import from local directory option.</b>
+					<br />
+					Note that files can only be uploaded to the HDFS at this time.
+				</intro>
+			</task>
+		</taskGroup>
+	</taskGroup>
+</compositeCheatsheet>

BIN
src/contrib/eclipse-plugin/resources/MAP100x100.gif


BIN
src/contrib/eclipse-plugin/resources/MAP16x15.gif


+ 24 - 0
src/contrib/eclipse-plugin/resources/RunProj.xml

@@ -0,0 +1,24 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<cheatsheet title="MapReduce project run tutorial">
+	<intro>
+		<description>
+			This tutorial informs you how to run your newly created
+			MapReduce Project in one of two fashions: locally as a Java
+			Application, or on a Hadoop Server.
+		</description>
+	</intro>
+	<item title="Run as Java Application">
+		<description>
+			To run your MapReduce application locally, right-click on
+			your Driver class in the Package Explorer and select <b>Run as
+			/ Java Application</b>.
+		</description>
+	</item>
+	<item title="Run on Hadoop Server">
+		<description>
+			To run your MapReduce application on a Hadoop server, right-click on
+			your Driver class in the Package Explorer and select <b>Run as
+			/ Run on Hadoop</b>.
+		</description>
+	</item>
+</cheatsheet>

+ 25 - 0
src/contrib/eclipse-plugin/resources/SetHadoopPath.xml

@@ -0,0 +1,25 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<cheatsheet title="Set default Hadoop path tutorial">
+	<intro>
+		<description>
+			This tutorial informs you how to set the default Hadoop
+			directory for the plugin.
+		</description>
+	</intro>
+	<item title="Open Plugin Preferences window">
+		<description>
+			To set the default Hadoop directory, open the plugin
+			preferences from the menu option
+			<b>Window > Preferences</b>.  <br />
+			Go to the <b>Hadoop Home Directory</b>
+			preference, and enter the installation directory there.
+
+			Use the following embedded command to open the Preferences
+			window:
+		</description>
+
+		<action pluginId="org.eclipse.jdt.ui"
+			class="org.eclipse.ui.internal.OpenPreferencesAction" />
+	</item>
+</cheatsheet>

+ 18 - 0
src/contrib/eclipse-plugin/resources/Setup.xml

@@ -0,0 +1,18 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<cheatsheet title="Open Browser">
+  <intro>
+    <description>This cheat sheet launches a browser to the Hadoop website.</description>
+  </intro>  
+  <item title="Open Browser">
+     <description>
+     				Go to http://lucene.apache.org/hadoop/, and follow
+					links to download the latest stable distribution of
+					Hadoop.
+
+
+     Use the following embedded command to launch the Hadoop Web site 
+        in a browser</description>
+     <command serialization=
+        "org.eclipse.ui.browser.openBrowser(url=http://lucene.apache.org/hadoop)"/>
+  </item>
+</cheatsheet>

BIN
src/contrib/eclipse-plugin/resources/drive100x100.gif


BIN
src/contrib/eclipse-plugin/resources/drive16x16.gif


BIN
src/contrib/eclipse-plugin/resources/driver.png


BIN
src/contrib/eclipse-plugin/resources/driverwiz.png


BIN
src/contrib/eclipse-plugin/resources/elephantblue16x16.gif


BIN
src/contrib/eclipse-plugin/resources/files.gif


BIN
src/contrib/eclipse-plugin/resources/hadoop.gif


BIN
src/contrib/eclipse-plugin/resources/hadoop_small.gif


BIN
src/contrib/eclipse-plugin/resources/job.gif


BIN
src/contrib/eclipse-plugin/resources/map16x16.gif


BIN
src/contrib/eclipse-plugin/resources/mapper16.png


BIN
src/contrib/eclipse-plugin/resources/mapwiz.png


BIN
src/contrib/eclipse-plugin/resources/projwiz.png


BIN
src/contrib/eclipse-plugin/resources/reduce100x100.gif


BIN
src/contrib/eclipse-plugin/resources/reduce16x16.gif


BIN
src/contrib/eclipse-plugin/resources/reducer-16x16.gif


BIN
src/contrib/eclipse-plugin/resources/reducer16.png


BIN
src/contrib/eclipse-plugin/resources/reducewiz.png


BIN
src/contrib/eclipse-plugin/resources/spite_overcloud.png


BIN
src/contrib/eclipse-plugin/resources/spitesmall.gif


BIN
src/contrib/eclipse-plugin/resources/spitesmall.png


+ 67 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/Activator.java

@@ -0,0 +1,67 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import org.apache.hadoop.eclipse.servers.ServerRegistry;
+import org.eclipse.ui.plugin.AbstractUIPlugin;
+import org.osgi.framework.BundleContext;
+
+/**
+ * The activator class controls the plug-in life cycle
+ */
+public class Activator extends AbstractUIPlugin {
+
+  // The plug-in ID
+  public static final String PLUGIN_ID =
+      "org.apache.hadoop.eclipse.Hadouken";
+
+  // The shared instance
+  private static Activator plugin;
+
+  /**
+   * The constructor
+   */
+  public Activator() {
+    plugin = this;
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void start(BundleContext context) throws Exception {
+    super.start(context);
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void stop(BundleContext context) throws Exception {
+    ServerRegistry.getInstance().dispose();
+    plugin = null;
+    super.stop(context);
+  }
+
+  /**
+   * Returns the shared instance
+   * 
+   * @return the shared instance
+   */
+  public static Activator getDefault() {
+    return plugin;
+  }
+
+}

+ 95 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/HadoopPerspectiveFactory.java

@@ -0,0 +1,95 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import org.eclipse.debug.ui.IDebugUIConstants;
+import org.eclipse.jdt.ui.JavaUI;
+import org.eclipse.ui.IFolderLayout;
+import org.eclipse.ui.IPageLayout;
+import org.eclipse.ui.IPerspectiveFactory;
+import org.eclipse.ui.console.IConsoleConstants;
+
+/**
+ * Creates links to the new MapReduce-based wizards and views for a MapReduce
+ * perspective
+ * 
+ */
+
+public class HadoopPerspectiveFactory implements IPerspectiveFactory {
+
+  public void createInitialLayout(IPageLayout layout) {
+    layout.addNewWizardShortcut("org.apache.hadoop.eclipse.NewDriverWizard");
+    layout.addNewWizardShortcut("org.apache.hadoop.eclipse.NewMapperWizard");
+    layout
+        .addNewWizardShortcut("org.apache.hadoop.eclipse.NewReducerWizard");
+
+    IFolderLayout left =
+        layout.createFolder("org.apache.hadoop.eclipse.perspective.left",
+            IPageLayout.LEFT, 0.2f, layout.getEditorArea());
+    left.addView("org.eclipse.ui.navigator.ProjectExplorer");
+
+    IFolderLayout bottom =
+        layout.createFolder("org.apache.hadoop.eclipse.perspective.bottom",
+            IPageLayout.BOTTOM, 0.7f, layout.getEditorArea());
+    bottom.addView(IPageLayout.ID_PROBLEM_VIEW);
+    bottom.addView(IPageLayout.ID_TASK_LIST);
+    bottom.addView(JavaUI.ID_JAVADOC_VIEW);
+    bottom.addView("org.apache.hadoop.eclipse.view.servers");
+    bottom.addPlaceholder(JavaUI.ID_SOURCE_VIEW);
+    bottom.addPlaceholder(IPageLayout.ID_PROGRESS_VIEW);
+    bottom.addPlaceholder(IConsoleConstants.ID_CONSOLE_VIEW);
+    bottom.addPlaceholder(IPageLayout.ID_BOOKMARKS);
+
+    IFolderLayout right =
+        layout.createFolder("org.apache.hadoop.eclipse.perspective.right",
+            IPageLayout.RIGHT, 0.8f, layout.getEditorArea());
+    right.addView(IPageLayout.ID_OUTLINE);
+    right.addView("org.eclipse.ui.cheatsheets.views.CheatSheetView");
+    // right.addView(layout.ID); .. cheat sheet here
+
+    layout.addActionSet(IDebugUIConstants.LAUNCH_ACTION_SET);
+    layout.addActionSet(JavaUI.ID_ACTION_SET);
+    layout.addActionSet(JavaUI.ID_CODING_ACTION_SET);
+    layout.addActionSet(JavaUI.ID_ELEMENT_CREATION_ACTION_SET);
+    layout.addActionSet(IPageLayout.ID_NAVIGATE_ACTION_SET);
+    layout.addActionSet(JavaUI.ID_SEARCH_ACTION_SET);
+
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewPackageCreationWizard");
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewClassCreationWizard");
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewInterfaceCreationWizard");
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewEnumCreationWizard");
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewAnnotationCreationWizard");
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewSourceFolderCreationWizard");
+    layout
+        .addNewWizardShortcut("org.eclipse.jdt.ui.wizards.NewSnippetFileCreationWizard");
+    layout.addNewWizardShortcut("org.eclipse.ui.wizards.new.folder");
+    layout.addNewWizardShortcut("org.eclipse.ui.wizards.new.file");
+    layout
+        .addNewWizardShortcut("org.eclipse.ui.editors.wizards.UntitledTextFileWizard");
+
+    // CheatSheetViewerFactory.createCheatSheetView().setInput("org.apache.hadoop.eclipse.cheatsheet");
+  }
+
+}

+ 80 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/JSchUtilities.java

@@ -0,0 +1,80 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import java.util.Properties;
+
+import org.eclipse.core.runtime.Platform;
+
+import com.jcraft.jsch.JSch;
+
+/**
+ * Creates a JSCH object so that we can use the JSCH methods for connecting to
+ * remote servers via SSH/SCP.
+ */
+
+public class JSchUtilities {
+
+  static String SSH_HOME_DEFAULT = null;
+  static {
+    String ssh_dir_name = ".ssh"; //$NON-NLS-1$
+
+    // Windows doesn't like files or directories starting with a dot.
+    if (Platform.getOS().equals(Platform.OS_WIN32)) {
+      ssh_dir_name = "ssh"; //$NON-NLS-1$
+    }
+
+    SSH_HOME_DEFAULT = System.getProperty("user.home"); //$NON-NLS-1$
+    if (SSH_HOME_DEFAULT != null) {
+      SSH_HOME_DEFAULT = SSH_HOME_DEFAULT + java.io.File.separator
+          + ssh_dir_name;
+    } else {
+
+    }
+  }
+
+  public synchronized static JSch createJSch() {
+
+    // IPreferenceStore store = CVSSSH2Plugin.getDefault().getPreferenceStore();
+    // String ssh_home = store.getString(SSH_HOME_DEFAULT);
+    String ssh_home = SSH_HOME_DEFAULT;
+
+    Properties props = new Properties();
+    props.setProperty("StrictHostKeyChecking", "no");
+
+    JSch jsch = new JSch();
+    JSch.setConfig(props);
+    /*
+     * JSch.setLogger(new Logger() { public boolean isEnabled(int level) {
+     * return true; }
+     * 
+     * public void log(int level, String message) { System.out.println("JSCH
+     * Level " + level + ": " + message); } });
+     */
+
+    try {
+      java.io.File file;
+      file = new java.io.File(ssh_home, "known_hosts"); //$NON-NLS-1$
+      jsch.setKnownHosts(file.getPath());
+    } catch (Exception e) {
+    }
+
+    return jsch;
+  }
+}

+ 146 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/MapReduceNature.java

@@ -0,0 +1,146 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import java.io.File;
+import java.io.FileFilter;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Iterator;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.eclipse.core.resources.IProject;
+import org.eclipse.core.resources.IProjectNature;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.NullProgressMonitor;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.core.runtime.QualifiedName;
+import org.eclipse.jdt.core.IClasspathEntry;
+import org.eclipse.jdt.core.IJavaProject;
+import org.eclipse.jdt.core.JavaCore;
+
+/**
+ * Class to configure and deconfigure an Eclipse project with the MapReduce
+ * project nature.
+ */
+
+public class MapReduceNature implements IProjectNature {
+
+  public static final String ID = "org.apache.hadoop.eclipse.Nature";
+
+  private IProject project;
+
+  static Logger log = Logger.getLogger(MapReduceNature.class.getName());
+
+  /**
+   * Configures an Eclipse project as a MapReduce project by adding the
+   * Hadoop libraries to a project's classpath.
+   */
+  public void configure() throws CoreException {
+    String path =
+        project.getPersistentProperty(new QualifiedName(Activator.PLUGIN_ID,
+            "hadoop.runtime.path"));
+
+    File dir = new File(path);
+    final ArrayList<File> coreJars = new ArrayList<File>();
+    dir.listFiles(new FileFilter() {
+      public boolean accept(File pathname) {
+        String fileName = pathname.getName();
+
+        // get the hadoop core jar without touching test or examples
+        // older version of hadoop don't use the word "core" -- eyhung
+        if ((fileName.indexOf("hadoop") != -1) && (fileName.endsWith("jar"))
+            && (fileName.indexOf("test") == -1)
+            && (fileName.indexOf("examples") == -1)) {
+          coreJars.add(pathname);
+        }
+
+        return false; // we don't care what this returns
+      }
+    });
+    File dir2 = new File(path + File.separatorChar + "lib");
+    if (dir2.exists() && dir2.isDirectory()) {
+      dir2.listFiles(new FileFilter() {
+        public boolean accept(File pathname) {
+          if ((!pathname.isDirectory())
+              && (pathname.getName().endsWith("jar"))) {
+            coreJars.add(pathname);
+          }
+
+          return false; // we don't care what this returns
+        }
+      });
+    }
+
+    // Add Hadoop libraries onto classpath
+    IJavaProject javaProject = JavaCore.create(getProject());
+    // Bundle bundle = Activator.getDefault().getBundle();
+    try {
+      IClasspathEntry[] currentCp = javaProject.getRawClasspath();
+      IClasspathEntry[] newCp =
+          new IClasspathEntry[currentCp.length + coreJars.size()];
+      System.arraycopy(currentCp, 0, newCp, 0, currentCp.length);
+
+      final Iterator i = coreJars.iterator();
+      int count = 0;
+      while (i.hasNext()) {
+        // for (int i = 0; i < s_coreJarNames.length; i++) {
+
+        final File f = (File) i.next();
+        // URL url = FileLocator.toFileURL(FileLocator.find(bundle, new
+        // Path("lib/" + s_coreJarNames[i]), null));
+        URL url = f.toURL();
+        log.finer("hadoop library url.getPath() = " + url.getPath());
+
+        newCp[newCp.length - 1 - count] =
+            JavaCore.newLibraryEntry(new Path(url.getPath()), null, null);
+        count++;
+      }
+
+      javaProject.setRawClasspath(newCp, new NullProgressMonitor());
+    } catch (Exception e) {
+      log.log(Level.SEVERE, "IOException generated in HadoukenNature.class",
+          e);
+    }
+  }
+
+  /**
+   * Deconfigure a project from MapReduce status. Currently unimplemented.
+   */
+  public void deconfigure() throws CoreException {
+    // TODO Auto-generated method stub
+
+  }
+
+  /**
+   * Returns the project to which this project nature applies.
+   */
+  public IProject getProject() {
+    return this.project;
+  }
+
+  /**
+   * Sets the project to which this nature applies. Used when instantiating
+   * this project nature runtime.
+   */
+  public void setProject(IProject project) {
+    this.project = project;
+  }
+}

+ 99 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewDriverWizard.java

@@ -0,0 +1,99 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import org.eclipse.core.resources.IFile;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.jdt.core.IJavaElement;
+import org.eclipse.jdt.internal.ui.wizards.NewElementWizard;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.ui.INewWizard;
+import org.eclipse.ui.IWorkbench;
+
+/**
+ * Wizard for creating a new Driver class (a class that runs a MapReduce job).
+ * 
+ */
+
+public class NewDriverWizard extends NewElementWizard implements INewWizard,
+    IRunnableWithProgress {
+  private NewDriverWizardPage page;
+
+  /*
+   * @Override public boolean performFinish() { }
+   */
+  public void run(IProgressMonitor monitor) {
+    try {
+      page.createType(monitor);
+    } catch (CoreException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    } catch (InterruptedException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    }
+  }
+
+  public NewDriverWizard() {
+    setWindowTitle("New MapReduce Driver");
+  }
+
+  @Override
+  public void init(IWorkbench workbench, IStructuredSelection selection) {
+    super.init(workbench, selection);
+
+    page = new NewDriverWizardPage();
+    addPage(page);
+    page.setSelection(selection);
+  }
+
+  @Override
+  /**
+   * Performs any actions appropriate in response to the user having pressed the
+   * Finish button, or refuse if finishing now is not permitted.
+   */
+  public boolean performFinish() {
+    if (super.performFinish()) {
+      if (getCreatedElement() != null) {
+        selectAndReveal(page.getModifiedResource());
+        openResource((IFile) page.getModifiedResource());
+      }
+
+      return true;
+    } else {
+      return false;
+    }
+  }
+
+  @Override
+  /**
+   * 
+   */
+  protected void finishPage(IProgressMonitor monitor)
+      throws InterruptedException, CoreException {
+    this.run(monitor);
+  }
+
+  @Override
+  public IJavaElement getCreatedElement() {
+    return page.getCreatedType().getPrimaryElement();
+  }
+}

+ 270 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewDriverWizardPage.java

@@ -0,0 +1,270 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.eclipse;
+
+import java.io.IOException;
+import java.util.ArrayList;
+
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jdt.core.IType;
+import org.eclipse.jdt.core.JavaModelException;
+import org.eclipse.jdt.core.search.SearchEngine;
+import org.eclipse.jdt.ui.IJavaElementSearchConstants;
+import org.eclipse.jdt.ui.JavaUI;
+import org.eclipse.jdt.ui.wizards.NewTypeWizardPage;
+import org.eclipse.jface.dialogs.ProgressMonitorDialog;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.jface.window.Window;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Button;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.Event;
+import org.eclipse.swt.widgets.Label;
+import org.eclipse.swt.widgets.Listener;
+import org.eclipse.swt.widgets.Text;
+import org.eclipse.ui.dialogs.SelectionDialog;
+
+/**
+ * Pre-fills the new MapReduce driver class with a template.
+ * 
+ */
+
+public class NewDriverWizardPage extends NewTypeWizardPage {
+  private Button isCreateMapMethod;
+
+  private Text reducerText;
+
+  private Text mapperText;
+
+  private final boolean showContainerSelector;
+
+  public NewDriverWizardPage() {
+    this(true);
+  }
+
+  public NewDriverWizardPage(boolean showContainerSelector) {
+    super(true, "MapReduce Driver");
+
+    this.showContainerSelector = showContainerSelector;
+    setTitle("MapReduce Driver");
+    setDescription("Create a new MapReduce driver.");
+    try {
+      setImageDescriptor(ImageDescriptor.createFromURL((FileLocator
+          .toFileURL(FileLocator.find(Activator.getDefault().getBundle(),
+              new Path("resources/driverwiz.png"), null)))));
+    } catch (IOException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    }
+  }
+
+  public void setSelection(IStructuredSelection selection) {
+    initContainerPage(getInitialJavaElement(selection));
+    initTypePage(getInitialJavaElement(selection));
+  }
+
+  @Override
+  /**
+   * Creates the new type using the entered field values.
+   */
+  public void createType(IProgressMonitor monitor) throws CoreException,
+      InterruptedException {
+    super.createType(monitor);
+  }
+
+  @Override
+  protected void createTypeMembers(final IType newType, ImportsManager imports,
+      final IProgressMonitor monitor) throws CoreException {
+    super.createTypeMembers(newType, imports, monitor);
+    imports.addImport("org.apache.hadoop.fs.Path");
+    imports.addImport("org.apache.hadoop.io.Text");
+    imports.addImport("org.apache.hadoop.io.IntWritable");
+    imports.addImport("org.apache.hadoop.mapred.JobClient");
+    imports.addImport("org.apache.hadoop.mapred.JobConf");
+    imports.addImport("org.apache.hadoop.mapred.Reducer");
+    imports.addImport("org.apache.hadoop.mapred.Mapper");
+
+    /**
+     * TODO(jz) - move most code out of the runnable
+     */
+    getContainer().getShell().getDisplay().syncExec(new Runnable() {
+      public void run() {
+
+        String method = "public static void main(String[] args) {\n JobClient client = new JobClient();";
+        method += "JobConf conf = new JobConf("
+            + newType.getFullyQualifiedName() + ".class);\n\n";
+
+        method += "// TODO: specify output types\nconf.setOutputKeyClass(Text.class);\nconf.setOutputValueClass(IntWritable.class);\n\n";
+
+        method += "// TODO: specify input and output DIRECTORIES (not files)\nconf.setInputPath(new Path(\"src\"));\nconf.setOutputPath(new Path(\"out\"));\n\n";
+
+        if (mapperText.getText().length() > 0) {
+          method += "conf.setMapperClass(" + mapperText.getText()
+              + ".class);\n\n";
+        } else {
+          method += "// TODO: specify a mapper\nconf.setMapperClass(org.apache.hadoop.mapred.lib.IdentityMapper.class);\n\n";
+        }
+        if (reducerText.getText().length() > 0) {
+          method += "conf.setReducerClass(" + reducerText.getText()
+              + ".class);\n\n";
+        } else {
+          method += "// TODO: specify a reducer\nconf.setReducerClass(org.apache.hadoop.mapred.lib.IdentityReducer.class);\n\n";
+        }
+
+        method += "client.setConf(conf);\n";
+        method += "try {\n\tJobClient.runJob(conf);\n} catch (Exception e) {\n"
+            + "\te.printStackTrace();\n}\n";
+        method += "}\n";
+
+        try {
+          newType.createMethod(method, null, false, monitor);
+        } catch (JavaModelException e) {
+          // TODO Auto-generated catch block
+          e.printStackTrace();
+        }
+      }
+    });
+  }
+
+  public void createControl(Composite parent) {
+    // super.createControl(parent);
+
+    initializeDialogUnits(parent);
+    Composite composite = new Composite(parent, SWT.NONE);
+    GridLayout layout = new GridLayout();
+    layout.numColumns = 4;
+    composite.setLayout(layout);
+
+    createContainerControls(composite, 4);
+
+    createPackageControls(composite, 4);
+    createSeparator(composite, 4);
+    createTypeNameControls(composite, 4);
+
+    createSuperClassControls(composite, 4);
+    createSuperInterfacesControls(composite, 4);
+    createSeparator(composite, 4);
+
+    createMapperControls(composite);
+    createReducerControls(composite);
+
+    if (!showContainerSelector) {
+      setPackageFragmentRoot(null, false);
+      setSuperClass("java.lang.Object", false);
+      setSuperInterfaces(new ArrayList(), false);
+    }
+
+    setControl(composite);
+
+    setFocus();
+    handleFieldChanged(CONTAINER);
+
+    // setSuperClass("org.apache.hadoop.mapred.MapReduceBase", true);
+    // setSuperInterfaces(Arrays.asList(new String[]{
+    // "org.apache.hadoop.mapred.Mapper" }), true);
+  }
+
+  @Override
+  protected void handleFieldChanged(String fieldName) {
+    super.handleFieldChanged(fieldName);
+
+    validate();
+  }
+
+  private void validate() {
+    if (showContainerSelector) {
+      updateStatus(new IStatus[] { fContainerStatus, fPackageStatus,
+          fTypeNameStatus, fSuperClassStatus, fSuperInterfacesStatus });
+    } else {
+      updateStatus(new IStatus[] { fTypeNameStatus, });
+    }
+  }
+
+  private void createMapperControls(Composite composite) {
+    this.mapperText = createBrowseClassControl(composite, "Ma&pper:",
+        "&Browse...", "org.apache.hadoop.mapred.Mapper", "Mapper Selection");
+  }
+
+  private void createReducerControls(Composite composite) {
+    this.reducerText = createBrowseClassControl(composite, "&Reducer:",
+        "Browse&...", "org.apache.hadoop.mapred.Reducer", "Reducer Selection");
+  }
+
+  private Text createBrowseClassControl(final Composite composite,
+      final String string, String browseButtonLabel,
+      final String baseClassName, final String dialogTitle) {
+    Label label = new Label(composite, SWT.NONE);
+    GridData data = new GridData(GridData.FILL_HORIZONTAL);
+    label.setText(string);
+    label.setLayoutData(data);
+
+    final Text text = new Text(composite, SWT.SINGLE | SWT.BORDER);
+    GridData data2 = new GridData(GridData.FILL_HORIZONTAL);
+    data2.horizontalSpan = 2;
+    text.setLayoutData(data2);
+
+    Button browse = new Button(composite, SWT.NONE);
+    browse.setText(browseButtonLabel);
+    GridData data3 = new GridData(GridData.FILL_HORIZONTAL);
+    browse.setLayoutData(data3);
+    browse.addListener(SWT.Selection, new Listener() {
+      public void handleEvent(Event event) {
+        IType baseType;
+        try {
+          baseType = getPackageFragmentRoot().getJavaProject().findType(
+              baseClassName);
+
+          // edit this to limit the scope
+          SelectionDialog dialog = JavaUI.createTypeDialog(
+              composite.getShell(), new ProgressMonitorDialog(composite
+                  .getShell()), SearchEngine.createHierarchyScope(baseType),
+              IJavaElementSearchConstants.CONSIDER_CLASSES, false);
+
+          dialog.setMessage("&Choose a type:");
+          dialog.setBlockOnOpen(true);
+          dialog.setTitle(dialogTitle);
+          dialog.open();
+
+          if ((dialog.getReturnCode() == Window.OK)
+              && (dialog.getResult().length > 0)) {
+            IType type = (IType) dialog.getResult()[0];
+            text.setText(type.getFullyQualifiedName());
+          }
+        } catch (JavaModelException e) {
+          // TODO Auto-generated catch block
+          e.printStackTrace();
+        }
+      }
+    });
+
+    if (!showContainerSelector) {
+      label.setEnabled(false);
+      text.setEnabled(false);
+      browse.setEnabled(false);
+    }
+
+    return text;
+  }
+}

+ 412 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewMapReduceProjectWizard.java

@@ -0,0 +1,412 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import java.io.File;
+import java.io.FilenameFilter;
+import java.lang.reflect.InvocationTargetException;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.preferences.HadoopHomeDirPreferencePage;
+import org.apache.hadoop.eclipse.preferences.PreferenceConstants;
+import org.eclipse.core.resources.IProject;
+import org.eclipse.core.resources.IProjectDescription;
+import org.eclipse.core.resources.ResourcesPlugin;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.IConfigurationElement;
+import org.eclipse.core.runtime.IExecutableExtension;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.NullProgressMonitor;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.core.runtime.QualifiedName;
+import org.eclipse.core.runtime.SubProgressMonitor;
+import org.eclipse.jdt.ui.wizards.NewJavaProjectWizardPage;
+import org.eclipse.jface.dialogs.IDialogConstants;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.jface.preference.PreferenceDialog;
+import org.eclipse.jface.preference.PreferenceManager;
+import org.eclipse.jface.preference.PreferenceNode;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.jface.wizard.IWizardPage;
+import org.eclipse.jface.wizard.Wizard;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.events.SelectionEvent;
+import org.eclipse.swt.events.SelectionListener;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Button;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.DirectoryDialog;
+import org.eclipse.swt.widgets.Group;
+import org.eclipse.swt.widgets.Link;
+import org.eclipse.swt.widgets.Text;
+import org.eclipse.ui.IWorkbench;
+import org.eclipse.ui.IWorkbenchWizard;
+import org.eclipse.ui.PlatformUI;
+import org.eclipse.ui.dialogs.WizardNewProjectCreationPage;
+import org.eclipse.ui.wizards.newresource.BasicNewProjectResourceWizard;
+
+/**
+ * Wizard for creating a new MapReduce Project
+ * 
+ */
+
+public class NewMapReduceProjectWizard extends Wizard implements
+    IWorkbenchWizard, IExecutableExtension {
+  static Logger log = Logger.getLogger(NewMapReduceProjectWizard.class
+      .getName());
+
+  private HadoopFirstPage firstPage;
+
+  private NewJavaProjectWizardPage javaPage;
+
+  public NewDriverWizardPage newDriverPage;
+
+  private IConfigurationElement config;
+
+  public NewMapReduceProjectWizard() {
+    setWindowTitle("New MapReduce Project Wizard");
+  }
+
+  public void init(IWorkbench workbench, IStructuredSelection selection) {
+
+  }
+
+  @Override
+  public boolean canFinish() {
+    return firstPage.isPageComplete() && javaPage.isPageComplete()
+    // && ((!firstPage.generateDriver.getSelection())
+    // || newDriverPage.isPageComplete()
+    ;
+  }
+
+  @Override
+  public IWizardPage getNextPage(IWizardPage page) {
+    // if (page == firstPage
+    // && firstPage.generateDriver.getSelection()
+    // )
+    // {
+    // return newDriverPage; // if "generate mapper" checked, second page is
+    // new driver page
+    // }
+    // else
+    // {
+    IWizardPage answer = super.getNextPage(page);
+    if (answer == newDriverPage) {
+      return null; // dont flip to new driver page unless "generate
+      // driver" is checked
+    } else if (answer == javaPage) {
+      return answer;
+    } else {
+      return answer;
+    }
+    // }
+  }
+
+  @Override
+  public IWizardPage getPreviousPage(IWizardPage page) {
+    if (page == newDriverPage) {
+      return firstPage; // newDriverPage, if it appears, is the second
+      // page
+    } else {
+      return super.getPreviousPage(page);
+    }
+  }
+
+  static class HadoopFirstPage extends WizardNewProjectCreationPage implements
+      SelectionListener {
+    public HadoopFirstPage() {
+      super("New Hadoop Project");
+
+      try {
+        setImageDescriptor(ImageDescriptor.createFromURL((FileLocator
+            .toFileURL(FileLocator.find(Activator.getDefault().getBundle(),
+                new Path("resources/projwiz.png"), null)))));
+      } catch (Exception e) {
+        // TODO Auto-generated catch block
+        e.printStackTrace();
+      }
+    }
+
+    private Link openPreferences;
+
+    private Button workspaceHadoop;
+
+    private Button projectHadoop;
+
+    private Text location;
+
+    private Button browse;
+
+    private String path;
+
+    public String currentPath;
+
+    private Button generateDriver;
+
+    @Override
+    public void createControl(Composite parent) {
+      super.createControl(parent);
+
+      setTitle("MapReduce Project");
+      setDescription("Create a MapReduce project.");
+
+      Group group = new Group((Composite) getControl(), SWT.NONE);
+      group.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+      group.setText("Hadoop MapReduce Library Installation Path");
+      GridLayout layout = new GridLayout(3, true);
+      layout.marginLeft = convertHorizontalDLUsToPixels(IDialogConstants.HORIZONTAL_MARGIN);
+      layout.marginRight = convertHorizontalDLUsToPixels(IDialogConstants.HORIZONTAL_MARGIN);
+      layout.marginTop = convertHorizontalDLUsToPixels(IDialogConstants.VERTICAL_MARGIN);
+      layout.marginBottom = convertHorizontalDLUsToPixels(IDialogConstants.VERTICAL_MARGIN);
+      group.setLayout(layout);
+
+      workspaceHadoop = new Button(group, SWT.RADIO);
+      GridData d = new GridData(GridData.BEGINNING, GridData.BEGINNING, false,
+          false);
+      d.horizontalSpan = 2;
+      workspaceHadoop.setLayoutData(d);
+      // workspaceHadoop.setText("Use default workbench Hadoop library
+      // location");
+      workspaceHadoop.setSelection(true);
+
+      updateHadoopDirLabelFromPreferences();
+
+      openPreferences = new Link(group, SWT.NONE);
+      openPreferences.setText("<a>Configure Hadoop install directory...</a>");
+      openPreferences.setLayoutData(new GridData(GridData.END, GridData.CENTER,
+          false, false));
+      openPreferences.addSelectionListener(this);
+
+      projectHadoop = new Button(group, SWT.RADIO);
+      projectHadoop.setLayoutData(new GridData(GridData.BEGINNING,
+          GridData.CENTER, false, false));
+      projectHadoop.setText("Specify Hadoop library location");
+
+      location = new Text(group, SWT.SINGLE | SWT.BORDER);
+      location.setText("");
+      d = new GridData(GridData.END, GridData.CENTER, true, false);
+      d.horizontalSpan = 1;
+      d.widthHint = 250;
+      d.grabExcessHorizontalSpace = true;
+      location.setLayoutData(d);
+      location.setEnabled(false);
+
+      browse = new Button(group, SWT.NONE);
+      browse.setText("Browse...");
+      browse.setLayoutData(new GridData(GridData.BEGINNING, GridData.CENTER,
+          false, false));
+      browse.setEnabled(false);
+      browse.addSelectionListener(this);
+
+      projectHadoop.addSelectionListener(this);
+      workspaceHadoop.addSelectionListener(this);
+
+      // generateDriver = new Button((Composite) getControl(), SWT.CHECK);
+      // generateDriver.setText("Generate a MapReduce driver");
+      // generateDriver.addListener(SWT.Selection, new Listener()
+      // {
+      // public void handleEvent(Event event) {
+      // getContainer().updateButtons(); }
+      // });
+    }
+
+    @Override
+    public boolean isPageComplete() {
+      boolean validHadoop = validateHadoopLocation();
+
+      if (!validHadoop && isCurrentPage()) {
+        setErrorMessage("Invalid Hadoop Runtime specified; please click 'Configure Hadoop install directory' or fill in library location input field");
+      } else {
+        setErrorMessage(null);
+      }
+
+      return super.isPageComplete() && validHadoop;
+    }
+
+    private boolean validateHadoopLocation() {
+      FilenameFilter gotHadoopJar = new FilenameFilter() {
+        public boolean accept(File dir, String name) {
+          return (name.startsWith("hadoop") && name.endsWith(".jar")
+              && (name.indexOf("test") == -1) && (name.indexOf("examples") == -1));
+        }
+      };
+
+      if (workspaceHadoop.getSelection()) {
+        this.currentPath = path;
+        return new Path(path).toFile().exists()
+            && (new Path(path).toFile().list(gotHadoopJar).length > 0);
+      } else {
+        this.currentPath = location.getText();
+        File file = new Path(location.getText()).toFile();
+        return file.exists()
+            && (new Path(location.getText()).toFile().list(gotHadoopJar).length > 0);
+      }
+    }
+
+    private void updateHadoopDirLabelFromPreferences() {
+      path = Activator.getDefault().getPreferenceStore().getString(
+          PreferenceConstants.P_PATH);
+
+      if ((path != null) && (path.length() > 0)) {
+        workspaceHadoop.setText("Use default Hadoop");
+      } else {
+        workspaceHadoop.setText("Use default Hadoop (currently not set)");
+      }
+    }
+
+    public void widgetDefaultSelected(SelectionEvent e) {
+    }
+
+    public void widgetSelected(SelectionEvent e) {
+      if (e.getSource() == openPreferences) {
+        PreferenceManager manager = new PreferenceManager();
+        manager.addToRoot(new PreferenceNode("Hadoop Installation Directory",
+            new HadoopHomeDirPreferencePage()));
+        PreferenceDialog dialog = new PreferenceDialog(this.getShell(), manager);
+        dialog.create();
+        dialog.setMessage("Select Hadoop Installation Directory");
+        dialog.setBlockOnOpen(true);
+        dialog.open();
+
+        updateHadoopDirLabelFromPreferences();
+      } else if (e.getSource() == browse) {
+        DirectoryDialog dialog = new DirectoryDialog(this.getShell());
+        dialog
+            .setMessage("Select a hadoop installation, containing hadoop-X-core.jar");
+        dialog.setText("Select Hadoop Installation Directory");
+        String directory = dialog.open();
+
+        if (directory != null) {
+          location.setText(directory);
+
+          if (!validateHadoopLocation()) {
+            setErrorMessage("No Hadoop jar found in specified directory");
+          } else {
+            setErrorMessage(null);
+          }
+        }
+      } else if (projectHadoop.getSelection()) {
+        location.setEnabled(true);
+        browse.setEnabled(true);
+      } else {
+        location.setEnabled(false);
+        browse.setEnabled(false);
+      }
+
+      getContainer().updateButtons();
+    }
+  }
+
+  @Override
+  public void addPages() {
+    /*
+     * firstPage = new HadoopFirstPage(); addPage(firstPage ); addPage( new
+     * JavaProjectWizardSecondPage(firstPage) );
+     */
+
+    firstPage = new HadoopFirstPage();
+    javaPage = new NewJavaProjectWizardPage(ResourcesPlugin.getWorkspace()
+        .getRoot(), firstPage);
+    // newDriverPage = new NewDriverWizardPage(false);
+    // newDriverPage.setPageComplete(false); // ensure finish button
+    // initially disabled
+    addPage(firstPage);
+    addPage(javaPage);
+
+    // addPage(newDriverPage);
+  }
+
+  @Override
+  public boolean performFinish() {
+    try {
+      PlatformUI.getWorkbench().getProgressService().runInUI(
+          this.getContainer(), new IRunnableWithProgress() {
+            public void run(IProgressMonitor monitor) {
+              try {
+                monitor.beginTask("Create Hadoop Project", 300);
+
+                javaPage.getRunnable()
+                    .run(new SubProgressMonitor(monitor, 100));
+
+                // if( firstPage.generateDriver.getSelection())
+                // {
+                // newDriverPage.setPackageFragmentRoot(javaPage.getNewJavaProject().getAllPackageFragmentRoots()[0],
+                // false);
+                // newDriverPage.getRunnable().run(new
+                // SubProgressMonitor(monitor,100));
+                // }
+
+                IProject project = javaPage.getNewJavaProject().getResource()
+                    .getProject();
+                IProjectDescription description = project.getDescription();
+                String[] existingNatures = description.getNatureIds();
+                String[] natures = new String[existingNatures.length + 1];
+                for (int i = 0; i < existingNatures.length; i++) {
+                  natures[i + 1] = existingNatures[i];
+                }
+
+                natures[0] = MapReduceNature.ID;
+                description.setNatureIds(natures);
+
+                project.setPersistentProperty(new QualifiedName(
+                    Activator.PLUGIN_ID, "hadoop.runtime.path"),
+                    firstPage.currentPath);
+                project.setDescription(description, new NullProgressMonitor());
+
+                String[] natureIds = project.getDescription().getNatureIds();
+                for (int i = 0; i < natureIds.length; i++) {
+                  log.fine("Nature id # " + i + " > " + natureIds[i]);
+                }
+
+                monitor.worked(100);
+                monitor.done();
+
+                BasicNewProjectResourceWizard.updatePerspective(config);
+              } catch (CoreException e) {
+                // TODO Auto-generated catch block
+                log.log(Level.SEVERE, "CoreException thrown.", e);
+              } catch (InvocationTargetException e) {
+                // TODO Auto-generated catch block
+                e.printStackTrace();
+              } catch (InterruptedException e) {
+                // TODO Auto-generated catch block
+                e.printStackTrace();
+              }
+            }
+          }, null);
+    } catch (InvocationTargetException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    } catch (InterruptedException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    }
+
+    return true;
+  }
+
+  public void setInitializationData(IConfigurationElement config,
+      String propertyName, Object data) throws CoreException {
+    this.config = config;
+  }
+}

+ 189 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewMapperWizard.java

@@ -0,0 +1,189 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import java.io.IOException;
+import java.util.Arrays;
+
+import org.eclipse.core.resources.IFile;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jdt.core.IJavaElement;
+import org.eclipse.jdt.core.IType;
+import org.eclipse.jdt.internal.ui.wizards.NewElementWizard;
+import org.eclipse.jdt.ui.wizards.NewTypeWizardPage;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Button;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.ui.INewWizard;
+import org.eclipse.ui.IWorkbench;
+
+/**
+ * Wizard for creating a new Mapper class (a class that runs the Map portion of
+ * a MapReduce job). The class is pre-filled with a template.
+ * 
+ */
+
+public class NewMapperWizard extends NewElementWizard implements INewWizard,
+    IRunnableWithProgress {
+  private Page page;
+
+  public NewMapperWizard() {
+    setWindowTitle("New Mapper");
+  }
+
+  public void run(IProgressMonitor monitor) {
+    try {
+      page.createType(monitor);
+    } catch (CoreException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    } catch (InterruptedException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    }
+  }
+
+  @Override
+  public void init(IWorkbench workbench, IStructuredSelection selection) {
+    super.init(workbench, selection);
+
+    page = new Page();
+    addPage(page);
+    page.setSelection(selection);
+  }
+
+  public static class Page extends NewTypeWizardPage {
+    private Button isCreateMapMethod;
+
+    public Page() {
+      super(true, "Mapper");
+
+      setTitle("Mapper");
+      setDescription("Create a new Mapper implementation.");
+      try {
+        setImageDescriptor(ImageDescriptor.createFromURL((FileLocator
+            .toFileURL(FileLocator.find(Activator.getDefault().getBundle(),
+                new Path("resources/mapwiz.png"), null)))));
+      } catch (IOException e) {
+        // TODO Auto-generated catch block
+        e.printStackTrace();
+      }
+
+    }
+
+    public void setSelection(IStructuredSelection selection) {
+      initContainerPage(getInitialJavaElement(selection));
+      initTypePage(getInitialJavaElement(selection));
+    }
+
+    @Override
+    public void createType(IProgressMonitor monitor) throws CoreException,
+        InterruptedException {
+      super.createType(monitor);
+    }
+
+    @Override
+    protected void createTypeMembers(IType newType, ImportsManager imports,
+        IProgressMonitor monitor) throws CoreException {
+      super.createTypeMembers(newType, imports, monitor);
+      imports.addImport("java.io.IOException");
+      imports.addImport("org.apache.hadoop.io.WritableComparable");
+      imports.addImport("org.apache.hadoop.io.Writable");
+      imports.addImport("org.apache.hadoop.mapred.OutputCollector");
+      imports.addImport("org.apache.hadoop.mapred.Reporter");
+      newType
+          .createMethod(
+              "public void map(WritableComparable key, Writable values, OutputCollector output, Reporter reporter) throws IOException \n{\n}\n",
+              null, false, monitor);
+    }
+
+    public void createControl(Composite parent) {
+      // super.createControl(parent);
+
+      initializeDialogUnits(parent);
+      Composite composite = new Composite(parent, SWT.NONE);
+      GridLayout layout = new GridLayout();
+      layout.numColumns = 4;
+      composite.setLayout(layout);
+
+      createContainerControls(composite, 4);
+      createPackageControls(composite, 4);
+      createSeparator(composite, 4);
+      createTypeNameControls(composite, 4);
+      createSuperClassControls(composite, 4);
+      createSuperInterfacesControls(composite, 4);
+      // createSeparator(composite, 4);
+
+      setControl(composite);
+
+      setSuperClass("org.apache.hadoop.mapred.MapReduceBase", true);
+      setSuperInterfaces(Arrays
+          .asList(new String[] { "org.apache.hadoop.mapred.Mapper" }), true);
+
+      setFocus();
+      validate();
+    }
+
+    @Override
+    protected void handleFieldChanged(String fieldName) {
+      super.handleFieldChanged(fieldName);
+
+      validate();
+    }
+
+    private void validate() {
+      updateStatus(new IStatus[] { fContainerStatus, fPackageStatus,
+          fTypeNameStatus, fSuperClassStatus, fSuperInterfacesStatus });
+    }
+  }
+
+  @Override
+  public boolean performFinish() {
+    if (super.performFinish()) {
+      if (getCreatedElement() != null) {
+        openResource((IFile) page.getModifiedResource());
+        selectAndReveal(page.getModifiedResource());
+      }
+
+      return true;
+    } else {
+      return false;
+    }
+  }
+
+  @Override
+  protected void finishPage(IProgressMonitor monitor)
+      throws InterruptedException, CoreException {
+    this.run(monitor);
+  }
+
+  @Override
+  public IJavaElement getCreatedElement() {
+    return page.getCreatedType().getPrimaryElement();
+  }
+
+}

+ 192 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/NewReducerWizard.java

@@ -0,0 +1,192 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import java.io.IOException;
+import java.util.Arrays;
+
+import org.eclipse.core.resources.IFile;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jdt.core.IJavaElement;
+import org.eclipse.jdt.core.IType;
+import org.eclipse.jdt.internal.ui.wizards.NewElementWizard;
+import org.eclipse.jdt.ui.wizards.NewTypeWizardPage;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.ui.INewWizard;
+import org.eclipse.ui.IWorkbench;
+
+/**
+ * Wizard for creating a new Reducer class (a class that runs the Reduce portion
+ * of a MapReduce job). The class is pre-filled with a template.
+ * 
+ */
+
+public class NewReducerWizard extends NewElementWizard implements INewWizard,
+    IRunnableWithProgress {
+  private Page page;
+
+  public NewReducerWizard() {
+    setWindowTitle("New Reducer");
+  }
+
+  public void run(IProgressMonitor monitor) {
+    try {
+      page.createType(monitor);
+    } catch (CoreException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    } catch (InterruptedException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    }
+  }
+
+  @Override
+  public void init(IWorkbench workbench, IStructuredSelection selection) {
+    super.init(workbench, selection);
+
+    page = new Page();
+    addPage(page);
+    page.setSelection(selection);
+  }
+
+  public static class Page extends NewTypeWizardPage {
+    public Page() {
+      super(true, "Reducer");
+
+      setTitle("Reducer");
+      setDescription("Create a new Reducer implementation.");
+      try {
+        setImageDescriptor(ImageDescriptor.createFromURL((FileLocator
+            .toFileURL(FileLocator.find(Activator.getDefault().getBundle(),
+                new Path("resources/reducewiz.png"), null)))));
+      } catch (IOException e) {
+        // TODO Auto-generated catch block
+        e.printStackTrace();
+      }
+
+    }
+
+    public void setSelection(IStructuredSelection selection) {
+      initContainerPage(getInitialJavaElement(selection));
+      initTypePage(getInitialJavaElement(selection));
+    }
+
+    @Override
+    public void createType(IProgressMonitor monitor) throws CoreException,
+        InterruptedException {
+      super.createType(monitor);
+    }
+
+    @Override
+    protected void createTypeMembers(IType newType, ImportsManager imports,
+        IProgressMonitor monitor) throws CoreException {
+      super.createTypeMembers(newType, imports, monitor);
+      imports.addImport("java.io.IOException");
+      imports.addImport("org.apache.hadoop.io.WritableComparable");
+      imports.addImport("org.apache.hadoop.mapred.OutputCollector");
+      imports.addImport("org.apache.hadoop.mapred.Reporter");
+      imports.addImport("java.util.Iterator");
+      newType
+          .createMethod(
+              "public void reduce(WritableComparable _key, Iterator values, OutputCollector output, Reporter reporter) throws IOException \n{\n"
+                  + "\t// replace KeyType with the real type of your key\n"
+                  + "\tKeyType key = (KeyType) _key;\n\n"
+                  + "\twhile (values.hasNext()) {\n"
+                  + "\t\t// replace ValueType with the real type of your value\n"
+                  + "\t\tValueType value = (ValueType) values.next();\n\n"
+                  + "\t\t// process value\n" + "\t}\n" + "}\n", null, false,
+              monitor);
+    }
+
+    public void createControl(Composite parent) {
+      // super.createControl(parent);
+
+      initializeDialogUnits(parent);
+      Composite composite = new Composite(parent, SWT.NONE);
+      GridLayout layout = new GridLayout();
+      layout.numColumns = 4;
+      composite.setLayout(layout);
+
+      createContainerControls(composite, 4);
+      createPackageControls(composite, 4);
+      createSeparator(composite, 4);
+      createTypeNameControls(composite, 4);
+      createSuperClassControls(composite, 4);
+      createSuperInterfacesControls(composite, 4);
+      // createSeparator(composite, 4);
+
+      setControl(composite);
+
+      setSuperClass("org.apache.hadoop.mapred.MapReduceBase", true);
+      setSuperInterfaces(Arrays
+          .asList(new String[] { "org.apache.hadoop.mapred.Reducer" }), true);
+
+      setFocus();
+      validate();
+    }
+
+    @Override
+    protected void handleFieldChanged(String fieldName) {
+      super.handleFieldChanged(fieldName);
+
+      validate();
+    }
+
+    private void validate() {
+      updateStatus(new IStatus[] { fContainerStatus, fPackageStatus,
+          fTypeNameStatus, fSuperClassStatus, fSuperInterfacesStatus });
+    }
+  }
+
+  @Override
+  public boolean performFinish() {
+    if (super.performFinish()) {
+      if (getCreatedElement() != null) {
+        selectAndReveal(page.getModifiedResource());
+        openResource((IFile) page.getModifiedResource());
+      }
+
+      return true;
+    } else {
+      return false;
+    }
+  }
+
+  @Override
+  protected void finishPage(IProgressMonitor monitor)
+      throws InterruptedException, CoreException {
+    this.run(monitor);
+  }
+
+  @Override
+  public IJavaElement getCreatedElement() {
+    return (page.getCreatedType() == null) ? null : page.getCreatedType()
+        .getPrimaryElement();
+  }
+}

+ 43 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/PropertyTester.java

@@ -0,0 +1,43 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse;
+
+import java.util.logging.Logger;
+
+/**
+ * Class to help with debugging properties
+ */
+public class PropertyTester extends
+    org.eclipse.core.expressions.PropertyTester {
+
+  static Logger log = Logger.getLogger(PropertyTester.class.getName());
+
+  public PropertyTester() {
+  }
+
+  public boolean test(Object receiver, String property, Object[] args,
+      Object expectedValue) {
+    log.fine("Test property " + property + ", " + receiver.getClass());
+
+    return true;
+
+    // todo(jz) support test for deployable if module has hadoop nature etc.
+  }
+
+}

+ 275 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/DfsAction.java

@@ -0,0 +1,275 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.actions;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.lang.reflect.InvocationTargetException;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.apache.hadoop.eclipse.dfs.DfsFile;
+import org.apache.hadoop.eclipse.dfs.DfsFolder;
+import org.apache.hadoop.eclipse.dfs.DfsPath;
+import org.eclipse.core.internal.runtime.AdapterManager;
+import org.eclipse.core.resources.IStorage;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.IPath;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jface.action.IAction;
+import org.eclipse.jface.dialogs.MessageDialog;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.ISelection;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.swt.widgets.DirectoryDialog;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.swt.widgets.Shell;
+import org.eclipse.ui.IObjectActionDelegate;
+import org.eclipse.ui.IPersistableElement;
+import org.eclipse.ui.ISharedImages;
+import org.eclipse.ui.IStorageEditorInput;
+import org.eclipse.ui.IWorkbenchPart;
+import org.eclipse.ui.PartInitException;
+import org.eclipse.ui.PlatformUI;
+
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.SftpException;
+
+public class DfsAction implements IObjectActionDelegate {
+
+  private ISelection selection;
+
+  private IWorkbenchPart targetPart;
+
+  /* @inheritDoc */
+  public void setActivePart(IAction action, IWorkbenchPart targetPart) {
+    this.targetPart = targetPart;
+  }
+
+  /* @inheritDoc */
+  public void run(IAction action) {
+
+    // Ignore non structured selections
+    if (!(this.selection instanceof IStructuredSelection))
+      return;
+
+    IStructuredSelection ss = (IStructuredSelection) selection;
+    String actionId = action.getActionDefinitionId();
+    try {
+      if (actionId.equals("dfs.delete"))
+        delete(ss);
+      else if (actionId.equals("dfs.open"))
+        open(ss);
+      else if (actionId.equals("dfs.put"))
+        put(ss);
+      else if (actionId.equals("dfs.refresh"))
+        refresh(ss);
+      else if (actionId.equals("dfs.get"))
+        get(ss);
+
+    } catch (Exception e) {
+      Shell shell = new Shell();
+      e.printStackTrace();
+      MessageDialog.openError(shell, "DFS Error",
+          "An error occurred while performing DFS operation: "
+              + e.getMessage());
+    }
+
+  }
+
+  /**
+   * Implement the import action (upload files from the current machine to
+   * HDFS)
+   * 
+   * @param object
+   * @throws SftpException
+   * @throws JSchException
+   * @throws InvocationTargetException
+   * @throws InterruptedException
+   */
+  private void put(IStructuredSelection selection) throws SftpException,
+      JSchException, InvocationTargetException, InterruptedException {
+
+    // Ask the user which local directory to upload
+    DirectoryDialog dialog =
+        new DirectoryDialog(Display.getCurrent().getActiveShell());
+    dialog.setText("Copy from local directory");
+    dialog.setMessage("Copy the local directory"
+        + " to the selected directories on the distributed filesystem");
+    String directory = dialog.open();
+
+    if (directory == null)
+      return;
+
+    for (DfsFolder folder : filterSelection(DfsFolder.class, selection))
+      folder.put(directory);
+  }
+
+  /**
+   * Implements the Download action from HDFS to the current machine
+   * 
+   * @param object
+   * @throws SftpException
+   * @throws JSchException
+   */
+  private void get(IStructuredSelection selection) throws SftpException,
+      JSchException {
+
+    // Ask the user where to put the downloaded files
+    DirectoryDialog dialog =
+        new DirectoryDialog(Display.getCurrent().getActiveShell());
+    dialog.setText("Copy to local directory");
+    dialog.setMessage("Copy the selected files and directories from the "
+        + "distributed filesystem to a local directory");
+    String directory = dialog.open();
+
+    if (directory == null)
+      return;
+
+    for (DfsPath path : filterSelection(DfsPath.class, selection)) {
+      try {
+        path.downloadToLocalDirectory(directory);
+      } catch (Exception e) {
+        // nothing we want to do here, ignore
+        e.printStackTrace();
+      }
+    }
+
+  }
+
+  /**
+   * Open the selected DfsPath in the editor window
+   * 
+   * @param selection
+   * @throws JSchException
+   * @throws IOException
+   * @throws PartInitException
+   * @throws InvocationTargetException
+   * @throws InterruptedException
+   */
+  private void open(IStructuredSelection selection) throws JSchException,
+      IOException, PartInitException, InvocationTargetException,
+      InterruptedException {
+
+    for (final DfsFile path : filterSelection(DfsFile.class, selection)) {
+
+      final InputStream data = path.open();
+      if (data == null)
+        continue;
+
+      final IStorage storage = new IStorage() {
+        public Object getAdapter(Class adapter) {
+          return AdapterManager.getDefault().getAdapter(this, adapter);
+        }
+
+        public boolean isReadOnly() {
+          return true;
+        }
+
+        public String getName() {
+          return path.toString();
+        }
+
+        public IPath getFullPath() {
+          return new Path(path.toString());
+        }
+
+        public InputStream getContents() throws CoreException {
+          return data;
+        }
+      };
+
+      IStorageEditorInput storageEditorInput = new IStorageEditorInput() {
+        public Object getAdapter(Class adapter) {
+          return null;
+        }
+
+        public String getToolTipText() {
+          return "";
+        }
+
+        public IPersistableElement getPersistable() {
+          return null;
+        }
+
+        public String getName() {
+          return path.toString();
+        }
+
+        public ImageDescriptor getImageDescriptor() {
+          return PlatformUI.getWorkbench().getSharedImages()
+              .getImageDescriptor(ISharedImages.IMG_OBJ_FILE);
+        }
+
+        public boolean exists() {
+          return true;
+        }
+
+        public IStorage getStorage() throws CoreException {
+          return storage;
+        }
+      };
+
+      targetPart.getSite().getWorkbenchWindow().getActivePage().openEditor(
+          storageEditorInput, "org.eclipse.ui.DefaultTextEditor");
+    }
+  }
+
+  private void refresh(IStructuredSelection selection) throws JSchException {
+    for (DfsPath path : filterSelection(DfsPath.class, selection))
+      path.refresh();
+
+  }
+
+  private void delete(IStructuredSelection selection) throws JSchException {
+    List<DfsPath> list = filterSelection(DfsPath.class, selection);
+    if (list.isEmpty())
+      return;
+
+    if (MessageDialog.openConfirm(null, "Confirm Delete from DFS",
+        "Are you sure you want to delete " + list + " from the DFS?")) {
+      for (DfsPath path : list)
+        path.delete();
+    }
+  }
+
+  /* @inheritDoc */
+  public void selectionChanged(IAction action, ISelection selection) {
+    this.selection = selection;
+  }
+
+  /**
+   * Extract the list of <T> from the structured selection
+   * 
+   * @param clazz the class T
+   * @param selection the structured selection
+   * @return the list of <T> it contains
+   */
+  private <T> List<T> filterSelection(Class<T> clazz,
+      IStructuredSelection selection) {
+    List<T> list = new ArrayList<T>();
+    for (Object obj : selection.toList()) {
+      if (clazz.isAssignableFrom(obj.getClass())) {
+        list.add((T) obj);
+      }
+    }
+    return list;
+  }
+
+}

+ 68 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/EditServerAction.java

@@ -0,0 +1,68 @@
+package org.apache.hadoop.eclipse.actions;
+
+import java.io.IOException;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.apache.hadoop.eclipse.servers.DefineHadoopServerLocWizardPage;
+import org.apache.hadoop.eclipse.view.servers.ServerView;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jface.action.Action;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.wizard.Wizard;
+import org.eclipse.jface.wizard.WizardDialog;
+
+/**
+ * Editing server properties action
+ */
+public class EditServerAction extends Action {
+
+  private ServerView serverView;
+
+  public EditServerAction(ServerView serverView) {
+    this.serverView = serverView;
+
+    setText("Edit Hadoop Server");
+    try {
+      // TODO Edit server icon
+      setImageDescriptor(ImageDescriptor.createFromURL((FileLocator
+          .toFileURL(FileLocator.find(Activator.getDefault().getBundle(),
+              new Path("resources/hadoop_small.gif"), null)))));
+    } catch (IOException e) {
+      /* Ignore if no image */
+      e.printStackTrace();
+    }
+  }
+
+  @Override
+  public void run() {
+
+    final HadoopServer server = serverView.getSelectedServer();
+    if (server == null)
+      return;
+
+    WizardDialog dialog = new WizardDialog(null, new Wizard() {
+      private DefineHadoopServerLocWizardPage page =
+          new DefineHadoopServerLocWizardPage(server);
+
+      @Override
+      public void addPages() {
+        super.addPages();
+        setWindowTitle("Edit Hadoop Server Location");
+        addPage(page);
+      }
+
+      @Override
+      public boolean performFinish() {
+        return (page.performFinish() != null);
+      }
+    });
+
+    dialog.create();
+    dialog.setBlockOnOpen(true);
+    dialog.open();
+
+    super.run();
+  }
+}

+ 76 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/NewServerAction.java

@@ -0,0 +1,76 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.actions;
+
+import java.io.IOException;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.servers.DefineHadoopServerLocWizardPage;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jface.action.Action;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.wizard.Wizard;
+import org.eclipse.jface.wizard.WizardDialog;
+
+
+/**
+ * Action corresponding to creating a new MapReduce Server.
+ */
+
+public class NewServerAction extends Action {
+  public NewServerAction() {
+    setText("New Hadoop Server");
+    try {
+      // TODO decorate with + sign to indicate create
+      setImageDescriptor(ImageDescriptor.createFromURL((FileLocator
+          .toFileURL(FileLocator.find(Activator.getDefault().getBundle(),
+              new Path("resources/hadoop_small.gif"), null)))));
+    } catch (IOException e) {
+      /* Ignore if no image */
+      e.printStackTrace();
+    }
+  }
+
+  @Override
+  public void run() {
+    WizardDialog dialog = new WizardDialog(null, new Wizard() {
+      private DefineHadoopServerLocWizardPage page = new DefineHadoopServerLocWizardPage();
+
+      @Override
+      public void addPages() {
+        super.addPages();
+        setWindowTitle("New Hadoop Server Location");
+        addPage(page);
+      }
+
+      @Override
+      public boolean performFinish() {
+        return page.performFinish() != null;
+      }
+
+    });
+
+    dialog.create();
+    dialog.setBlockOnOpen(true);
+    dialog.open();
+
+    super.run();
+  }
+}

+ 76 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/OpenNewMRClassWizardAction.java

@@ -0,0 +1,76 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.actions;
+
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.NewDriverWizard;
+import org.apache.hadoop.eclipse.NewMapperWizard;
+import org.apache.hadoop.eclipse.NewReducerWizard;
+import org.eclipse.jface.action.Action;
+import org.eclipse.jface.viewers.StructuredSelection;
+import org.eclipse.jface.window.Window;
+import org.eclipse.jface.wizard.WizardDialog;
+import org.eclipse.ui.INewWizard;
+import org.eclipse.ui.IWorkbench;
+import org.eclipse.ui.PlatformUI;
+import org.eclipse.ui.cheatsheets.ICheatSheetAction;
+import org.eclipse.ui.cheatsheets.ICheatSheetManager;
+
+
+/**
+ * Action to open a new MapReduce Class.
+ */
+
+public class OpenNewMRClassWizardAction extends Action implements
+    ICheatSheetAction {
+
+  static Logger log = Logger.getLogger(OpenNewMRClassWizardAction.class
+      .getName());
+
+  public void run(String[] params, ICheatSheetManager manager) {
+
+    if ((params != null) && (params.length > 0)) {
+      IWorkbench workbench = PlatformUI.getWorkbench();
+      INewWizard wizard = getWizard(params[0]);
+      wizard.init(workbench, new StructuredSelection());
+      WizardDialog dialog = new WizardDialog(PlatformUI.getWorkbench()
+          .getActiveWorkbenchWindow().getShell(), wizard);
+      dialog.create();
+      dialog.open();
+
+      // did the wizard succeed ?
+      notifyResult(dialog.getReturnCode() == Window.OK);
+    }
+  }
+
+  private INewWizard getWizard(String typeName) {
+    if (typeName.equals("Mapper")) {
+      return new NewMapperWizard();
+    } else if (typeName.equals("Reducer")) {
+      return new NewReducerWizard();
+    } else if (typeName.equals("Driver")) {
+      return new NewDriverWizard();
+    } else {
+      log.severe("Invalid Wizard requested");
+      return null;
+    }
+  }
+
+}

+ 49 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/OpenNewMRProjectAction.java

@@ -0,0 +1,49 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.actions;
+
+import org.apache.hadoop.eclipse.NewMapReduceProjectWizard;
+import org.eclipse.jface.action.Action;
+import org.eclipse.jface.viewers.StructuredSelection;
+import org.eclipse.jface.window.Window;
+import org.eclipse.jface.wizard.WizardDialog;
+import org.eclipse.swt.widgets.Shell;
+import org.eclipse.ui.IWorkbench;
+import org.eclipse.ui.PlatformUI;
+
+
+/**
+ * Action to open a new MapReduce project.
+ */
+
+public class OpenNewMRProjectAction extends Action {
+
+  @Override
+  public void run() {
+    IWorkbench workbench = PlatformUI.getWorkbench();
+    Shell shell = workbench.getActiveWorkbenchWindow().getShell();
+    NewMapReduceProjectWizard wizard = new NewMapReduceProjectWizard();
+    wizard.init(workbench, new StructuredSelection());
+    WizardDialog dialog = new WizardDialog(shell, wizard);
+    dialog.create();
+    dialog.open();
+    // did the wizard succeed?
+    notifyResult(dialog.getReturnCode() == Window.OK);
+  }
+}

+ 102 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/actions/RunOnHadoopActionDelegate.java

@@ -0,0 +1,102 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.actions;
+
+import org.apache.hadoop.eclipse.server.JarModule;
+import org.apache.hadoop.eclipse.servers.RunOnHadoopWizard;
+import org.eclipse.core.resources.IResource;
+import org.eclipse.core.runtime.IAdaptable;
+import org.eclipse.jface.action.IAction;
+import org.eclipse.jface.dialogs.MessageDialog;
+import org.eclipse.jface.viewers.ISelection;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.jface.wizard.IWizard;
+import org.eclipse.jface.wizard.WizardDialog;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.swt.widgets.Shell;
+import org.eclipse.ui.actions.ActionDelegate;
+
+/**
+ * Allows a resource to be associated with the "Run on Hadoop" action in the
+ * Run menu. Only files, not directories, may be run on Hadoop. The file
+ * needs to have a main method. When the "Run on Hadoop" action is called,
+ * launch the RunOnHadoop Dialog.
+ */
+
+public class RunOnHadoopActionDelegate extends ActionDelegate {
+
+  private ISelection selection;
+
+  @Override
+  public void selectionChanged(IAction action, ISelection selection) {
+    this.selection = selection;
+  }
+
+  @Override
+  public void run(IAction action) {
+    if ((selection == null)
+        || (!(selection instanceof IStructuredSelection)))
+      return;
+
+    IStructuredSelection issel = (IStructuredSelection) selection;
+
+    if (issel.size() != 1)
+      return;
+
+    Object selected = issel.getFirstElement();
+    if (!(selected instanceof IAdaptable))
+      return;
+
+    IAdaptable adaptable = (IAdaptable) selected;
+
+    IResource resource = (IResource) adaptable.getAdapter(IResource.class);
+
+    // 63561: only allow run-on on file resources
+    if ((resource != null) && (resource.getType() == IResource.FILE)) {
+      RunOnHadoopWizard wizard =
+          new RunOnHadoopWizard(new JarModule(resource));
+
+      WizardDialog dialog = new Dialog(null, wizard);
+      dialog.create();
+      dialog.setBlockOnOpen(true);
+      dialog.open();
+
+      return;
+    }
+
+    MessageDialog
+        .openInformation(Display.getDefault().getActiveShell(),
+            "No Main method found",
+            "Please select a file with a main method to Run on a MapReduce server");
+  }
+
+  static class Dialog extends WizardDialog {
+    public Dialog(Shell parentShell, IWizard newWizard) {
+      super(parentShell, newWizard);
+    }
+
+    @Override
+    public void create() {
+      super.create();
+
+      ((RunOnHadoopWizard) getWizard())
+          .setProgressMonitor(getProgressMonitor());
+    }
+  }
+}

+ 177 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/ActionProvider.java

@@ -0,0 +1,177 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.dfs;
+
+import java.util.HashMap;
+import java.util.Map;
+
+import org.eclipse.jface.action.Action;
+import org.eclipse.jface.action.IMenuManager;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.ISelection;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.ui.IActionBars;
+import org.eclipse.ui.ISharedImages;
+import org.eclipse.ui.PlatformUI;
+import org.eclipse.ui.actions.ActionFactory;
+import org.eclipse.ui.navigator.CommonActionProvider;
+import org.eclipse.ui.navigator.ICommonActionConstants;
+import org.eclipse.ui.navigator.ICommonActionExtensionSite;
+import org.eclipse.ui.navigator.ICommonMenuConstants;
+import org.eclipse.ui.plugin.AbstractUIPlugin;
+
+/**
+ * Allows the user to delete and refresh items in the DFS tree
+ */
+
+public class ActionProvider extends CommonActionProvider {
+
+  private ICommonActionExtensionSite site;
+
+  private Map<String, ImageDescriptor> descriptors =
+      new HashMap<String, ImageDescriptor>();
+
+  public ActionProvider() {
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void init(ICommonActionExtensionSite site) {
+    super.init(site);
+    this.site = site;
+
+    descriptors
+        .put("dfs.delete", PlatformUI.getWorkbench().getSharedImages()
+            .getImageDescriptor(ISharedImages.IMG_TOOL_DELETE));
+    descriptors.put("dfs.refresh", AbstractUIPlugin
+        .imageDescriptorFromPlugin("org.eclipse.core.tools.resources",
+            "icons/refresh.gif"));
+    // NOTE(jz)
+    // pretty brittle, but worst case no image
+    // descriptors.put("dfs.put",
+    // NavigatorPlugin.imageDescriptorFromPlugin("org.eclipse.core.tools.resources",
+    // "icons/refresh.gif"));
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void fillActionBars(IActionBars actionBars) {
+    actionBars.setGlobalActionHandler(ActionFactory.DELETE.getId(),
+        new DfsAction("dfs.delete", "Delete"));
+    actionBars.setGlobalActionHandler(ActionFactory.REFRESH.getId(),
+        new DfsAction("dfs.refresh", "Refresh"));
+
+    if ((this.site != null)
+        && (this.site.getStructuredViewer().getSelection() instanceof IStructuredSelection)
+        && (((IStructuredSelection) this.site.getStructuredViewer()
+            .getSelection()).size() == 1)
+        && (((IStructuredSelection) this.site.getStructuredViewer()
+            .getSelection()).getFirstElement() instanceof DfsFile)) {
+      actionBars.setGlobalActionHandler(ICommonActionConstants.OPEN,
+          new DfsAction("dfs.open", "View"));
+    }
+
+    actionBars.updateActionBars();
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void fillContextMenu(IMenuManager menu) {
+    menu.appendToGroup(ICommonMenuConstants.GROUP_EDIT, new DfsAction(
+        "dfs.delete", "Delete"));
+    menu.appendToGroup(ICommonMenuConstants.GROUP_EDIT, new DfsAction(
+        "dfs.refresh", "Refresh"));
+
+    menu.appendToGroup(ICommonMenuConstants.GROUP_NEW, new DfsAction(
+        "dfs.get", "Download to local directory..."));
+
+    if (this.site == null)
+      return;
+
+    ISelection isel = this.site.getStructuredViewer().getSelection();
+    if (!(isel instanceof IStructuredSelection))
+      return;
+
+    IStructuredSelection issel = (IStructuredSelection) isel;
+    if (issel.size() != 1)
+      return;
+
+    Object element = issel.getFirstElement();
+
+    if (element instanceof DfsFile) {
+      menu.appendToGroup(ICommonMenuConstants.GROUP_OPEN, new DfsAction(
+          "dfs.open", "View"));
+
+    } else if (element instanceof DfsFolder) {
+      menu.appendToGroup(ICommonMenuConstants.GROUP_NEW, new DfsAction(
+          "dfs.put", "Import from local directory..."));
+    }
+  }
+
+  /**
+   * 
+   */
+  public class DfsAction extends Action {
+
+    private final String actionDefinition;
+
+    private final String title;
+
+    public DfsAction(String actionDefinition, String title) {
+      this.actionDefinition = actionDefinition;
+      this.title = title;
+
+    }
+
+    @Override
+    public String getText() {
+      return this.title;
+    }
+
+    @Override
+    public ImageDescriptor getImageDescriptor() {
+      if (descriptors.containsKey(getActionDefinitionId())) {
+        return (ImageDescriptor) descriptors.get(getActionDefinitionId());
+      } else {
+        return null;
+      }
+    }
+
+    @Override
+    public String getActionDefinitionId() {
+      return actionDefinition;
+    }
+
+    @Override
+    public void run() {
+      org.apache.hadoop.eclipse.actions.DfsAction action =
+          new org.apache.hadoop.eclipse.actions.DfsAction();
+      action.setActivePart(this, PlatformUI.getWorkbench()
+          .getActiveWorkbenchWindow().getActivePage().getActivePart());
+      action.selectionChanged(this, site.getStructuredViewer()
+          .getSelection());
+      action.run(this);
+    }
+
+    @Override
+    public boolean isEnabled() {
+      return true;
+    }
+  }
+}

+ 203 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSContentProvider.java

@@ -0,0 +1,203 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.dfs;
+
+import java.io.IOException;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.apache.hadoop.eclipse.servers.IHadoopServerListener;
+import org.apache.hadoop.eclipse.servers.ServerRegistry;
+import org.eclipse.core.resources.ResourcesPlugin;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.IAdaptable;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.ILabelProvider;
+import org.eclipse.jface.viewers.ILabelProviderListener;
+import org.eclipse.jface.viewers.ITreeContentProvider;
+import org.eclipse.jface.viewers.StructuredViewer;
+import org.eclipse.jface.viewers.Viewer;
+import org.eclipse.swt.graphics.Image;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.ui.ISharedImages;
+import org.eclipse.ui.PlatformUI;
+import org.eclipse.ui.model.IWorkbenchAdapter;
+
+/**
+ * Handles viewing the DFS
+ */
+public class DFSContentProvider implements ITreeContentProvider,
+    ILabelProvider {
+
+  /**
+   * The viewer that displays this Tree content
+   */
+  private Viewer viewer;
+
+  private ImageDescriptor hadoopImage;
+
+  private ImageDescriptor folderImage;
+
+  private ImageDescriptor fileImage;
+
+  private ImageDescriptor dfsImage;
+
+  public DFSContentProvider() {
+    try {
+      hadoopImage =
+          ImageDescriptor.createFromURL((FileLocator.toFileURL(FileLocator
+              .find(Activator.getDefault().getBundle(), new Path(
+                  "resources/hadoop_small.gif"), null))));
+      dfsImage =
+          ImageDescriptor.createFromURL((FileLocator.toFileURL(FileLocator
+              .find(Activator.getDefault().getBundle(), new Path(
+                  "resources/files.gif"), null))));
+    } catch (IOException e) {
+      e.printStackTrace();
+      // no images, okay, will deal with that
+    }
+  }
+
+  public Object[] getChildren(Object parentElement) {
+    if (parentElement instanceof DummyWorkspace) {
+      return ResourcesPlugin.getWorkspace().getRoot().getProjects();
+    }
+    if (parentElement instanceof DFS) {
+      return ServerRegistry.getInstance().getServers().toArray();
+    } else if (parentElement instanceof HadoopServer) {
+      return new Object[] { new DfsFolder((HadoopServer) parentElement, "/",
+          viewer) };
+    } else if (parentElement instanceof DfsFolder) {
+      return ((DfsFolder) parentElement).getChildren();
+    }
+
+    return new Object[0];
+  }
+
+  public Object getParent(Object element) {
+    if (element instanceof DfsPath) {
+      return ((DfsPath) element).getParent();
+    } else if (element instanceof HadoopServer) {
+      return dfs;
+    } else {
+      return null;
+    }
+  }
+
+  public boolean hasChildren(Object element) {
+    return (element instanceof HadoopServer)
+        || (element instanceof DfsFolder) || (element instanceof DFS)
+        || (element instanceof DummyWorkspace);
+  }
+
+  public class DFS {
+    public DFS() {
+      ServerRegistry.getInstance().addListener(new IHadoopServerListener() {
+        public void serverChanged(final HadoopServer location, final int type) {
+          if (viewer != null) {
+            Display.getDefault().syncExec(new Runnable() {
+              public void run() {
+                if (type == ServerRegistry.SERVER_STATE_CHANGED) {
+                  ((StructuredViewer) viewer).refresh(location);
+                } else {
+                  ((StructuredViewer) viewer).refresh(ResourcesPlugin
+                      .getWorkspace().getRoot());
+                }
+              }
+            });
+          }
+        }
+      });
+    }
+
+    @Override
+    public String toString() {
+      return "MapReduce DFS";
+    }
+  }
+
+  private final DFS dfs = new DFS();
+
+  private final Object workspace = new DummyWorkspace();
+
+  private static class DummyWorkspace {
+    @Override
+    public String toString() {
+      return "Workspace";
+    }
+  };
+
+  public Object[] getElements(final Object inputElement) {
+    return ServerRegistry.getInstance().getServers().toArray();
+  }
+
+  public void dispose() {
+
+  }
+
+  public void inputChanged(Viewer viewer, Object oldInput, Object newInput) {
+    this.viewer = viewer;
+  }
+
+  public Image getImage(Object element) {
+    if (element instanceof DummyWorkspace) {
+      IWorkbenchAdapter a =
+          (IWorkbenchAdapter) ((IAdaptable) ResourcesPlugin.getWorkspace()
+              .getRoot()).getAdapter(IWorkbenchAdapter.class);
+      return a.getImageDescriptor(ResourcesPlugin.getWorkspace().getRoot())
+          .createImage();
+    } else if (element instanceof DFS) {
+      return dfsImage.createImage(true);
+    } else if (element instanceof HadoopServer) {
+      return hadoopImage.createImage(true);
+    } else if (element instanceof DfsFolder) {
+      return PlatformUI.getWorkbench().getSharedImages().getImageDescriptor(
+          ISharedImages.IMG_OBJ_FOLDER).createImage();
+    } else if (element instanceof DfsFile) {
+      return PlatformUI.getWorkbench().getSharedImages().getImageDescriptor(
+          ISharedImages.IMG_OBJ_FILE).createImage();
+    }
+
+    return null;
+  }
+
+  public String getText(Object element) {
+    if (element instanceof DummyWorkspace) {
+      IWorkbenchAdapter a =
+          (IWorkbenchAdapter) ((IAdaptable) ResourcesPlugin.getWorkspace()
+              .getRoot()).getAdapter(IWorkbenchAdapter.class);
+      return a.getLabel(ResourcesPlugin.getWorkspace().getRoot());
+    } else {
+      return element.toString();
+    }
+  }
+
+  public void addListener(ILabelProviderListener listener) {
+
+  }
+
+  public boolean isLabelProperty(Object element, String property) {
+    return false;
+  }
+
+  public void removeListener(ILabelProviderListener listener) {
+
+  }
+}

+ 157 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DfsFile.java

@@ -0,0 +1,157 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.dfs;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedOutputStream;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.lang.reflect.InvocationTargetException;
+
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.jface.dialogs.MessageDialog;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.ui.PlatformUI;
+
+import com.jcraft.jsch.ChannelExec;
+import com.jcraft.jsch.JSchException;
+
+/**
+ * File handling methods for the DFS
+ */
+public class DfsFile extends DfsPath {
+
+  public DfsFile(DfsPath parent, String path) {
+    super(parent, path);
+  }
+
+  /**
+   * Download and view contents of a file in the DFS NOTE: may not work on
+   * files >1 MB.
+   * 
+   * @return a FileInputStream for the file
+   */
+  public FileInputStream open() throws JSchException, IOException,
+      InvocationTargetException, InterruptedException {
+
+    File tempFile =
+        File.createTempFile("hadoop" + System.currentTimeMillis(), "tmp");
+    tempFile.deleteOnExit();
+
+    this.downloadToLocalFile(tempFile);
+
+    // file size greater than 1 MB
+    if (tempFile.length() > 1024 * 1024) {
+      boolean answer =
+          MessageDialog.openQuestion(null, "Show large file from DFS?",
+              "The file you are attempting to download from the DFS, "
+                  + this.getPath() + " is over 1MB in size. \n"
+                  + "Opening this file may cause performance problems."
+                  + " You can open the file with your favourite editor at "
+                  + tempFile.getAbsolutePath()
+                  + " (it's already saved there)."
+                  + " Continue opening the file in eclipse?");
+      if (!answer) {
+        return null;
+      }
+    }
+
+    return new FileInputStream(tempFile);
+  }
+
+  public void downloadToLocalFile(File localFile) throws JSchException,
+      IOException, InvocationTargetException, InterruptedException {
+
+    final ChannelExec exec =
+
+    exec(" dfs " + DfsFolder.s_whichFS + " -cat " + getPath());
+
+    final OutputStream os =
+        new BufferedOutputStream(new FileOutputStream(localFile));
+
+    try {
+      PlatformUI.getWorkbench().getProgressService().busyCursorWhile(
+          new IRunnableWithProgress() {
+            public void run(IProgressMonitor monitor)
+                throws InvocationTargetException {
+              try {
+                monitor.beginTask("View file from Distributed File System",
+                    IProgressMonitor.UNKNOWN);
+                exec.connect();
+                BufferedInputStream stream =
+                    new BufferedInputStream(exec.getInputStream());
+
+                byte[] buffer = new byte[1024];
+                int bytes;
+
+                while ((bytes = stream.read(buffer)) >= 0) {
+                  if (monitor.isCanceled()) {
+                    os.close();
+                    return;
+                  }
+
+                  monitor.worked(1);
+                  os.write(buffer, 0, bytes);
+                }
+
+                monitor.done();
+              } catch (Exception e) {
+                throw new InvocationTargetException(e);
+              }
+            }
+          });
+    } finally {
+      if (exec.isConnected()) {
+        exec.disconnect();
+      }
+      os.close();
+    }
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void downloadToLocalDirectory(String localDirectory)
+      throws InvocationTargetException, JSchException, InterruptedException,
+      IOException {
+
+    File dir = new File(localDirectory);
+    if (!dir.exists() || !dir.isDirectory())
+      return; // TODO display error message
+
+    File dfsPath = new File(this.getPath());
+    File destination = new File(dir, dfsPath.getName());
+
+    if (destination.exists()) {
+      boolean answer =
+          MessageDialog.openQuestion(null, "Overwrite existing local file?",
+              "The file you are attempting to download from the DFS "
+                  + this.getPath()
+                  + ", already exists in your local directory as "
+                  + destination + ".\n" + "Overwrite the existing file?");
+      if (!answer)
+        return;
+    }
+
+    this.downloadToLocalFile(destination);
+  }
+
+}

+ 324 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DfsFolder.java

@@ -0,0 +1,324 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.dfs;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedOutputStream;
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.lang.reflect.InvocationTargetException;
+import java.rmi.dgc.VMID;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Status;
+import org.eclipse.core.runtime.SubProgressMonitor;
+import org.eclipse.core.runtime.jobs.Job;
+import org.eclipse.jface.dialogs.ProgressMonitorDialog;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.jface.viewers.Viewer;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.ui.PlatformUI;
+
+import com.jcraft.jsch.ChannelExec;
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.SftpException;
+
+/**
+ * Folder handling methods for the DFS
+ */
+
+public class DfsFolder extends DfsPath {
+
+  protected final static String s_whichFS = ""; // "-fs local";
+
+  static Logger log = Logger.getLogger(DfsFolder.class.getName());
+
+  private Object[] children;
+
+  private boolean loading = false;
+
+  protected DfsFolder(HadoopServer location, String path, Viewer viewer) {
+    super(location, path, viewer);
+  }
+
+  private DfsFolder(DfsPath parent, String path) {
+    super(parent, path);
+  }
+
+  public Object[] getChildren() {
+    ChannelExec channel = null;
+    if (children == null) {
+      doRefresh();
+      return new Object[] { "Loading..." };
+    } else {
+      return children;
+    }
+  }
+
+  @Override
+  /**
+   * Forces a refresh of the items in the current DFS node
+   */
+  public void doRefresh() {
+    new Job("Refresh DFS Children") {
+      @Override
+      protected IStatus run(IProgressMonitor monitor) {
+        try {
+          ChannelExec channel =
+              exec(" dfs " + s_whichFS + " -ls " + getPath());
+          InputStream is = channel.getInputStream();
+          BufferedReader in =
+              new BufferedReader(new InputStreamReader(
+                  new BufferedInputStream(is)));
+
+          if (!channel.isConnected()) {
+            channel.connect();
+          }
+
+          try {
+            // initial "found n items" line ignorable
+            if (in.readLine() == null) {
+              children =
+                  new Object[] { "An error occurred: empty result from dfs -ls" };
+            }
+
+            String line;
+            List<DfsPath> children = new ArrayList<DfsPath>();
+            while ((line = in.readLine()) != null) {
+              String[] parts = line.split("\t");
+
+              for (int i = 0; i < parts.length; i++) {
+                log.fine(parts[0]);
+              }
+
+              if (parts[1].equals("<dir>")) {
+                children.add(new DfsFolder(DfsFolder.this, parts[0]));
+              } else {
+                children.add(new DfsFile(DfsFolder.this, parts[0]));
+              }
+            }
+
+            DfsFolder.this.children = children.toArray();
+
+            DfsFolder.super.doRefresh();
+
+            return Status.OK_STATUS;
+          } finally {
+            if (channel.isConnected()) {
+              channel.disconnect();
+            }
+          }
+        } catch (Exception e) {
+          e.printStackTrace();
+          return new Status(IStatus.ERROR, Activator.PLUGIN_ID, -1,
+              "Refreshing DFS node failed: " + e.getLocalizedMessage(), e);
+        }
+      }
+    }.schedule();
+  }
+
+  @Override
+  /**
+   * Does a recursive delete of the remote directory tree at this node.
+   */
+  public void delete() throws JSchException {
+    doExec("dfs " + s_whichFS + " -rmr " + getPath());
+  }
+
+  /**
+   * Upload a local directory and its contents to the remote DFS
+   * 
+   * @param directory source directory to upload
+   * @throws SftpException
+   * @throws JSchException
+   * @throws InvocationTargetException
+   * @throws InterruptedException
+   */
+  public void put(final String directory) throws SftpException,
+      JSchException, InvocationTargetException, InterruptedException {
+    ProgressMonitorDialog progress =
+        new ProgressMonitorDialog((Display.getCurrent() == null) ? null
+            : Display.getCurrent().getActiveShell());
+    progress.setCancelable(true);
+
+    PlatformUI.getWorkbench().getProgressService().busyCursorWhile(
+        new IRunnableWithProgress() {
+          public void run(IProgressMonitor monitor)
+              throws InvocationTargetException, InterruptedException {
+            String guid = new VMID().toString().replace(':', '_');
+
+            monitor.beginTask("Secure Copy", 100);
+            scp(directory, "/tmp/hadoop_scp_" + guid,
+                new SubProgressMonitor(monitor, 60));
+
+            try {
+              SubProgressMonitor sub = new SubProgressMonitor(monitor, 1);
+              if (monitor.isCanceled()) {
+                return;
+              }
+
+              final File dir = new File(directory);
+
+              sub.beginTask("Move files from staging server to DFS", 1);
+              ChannelExec exec =
+                  exec(" dfs " + s_whichFS
+                      + " -moveFromLocal /tmp/hadoop_scp_" + guid + " \""
+                      + getPath() + "/" + dir.getName() + "\"");
+              BufferedReader reader =
+                  new BufferedReader(new InputStreamReader(
+                      new BufferedInputStream(exec.getInputStream())));
+
+              if (!monitor.isCanceled()) {
+                exec.connect();
+                String line = reader.readLine();
+                sub.worked(1);
+              }
+
+              if (exec.isConnected()) {
+                exec.disconnect();
+              }
+
+              sub.done();
+
+              monitor.done();
+              doRefresh();
+            } catch (Exception e) {
+              log.log(Level.SEVERE, "", e);
+              throw new InvocationTargetException(e);
+            }
+          }
+
+          public void scp(String from, String to, IProgressMonitor monitor) {
+            File file = new File(from);
+            ChannelExec channel = null;
+
+            monitor.beginTask("scp from " + from + " to " + to, 100 * (file
+                .isDirectory() ? file.list().length + 1 : 1));
+
+            if (monitor.isCanceled()) {
+              return;
+            }
+
+            if (file.isDirectory()) {
+              // mkdir
+              try {
+                channel = (ChannelExec) getSession().openChannel("exec");
+                channel.setCommand(" mkdir " + to);
+                InputStream in = channel.getInputStream();
+                channel.connect();
+                // in.read(); // wait for a response, which
+                // we'll then ignore
+              } catch (JSchException e) {
+                // BUG(jz) abort operation and display error
+                throw new RuntimeException(e);
+              } catch (IOException e) {
+                throw new RuntimeException(e);
+              } finally {
+                if (channel.isConnected()) {
+                  channel.disconnect();
+                }
+              }
+
+              monitor.worked(100);
+
+              String[] children = file.list();
+              for (int i = 0; i < children.length; i++) {
+                File child = new File(file, children[i]);
+
+                // recurse
+                scp(new File(file, children[i]).getAbsolutePath(), to + "/"
+                    + children[i], new SubProgressMonitor(monitor, 100));
+              }
+            } else {
+              InputStream filein = null;
+
+              try {
+                channel = (ChannelExec) getSession().openChannel("exec");
+                (channel).setCommand("scp -p -t " + to);
+                BufferedOutputStream out =
+                    new BufferedOutputStream(channel.getOutputStream());
+                InputStream in = channel.getInputStream();
+                channel.connect();
+
+                if (in.read() == 0) {
+                  int step = (int) (100 / new File(from).length());
+                  out.write(("C0644 " + new File(from).length() + " "
+                      + new File(to).getName() + "\n").getBytes());
+                  out.flush();
+                  if (in.read() != 0) {
+                    throw new RuntimeException("Copy failed");
+                  }
+
+                  filein =
+                      new BufferedInputStream(new FileInputStream(from));
+
+                  byte[] buffer = new byte[1024];
+                  int bytes;
+                  while ((bytes = filein.read(buffer)) > -1) {
+                    if (monitor.isCanceled()) {
+                      return;
+                    }
+
+                    out.write(buffer, 0, bytes);
+                    monitor.worked(step);
+                  }
+
+                  out.write("\0".getBytes());
+                  out.flush();
+
+                  if (in.read() != 0) {
+                    throw new RuntimeException("Copy failed");
+                  }
+                  out.close();
+                } else {
+                  // problems with copy
+                  throw new RuntimeException("Copy failed");
+                }
+              } catch (JSchException e) {
+                e.printStackTrace();
+                throw new RuntimeException(e);
+              } catch (IOException e) {
+                throw new RuntimeException(e);
+              } finally {
+                if (channel.isConnected()) {
+                  channel.disconnect();
+                }
+                try {
+                  filein.close();
+                } catch (IOException e) {
+                }
+              }
+            }
+
+            monitor.done();
+          }
+        });
+  }
+}

+ 202 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DfsPath.java

@@ -0,0 +1,202 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.eclipse.dfs;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStreamReader;
+import java.lang.reflect.InvocationTargetException;
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.eclipse.core.runtime.IAdaptable;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Status;
+import org.eclipse.jface.viewers.StructuredViewer;
+import org.eclipse.jface.viewers.Viewer;
+import org.eclipse.swt.widgets.Display;
+
+import com.jcraft.jsch.ChannelExec;
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.Session;
+
+/**
+ * DFS Path handling for DFS
+ */
+public class DfsPath implements IAdaptable {
+
+  private Session shell;
+
+  private HadoopServer location;
+
+  private String path;
+
+  private final Viewer viewer;
+
+  private DfsPath parent;
+
+  static Logger log = Logger.getLogger(DfsPath.class.getName());
+
+  public DfsPath(HadoopServer location, String path, Viewer viewer) {
+    this.location = location;
+    this.path = path;
+    this.viewer = viewer;
+  }
+
+  protected String getPath() {
+    return this.path;
+  }
+
+  protected ChannelExec exec(String command) throws JSchException {
+    ChannelExec channel = (ChannelExec) getSession().openChannel("exec");
+    channel.setCommand(location.getInstallPath() + "/bin/hadoop " + command);
+    channel.setErrStream(System.err);
+    // channel.connect();
+
+    return channel;
+  }
+
+  protected DfsPath(HadoopServer location, String path, Session shell,
+      Viewer viewer) {
+    this(location, path, viewer);
+
+    this.shell = shell;
+  }
+
+  protected DfsPath(DfsPath parent, String path) {
+    this(parent.location, path, parent.shell, parent.viewer);
+    this.parent = parent;
+  }
+
+  protected Session getSession() throws JSchException {
+    if (shell == null) {
+      // this.shell =
+      // JSchUtilities.createJSch().getSession(location.getUser(),
+      // location.getHostname());
+      this.shell = location.createSession();
+    }
+
+    if (!shell.isConnected()) {
+      shell.connect();
+    }
+
+    return shell;
+  }
+
+  protected void dispose() {
+    if ((this.shell != null) && this.shell.isConnected()) {
+      shell.disconnect();
+    }
+  }
+
+  @Override
+  public String toString() {
+    if ((path.length() < 1) || path.equals("/")) {
+      return "DFS @ " + location.getName();
+    } else {
+      String[] parts = path.split("/");
+      return parts[parts.length - 1];
+    }
+  }
+
+  protected void doExec(final String command) {
+    org.eclipse.core.runtime.jobs.Job job =
+        new org.eclipse.core.runtime.jobs.Job("DFS operation: " + command) {
+          @Override
+          protected IStatus run(IProgressMonitor monitor) {
+            ChannelExec exec = null;
+            monitor.beginTask("Execute remote dfs  command", 100);
+            try {
+              exec = exec(" " + command);
+              monitor.worked(33);
+
+              exec.connect();
+              monitor.worked(33);
+
+              BufferedReader reader =
+                  new BufferedReader(new InputStreamReader(
+                      new BufferedInputStream(exec.getInputStream())));
+              String response = reader.readLine(); // TIDY(jz)
+              monitor.worked(34);
+
+              monitor.done();
+
+              refresh();
+
+              return Status.OK_STATUS;
+            } catch (Exception e) {
+              e.printStackTrace();
+              return new Status(IStatus.ERROR, Activator.PLUGIN_ID, -1,
+                  "DFS operation failed: " + e.getLocalizedMessage(), e);
+            } finally {
+              if (exec != null) {
+                exec.disconnect();
+              }
+            }
+          }
+        };
+
+    job.setUser(true);
+    job.schedule();
+  }
+
+  public void delete() throws JSchException {
+    doExec("dfs " + DfsFolder.s_whichFS + " -rm " + path);
+  }
+
+  public Object getParent() {
+    return parent;
+  }
+
+  public void refresh() {
+    if (parent != null) {
+      parent.doRefresh();
+    } else {
+      doRefresh();
+    }
+  }
+
+  protected void doRefresh() {
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        ((StructuredViewer) viewer).refresh(DfsPath.this);
+      }
+    });
+  }
+
+  public Object getAdapter(Class type) {
+    log.fine(type.toString());
+    return null;
+  }
+
+  /**
+   * Copy the DfsPath to the given local directory
+   * 
+   * @param directory the local directory
+   */
+  public void downloadToLocalDirectory(String directory)
+      throws InvocationTargetException, JSchException, InterruptedException,
+      IOException {
+
+    // Not implemented here; by default, do nothing
+  }
+
+}

+ 58 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/LaunchShortcut.java

@@ -0,0 +1,58 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.launch;
+
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.actions.RunOnHadoopActionDelegate;
+import org.eclipse.core.resources.IResource;
+import org.eclipse.debug.ui.ILaunchShortcut;
+import org.eclipse.jface.viewers.ISelection;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.jface.viewers.StructuredSelection;
+import org.eclipse.ui.IEditorPart;
+import org.eclipse.ui.actions.ActionDelegate;
+
+
+/**
+ * Add a shortcut "Run on Hadoop" to the Run menu
+ */
+
+public class LaunchShortcut implements ILaunchShortcut {
+  static Logger log = Logger.getLogger(LaunchShortcut.class.getName());
+
+  private ActionDelegate delegate = new RunOnHadoopActionDelegate();
+
+  public LaunchShortcut() {
+  }
+
+  public void launch(final ISelection selection, String mode) {
+    if (selection instanceof IStructuredSelection) {
+      delegate.selectionChanged(null, selection);
+      delegate.run(null);
+    }
+  }
+
+  public void launch(final IEditorPart editor, String mode) {
+    delegate.selectionChanged(null, new StructuredSelection(editor
+        .getEditorInput().getAdapter(IResource.class))); // hmm(jz)
+    // :-)
+    delegate.run(null);
+  }
+}

+ 182 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/LocalMapReduceLaunchTabGroup.java

@@ -0,0 +1,182 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.launch;
+
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.debug.core.ILaunchConfiguration;
+import org.eclipse.debug.core.ILaunchConfigurationWorkingCopy;
+import org.eclipse.debug.ui.AbstractLaunchConfigurationTab;
+import org.eclipse.debug.ui.AbstractLaunchConfigurationTabGroup;
+import org.eclipse.debug.ui.CommonTab;
+import org.eclipse.debug.ui.ILaunchConfigurationDialog;
+import org.eclipse.debug.ui.ILaunchConfigurationTab;
+import org.eclipse.jdt.core.IType;
+import org.eclipse.jdt.core.JavaModelException;
+import org.eclipse.jdt.core.dom.AST;
+import org.eclipse.jdt.core.search.SearchEngine;
+import org.eclipse.jdt.debug.ui.launchConfigurations.JavaArgumentsTab;
+import org.eclipse.jdt.debug.ui.launchConfigurations.JavaClasspathTab;
+import org.eclipse.jdt.debug.ui.launchConfigurations.JavaJRETab;
+import org.eclipse.jdt.ui.IJavaElementSearchConstants;
+import org.eclipse.jdt.ui.JavaUI;
+import org.eclipse.jface.dialogs.ProgressMonitorDialog;
+import org.eclipse.jface.window.Window;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Button;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.Event;
+import org.eclipse.swt.widgets.Label;
+import org.eclipse.swt.widgets.Listener;
+import org.eclipse.swt.widgets.Text;
+import org.eclipse.ui.dialogs.SelectionDialog;
+
+/**
+ * 
+ * Handler for Local MapReduce job launches
+ * 
+ * TODO(jz) this may not be needed as we almost always deploy to a remote server
+ * and not locally, where we do do it locally we may just be able to exec
+ * scripts without going to java
+ * 
+ */
+public class LocalMapReduceLaunchTabGroup extends
+    AbstractLaunchConfigurationTabGroup {
+
+  public LocalMapReduceLaunchTabGroup() {
+    // TODO Auto-generated constructor stub
+  }
+
+  public void createTabs(ILaunchConfigurationDialog dialog, String mode) {
+    setTabs(new ILaunchConfigurationTab[] { new MapReduceLaunchTab(),
+        new JavaArgumentsTab(), new JavaJRETab(), new JavaClasspathTab(),
+        new CommonTab() });
+  }
+
+  public static class MapReduceLaunchTab extends AbstractLaunchConfigurationTab {
+    private Text combinerClass;
+
+    private Text reducerClass;
+
+    private Text mapperClass;
+
+    @Override
+    public boolean canSave() {
+      return true;
+    }
+
+    @Override
+    public boolean isValid(ILaunchConfiguration launchConfig) {
+      // todo: only if all classes are of proper types
+      return true;
+    }
+
+    public void createControl(final Composite parent) {
+      Composite panel = new Composite(parent, SWT.NONE);
+      GridLayout layout = new GridLayout(3, false);
+      panel.setLayout(layout);
+
+      Label mapperLabel = new Label(panel, SWT.NONE);
+      mapperLabel.setText("Mapper");
+      mapperClass = new Text(panel, SWT.SINGLE | SWT.BORDER);
+      createRow(parent, panel, mapperClass);
+
+      Label reducerLabel = new Label(panel, SWT.NONE);
+      reducerLabel.setText("Reducer");
+      reducerClass = new Text(panel, SWT.SINGLE | SWT.BORDER);
+      createRow(parent, panel, reducerClass);
+
+      Label combinerLabel = new Label(panel, SWT.NONE);
+      combinerLabel.setText("Combiner");
+      combinerClass = new Text(panel, SWT.SINGLE | SWT.BORDER);
+      createRow(parent, panel, combinerClass);
+
+      panel.pack();
+      setControl(panel);
+    }
+
+    private void createRow(final Composite parent, Composite panel,
+        final Text text) {
+      text.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+      Button button = new Button(panel, SWT.BORDER);
+      button.setText("Browse...");
+      button.addListener(SWT.Selection, new Listener() {
+        public void handleEvent(Event arg0) {
+          try {
+            AST ast = AST.newAST(3);
+
+            SelectionDialog dialog = JavaUI.createTypeDialog(parent.getShell(),
+                new ProgressMonitorDialog(parent.getShell()), SearchEngine
+                    .createWorkspaceScope(),
+                IJavaElementSearchConstants.CONSIDER_CLASSES, false);
+            dialog.setMessage("Select Mapper type (implementing )");
+            dialog.setBlockOnOpen(true);
+            dialog.setTitle("Select Mapper Type");
+            dialog.open();
+
+            if ((dialog.getReturnCode() == Window.OK)
+                && (dialog.getResult().length > 0)) {
+              IType type = (IType) dialog.getResult()[0];
+              text.setText(type.getFullyQualifiedName());
+              setDirty(true);
+            }
+          } catch (JavaModelException e) {
+            // TODO Auto-generated catch block
+            e.printStackTrace();
+          }
+        }
+      });
+    }
+
+    public String getName() {
+      return "Hadoop";
+    }
+
+    public void initializeFrom(ILaunchConfiguration configuration) {
+      try {
+        mapperClass.setText(configuration.getAttribute(
+            "org.apache.hadoop.eclipse.launch.mapper", ""));
+        reducerClass.setText(configuration.getAttribute(
+            "org.apache.hadoop.eclipse.launch.reducer", ""));
+        combinerClass.setText(configuration.getAttribute(
+            "org.apache.hadoop.eclipse.launch.combiner", ""));
+      } catch (CoreException e) {
+        // TODO Auto-generated catch block
+        e.printStackTrace();
+        setErrorMessage(e.getMessage());
+      }
+    }
+
+    public void performApply(ILaunchConfigurationWorkingCopy configuration) {
+      configuration.setAttribute("org.apache.hadoop.eclipse.launch.mapper",
+          mapperClass.getText());
+      configuration.setAttribute(
+          "org.apache.hadoop.eclipse.launch.reducer", reducerClass
+              .getText());
+      configuration.setAttribute(
+          "org.apache.hadoop.eclipse.launch.combiner", combinerClass
+              .getText());
+    }
+
+    public void setDefaults(ILaunchConfigurationWorkingCopy configuration) {
+
+    }
+  }
+}

+ 37 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/MutexRule.java

@@ -0,0 +1,37 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.launch;
+
+import org.eclipse.core.runtime.jobs.ISchedulingRule;
+
+public class MutexRule implements ISchedulingRule {
+  private final String id;
+
+  public MutexRule(String id) {
+    this.id = id;
+  }
+
+  public boolean contains(ISchedulingRule rule) {
+    return (rule instanceof MutexRule) && ((MutexRule) rule).id.equals(id);
+  }
+
+  public boolean isConflicting(ISchedulingRule rule) {
+    return (rule instanceof MutexRule) && ((MutexRule) rule).id.equals(id);
+  }
+}

+ 264 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/SWTUserInfo.java

@@ -0,0 +1,264 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.launch;
+
+import org.eclipse.jface.dialogs.Dialog;
+import org.eclipse.jface.dialogs.MessageDialog;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.Control;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.swt.widgets.Label;
+import org.eclipse.swt.widgets.Shell;
+import org.eclipse.swt.widgets.Text;
+
+import com.jcraft.jsch.UIKeyboardInteractive;
+import com.jcraft.jsch.UserInfo;
+
+/**
+ * Data structure for retaining user login information
+ */
+public abstract class SWTUserInfo implements UserInfo, UIKeyboardInteractive {
+
+  public SWTUserInfo() {
+  }
+
+  public String getPassphrase() {
+    return this.getPassword();
+  }
+
+  public abstract String getPassword();
+
+  public abstract void setPassword(String pass);
+
+  public void setPassphrase(String pass) {
+    this.setPassword(pass);
+  }
+
+  public boolean promptPassphrase(final String arg0) {
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        SWTUserInfo.this.setPassword(new PasswordDialog(null).prompt(arg0));
+      }
+    });
+
+    return this.getPassword() != null;
+  }
+
+  public boolean promptPassword(final String arg0) {
+    // check if password is already set to prevent the second session from
+    // querying again -- eyhung
+    // how to prevent bad passwords?
+    if (this.getPassword() == null) {
+      Display.getDefault().syncExec(new Runnable() {
+        public void run() {
+          Shell parent = Display.getDefault().getActiveShell();
+          String password = new PasswordDialog(parent).prompt(arg0);
+          SWTUserInfo.this.setPassword(password);
+        }
+      });
+    }
+    return this.getPassword() != null;
+  }
+
+  private boolean result;
+
+  public boolean promptYesNo(final String arg0) {
+
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        result =
+            MessageDialog.openQuestion(
+                Display.getDefault().getActiveShell(),
+                "SSH Question Dialog", arg0);
+      }
+    });
+
+    return result;
+  }
+
+  public void showMessage(final String arg0) {
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        MessageDialog.openInformation(null, "SSH Message", arg0);
+      }
+    });
+  }
+
+  private String[] interactiveAnswers;
+
+  /* @inheritDoc */
+  public String[] promptKeyboardInteractive(final String destination,
+      final String name, final String instruction, final String[] prompt,
+      final boolean[] echo) {
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        Shell parent = Display.getDefault().getActiveShell();
+        interactiveAnswers =
+            new KeyboardInteractiveDialog(parent).prompt(destination, name,
+                instruction, prompt, echo);
+      }
+    });
+    return interactiveAnswers;
+  }
+
+  /**
+   * Simple password prompting dialog
+   */
+  public static class PasswordDialog extends Dialog {
+    private Text text;
+
+    private String password;
+
+    private Label title;
+
+    private String message;
+
+    protected PasswordDialog(Shell parentShell) {
+      super(parentShell);
+    }
+
+    public String prompt(String message) {
+      this.setBlockOnOpen(true);
+      this.message = message;
+
+      if (this.open() == OK) {
+        return password;
+      } else {
+        return null;
+      }
+    }
+
+    @Override
+    protected void okPressed() {
+      this.password = text.getText();
+      super.okPressed();
+    }
+
+    @Override
+    protected Control createDialogArea(Composite parent) {
+      Composite panel = (Composite) super.createDialogArea(parent);
+      panel.setLayout(new GridLayout(2, false));
+      panel.setLayoutData(new GridData(GridData.FILL_BOTH));
+
+      title = new Label(panel, SWT.NONE);
+      GridData span2 = new GridData(GridData.FILL_HORIZONTAL);
+      span2.horizontalSpan = 2;
+      title.setLayoutData(span2);
+      title.setText(message);
+
+      getShell().setText(message);
+
+      Label label = new Label(panel, SWT.NONE);
+      label.setText("password");
+
+      text = new Text(panel, SWT.BORDER | SWT.SINGLE);
+      GridData data = new GridData(GridData.FILL_HORIZONTAL);
+      data.grabExcessHorizontalSpace = true;
+      text.setLayoutData(data);
+      text.setEchoChar('*');
+
+      return panel;
+    }
+  }
+
+  /**
+   * Keyboard interactive prompting dialog
+   */
+  public static class KeyboardInteractiveDialog extends Dialog {
+
+    private String destination;
+
+    private String name;
+
+    private String instruction;
+
+    private String[] prompt;
+
+    private boolean[] echo;
+
+    private Text[] text;
+
+    private String[] answer;
+
+    protected KeyboardInteractiveDialog(Shell parentShell) {
+      super(parentShell);
+    }
+
+    public String[] prompt(String destination, String name,
+        String instruction, String[] prompt, boolean[] echo) {
+
+      this.destination = destination;
+      this.name = name;
+      this.instruction = instruction;
+      this.prompt = prompt;
+      this.echo = echo;
+
+      this.setBlockOnOpen(true);
+
+      if (this.open() == OK)
+        return answer;
+      else
+        return null;
+    }
+
+    @Override
+    protected void okPressed() {
+      answer = new String[text.length];
+      for (int i = 0; i < text.length; ++i) {
+        answer[i] = text[i].getText();
+      }
+      super.okPressed();
+    }
+
+    @Override
+    protected Control createDialogArea(Composite parent) {
+      Composite panel = (Composite) super.createDialogArea(parent);
+      panel.setLayout(new GridLayout(2, false));
+      panel.setLayoutData(new GridData(GridData.FILL_BOTH));
+
+      Label title = new Label(panel, SWT.NONE);
+      GridData span2 = new GridData(GridData.FILL_HORIZONTAL);
+      span2.horizontalSpan = 2;
+      title.setLayoutData(span2);
+      title.setText(destination + ": " + name);
+
+      getShell().setText(instruction);
+
+      text = new Text[prompt.length];
+
+      for (int i = 0; i < text.length; ++i) {
+        Label label = new Label(panel, SWT.NONE);
+        label.setText("password");
+
+        text[i] = new Text(panel, SWT.BORDER | SWT.SINGLE);
+        GridData data = new GridData(GridData.FILL_HORIZONTAL);
+        data.grabExcessHorizontalSpace = true;
+        text[i].setLayoutData(data);
+        if (!echo[i])
+          text[i].setEchoChar('*');
+      }
+
+      return panel;
+    }
+  }
+
+}

+ 47 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/StartHadoopLaunchTabGroup.java

@@ -0,0 +1,47 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.launch;
+
+import org.eclipse.debug.ui.AbstractLaunchConfigurationTabGroup;
+import org.eclipse.debug.ui.CommonTab;
+import org.eclipse.debug.ui.ILaunchConfigurationDialog;
+import org.eclipse.debug.ui.ILaunchConfigurationTab;
+import org.eclipse.jdt.debug.ui.launchConfigurations.JavaArgumentsTab;
+import org.eclipse.jdt.debug.ui.launchConfigurations.JavaClasspathTab;
+import org.eclipse.jdt.debug.ui.launchConfigurations.JavaJRETab;
+
+/**
+ * Create the tab group for the dialog window for starting a Hadoop job.
+ */
+
+public class StartHadoopLaunchTabGroup extends
+    AbstractLaunchConfigurationTabGroup {
+
+  public StartHadoopLaunchTabGroup() {
+    // TODO Auto-generated constructor stub
+  }
+
+  /**
+   * TODO(jz) consider the appropriate tabs for this case
+   */
+  public void createTabs(ILaunchConfigurationDialog dialog, String mode) {
+    setTabs(new ILaunchConfigurationTab[] { new JavaArgumentsTab(),
+        new JavaJRETab(), new JavaClasspathTab(), new CommonTab() });
+  }
+}

+ 373 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/StartMapReduceServer.java

@@ -0,0 +1,373 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.launch;
+
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.OutputStream;
+import java.rmi.dgc.VMID;
+import java.util.Map;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.apache.hadoop.eclipse.server.JarModule;
+import org.apache.hadoop.eclipse.servers.ServerRegistry;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Status;
+import org.eclipse.core.runtime.SubProgressMonitor;
+import org.eclipse.debug.core.ILaunch;
+import org.eclipse.debug.core.ILaunchConfiguration;
+import org.eclipse.debug.core.model.ILaunchConfigurationDelegate;
+import org.eclipse.jface.dialogs.MessageDialog;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.graphics.Color;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.ui.console.ConsolePlugin;
+import org.eclipse.ui.console.IConsole;
+import org.eclipse.ui.console.IOConsoleOutputStream;
+import org.eclipse.ui.console.MessageConsole;
+import org.eclipse.ui.console.MessageConsoleStream;
+
+import com.jcraft.jsch.Channel;
+import com.jcraft.jsch.ChannelExec;
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.Session;
+
+/**
+ * Transfer a jar file and run it on the specified MapReduce server.
+ */
+
+public class StartMapReduceServer implements ILaunchConfigurationDelegate {
+
+  private static final Logger log = Logger.getLogger(StartMapReduceServer.class
+      .getName());
+
+  private static final int SSH_FAILED_CODE = 999;
+
+  private static final IStatus SSH_FAILED_STATUS1 = new Status(IStatus.ERROR,
+      Activator.PLUGIN_ID, SSH_FAILED_CODE,
+      "SSH Connection to hadoop server failed", null);
+
+  private static final IStatus SSH_FAILED_STATUS2 = new Status(IStatus.ERROR,
+      Activator.PLUGIN_ID, SSH_FAILED_CODE,
+      "SSH Connection to start SCP failed", null);
+
+  private static final IStatus SSH_FAILED_STATUS3 = new Status(IStatus.ERROR,
+      Activator.PLUGIN_ID, SSH_FAILED_CODE,
+      "SCP Connection to hadoop server failed", null);
+
+  private static final int TIMEOUT = 15000;
+
+  private Color black;
+
+  private Color red;
+
+  public StartMapReduceServer() {
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        black = Display.getDefault().getSystemColor(SWT.COLOR_BLACK);
+        red = Display.getDefault().getSystemColor(SWT.COLOR_RED);
+      }
+    });
+  }
+
+  static int checkAck(InputStream in) throws IOException {
+    int b = in.read();
+    // b may be 0 for success,
+    // 1 for error,
+    // 2 for fatal error,
+    // -1
+    if (b == 0) {
+      return b;
+    }
+    if (b == -1) {
+      log.info("checkAck returned -1");
+      return b;
+    }
+
+    if ((b == 1) || (b == 2)) {
+      StringBuffer sb = new StringBuffer();
+      int c;
+      do {
+        c = in.read();
+        sb.append((char) c);
+      } while (c != '\n');
+
+      if (b == 1) { // error
+        System.out.print(sb.toString());
+      }
+      if (b == 2) { // fatal error
+        System.out.print(sb.toString());
+      }
+    }
+    return b;
+  }
+
+  /**
+   * Send the file and launch the hadoop job.
+   */
+  public void launch(ILaunchConfiguration configuration, String mode,
+      ILaunch launch, IProgressMonitor monitor) throws CoreException {
+    Map attributes = configuration.getAttributes();
+
+    log.log(Level.FINE, "Preparing hadoop launch", configuration);
+
+    String hostname = configuration.getAttribute("hadoop.host", "");
+    int serverid = configuration.getAttribute("hadoop.serverid", 0);
+    String user = configuration.getAttribute("hadoop.user", "");
+    String path = configuration.getAttribute("hadoop.path", "");
+
+    String dir = ensureTrailingSlash(path);
+
+    log.log(Level.FINER, "Computed Server URL", new Object[] { dir, user,
+        hostname });
+
+    HadoopServer server = ServerRegistry.getInstance().getServer(serverid);
+
+    try {
+      Session session = server.createSession();
+      // session.setTimeout(TIMEOUT);
+
+      log.log(Level.FINER, "Connected");
+
+      /*
+       * COMMENTED(jz) removing server start/stop support for now if (!
+       * attributes.containsKey("hadoop.jar")) { // start or stop server if(
+       * server.getServerState() == IServer.STATE_STARTING ) { String command =
+       * dir + "bin/start-all.sh"; execInConsole(session, command); } else if(
+       * server.getServerState() == IServer.STATE_STOPPING ) { String command =
+       * dir + "bin/stop-all.sh"; execInConsole(session, command); } }
+       */
+
+      if (false) {
+      } else {
+        FileInputStream fis = null;
+        String jarFile, remoteFile = null;
+
+        if (attributes.containsKey("hadoop.jar")) {
+          jarFile = (String) attributes.get("hadoop.jar");
+        } else {
+          String memento = (String) attributes.get("hadoop.jarrable");
+          JarModule fromMemento = JarModule.fromMemento(memento);
+          jarFile = fromMemento.buildJar(new SubProgressMonitor(monitor, 100))
+              .toString();
+        }
+
+        if (jarFile.lastIndexOf('/') > 0) {
+          remoteFile = jarFile.substring(jarFile.lastIndexOf('/') + 1);
+        } else if (jarFile.lastIndexOf('\\') > 0) {
+          remoteFile = jarFile.substring(jarFile.lastIndexOf('\\') + 1);
+        }
+
+        // exec 'scp -t -p hadoop.jar' remotely
+
+        String command = "scp -p -t " + remoteFile;
+        Channel channel = session.openChannel("exec");
+        ((ChannelExec) channel).setCommand(command);
+
+        // get I/O streams for remote scp
+        OutputStream out = channel.getOutputStream();
+        final InputStream in = channel.getInputStream();
+
+        channel.connect();
+
+        if (checkAck(in) != 0) {
+          throw new CoreException(SSH_FAILED_STATUS1);
+        }
+
+        // send "C0644 filesize filename", where filename should not
+        // include '/'
+        long filesize = (new File(jarFile)).length();
+        command = "C0644 " + filesize + " ";
+        if (jarFile.lastIndexOf('/') > 0) {
+          command += jarFile.substring(jarFile.lastIndexOf('/') + 1);
+        } else {
+          command += jarFile;
+        }
+
+        command += "\n";
+        out.write(command.getBytes());
+        out.flush();
+        if (checkAck(in) != 0) {
+          throw new CoreException(SSH_FAILED_STATUS2);
+        }
+
+        // send a content of jarFile
+        fis = new FileInputStream(jarFile);
+        byte[] buf = new byte[1024];
+        while (true) {
+          int len = fis.read(buf, 0, buf.length);
+          if (len <= 0) {
+            break;
+          }
+          out.write(buf, 0, len); // out.flush();
+        }
+
+        fis.close();
+        fis = null;
+        // send '\0'
+        buf[0] = 0;
+        out.write(buf, 0, 1);
+        out.flush();
+        if (checkAck(in) != 0) {
+          throw new CoreException(SSH_FAILED_STATUS3);
+        }
+        out.close();
+        channel.disconnect();
+
+        // move the jar file to a temp directory
+        String jarDir = "/tmp/hadoopjar"
+            + new VMID().toString().replace(':', '_');
+        command = "mkdir " + jarDir + ";mv " + remoteFile + " " + jarDir;
+        channel = session.openChannel("exec");
+        ((ChannelExec) channel).setCommand(command);
+        channel.connect();
+        channel.disconnect();
+
+        session.disconnect();
+
+        // we create a new session with a zero timeout to prevent the
+        // console stream
+        // from stalling -- eyhung
+        final Session session2 = server.createSessionNoTimeout();
+
+        // now remotely execute hadoop with the just sent-over jarfile
+        command = dir + "bin/hadoop jar " + jarDir + "/" + remoteFile;
+        log.fine("Running command: " + command);
+        execInConsole(session2, command, jarDir + "/" + remoteFile);
+
+        // the jar file is not deleted anymore, but placed in a temp dir
+        // -- eyhung
+      }
+    } catch (final JSchException e) {
+      e.printStackTrace();
+      Display.getDefault().syncExec(new Runnable() {
+        public void run() {
+          MessageDialog.openError(Display.getDefault().getActiveShell(),
+              "Problems connecting to MapReduce Server", 
+              e.getLocalizedMessage());
+        }
+      });
+    } catch (IOException e) {
+      // TODO Auto-generated catch block
+      e.printStackTrace();
+    }
+  }
+
+  /**
+   * Show the job output in the console.
+   * @param session The SSH session object
+   * @param command The command to run remotely
+   * @param jarFile The jar file containing the classes for the Hadoop job
+   * @throws JSchException
+   */
+  private void execInConsole(final Session session, final String command,
+      final String jarFile) throws JSchException {
+    final ChannelExec channel = (ChannelExec) session.openChannel("exec");
+
+    final MessageConsole console = new MessageConsole("Hadoop: " + command,
+        null);
+    final MessageConsoleStream stream = console.newMessageStream();
+
+    final IOConsoleOutputStream out = console.newOutputStream();
+    final IOConsoleOutputStream err = console.newOutputStream();
+
+    out.setColor(black);
+    err.setColor(red);
+
+    ConsolePlugin.getDefault().getConsoleManager().addConsoles(
+        new IConsole[] { console });
+    ConsolePlugin.getDefault().getConsoleManager().showConsoleView(console);
+
+    channel.setCommand(command);
+    channel.setInputStream(null);
+
+    channel.connect();
+    new Thread() {
+      @Override
+      public void run() {
+        try {
+
+          BufferedReader hadoopOutput = new BufferedReader(
+              new InputStreamReader(channel.getInputStream()));
+
+          String stdoutLine;
+          while ((stdoutLine = hadoopOutput.readLine()) != null) {
+            out.write(stdoutLine);
+            out.write('\n');
+            continue;
+          }
+
+          channel.disconnect();
+
+          // meaningless call meant to prevent console from being
+          // garbage collected -- eyhung
+          console.getName();
+          ChannelExec channel2 = (ChannelExec) session.openChannel("exec");
+          channel2.setCommand("rm -rf "
+              + jarFile.substring(0, jarFile.lastIndexOf("/")));
+          log.fine("Removing temp file "
+              + jarFile.substring(0, jarFile.lastIndexOf("/")));
+          channel2.connect();
+          channel2.disconnect();
+
+        } catch (Exception e) {
+        }
+      }
+    }.start();
+
+    new Thread() {
+      @Override
+      public void run() {
+        try {
+
+          BufferedReader hadoopErr = new BufferedReader(new InputStreamReader(
+              channel.getErrStream()));
+
+          // String stdoutLine;
+          String stderrLine;
+          while ((stderrLine = hadoopErr.readLine()) != null) {
+            err.write(stderrLine);
+            err.write('\n');
+            continue;
+          }
+
+        } catch (Exception e) {
+        }
+      }
+    }.start();
+
+  }
+
+  private String ensureTrailingSlash(String dir) {
+    if (!dir.endsWith("/")) {
+      dir += "/";
+    }
+    return dir;
+  }
+
+}

+ 63 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/preferences/HadoopHomeDirPreferencePage.java

@@ -0,0 +1,63 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.eclipse.preferences;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.eclipse.jface.preference.DirectoryFieldEditor;
+import org.eclipse.jface.preference.FieldEditorPreferencePage;
+import org.eclipse.ui.IWorkbench;
+import org.eclipse.ui.IWorkbenchPreferencePage;
+
+/**
+ * This class represents a preference page that is contributed to the
+ * Preferences dialog. By subclassing <samp>FieldEditorPreferencePage</samp>,
+ * we can use the field support built into JFace that allows us to create a
+ * page that is small and knows how to save, restore and apply itself.
+ * <p>
+ * This page is used to modify preferences only. They are stored in the
+ * preference store that belongs to the main plug-in class. That way,
+ * preferences can be accessed directly via the preference store.
+ */
+
+public class HadoopHomeDirPreferencePage extends FieldEditorPreferencePage
+    implements IWorkbenchPreferencePage {
+
+  public HadoopHomeDirPreferencePage() {
+    super(GRID);
+    setPreferenceStore(Activator.getDefault().getPreferenceStore());
+    setTitle("MapReduce Tools");
+    setDescription("MapReduce Preferences");
+  }
+
+  /**
+   * Creates the field editors. Field editors are abstractions of the common
+   * GUI blocks needed to manipulate various types of preferences. Each field
+   * editor knows how to save and restore itself.
+   */
+  @Override
+  public void createFieldEditors() {
+    addField(new DirectoryFieldEditor(PreferenceConstants.P_PATH,
+        "&Hadoop main directory:", getFieldEditorParent()));
+
+  }
+
+  /* @inheritDoc */
+  public void init(IWorkbench workbench) {
+  }
+
+}

+ 34 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/preferences/PreferenceConstants.java

@@ -0,0 +1,34 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.preferences;
+
+/**
+ * Constant definitions for plug-in preferences
+ */
+public class PreferenceConstants {
+
+  public static final String P_PATH = "pathPreference";
+
+  // public static final String P_BOOLEAN = "booleanPreference";
+  //
+  // public static final String P_CHOICE = "choicePreference";
+  //
+  // public static final String P_STRING = "stringPreference";
+  //	
+}

+ 33 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/preferences/PreferenceInitializer.java

@@ -0,0 +1,33 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.preferences;
+
+import org.eclipse.core.runtime.preferences.AbstractPreferenceInitializer;
+
+/**
+ * Class used to initialize default preference values.
+ */
+public class PreferenceInitializer extends AbstractPreferenceInitializer {
+
+  /* @inheritDoc */
+  @Override
+  public void initializeDefaultPreferences() {
+  }
+
+}

+ 100 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopJob.java

@@ -0,0 +1,100 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.server;
+
+/**
+ * Helper class to pretty-print status for a hadoop job running on a MapReduce server.
+ */
+
+public class HadoopJob {
+  String name;
+  
+  /**
+   * Hadoop Job Id (useful to kill the job)
+   */
+  String jobId;
+
+  boolean completed;
+
+  String totalMaps;
+
+  String totalReduces;
+
+  String completedMaps;
+
+  String completedReduces;
+
+  String mapPercentage;
+
+  String reducePercentage;
+
+  private HadoopServer server;
+
+  public HadoopJob(HadoopServer server) {
+    this.server = server;
+  }
+
+  public void print() {
+    System.out.println("Job name = " + name);
+    System.out.println("Job id = " + jobId);
+    System.out.println("Job total maps = " + totalMaps);
+    System.out.println("Job completed maps = " + completedMaps);
+    System.out.println("Map percentage complete = " + mapPercentage);
+    System.out.println("Job total reduces = " + totalReduces);
+    System.out.println("Job completed reduces = " + completedReduces);
+    System.out.println("Reduce percentage complete = " + reducePercentage);
+    System.out.flush();
+  }
+
+  public String getId() {
+    return this.name;
+  }
+  
+  public String getJobId() {
+    return this.jobId;
+  }
+  
+  public boolean isCompleted() {
+    return this.completed;
+  }
+
+  @Override
+  public boolean equals(Object o) {
+    return (o instanceof HadoopJob) && ((HadoopJob) o).name.equals(name);
+  }
+
+  public String getState() {
+    return (!completed) ? "Running" : "Completed";
+  }
+
+  public String getStatus() {
+    StringBuffer s = new StringBuffer();
+
+    s.append("Maps : " + completedMaps + "/" + totalMaps);
+    s.append(" (" + mapPercentage + ")");
+    s.append("  Reduces : " + completedReduces + "/" + totalReduces);
+    s.append(" (" + reducePercentage + ")");
+
+    return s.toString();
+  }
+
+  public HadoopServer getServer() {
+    return this.server;
+  }
+}

+ 124 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopPathPage.java

@@ -0,0 +1,124 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.server;
+
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.swt.graphics.Image;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.ui.IEditorInput;
+import org.eclipse.ui.IEditorPart;
+import org.eclipse.ui.IEditorSite;
+import org.eclipse.ui.IPropertyListener;
+import org.eclipse.ui.IWorkbenchPartSite;
+import org.eclipse.ui.PartInitException;
+
+public class HadoopPathPage implements IEditorPart {
+
+  public IEditorInput getEditorInput() {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public IEditorSite getEditorSite() {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public void init(IEditorSite site, IEditorInput input)
+      throws PartInitException {
+    // TODO Auto-generated method stub
+
+  }
+
+  public void addPropertyListener(IPropertyListener listener) {
+    // TODO Auto-generated method stub
+
+  }
+
+  public void createPartControl(Composite parent) {
+    // TODO Auto-generated method stub
+
+  }
+
+  public void dispose() {
+    // TODO Auto-generated method stub
+
+  }
+
+  public IWorkbenchPartSite getSite() {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public String getTitle() {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public Image getTitleImage() {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public String getTitleToolTip() {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public void removePropertyListener(IPropertyListener listener) {
+    // TODO Auto-generated method stub
+
+  }
+
+  public void setFocus() {
+    // TODO Auto-generated method stub
+
+  }
+
+  public Object getAdapter(Class adapter) {
+    // TODO Auto-generated method stub
+    return null;
+  }
+
+  public void doSave(IProgressMonitor monitor) {
+    // TODO Auto-generated method stub
+
+  }
+
+  public void doSaveAs() {
+    // TODO Auto-generated method stub
+
+  }
+
+  public boolean isDirty() {
+    // TODO Auto-generated method stub
+    return false;
+  }
+
+  public boolean isSaveAsAllowed() {
+    // TODO Auto-generated method stub
+    return false;
+  }
+
+  public boolean isSaveOnCloseNeeded() {
+    // TODO Auto-generated method stub
+    return false;
+  }
+
+}

+ 683 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopServer.java

@@ -0,0 +1,683 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.server;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.StringReader;
+import java.net.HttpURLConnection;
+import java.net.MalformedURLException;
+import java.net.Socket;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.TreeMap;
+import java.util.Vector;
+import java.util.logging.Logger;
+
+import javax.net.SocketFactory;
+
+import org.apache.hadoop.eclipse.JSchUtilities;
+import org.apache.hadoop.eclipse.launch.SWTUserInfo;
+import org.apache.hadoop.eclipse.servers.ServerRegistry;
+import org.eclipse.core.runtime.CoreException;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.IStatus;
+import org.eclipse.core.runtime.Status;
+import org.eclipse.core.runtime.jobs.Job;
+import org.eclipse.debug.core.DebugPlugin;
+import org.eclipse.debug.core.ILaunchConfiguration;
+import org.eclipse.debug.core.ILaunchConfigurationType;
+import org.eclipse.debug.core.ILaunchConfigurationWorkingCopy;
+import org.eclipse.debug.core.model.ILaunchConfigurationDelegate;
+import org.eclipse.debug.ui.DebugUITools;
+import org.eclipse.swt.widgets.Display;
+
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.Session;
+
+/**
+ * Methods for defining and interacting with a Hadoop MapReduce server
+ */
+
+public class HadoopServer {
+
+  private static final int JOB_TRACKER_PORT = 50030;
+
+  private PingJob ping;
+
+  protected static final long PING_DELAY = 1500;
+
+  /**
+   * Location of Hadoop jars on the server
+   */
+  private String installPath;
+
+  /**
+   * User name to use to connect to the server
+   */
+  private String userName;
+
+  /**
+   * Host name of the hadoop server
+   */
+  private String hostName;
+
+  private String password;
+
+  // state and status - transient
+  private transient String state = "";
+
+  private transient Map<String, HadoopJob> jobs =
+      Collections.synchronizedMap(new TreeMap<String, HadoopJob>());
+
+  private transient List<JarModule> jars =
+      Collections.synchronizedList(new ArrayList<JarModule>());
+
+  /**
+   * User-defined name for the server (set from Eclipse)
+   */
+  private String name;
+
+  // the machine that we are tunneling through to get to the Hadoop server
+
+  /**
+   * Host name of the tunneling machine
+   */
+  private String tunnelHostName;
+
+  /**
+   * User name to use to connect to the tunneling machine
+   */
+  private String tunnelUserName;
+
+  private String tunnelPassword;
+
+  static Logger log = Logger.getLogger(HadoopServer.class.getName());
+
+  public HadoopServer(String uri, String name) {
+    this.name = name;
+
+    String[] hostInfo = uri.split(":");
+    String[] loginInfo = hostInfo[0].split("@");
+
+    installPath = hostInfo[1];
+    userName = loginInfo[0];
+    hostName = loginInfo[1];
+  }
+
+  public HadoopServer(String uri, String name, String tunnelVia,
+      String tunnelUserName) {
+    this(uri, name);
+    this.tunnelHostName = tunnelVia;
+    this.tunnelUserName = tunnelUserName;
+  }
+
+  /**
+   * Create an SSH session with no timeout
+   * 
+   * @return Session object with no timeout
+   * @throws JSchException
+   */
+  public Session createSessionNoTimeout() throws JSchException {
+    return createSession(0);
+  }
+
+  /**
+   * Create an SSH session with no timeout
+   * 
+   * @return Session object with no timeout
+   * @throws JSchException
+   */
+  public Session createSession() throws JSchException {
+    return createSession(0);
+  }
+
+  /**
+   * Creates a SSH session with a specified timeout
+   * 
+   * @param timeout the amount of time before the session expires
+   * @return Returns the created session object representing the SSH session.
+   * @throws JSchException
+   */
+  public Session createSession(int timeout) throws JSchException {
+    if (tunnelHostName == null) {
+      Session session =
+          JSchUtilities.createJSch().getSession(userName, hostName, 22);
+      session.setUserInfo(new SWTUserInfo() {
+        @Override
+        public String getPassword() {
+          return password;
+        }
+
+        @Override
+        public void setPassword(String pass) {
+          HadoopServer.this.password = pass;
+        }
+
+      });
+      if (!session.isConnected()) {
+        try {
+          session.connect();
+        } catch (JSchException jse) {
+          if (jse.getMessage().equals("Auth fail"))
+            this.password = null;
+          throw jse;
+        }
+      }
+
+      return session;
+    } else {
+      createSshTunnel();
+
+      Session session =
+          JSchUtilities.createJSch().getSession(userName, "localhost",
+              tunnelPort);
+      session.setUserInfo(new SWTUserInfo() {
+        @Override
+        public String getPassword() {
+          return HadoopServer.this.password;
+        }
+
+        @Override
+        public void setPassword(String pass) {
+          HadoopServer.this.password = pass;
+        }
+      });
+      if (!session.isConnected()) {
+        try {
+          session.connect();
+        } catch (JSchException jse) {
+          if (jse.getMessage().equals("Auth fail"))
+            this.password = null;
+          throw jse;
+        }
+      }
+      if (timeout > -1) {
+        session.setTimeout(timeout);
+      }
+      return session;
+    }
+  }
+
+  private Session createTunnel(int port) throws JSchException {
+    Session tunnel;
+
+    tunnelPort = -1;
+    for (int i = 0; !((i > 4) || (tunnelPort > -1)); i++) {
+      try {
+        Socket socket = SocketFactory.getDefault().createSocket();
+        socket.bind(null);
+        tunnelPort = socket.getLocalPort();
+        socket.close();
+      } catch (IOException e) {
+        // ignore, retry
+      }
+    }
+
+    if (tunnelPort == -1) {
+      throw new JSchException("No free local port found to bound to");
+    }
+
+    tunnel =
+        JSchUtilities.createJSch().getSession(tunnelUserName,
+            tunnelHostName, 22);
+    tunnel.setTimeout(0);
+    tunnel.setPortForwardingL(tunnelPort, hostName, port);
+    tunnel.setUserInfo(new SWTUserInfo() {
+      @Override
+      public String getPassword() {
+        return tunnelPassword;
+      }
+
+      @Override
+      public void setPassword(String password) {
+        tunnelPassword = password;
+      }
+    });
+    try {
+      tunnel.connect();
+    } catch (JSchException jse) {
+      if (jse.getMessage().equals("Auth fail"))
+        this.tunnelPassword = null;
+      throw jse;
+    }
+
+    return tunnel;
+  }
+
+  private void createSshTunnel() throws JSchException {
+    if ((sshTunnel != null) && sshTunnel.isConnected()) {
+      sshTunnel.disconnect();
+    }
+
+    sshTunnel = createTunnel(22);
+  }
+
+  private void createHttpTunnel(int port) throws JSchException {
+    if ((httpTunnel == null) || !httpTunnel.isConnected()) {
+      httpTunnel = createTunnel(port);
+    }
+  }
+
+  public String getHostName() {
+    if ((tunnelHostName != null) && (tunnelHostName.length() > 0)) {
+      return "localhost";
+    }
+
+    return hostName;
+  }
+
+  public void setHostname(String hostname) {
+    this.hostName = hostname;
+  }
+
+  /**
+   * Gets the path where the hadoop jars are stored.
+   * 
+   * @return String containing the path to the hadoop jars.
+   */
+  public String getInstallPath() {
+    return installPath;
+  }
+
+  /**
+   * Sets the path where the hadoop jars are stored.
+   * 
+   * @param path The directory where the hadoop jars are stored.
+   */
+  public void setPath(String path) {
+    this.installPath = path;
+  }
+
+  public String getUserName() {
+    return userName;
+  }
+
+  public void setUser(String user) {
+    this.userName = user;
+  }
+
+  public String getPassword() {
+    return password;
+  }
+
+  public void setPassword(String password) {
+    log.fine("Server password set to " + password);
+    this.password = password;
+  }
+
+  @Override
+  public String toString() {
+    return this.userName + "@" + this.hostName + ":" + this.installPath;
+  }
+
+  public String getName() {
+    return this.name;
+  }
+
+  /**
+   * Returns the URL for the Job Tracker (default is port 50030)
+   * 
+   * @return URL for the Job Tracker
+   * @throws MalformedURLException
+   */
+  public URL getJobTrackerUrl() throws MalformedURLException {
+    if (tunnelHostName == null) {
+      return new URL("http://" + getHostName() + ":" + JOB_TRACKER_PORT
+          + "/jobtracker.jsp");
+    } else {
+      try {
+        createHttpTunnel(JOB_TRACKER_PORT);
+
+        String port = httpTunnel.getPortForwardingL()[0].split(":")[0];
+        return new URL("http://localhost:" + port + "/jobtracker.jsp");
+      } catch (JSchException e) {
+        // / BUG(jz) -- need to display error here
+        return null;
+      }
+    }
+  }
+
+  public String getState() {
+    return state;
+  }
+
+  public Object[] getChildren() {
+    /*
+     * List all elements that should be present in the Server window (all
+     * servers and all jobs running on each servers)
+     */
+    checkPingJobRunning();
+    Collection<Object> collection =
+        new ArrayList<Object>(this.jobs.values());
+    collection.addAll(jars);
+    return collection.toArray();
+  }
+
+  private synchronized void checkPingJobRunning() {
+    if (ping == null) {
+      ping = new PingJob();
+      ping.setSystem(true);
+      ping.schedule();
+    }
+  }
+
+  private HashSet<IJobListener> jobListeners = new HashSet<IJobListener>();
+
+  private Session sshTunnel;
+
+  private Session httpTunnel;
+
+  private int tunnelPort;
+
+  private int id;
+
+  public void addJobListener(IJobListener l) {
+    jobListeners.add(l);
+  }
+
+  protected void fireJobChanged(HadoopJob job) {
+    for (IJobListener listener : jobListeners) {
+      listener.jobChanged(job);
+    }
+  }
+
+  protected void fireJobAdded(HadoopJob job) {
+    for (IJobListener listener : jobListeners) {
+      listener.jobAdded(job);
+    }
+  }
+
+  protected void fireJarPublishStart(JarModule jar) {
+    for (IJobListener listener : jobListeners) {
+      listener.publishStart(jar);
+    }
+  }
+
+  protected void fireJarPublishDone(JarModule jar) {
+    for (IJobListener listener : jobListeners) {
+      listener.publishDone(jar);
+    }
+  }
+
+  public void runJar(JarModule jar, IProgressMonitor monitor) {
+    log.fine("Run Jar: " + jar);
+    ILaunchConfigurationType launchConfigType =
+        DebugPlugin.getDefault().getLaunchManager()
+            .getLaunchConfigurationType(
+                "org.apache.hadoop.eclipse.launch.StartServer");
+
+    jars.add(jar);
+    fireJarPublishStart(jar);
+
+    try {
+      ILaunchConfiguration[] matchingConfigs =
+          DebugPlugin.getDefault().getLaunchManager()
+              .getLaunchConfigurations(launchConfigType);
+      ILaunchConfiguration launchConfig = null;
+
+      // TODO(jz) allow choosing correct config, for now we're always
+      // going to use the first
+      if (matchingConfigs.length == 1) {
+        launchConfig = matchingConfigs[0];
+      } else {
+        launchConfig =
+            launchConfigType
+                .newInstance(null, DebugPlugin.getDefault()
+                    .getLaunchManager()
+                    .generateUniqueLaunchConfigurationNameFrom(
+                        "Run Hadoop Jar"));
+      }
+
+      ILaunchConfigurationWorkingCopy copy =
+          launchConfig
+              .copy("Run " + jar.getName() + " on " + this.getName());
+
+      // COMMENTED(jz) - perform the jarring in the launch delegate now
+      // copy.setAttribute("hadoop.jar",
+      // jar.buildJar(monitor).toString());
+
+      copy.setAttribute("hadoop.jarrable", jar.toMemento());
+      copy.setAttribute("hadoop.host", this.getHostName());
+      copy.setAttribute("hadoop.user", this.getUserName());
+      copy.setAttribute("hadoop.serverid", this.id);
+      copy.setAttribute("hadoop.path", this.getInstallPath());
+      ILaunchConfiguration saved = copy.doSave();
+
+      // NOTE(jz) became deprecated in 3.3, replaced with getDelegates
+      // (plural) method,
+      // as this new method is marked experimental leaving as-is for now
+      ILaunchConfigurationDelegate delegate =
+          launchConfigType.getDelegate("run");
+      // only support run for now
+      DebugUITools.launch(saved, "run");
+    } catch (CoreException e) {
+      // TODO(jz) autogen
+      e.printStackTrace();
+    } finally {
+      jars.remove(jar);
+      fireJarPublishDone(jar);
+    }
+  }
+
+  public class PingJob extends Job {
+    public PingJob() {
+      super("Get MapReduce server status");
+    }
+
+    @Override
+    protected IStatus run(IProgressMonitor monitor) {
+      HttpURLConnection connection = null;
+
+      try {
+        connection = (HttpURLConnection) getJobTrackerUrl().openConnection();
+        connection.connect();
+
+        String previousState = state;
+
+        if (connection.getResponseCode() == 200) {
+          state = "Started";
+
+          StringBuffer string = new StringBuffer();
+          byte[] buffer = new byte[1024];
+          InputStream in =
+              new BufferedInputStream(connection.getInputStream());
+          int bytes = 0;
+          while ((bytes = in.read(buffer)) != -1) {
+            string.append(new String(buffer, 0, bytes));
+          }
+
+          HadoopJob[] jobData = getJobData(string.toString());
+          for (int i = 0; i < jobData.length; i++) {
+            HadoopJob job = jobData[i];
+            if (jobs.containsKey((job.getId()))) {
+              updateJob(job);
+            } else {
+              addJob(job);
+            }
+          }
+        } else {
+          state = "Stopped";
+        }
+
+        if (!state.equals(previousState)) {
+          ServerRegistry.getInstance().stateChanged(HadoopServer.this);
+        }
+      } catch (Exception e) {
+        state = "Stopped (Connection Error)";
+      }
+
+      schedule(PING_DELAY);
+      return Status.OK_STATUS;
+    }
+  }
+
+  private void updateJob(final HadoopJob data) {
+    jobs.put(data.getId(), data);
+    // TODO(jz) only if it has changed
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        fireJobChanged(data);
+      }
+    });
+  }
+
+  private void addJob(final HadoopJob data) {
+    jobs.put(data.getId(), data);
+
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        fireJobAdded(data);
+      }
+    });
+  }
+
+  /**
+   * Parse the job tracker data to display currently running and completed
+   * jobs.
+   * 
+   * @param jobTrackerHtml The HTML returned from the Job Tracker port
+   * @return an array of Strings that contain job status info
+   */
+  public HadoopJob[] getJobData(String jobTrackerHtml) {
+    try {
+      Vector<HadoopJob> jobsVector = new Vector<HadoopJob>();
+
+      BufferedReader in =
+          new BufferedReader(new StringReader(jobTrackerHtml));
+
+      String inputLine;
+
+      boolean completed = false;
+      while ((inputLine = in.readLine()) != null) {
+        // stop once we reach failed jobs (which are after running and
+        // completed jobs)
+        if (inputLine.indexOf("Failed Jobs") != -1) {
+          break;
+        }
+
+        if (inputLine.indexOf("Completed Jobs") != -1) {
+          completed = true;
+        }
+
+        // skip lines without data (stored in a table)
+        if (!inputLine.startsWith("<tr><td><a")) {
+          // log.debug (" > " + inputLine, verbose);
+          continue;
+        }
+
+        HadoopJob jobData = new HadoopJob(HadoopServer.this);
+
+        String[] values = inputLine.split("</td><td>");
+
+        String jobId = values[0].trim();
+        String realJobId =
+            jobId.substring(jobId.lastIndexOf("_") + 1, jobId
+                .lastIndexOf("_") + 5);
+        String name = values[2].trim();
+        if (name.equals("&nbsp;")) {
+          name = "(untitled)";
+        }
+        jobData.name = name + "(" + realJobId + ")";
+        jobData.jobId = "job_" + realJobId;
+        jobData.completed = completed;
+
+        jobData.mapPercentage = values[3].trim();
+        jobData.totalMaps = values[4].trim();
+        jobData.completedMaps = values[5].trim();
+        jobData.reducePercentage = values[6].trim();
+        jobData.totalReduces = values[7].trim();
+        jobData.completedReduces =
+            values[8].substring(0, values[8].indexOf("<")).trim();
+
+        jobsVector.addElement(jobData);
+      }
+
+      in.close();
+
+      // convert vector to array
+      HadoopJob[] jobArray = new HadoopJob[jobsVector.size()];
+      for (int j = 0; j < jobsVector.size(); j++) {
+        jobArray[j] = jobsVector.elementAt(j);
+      }
+
+      return jobArray;
+    } catch (Exception e) {
+      e.printStackTrace();
+    }
+
+    return null;
+  }
+
+  public void dispose() {
+    if ((sshTunnel != null) && sshTunnel.isConnected()) {
+      sshTunnel.disconnect();
+    }
+
+    if ((httpTunnel != null) && httpTunnel.isConnected()) {
+      httpTunnel.disconnect();
+    }
+  }
+
+  public String getTunnelHostName() {
+    return tunnelHostName;
+  }
+
+  public String getTunnelUserName() {
+    return tunnelUserName;
+  }
+
+  public void setId(int i) {
+    this.id = i;
+  }
+
+  public void setName(String newName) {
+    this.name = newName;
+  }
+
+  public void setURI(String newURI) {
+    String[] hostInfo = newURI.split(":");
+    String[] loginInfo = hostInfo[0].split("@");
+
+    installPath = hostInfo[1];
+    userName = loginInfo[0];
+    hostName = loginInfo[1];
+  }
+  
+  public void setTunnel(String tunnelHostName, String tunnelUserName) {
+    this.tunnelHostName = tunnelHostName;
+    this.tunnelUserName = tunnelUserName;
+  }
+  
+  /**
+   * Returns whether this server uses SSH tunneling or not
+   * @return whether this server uses SSH tunneling or not
+   */
+  public boolean useTunneling() {
+    return (this.tunnelHostName != null);
+  }
+  
+}

+ 33 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/IJobListener.java

@@ -0,0 +1,33 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.server;
+
+/**
+ * Interface for updating/adding jobs to the MapReduce Server view.
+ */
+
+public interface IJobListener {
+  void jobChanged(HadoopJob job);
+
+  void jobAdded(HadoopJob job);
+
+  void publishStart(JarModule jar);
+
+  void publishDone(JarModule jar);
+}

+ 102 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/JarModule.java

@@ -0,0 +1,102 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.server;
+
+import java.io.File;
+import java.util.logging.Logger;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.eclipse.core.resources.IResource;
+import org.eclipse.core.resources.ResourcesPlugin;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.jdt.core.ICompilationUnit;
+import org.eclipse.jdt.core.IJavaElement;
+import org.eclipse.jdt.core.IJavaProject;
+import org.eclipse.jdt.core.IType;
+import org.eclipse.jdt.core.JavaCore;
+import org.eclipse.jdt.ui.jarpackager.IJarExportRunnable;
+import org.eclipse.jdt.ui.jarpackager.JarPackageData;
+
+
+/**
+ * Methods for interacting with the jar file containing the 
+ * Mapper/Reducer/Driver classes for a MapReduce job.
+ */
+
+public class JarModule {
+  static Logger log = Logger.getLogger(JarModule.class.getName());
+
+  private final IResource resource;
+
+  public JarModule(IResource resource) {
+    this.resource = resource;
+  }
+
+  /**
+   * Create the jar file containing all the MapReduce job classes.
+   */
+  public File buildJar(IProgressMonitor monitor) {
+    log.fine("Build jar");
+    JarPackageData jarrer = new JarPackageData();
+
+    jarrer.setExportJavaFiles(true);
+    jarrer.setExportClassFiles(true);
+    jarrer.setExportOutputFolders(true);
+    jarrer.setOverwrite(true);
+
+    Path path;
+
+    try {
+      IJavaProject project = (IJavaProject) resource.getProject().getNature(
+          JavaCore.NATURE_ID); // todo(jz)
+      // check this is the case before letting this method get called
+      Object element = resource.getAdapter(IJavaElement.class);
+      IType type = ((ICompilationUnit) element).findPrimaryType();
+      jarrer.setManifestMainClass(type);
+      path = new Path(new File(Activator.getDefault().getStateLocation()
+          .toFile(), resource.getProject().getName() + "_project_hadoop_"
+          + resource.getName() + "_" + System.currentTimeMillis() + ".jar")
+          .getAbsolutePath());
+      jarrer.setJarLocation(path);
+
+      jarrer.setElements(resource.getProject().members(IResource.FILE));
+      IJarExportRunnable runnable = jarrer.createJarExportRunnable(null);
+      runnable.run(monitor);
+    } catch (Exception e) {
+      e.printStackTrace();
+      throw new RuntimeException(e);
+    }
+
+    return path.toFile();
+  }
+
+  public String getName() {
+    return resource.getProject().getName() + "/" + resource.getName();
+  }
+
+  public static JarModule fromMemento(String memento) {
+    return new JarModule(ResourcesPlugin.getWorkspace().getRoot().findMember(
+        Path.fromPortableString(memento)));
+  }
+
+  public String toMemento() {
+    return resource.getFullPath().toPortableString();
+  }
+}

+ 445 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/DefineHadoopServerLocWizardPage.java

@@ -0,0 +1,445 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.servers;
+
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStreamReader;
+import java.lang.reflect.InvocationTargetException;
+
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.jface.dialogs.IMessageProvider;
+import org.eclipse.jface.dialogs.ProgressMonitorDialog;
+import org.eclipse.jface.operation.IRunnableWithProgress;
+import org.eclipse.jface.wizard.WizardPage;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.events.SelectionEvent;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Button;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.swt.widgets.Event;
+import org.eclipse.swt.widgets.Group;
+import org.eclipse.swt.widgets.Label;
+import org.eclipse.swt.widgets.Listener;
+import org.eclipse.swt.widgets.Text;
+
+import com.jcraft.jsch.ChannelExec;
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.Session;
+
+/**
+ * Wizard for defining the location of a Hadoop server
+ */
+
+public class DefineHadoopServerLocWizardPage extends WizardPage {
+
+  /**
+   * User-defined name for the hadoop server
+   */
+  private Text serverName;
+
+  /**
+   * Host name of the Hadoop server
+   */
+  private Text hostName;
+
+  /**
+   * Location of Hadoop jars on the server
+   */
+  private Text installPath;
+
+  /**
+   * User name to use to connect to the Hadoop server
+   */
+  private Text userName;
+
+  private Button useSSHTunnel;
+
+  /**
+   * User name to use to connect to the tunneling machine
+   */
+  private Text tunnelUserName;
+
+  /**
+   * Host name of the tunneling machine
+   */
+  private Text tunnelHostName;
+
+  /**
+   * HadoopServer instance currently edited by this form (null if we create a
+   * new one)
+   */
+  private HadoopServer editedServer;
+
+  /**
+   * Constructor to create a new Hadoop server
+   */
+  public DefineHadoopServerLocWizardPage() {
+    super("Hadoop Server", "Define Hadoop Server Location", null);
+    this.editedServer = null;
+  }
+
+  /**
+   * Constructor to edit the parameters of an existing Hadoop server
+   * 
+   * @param server
+   */
+  public DefineHadoopServerLocWizardPage(HadoopServer server) {
+    super("Hadoop Server", "Edit Hadoop Server Location", null);
+    this.editedServer = server;
+  }
+
+  /**
+   * Fill the server values from the form values
+   * 
+   * @return
+   */
+  private HadoopServer defineServerFromValues() {
+    String uri =
+        userName.getText() + "@" + hostName.getText() + ":"
+            + installPath.getText();
+
+    if (editedServer == null) {
+      // Create and register the new HadoopServer
+      this.editedServer =
+          new HadoopServer(uri, serverName.getText(), (useSSHTunnel
+              .getSelection()) ? tunnelHostName.getText() : null,
+              (useSSHTunnel.getSelection()) ? tunnelUserName.getText()
+                  : null);
+      ServerRegistry.getInstance().addServer(this.editedServer);
+
+    } else {
+
+      // Update values of the already existing HadoopServer
+      editedServer.setName(this.serverName.getText());
+      editedServer.setURI(uri);
+      if (useSSHTunnel.getSelection())
+        editedServer.setTunnel(tunnelHostName.getText(), tunnelUserName
+            .getText());
+      else
+        editedServer.setTunnel(null, null);
+
+      ServerRegistry.getInstance().stateChanged(this.editedServer);
+    }
+
+    return this.editedServer;
+  }
+
+  /**
+   * Fill the form values from the server instance
+   */
+  private void defineValuesFromServer() {
+    if (this.editedServer == null) {
+      // Setup values for a new empty instance
+      // Do nothing as it may trigger listeners!!!
+      /*
+       * serverName.setText(""); userName.setText(""); hostName.setText("");
+       * installPath.setText(""); useSSHTunnel.setSelection(false);
+       * tunnelHostName.setText(""); tunnelUserName.setText("");
+       */
+    } else {
+      // Setup values from the server instance
+      serverName.setText(editedServer.getName());
+      userName.setText(editedServer.getUserName());
+      hostName.setText(editedServer.getHostName());
+      installPath.setText(editedServer.getInstallPath());
+      if (editedServer.useTunneling()) {
+        useSSHTunnel.setSelection(true);
+        tunnelHostName.setText(editedServer.getTunnelHostName());
+        tunnelUserName.setText(editedServer.getTunnelUserName());
+      } else {
+        useSSHTunnel.setSelection(false);
+        tunnelHostName.setText("");
+        tunnelUserName.setText("");
+      }
+    }
+  }
+
+  /**
+   * Performs any actions appropriate in response to the user having pressed
+   * the Finish button, or refuse if finishing now is not permitted.
+   * 
+   * @return Object containing information about the Hadoop server
+   */
+  public HadoopServer performFinish() {
+    try {
+      return defineServerFromValues();
+    } catch (Exception e) {
+      e.printStackTrace();
+      setMessage("Invalid server location values", IMessageProvider.ERROR);
+      return null;
+    }
+  }
+
+  /**
+   * Validates whether Hadoop exists at the specified server location
+   * 
+   */
+  private void testLocation() {
+    ProgressMonitorDialog dialog = new ProgressMonitorDialog(getShell());
+    dialog.setOpenOnRun(true);
+
+    try {
+      final HadoopServer location = defineServerFromValues();
+
+      try {
+        dialog.run(true, false, new IRunnableWithProgress() {
+          public void run(IProgressMonitor monitor)
+              throws InvocationTargetException, InterruptedException {
+            Session session = null;
+            try {
+              session = location.createSession();
+              try {
+                ChannelExec channel =
+                    (ChannelExec) session.openChannel("exec");
+                channel.setCommand(location.getInstallPath()
+                    + "/bin/hadoop version");
+                BufferedReader response =
+                    new BufferedReader(new InputStreamReader(channel
+                        .getInputStream()));
+                channel.connect();
+                final String versionLine = response.readLine();
+
+                if ((versionLine != null)
+                    && versionLine.startsWith("Hadoop")) {
+                  Display.getDefault().syncExec(new Runnable() {
+                    public void run() {
+                      setMessage("Found " + versionLine,
+                          IMessageProvider.INFORMATION);
+                    }
+                  });
+                } else {
+                  Display.getDefault().syncExec(new Runnable() {
+                    public void run() {
+                      setMessage("No Hadoop Found in this location",
+                          IMessageProvider.WARNING);
+                    }
+                  });
+                }
+              } finally {
+                session.disconnect();
+                location.dispose();
+              }
+            } catch (final JSchException e) {
+              Display.getDefault().syncExec(new Runnable() {
+                public void run() {
+                  System.err.println(e.getMessage());
+                  setMessage("Problems connecting to server: "
+                      + e.getLocalizedMessage(), IMessageProvider.WARNING);
+                }
+              });
+            } catch (final IOException e) {
+              Display.getDefault().syncExec(new Runnable() {
+                public void run() {
+                  setMessage("Problems communicating with server: "
+                      + e.getLocalizedMessage(), IMessageProvider.WARNING);
+                }
+              });
+            } catch (final Exception e) {
+              Display.getDefault().syncExec(new Runnable() {
+                public void run() {
+                  setMessage("Errors encountered connecting to server: "
+                      + e.getLocalizedMessage(), IMessageProvider.WARNING);
+                }
+              });
+            } finally {
+              if (session != null) {
+                session.disconnect();
+              }
+            }
+          }
+
+        });
+      } catch (InvocationTargetException e) {
+        // TODO Auto-generated catch block
+        e.printStackTrace();
+      } catch (InterruptedException e) {
+        // TODO Auto-generated catch block
+        e.printStackTrace();
+      }
+    } catch (Exception e) {
+      setMessage("Invalid server location", IMessageProvider.WARNING);
+      return;
+    }
+  }
+
+  @Override
+  /**
+   * Location is not complete (and finish button not available) until a
+   * hostname is specified.
+   */
+  public boolean isPageComplete() {
+    if (hostName.getText().length() > 0) {
+      return true;
+    } else {
+      return false;
+    }
+  }
+
+  public boolean canFinish() {
+    return isPageComplete();
+  }
+
+  /**
+   * Create the overall wizard
+   */
+  public void createControl(Composite parent) {
+
+    setTitle("Define Hadoop Server Location");
+    setDescription("Define the location of a Hadoop server for running MapReduce applications.");
+    Composite panel = new Composite(parent, SWT.NONE);
+    GridLayout layout = new GridLayout();
+    layout.numColumns = 3;
+    layout.makeColumnsEqualWidth = false;
+    panel.setLayout(layout);
+    panel.setLayoutData(new GridData(GridData.FILL_BOTH));
+
+    Label serverNameLabel = new Label(panel, SWT.NONE);
+    serverNameLabel.setText("&Server name:");
+
+    serverName = new Text(panel, SWT.SINGLE | SWT.BORDER);
+    GridData data = new GridData(GridData.FILL_HORIZONTAL);
+    serverName.setLayoutData(data);
+    serverName.setText("Hadoop Server");
+
+    new Label(panel, SWT.NONE).setText(" ");
+
+    // serverName.addModifyListener(this);
+
+    Label hostNameLabel = new Label(panel, SWT.NONE);
+    hostNameLabel.setText("&Hostname:");
+
+    hostName = new Text(panel, SWT.SINGLE | SWT.BORDER);
+    hostName.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+    hostName.addListener(SWT.Modify, new Listener() {
+      public void handleEvent(Event e) {
+        refreshButtons();
+      }
+
+      public void widgetDefaultSelected(SelectionEvent e) {
+      }
+    });
+
+    // COMMENTED(jz) seems to cause issues on my version of eclipse
+    // hostName.setText(server.getHost());
+
+    // hostName.addModifyListener(this);
+
+    new Label(panel, SWT.NONE).setText(" ");
+
+    Label installationPathLabel = new Label(panel, SWT.NONE);
+    installationPathLabel.setText("&Installation directory:");
+
+    installPath = new Text(panel, SWT.SINGLE | SWT.BORDER);
+    installPath.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+
+    // installationPath.addModifyListener(this);
+
+    new Label(panel, SWT.NONE).setText(" ");
+
+    Label usernameLabel = new Label(panel, SWT.NONE);
+    usernameLabel.setText("&Username:");
+
+    // new Label(panel, SWT.NONE).setText(" ");
+
+    userName = new Text(panel, SWT.SINGLE | SWT.BORDER);
+    userName.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+    // username.addModifyListener(this);
+
+    Label spacer = new Label(panel, SWT.NONE);
+    spacer.setText(" ");
+
+    Label spacer2 = new Label(panel, SWT.NONE);
+    spacer2.setText(" ");
+
+    /*
+     * Label label = new Label(panel, SWT.NONE); GridData data2 = new
+     * GridData(); data2.horizontalSpan = 2; label.setLayoutData(data2);
+     * label.setText("Example: user@host:/path/to/hadoop");
+     */
+
+    Group sshTunnelGroup = new Group(panel, SWT.NONE);
+    GridData gridData = new GridData(GridData.FILL_HORIZONTAL);
+    gridData.horizontalSpan = 3;
+    sshTunnelGroup.setLayoutData(gridData);
+    sshTunnelGroup.setLayout(new GridLayout(2, false));
+    useSSHTunnel = new Button(sshTunnelGroup, SWT.CHECK);
+    useSSHTunnel.setText("Tunnel Connections");
+    GridData span2 = new GridData(GridData.FILL_HORIZONTAL);
+    span2.horizontalSpan = 2;
+    useSSHTunnel.setLayoutData(span2);
+    Label label = new Label(sshTunnelGroup, SWT.NONE);
+    label.setText("Tunnel via");
+    tunnelHostName = new Text(sshTunnelGroup, SWT.BORDER | SWT.SINGLE);
+    tunnelHostName.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+
+    Label label2 = new Label(sshTunnelGroup, SWT.NONE);
+    label2.setText("Tunnel username");
+    tunnelUserName = new Text(sshTunnelGroup, SWT.BORDER | SWT.SINGLE);
+    tunnelUserName.setLayoutData(new GridData(GridData.FILL_HORIZONTAL));
+
+    Listener refreshButtonsListener = new Listener() {
+      public void handleEvent(Event event) {
+        refreshButtons();
+      }
+    };
+    useSSHTunnel.addListener(SWT.Selection, refreshButtonsListener);
+    tunnelHostName.setEnabled(useSSHTunnel.getSelection());
+    tunnelUserName.setEnabled(useSSHTunnel.getSelection());
+
+    ((GridLayout) sshTunnelGroup.getLayout()).marginBottom = 20;
+
+    Label label4 = new Label(panel, SWT.NONE);
+    GridData span3 = new GridData(GridData.FILL_HORIZONTAL);
+    span3.horizontalSpan = 3;
+    label4.setLayoutData(span3);
+
+    final Button validate = new Button(panel, SWT.NONE);
+    validate.setText("&Validate location");
+    validate.addListener(SWT.Selection, new Listener() {
+      public void handleEvent(Event e) {
+        testLocation();
+      }
+
+      public void widgetDefaultSelected(SelectionEvent e) {
+      }
+    });
+
+    new Label(panel, SWT.NONE).setText(" ");
+
+    setControl(panel);
+
+    defineValuesFromServer();
+  }
+
+  public void refreshButtons() {
+    if (useSSHTunnel == null)
+      return;
+
+    if (tunnelHostName != null)
+      tunnelHostName.setEnabled(useSSHTunnel.getSelection());
+    if (tunnelUserName != null)
+      tunnelUserName.setEnabled(useSSHTunnel.getSelection());
+
+    getContainer().updateButtons();
+  }
+}

+ 75 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/HadoopServerSelectionListContentProvider.java

@@ -0,0 +1,75 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.servers;
+
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.eclipse.jface.viewers.IContentProvider;
+import org.eclipse.jface.viewers.ILabelProviderListener;
+import org.eclipse.jface.viewers.IStructuredContentProvider;
+import org.eclipse.jface.viewers.ITableLabelProvider;
+import org.eclipse.jface.viewers.Viewer;
+import org.eclipse.swt.graphics.Image;
+
+
+/**
+ * Provider that enables selection of a predefined Hadoop server.
+ */
+
+public class HadoopServerSelectionListContentProvider implements
+    IContentProvider, ITableLabelProvider, IStructuredContentProvider {
+  public void dispose() {
+
+  }
+
+  public void inputChanged(Viewer viewer, Object oldInput, Object newInput) {
+
+  }
+
+  public Image getColumnImage(Object element, int columnIndex) {
+    return null;
+  }
+
+  public String getColumnText(Object element, int columnIndex) {
+    if (element instanceof HadoopServer) {
+      if (columnIndex == 0) {
+        return ((HadoopServer) element).getName();
+      } else if (columnIndex == 1) {
+        return ((HadoopServer) element).toString();
+      }
+    }
+
+    return element.toString();
+  }
+
+  public void addListener(ILabelProviderListener listener) {
+
+  }
+
+  public boolean isLabelProperty(Object element, String property) {
+    return false;
+  }
+
+  public void removeListener(ILabelProviderListener listener) {
+
+  }
+
+  public Object[] getElements(Object inputElement) {
+    return ServerRegistry.getInstance().getServers().toArray();
+  }
+}

+ 28 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/IHadoopServerListener.java

@@ -0,0 +1,28 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.servers;
+
+import org.apache.hadoop.eclipse.server.HadoopServer;
+
+/**
+ * Interface for monitoring server changes
+ */
+public interface IHadoopServerListener {
+  void serverChanged(HadoopServer location, int type);
+}

+ 235 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/RunOnHadoopWizard.java

@@ -0,0 +1,235 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.servers;
+
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.apache.hadoop.eclipse.server.JarModule;
+import org.eclipse.core.runtime.IProgressMonitor;
+import org.eclipse.jface.viewers.TableViewer;
+import org.eclipse.jface.wizard.Wizard;
+import org.eclipse.jface.wizard.WizardPage;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.events.SelectionEvent;
+import org.eclipse.swt.events.SelectionListener;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.layout.GridLayout;
+import org.eclipse.swt.widgets.Button;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.Label;
+import org.eclipse.swt.widgets.Table;
+import org.eclipse.swt.widgets.TableColumn;
+
+/**
+ * Wizard for publishing a job to a Hadoop server.
+ */
+
+public class RunOnHadoopWizard extends Wizard implements SelectionListener {
+
+  private DefineHadoopServerLocWizardPage createNewPage;
+
+  private MainPage mainPage;
+
+  private final JarModule jar;
+
+  private boolean complete = false;
+
+  private IProgressMonitor progressMonitor;
+
+  public RunOnHadoopWizard(JarModule jar) {
+    this.jar = jar;
+    setForcePreviousAndNextButtons(true);
+    setNeedsProgressMonitor(true);
+    setWindowTitle("Run on Hadoop");
+  }
+
+  @Override
+  public void addPages() {
+    super.addPages();
+    mainPage = new MainPage();
+    addPage(mainPage);
+    createNewPage = new DefineHadoopServerLocWizardPage();
+    addPage(createNewPage);
+  }
+
+  @Override
+  /**
+   * Performs any actions appropriate in response to the user having pressed
+   * the Finish button, or refuse if finishing now is not permitted.
+   */
+  public boolean performFinish() {
+    HadoopServer location = null;
+    if (mainPage.createNew.getSelection()) {
+      location = createNewPage.performFinish();
+    } else if (mainPage.table.getSelection().length == 1) {
+      location = (HadoopServer) mainPage.table.getSelection()[0].getData();
+    }
+
+    if (location != null) {
+      location.runJar(jar, progressMonitor);
+
+      return true;
+    }
+
+    return false;
+  }
+
+  public void refreshButtons() {
+    getContainer().updateButtons();
+  }
+
+  @Override
+  /**
+   * Allows finish when an existing server is selected or when a new server
+   * location is defined
+   */
+  public boolean canFinish() {
+
+    if (mainPage.chooseExisting.getSelection()
+        && (mainPage.table.getSelectionCount() > 0)) {
+      return true;
+    } else {
+      return (createNewPage.isPageComplete());
+      // check first
+    }
+  }
+
+  public class MainPage extends WizardPage {
+
+    private Button createNew;
+
+    private Table table;
+
+    public Button chooseExisting;
+
+    public MainPage() {
+      super("Select or define server to run on");
+      setTitle("Select Hadoop Server");
+      setDescription("Select a Hadoop Server to run on.");
+    }
+
+    @Override
+    public boolean canFlipToNextPage() {
+      return createNew.getSelection();
+    }
+
+    public void createControl(Composite parent) {
+      Composite control = new Composite(parent, SWT.NONE);
+      control.setLayout(new GridLayout(4, false));
+
+      Label label = new Label(control, SWT.FILL);
+      label.setText("Select a Hadoop Server to run on.");
+      GridData data = new GridData(GridData.FILL_BOTH);
+      data.grabExcessVerticalSpace = false;
+      data.horizontalSpan = 4;
+      label.setLayoutData(data);
+
+      createNew = new Button(control, SWT.RADIO);
+      createNew.setText("Define a new Hadoop server location");
+      createNew.setLayoutData(data);
+      createNew.addSelectionListener(RunOnHadoopWizard.this);
+
+      createNew.setSelection(true);
+
+      chooseExisting = new Button(control, SWT.RADIO);
+      chooseExisting
+          .setText("Choose an existing server from the list below");
+      chooseExisting.setLayoutData(data);
+      chooseExisting.addSelectionListener(RunOnHadoopWizard.this);
+
+      chooseExisting.addSelectionListener(new SelectionListener() {
+
+        public void widgetSelected(SelectionEvent e) {
+          if (chooseExisting.getSelection()
+              && (table.getSelectionCount() == 0)) {
+            if (table.getItems().length > 0) {
+              table.setSelection(0);
+            }
+          }
+        }
+
+        public void widgetDefaultSelected(SelectionEvent e) {
+        }
+
+      });
+
+      Composite serverList = new Composite(control, SWT.NONE);
+      GridData span = new GridData(GridData.FILL_BOTH);
+      span.horizontalSpan = 4;
+      serverList.setLayoutData(span);
+      GridLayout layout = new GridLayout(4, false);
+      layout.marginTop = 12;
+      serverList.setLayout(layout);
+
+      table =
+          new Table(serverList, SWT.SINGLE | SWT.H_SCROLL | SWT.V_SCROLL
+              | SWT.FULL_SELECTION);
+      table.setHeaderVisible(true);
+      table.setLinesVisible(true);
+      GridData d = new GridData(GridData.FILL_HORIZONTAL);
+      d.horizontalSpan = 4;
+      d.heightHint = 300;
+      table.setLayoutData(d);
+
+      TableColumn nameColumn = new TableColumn(table, SWT.SINGLE);
+      nameColumn.setText("Name");
+      nameColumn.setWidth(160);
+
+      TableColumn hostColumn = new TableColumn(table, SWT.SINGLE);
+      hostColumn.setText("Location");
+      hostColumn.setWidth(200);
+
+      table.addSelectionListener(new SelectionListener() {
+        public void widgetSelected(SelectionEvent e) {
+          chooseExisting.setSelection(true);
+          createNew.setSelection(false); // shouldnt be necessary,
+          // but got a visual bug once
+
+          refreshButtons();
+        }
+
+        public void widgetDefaultSelected(SelectionEvent e) {
+
+        }
+      });
+
+      TableViewer viewer = new TableViewer(table);
+      HadoopServerSelectionListContentProvider provider =
+          new HadoopServerSelectionListContentProvider();
+      viewer.setContentProvider(provider);
+      viewer.setLabelProvider(provider);
+      viewer.setInput(new Object()); // don't care, get from singleton
+      // server registry
+
+      setControl(control);
+    }
+  }
+
+  public void widgetDefaultSelected(SelectionEvent e) {
+    // TODO Auto-generated method stub
+
+  }
+
+  public void widgetSelected(SelectionEvent e) {
+    refreshButtons();
+  }
+
+  public void setProgressMonitor(IProgressMonitor progressMonitor) {
+    this.progressMonitor = progressMonitor;
+  }
+}

+ 229 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/servers/ServerRegistry.java

@@ -0,0 +1,229 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.servers;
+
+import java.io.BufferedReader;
+import java.io.BufferedWriter;
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.FileReader;
+import java.io.FileWriter;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+
+/**
+ * Registry for storing Hadoop Servers
+ */
+public class ServerRegistry {
+
+  private static final ServerRegistry INSTANCE = new ServerRegistry();
+
+  public static final int SERVER_ADDED = 0;
+
+  public static final int SERVER_REMOVED = 1;
+
+  public static final int SERVER_STATE_CHANGED = 2;
+
+  private ServerRegistry() {
+  }
+
+  private List<HadoopServer> servers;
+
+  private Set<IHadoopServerListener> listeners =
+      new HashSet<IHadoopServerListener>();
+
+  public static ServerRegistry getInstance() {
+    return INSTANCE;
+  }
+
+  public List<HadoopServer> getServers() {
+    return Collections.unmodifiableList(getServersInternal());
+  }
+
+  /**
+   * Returns the list of currently defined servers. The list is read from the
+   * file if it is not in memory.
+   * 
+   * @return the list of hadoop servers
+   */
+  private List<HadoopServer> getServersInternal() {
+
+    if (servers == null) {
+      servers = new ArrayList<HadoopServer>();
+
+      File store =
+          Activator.getDefault().getStateLocation().append("SERVERS.txt")
+              .toFile();
+
+      if (!store.exists()) {
+        try {
+          store.createNewFile();
+        } catch (IOException e) {
+          // pretty fatal error here - we cant save or restore
+          throw new RuntimeException(e);
+        }
+      }
+
+      BufferedReader reader = null;
+      try {
+        reader = new BufferedReader(new FileReader(store));
+        String line;
+        while ((line = reader.readLine()) != null) {
+          try {
+            String[] parts = line.split("\t");
+            if (parts.length == 1) {
+              String location = parts[0];
+              parts = new String[] { location, "Hadoop Server" };
+            }
+
+            if (parts.length > 2) {
+              servers.add(new HadoopServer(parts[0], parts[1], parts[2],
+                  parts[3]));
+            } else {
+              servers.add(new HadoopServer(parts[0], parts[1]));
+            }
+
+            servers.get(servers.size() - 1).setId(servers.size() - 1);
+
+          } catch (Exception e) {
+            // TODO(jz) show message and ignore - still want rest of
+            // servers if we can get them
+            e.printStackTrace();
+          }
+        }
+      } catch (FileNotFoundException e) {
+        e.printStackTrace();
+      } catch (IOException e) {
+        // TODO(jz) show message and ignore - may have corrupt
+        // configuration
+        e.printStackTrace();
+      } finally {
+        if (reader != null) {
+          try {
+            reader.close();
+          } catch (IOException e) {
+            /* nothing we can do */
+          }
+        }
+      }
+    }
+
+    return servers;
+  }
+
+  public synchronized void removeServer(HadoopServer server) {
+    getServersInternal().remove(server);
+    fireListeners(server, SERVER_REMOVED);
+    save();
+  }
+
+  public synchronized void addServer(HadoopServer server) {
+    getServersInternal().add(server);
+    fireListeners(server, SERVER_ADDED);
+    save();
+  }
+
+  /**
+   * Save the list of servers to the plug-in configuration file, currently
+   * SERVERS.txt in
+   * <workspace-dir>/.metadata/.plugins/org.apache.hadoop.eclipse/SERVERS.txt
+   */
+  private synchronized void save() {
+    File store =
+        Activator.getDefault().getStateLocation().append("SERVERS.txt")
+            .toFile();
+    BufferedWriter writer = null;
+
+    if (!store.exists()) {
+      try {
+        store.createNewFile();
+      } catch (IOException e) {
+        // pretty fatal error here - we can't save or restore
+        throw new RuntimeException(e);
+      }
+    }
+
+    try {
+      writer = new BufferedWriter(new FileWriter(store));
+      int i = 0;
+      for (HadoopServer server : servers) {
+        server.setId(i++);
+        writer.append(server.toString() + "\t" + server.getName());
+        if (server.getTunnelHostName() != null) {
+          writer.append("\t" + server.getTunnelHostName() + "\t"
+              + server.getTunnelUserName());
+        }
+        writer.newLine();
+      }
+    } catch (IOException e) {
+      // TODO(jz) show error message
+      e.printStackTrace();
+    } finally {
+      if (writer != null) {
+        try {
+          writer.close();
+        } catch (IOException e) {
+          /* nothing we can do */
+        }
+      }
+    }
+  }
+
+  public void addListener(IHadoopServerListener l) {
+    synchronized (listeners) {
+      listeners.add(l);
+    }
+  }
+
+  private void fireListeners(HadoopServer location, int kind) {
+    synchronized (listeners) {
+      for (IHadoopServerListener listener : listeners) {
+        listener.serverChanged(location, kind);
+      }
+    }
+  }
+
+  public void stateChanged(HadoopServer job) {
+    fireListeners(job, SERVER_STATE_CHANGED);
+  }
+
+  public void removeListener(IHadoopServerListener l) {
+    synchronized (listeners) {
+      listeners.remove(l);
+    }
+  }
+
+  public void dispose() {
+    for (HadoopServer server : getServers()) {
+      server.dispose();
+    }
+  }
+
+  public HadoopServer getServer(int serverid) {
+    return servers.get(serverid);
+  }
+
+}

+ 383 - 0
src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/view/servers/ServerView.java

@@ -0,0 +1,383 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.eclipse.view.servers;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.hadoop.eclipse.Activator;
+import org.apache.hadoop.eclipse.actions.EditServerAction;
+import org.apache.hadoop.eclipse.actions.NewServerAction;
+import org.apache.hadoop.eclipse.server.HadoopJob;
+import org.apache.hadoop.eclipse.server.HadoopServer;
+import org.apache.hadoop.eclipse.server.IJobListener;
+import org.apache.hadoop.eclipse.server.JarModule;
+import org.apache.hadoop.eclipse.servers.IHadoopServerListener;
+import org.apache.hadoop.eclipse.servers.ServerRegistry;
+import org.eclipse.core.runtime.FileLocator;
+import org.eclipse.core.runtime.Path;
+import org.eclipse.debug.internal.ui.DebugPluginImages;
+import org.eclipse.debug.ui.IDebugUIConstants;
+import org.eclipse.jface.action.Action;
+import org.eclipse.jface.action.IAction;
+import org.eclipse.jface.resource.ImageDescriptor;
+import org.eclipse.jface.viewers.IContentProvider;
+import org.eclipse.jface.viewers.ILabelProviderListener;
+import org.eclipse.jface.viewers.ISelection;
+import org.eclipse.jface.viewers.IStructuredContentProvider;
+import org.eclipse.jface.viewers.IStructuredSelection;
+import org.eclipse.jface.viewers.ITableLabelProvider;
+import org.eclipse.jface.viewers.ITreeContentProvider;
+import org.eclipse.jface.viewers.ITreeSelection;
+import org.eclipse.jface.viewers.TreeViewer;
+import org.eclipse.jface.viewers.Viewer;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.graphics.Image;
+import org.eclipse.swt.layout.GridData;
+import org.eclipse.swt.widgets.Composite;
+import org.eclipse.swt.widgets.Display;
+import org.eclipse.swt.widgets.Tree;
+import org.eclipse.swt.widgets.TreeColumn;
+import org.eclipse.ui.IViewSite;
+import org.eclipse.ui.PartInitException;
+import org.eclipse.ui.actions.ActionFactory;
+import org.eclipse.ui.part.ViewPart;
+
+import com.jcraft.jsch.Channel;
+import com.jcraft.jsch.ChannelExec;
+import com.jcraft.jsch.JSchException;
+import com.jcraft.jsch.Session;
+
+/**
+ * Code for displaying/updating the MapReduce Servers view panel
+ */
+public class ServerView extends ViewPart implements IContentProvider,
+    IStructuredContentProvider, ITreeContentProvider, ITableLabelProvider,
+    IJobListener, IHadoopServerListener {
+
+  /**
+   * This object is the root content for this content provider
+   */
+  private static final Object CONTENT_ROOT = new Object();
+
+  private final IAction DELETE = new DeleteAction();
+
+  private final IAction PROPERTIES = new EditServerAction(this);
+
+  private final IAction NEWSERVER = new NewServerAction();
+
+  private Map<String, Image> images = new HashMap<String, Image>();
+
+  private TreeViewer viewer;
+
+  public ServerView() {
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void init(IViewSite site) throws PartInitException {
+    super.init(site);
+
+    try {
+      images.put("hadoop", ImageDescriptor.createFromURL(
+          (FileLocator.toFileURL(FileLocator.find(Activator.getDefault()
+              .getBundle(), new Path("resources/hadoop_small.gif"), null))))
+          .createImage(true));
+      images.put("job", ImageDescriptor.createFromURL(
+          (FileLocator.toFileURL(FileLocator.find(Activator.getDefault()
+              .getBundle(), new Path("resources/job.gif"), null))))
+          .createImage(true));
+    } catch (IOException e) {
+      e.printStackTrace();
+    }
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void dispose() {
+    for (String key : images.keySet()) {
+      if (images.containsKey(key))
+        ((Image) images.get(key)).dispose();
+    }
+
+    ServerRegistry.getInstance().removeListener(this);
+
+    images.clear();
+  }
+
+  /**
+   * Creates the columns for the view
+   */
+  @Override
+  public void createPartControl(Composite parent) {
+    Tree main =
+        new Tree(parent, SWT.SINGLE | SWT.FULL_SELECTION | SWT.H_SCROLL
+            | SWT.V_SCROLL);
+    main.setHeaderVisible(true);
+    main.setLinesVisible(false);
+    main.setLayoutData(new GridData(GridData.FILL_BOTH));
+
+    TreeColumn serverCol = new TreeColumn(main, SWT.SINGLE);
+    serverCol.setText("Server");
+    serverCol.setWidth(185);
+    serverCol.setResizable(true);
+
+    TreeColumn locationCol = new TreeColumn(main, SWT.SINGLE);
+    locationCol.setText("Location");
+    locationCol.setWidth(185);
+    locationCol.setResizable(true);
+
+    TreeColumn stateCol = new TreeColumn(main, SWT.SINGLE);
+    stateCol.setText("State");
+    stateCol.setWidth(95);
+    stateCol.setResizable(true);
+
+    TreeColumn statusCol = new TreeColumn(main, SWT.SINGLE);
+    statusCol.setText("Status");
+    statusCol.setWidth(300);
+    statusCol.setResizable(true);
+
+    viewer = new TreeViewer(main);
+    viewer.setContentProvider(this);
+    viewer.setLabelProvider(this);
+    viewer.setInput(CONTENT_ROOT); // dont care
+
+    getViewSite().setSelectionProvider(viewer);
+    getViewSite().getActionBars().setGlobalActionHandler(
+        ActionFactory.DELETE.getId(), DELETE);
+
+    getViewSite().getActionBars().getToolBarManager().add(PROPERTIES);
+
+    // getViewSite().getActionBars().getToolBarManager().add(new
+    // StartAction());
+    getViewSite().getActionBars().getToolBarManager().add(NEWSERVER);
+  }
+
+  // NewServerAction moved to actions package for cheat sheet access --
+  // eyhung
+
+  public class DeleteAction extends Action {
+    @Override
+    public void run() {
+      ISelection selection =
+          getViewSite().getSelectionProvider().getSelection();
+      if ((selection != null) && (selection instanceof IStructuredSelection)) {
+        Object selItem =
+            ((IStructuredSelection) selection).getFirstElement();
+
+        if (selItem instanceof HadoopServer) {
+          HadoopServer location = (HadoopServer) selItem;
+          ServerRegistry.getInstance().removeServer(location);
+
+        } else if (selItem instanceof HadoopJob) {
+
+          // kill the job
+          HadoopJob job = (HadoopJob) selItem;
+          HadoopServer server = job.getServer();
+          String jobId = job.getJobId();
+
+          if (job.isCompleted())
+            return;
+
+          try {
+            Session session = server.createSession();
+
+            String command =
+                server.getInstallPath() + "/bin/hadoop job -kill " + jobId;
+            Channel channel = session.openChannel("exec");
+            ((ChannelExec) channel).setCommand(command);
+            channel.connect();
+            channel.disconnect();
+
+            session.disconnect();
+          } catch (JSchException e) {
+            e.printStackTrace();
+          }
+        }
+      }
+    }
+  }
+
+  public static class StartAction extends Action {
+    public StartAction() {
+      setText("Start");
+
+      // NOTE(jz) - all below from internal api, worst case no images
+      setImageDescriptor(DebugPluginImages
+          .getImageDescriptor(IDebugUIConstants.IMG_ACT_RUN));
+    }
+  }
+
+  /* @inheritDoc */
+  @Override
+  public void setFocus() {
+
+  }
+
+  /* @inheritDoc */
+  public void serverChanged(HadoopServer location, int type) {
+    Display.getDefault().syncExec(new Runnable() {
+      public void run() {
+        ServerView.this.viewer.refresh();
+      }
+    });
+  }
+
+  /* @inheritDoc */
+  public void inputChanged(final Viewer viewer, Object oldInput,
+      Object newInput) {
+    if (oldInput == CONTENT_ROOT)
+      ServerRegistry.getInstance().removeListener(this);
+    if (newInput == CONTENT_ROOT)
+      ServerRegistry.getInstance().addListener(this);
+  }
+
+  /* @inheritDoc */
+  public Object[] getElements(Object inputElement) {
+    return ServerRegistry.getInstance().getServers().toArray();
+  }
+
+  /* @inheritDoc */
+  public Object[] getChildren(Object parentElement) {
+    if (parentElement instanceof HadoopServer) {
+      ((HadoopServer) parentElement).addJobListener(this);
+
+      return ((HadoopServer) parentElement).getChildren();
+    }
+
+    return null;
+  }
+
+  /* @inheritDoc */
+  public Object getParent(Object element) {
+    if (element instanceof HadoopServer) {
+      return CONTENT_ROOT;
+    } else if (element instanceof HadoopJob) {
+      return ((HadoopJob) element).getServer();
+    }
+    return null;
+  }
+
+  /* @inheritDoc */
+  public boolean hasChildren(Object element) {
+    /* Only server entries have children */
+    return (element instanceof HadoopServer);
+  }
+
+  /* @inheritDoc */
+  public void addListener(ILabelProviderListener listener) {
+    // no listeners handling
+  }
+
+  public boolean isLabelProperty(Object element, String property) {
+    return false;
+  }
+
+  /* @inheritDoc */
+  public void removeListener(ILabelProviderListener listener) {
+    // no listener handling
+  }
+
+  /* @inheritDoc */
+  public Image getColumnImage(Object element, int columnIndex) {
+    if ((columnIndex == 0) && (element instanceof HadoopServer)) {
+      return images.get("hadoop");
+    } else if ((columnIndex == 0) && (element instanceof HadoopJob)) {
+      return images.get("job");
+    }
+    return null;
+  }
+
+  /* @inheritDoc */
+  public String getColumnText(Object element, int columnIndex) {
+    if (element instanceof HadoopServer) {
+      HadoopServer server = (HadoopServer) element;
+
+      switch (columnIndex) {
+        case 0:
+          return server.getName();
+        case 1:
+          return server.getHostName().toString();
+        case 2:
+          return server.getState();
+        case 3:
+          return "";
+      }
+    } else if (element instanceof HadoopJob) {
+      HadoopJob job = (HadoopJob) element;
+
+      switch (columnIndex) {
+        case 0:
+          return job.getId();
+        case 1:
+          return "";
+        case 2:
+          return job.getState();
+        case 3:
+          return job.getStatus();
+      }
+    } else if (element instanceof JarModule) {
+      JarModule jar = (JarModule) element;
+
+      switch (columnIndex) {
+        case 0:
+          return jar.toString();
+        case 1:
+          return "Publishing jar to server..";
+        case 2:
+          return "";
+      }
+    }
+
+    return null;
+  }
+
+  public void jobAdded(HadoopJob job) {
+    viewer.refresh();
+  }
+
+  public void jobChanged(HadoopJob job) {
+    viewer.refresh(job);
+  }
+
+  public void publishDone(JarModule jar) {
+    viewer.refresh();
+  }
+
+  public void publishStart(JarModule jar) {
+    viewer.refresh();
+  }
+
+  /**
+   * Return the currently selected server (null if there is no selection or
+   * if the selection is not a server)
+   * 
+   * @return the currently selected server entry
+   */
+  public HadoopServer getSelectedServer() {
+    ITreeSelection selection = (ITreeSelection) viewer.getSelection();
+    Object first = selection.getFirstElement();
+    if (first instanceof HadoopServer) {
+      return (HadoopServer) first;
+    }
+    return null;
+  }
+
+}

+ 149 - 0
src/contrib/eclipse-plugin/todo.txt

@@ -0,0 +1,149 @@
+-- DONE --------------------------
+	* Pref wizard page for hadoop libraries (eugene) -- DONE
+	* running wrong jar bug (julz) -- not using WTP any more DONE
+	* DFS only for hadoop servers (julz) -- DONE
+	* allow per-project hadoop dir, moved selection of hadoop path to first page of wizard (julz) -- DONE
+	* allow creation of new driver as part of new project wizard (julz) -- DONE
+	* BUG: ssh console sometimes drops (eugene) -- DONE
+	* Server Selection wizard - finish button should not be clickable if radio is on create server (eugene) -- DONE
+	* module icons for jar and job (dennis) -- DONE (sort of)
+	
+			 
+--- Bugs ---
+
+	* Server Selection wizard has identical name and location -- 
+	
+--- Features ----
+
+	* Limit type searches in driver wizard to current project (eugene) 
+	
+	* new.. dialogs on mapred perspective (julz) 
+	
+	* show cheat sheet, more wizardy goodness (julz) 
+
+
+--- Documentation ---
+
+	* cheat sheets (dennis)
+
+
+--- Testing ---
+
+	* test on mac osx (julz)
+
+
+--- Everything ------------------
+
+* Run/Debug.. on Hadoop runs the project on a local hadoop cloud, this will involve finding
+	the appropriate Map/Reduce classes, as a first pass I suggest we have the user specify these in the Run.. dialog
+	therefore this task breaks down to at least:
+	
+	* hadoop new proj. size
+	
+	* generate mapper/reducer screen on new project wizard
+	* title bar, titles on new X wizards
+	* hadoop perspective show cheat sheet
+	* status on server view
+	* double click on jobs, go to associated console
+	* icons for jobs
+	
+	* copy resources directory or similar to dfs, allow configurable resources directory
+	
+	* test installation directory on new server screen (i.e. ssh in and check files are present)
+	
+	* (Eugene) if server has user:pass@hostname in location, ssh file and run it on remote hadoop client
+	
+	* (Daniel) make launch on local hadoop scenario properly work, deploy jar to the server when run
+	
+	* (Julz) read info from 50030 to show jobs running on server
+
+	* contribute Format action for fs, suggest this when a server if first created
+	
+	* Possibly DFS navigator view?
+
+	*	(and to specify input and output? - how should we handle this?)
+	
+	* Restrict browse classes dialog above to subclass of Mapper, Reducer etc., add proposals to text fields
+	
+	* Make launch dialog look pretty
+	
+	* Run the specified Mapper and Reducer on a local server
+	
+	* Allow the user to Run on a server defined in a servers view (i.e. so you can run locally, and on cloud A or B with the same settings)
+	
+	* Allow the user to configure the hadoop server from this view as appropriate
+	
+	* When the job runs, keep the tracker interface and put it into a view in the perspective (see next task!) so the user
+	can track the state
+
+* Add a Hadouken perspective with
+	* the Hadoop targets view (analogous to servers view in WTP project)
+	
+	* the running jobs view which shows the status of running jobs
+	
+	* a Current Lesson/API panel showing html text from the lecturer?
+	
+	* any jazz stuff?
+	
+* JUnit support, specify expected inputs and outputs and run on server, collecting results and presenting a unified view
+ similar to the junit component.
+-- DONE --------------------------
+
+-- Current priorities ------------
+
+ ... Dennis, maybe you could move stuff from below up here?
+
+--- Everything ------------------
+
+* Run/Debug.. on Hadoop runs the project on a local hadoop cloud, this will involve finding
+	the appropriate Map/Reduce classes, as a first pass I suggest we have the user specify these in the Run.. dialog
+	therefore this task breaks down to at least:
+	
+	* hadoop new proj. size
+	
+	* generate mapper/reducer screen on new project wizard
+	* title bar, titles on new X wizards
+	* auto-focus on main on X wizards, auto show newly created stuff
+	* on new driver screen, specify mapper (allow creation for bonus points)
+	* hadoop perspective show cheat sheet
+	* remove browse button
+	* status on server view
+	* double click on jobs, go to associated console
+	* icons for jobs
+	
+	* (Eugene) if server has user:pass@hostname in location, ssh file and run it on remote hadoop client
+	
+	* (Daniel) make launch on local hadoop scenario properly work, deploy jar to the server when run
+	
+	* (Julz) read info from 50030 to show jobs running on server
+
+	* contribute Format action for fs, suggest this when a server if first created
+	
+	* Possibly DFS navigator view?
+
+	*	(and to specify input and output? - how should we handle this?)
+	
+	* Restrict browse classes dialog above to subclass of Mapper, Reducer etc., add proposals to text fields
+	
+	* Make launch dialog look pretty
+	
+	* Run the specified Mapper and Reducer on a local server
+	
+	* Allow the user to Run on a server defined in a servers view (i.e. so you can run locally, and on cloud A or B with the same settings)
+	
+	* Allow the user to configure the hadoop server from this view as appropriate
+	
+	* When the job runs, keep the tracker interface and put it into a view in the perspective (see next task!) so the user
+	can track the state
+
+* Add a Hadouken perspective with
+	* the Hadoop targets view (analogous to servers view in WTP project)
+	
+	* the running jobs view which shows the status of running jobs
+	
+	* a Current Lesson/API panel showing html text from the lecturer?
+	
+	* any jazz stuff?
+	
+* JUnit support, specify expected inputs and outputs and run on server, collecting results and presenting a unified view
+ similar to the junit component.