Hadoop "failed on connection exception: java.net.ConnectException: Connection refused"

dbl001

I'm trying to run a Hadoop command in local mode. I'm running on Mac OS X 10.10.5 and I'm getting an error while putting a file into HDFS. Here's the error message from my Hadoop command:

 $ sudo hadoop fs -put HG00103.mapped.ILLUMINA.bwa.GBR.low_coverage.20120522.bam /usr/ds/genomics
    Password:
    15/09/25 10:10:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    put: Call From BlueMeanie/10.0.1.5 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Here are the details of my system:

$ java -version
java version "1.8.0_05"
Java(TM) SE Runtime Environment (build 1.8.0_05-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.5-b02, mixed mode)

$ hadoop version
Hadoop 2.3.0
Subversion http://svn.apache.org/repos/asf/hadoop/common -r 1567123
Compiled by jenkins on 2014-02-11T13:40Z
Compiled with protoc 2.5.0
From source with checksum dfe46336fbc6a044bc124392ec06b85
This command was run using /Users/davidlaxer/hadoop-`2.3.0/share/hadoop/common/hadoop-common-2.3.0.jar`

$ cat /etc/hosts
##
# Host Database
#
# localhost is used to configure the loopback interface
# when the system is booting.  Do not change this entry.
##
127.0.0.1   localhost
10.0.1.5    BlueMeanie
255.255.255.255 broadcasthost
::1 localhost
fe80::1%lo0 localhost

$ telnet 10.1.1.5 9000
Trying 10.1.1.5...
^C
$ telnet localhost 9000
Trying ::1...
telnet: connect to address ::1: Connection refused
Trying 127.0.0.1...
telnet: connect to address 127.0.0.1: Connection refused
Trying fe80::1...
telnet: connect to address fe80::1: Connection refused
telnet: Unable to connect to remote host

$ env | grep HADOOP
HADOOP_HOME=/Users/dbl/hadoop-2.3.0/
HADOOP_CONF_DIR=/Users/dbl/hadoop-2.3.0/etc

$ cat core_site.xml

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

$ cat hdfs-site.xml 
<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>
$ cat yarn_site.xml 
<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>
$ cat mapred_site.xml 
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

$ sbin/start-dfs.sh
Starting namenodes on [2015-09-25 16:36:54,540 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
localhost]
[main]: ssh: Could not resolve hostname [main]: nodename nor servname provided, or not known
-: ssh: Could not resolve hostname -: nodename nor servname provided, or not known
Unable: ssh: Could not resolve hostname Unable: nodename nor servname provided, or not known
native-hadoop: ssh: Could not resolve hostname native-hadoop: nodename nor servname provided, or not known
load: ssh: Could not resolve hostname load: nodename nor servname provided, or not known
to: ssh: Could not resolve hostname to: nodename nor servname provided, or not known
for: ssh: Could not resolve hostname for: nodename nor servname provided, or not known
16:36:54,540: ssh: Could not resolve hostname 16:36:54,540: nodename nor servname provided, or not known
your: ssh: Could not resolve hostname your: nodename nor servname provided, or not known
platform...: ssh: Could not resolve hostname platform...: nodename nor servname provided, or not known
using: ssh: Could not resolve hostname using: nodename nor servname provided, or not known
builtin-java: ssh: Could not resolve hostname builtin-java: nodename nor servname provided, or not known
where: ssh: Could not resolve hostname where: nodename nor servname provided, or not known
applicable: ssh: Could not resolve hostname applicable: nodename nor servname provided, or not known
localhost: namenode running as process 99664. Stop it first.
2015-09-25: ssh: Could not resolve hostname 2015-09-25: nodename nor servname provided, or not known
WARN: ssh: Could not resolve hostname WARN: nodename nor servname provided, or not known
library: ssh: Could not resolve hostname library: nodename nor servname provided, or not known
classes: ssh: Could not resolve hostname classes: nodename nor servname provided, or not known
(NativeCodeLoader.java:<clinit>(62)): ssh: connect to host (NativeCodeLoader.java:<clinit>(62)) port 22: Operation timed out
util.NativeCodeLoader: ssh: connect to host util.NativeCodeLoader port 22: Operation timed out
cat: /Users/davidlaxer/hadoop-2.3.0/etc/hadoop/conf/slaves: No such file or directory
Starting secondary namenodes [2015-09-25 16:39:26,863 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
0.0.0.0]
WARN: ssh: Could not resolve hostname WARN: nodename nor servname provided, or not known
[main]: ssh: Could not resolve hostname [main]: nodename nor servname provided, or not known
Unable: ssh: Could not resolve hostname Unable: nodename nor servname provided, or not known
to: ssh: Could not resolve hostname to: nodename nor servname provided, or not known
-: ssh: Could not resolve hostname -: nodename nor servname provided, or not known
native-hadoop: ssh: Could not resolve hostname native-hadoop: nodename nor servname provided, or not known
library: ssh: Could not resolve hostname library: nodename nor servname provided, or not known
for: ssh: Could not resolve hostname for: nodename nor servname provided, or not known
your: ssh: Could not resolve hostname your: nodename nor servname provided, or not known
platform...: ssh: Could not resolve hostname platform...: nodename nor servname provided, or not known
16:39:26,863: ssh: Could not resolve hostname 16:39:26,863: nodename nor servname provided, or not known
using: ssh: Could not resolve hostname using: nodename nor servname provided, or not known
builtin-java: ssh: Could not resolve hostname builtin-java: nodename nor servname provided, or not known
classes: ssh: Could not resolve hostname classes: nodename nor servname provided, or not known
where: ssh: Could not resolve hostname where: nodename nor servname provided, or not known
applicable: ssh: Could not resolve hostname applicable: nodename nor servname provided, or not known
0.0.0.0: secondarynamenode running as process 99006. Stop it first.
2015-09-25: ssh: Could not resolve hostname 2015-09-25: nodename nor servname provided, or not known
load: ssh: Could not resolve hostname load: nodename nor servname provided, or not known
(NativeCodeLoader.java:<clinit>(62)): ssh: connect to host (NativeCodeLoader.java:<clinit>(62)) port 22: Operation timed out
util.NativeCodeLoader: ssh: connect to host util.NativeCodeLoader port 22: Operation timed out
Dimitris Fasarakis Hilliard

Well running in single node mode doesn't require you start Namenode, Datanode et al.

Single Node or Standalone works out of the box with the standard hadoop installation with the requirement that you set your fs.defaultFS to file:///, meaning your local file system.


If you want to run in pseudo distributed (as I'm guessing you wanted from your configuration and the fact you ran start-dfs.sh) you must also remember that communication between daemons is performed with ssh so you need to:

  • Edit your shd_config file (after installing ssh and backing shd_config up)
  • Add Port 9000 (and I believe Port 8020 to it).

Then, restart ssh and check whether you can connect to localhost via ssh. This is probably what your weird messages when starting the Namenode and Datanode are all about.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related