QBoard » Big Data » Big Data - Hadoop Eco-System » Datanode process not running in Hadoop

Datanode process not running in Hadoop

  • I set up and configured a multi-node Hadoop cluster using this tutorial.

    When I type in the start-all.sh command, it shows all the processes initializing properly as follows:

    starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-namenode-jawwadtest1.out
    jawwadtest1: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-jawwadtest1.out
    jawwadtest2: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-jawwadtest2.out
    jawwadtest1: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-secondarynamenode-jawwadtest1.out
    starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-jobtracker-jawwadtest1.out
    jawwadtest1: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-jawwadtest1.out
    jawwadtest2: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-jawwadtest2.out

     

    However, when I type the jps command, I get the following output:

    31057 NameNode
    4001 RunJar
    6182 RunJar
    31328 SecondaryNameNode
    31411 JobTracker
    32119 Jps
    31560 TaskTracker

     

    As you can see, there's no datanode process running. I tried configuring a single-node cluster but got the same problem. Would anyone have any idea what could be going wrong here? Are there any configuration files that are not mentioned in the tutorial or I may have looked over? I am new to Hadoop and am kinda lost and any help would be greatly appreciated.

    EDIT: hadoop-root-datanode-jawwadtest1.log:

    STARTUP_MSG:   args = []
    STARTUP_MSG:   version = 1.0.3
    STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/$
    ************************************************************/
    2012-08-09 23:07:30,717 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loa$
    2012-08-09 23:07:30,734 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapt$
    2012-08-09 23:07:30,735 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$
    2012-08-09 23:07:30,736 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$
    2012-08-09 23:07:31,018 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapt$
    2012-08-09 23:07:31,024 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$
    2012-08-09 23:07:32,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to $
    2012-08-09 23:07:37,949 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: $
            at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(Data$
            at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransition$
            at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNo$
            at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java$
            at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNod$
            at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode($
            at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataN$
            at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.$
            at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1$
    
    2012-08-09 23:07:37,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: S$
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at jawwadtest1/198.101.220.90
    ************************************************************/



      January 4, 2022 1:25 PM IST
    0
  • Follow these steps and your datanode will start again.

    1. Stop dfs.
    2. Open hdfs-site.xml
    3. Remove the data.dir and name.dir properties from hdfs-site.xml and -format namenode again.
    4. Then remove the hadoopdata directory and add the data.dir and name.dir in hdfs-site.xml and again format namenode.
    5. Then start dfs again.
      January 5, 2022 2:19 PM IST
    0