hdfs - Hadoop Basic - error while creating directroy -
i have started learning hadoop , getting below error while creating new folder -
vm4learning@vm4learning:~/installations/hadoop-1.2.1/bin$ ./hadoop fs -mkdir helloworld warning: $hadoop_home deprecated. 15/06/14 19:46:35 info ipc.client: retrying connect server: localhost/127.0.0.1:9000. tried 0 time(s); retry policy retryuptomaximumcountwithfixedsleep(maxretries=10, sleeptime=1 seconds)
request help.below namdenode logs -
015-06-14 22:01:08,158 info org.apache.hadoop.hdfs.server.common.storage: storage directory /home/vm4learning/installations/hadoop-1.2.1/data/dfs/name not exist 2015-06-14 22:01:08,161 error org.apache.hadoop.hdfs.server.namenode.fsnamesystem: fsnamesystem initialization failed. org.apache.hadoop.hdfs.server.common.inconsistentfsstateexception: directory /home/vm4learning/installations/hadoop-1.2.1/data/dfs/name in inconsistent state: storage directory not exist or not accessible. @ org.apache.hadoop.hdfs.server.namenode.fsimage.recovertransitionread(fsimage.java:304) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.loadfsimage(fsdirectory.java:104) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.initialize(fsnamesystem.java:427) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.(fsnamesystem.java:395) @ org.apache.hadoop.hdfs.server.namenode.namenode.initialize(namenode.java:299) @ org.apache.hadoop.hdfs.server.namenode.namenode.(namenode.java:569) @ org.apache.hadoop.hdfs.server.namenode.namenode.createnamenode(namenode.java:1479) @ org.apache.hadoop.hdfs.server.namenode.namenode.main(namenode.java:1488) 2015-06-14 22:01:08,182 error org.apache.hadoop.hdfs.server.namenode.namenode: org.apache.hadoop.hdfs.server.common.inconsistentfsstateexception: directory /home/vm4learning/installations/hadoop-1.2.1/data/dfs/name in inconsistent state: storage directory not exist or not accessible. @ org.apache.hadoop.hdfs.server.namenode.fsimage.recovertransitionread(fsimage.java:304) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.loadfsimage(fsdirectory.java:104) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.initialize(fsnamesystem.java:427) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.(fsnamesystem.java:395) @ org.apache.hadoop.hdfs.server.namenode.namenode.initialize(namenode.java:299) @ org.apache.hadoop.hdfs.server.namenode.namenode.(namenode.java:569) @ org.apache.hadoop.hdfs.server.namenode.namenode.createnamenode(namenode.java:1479) @ org.apache.hadoop.hdfs.server.namenode.namenode.main(namenode.java:1488)
2015-06-14 22:01:08,185 info org.apache.hadoop.hdfs.server.namenode.namenode: shutdown_msg: /************************************************************ shutdown_msg: shutting down namenode @ vm4learning/192.168.1.102 ************************************************************/
before starting create directory, should sure hadoop installation correct, through jps
command, , looking process missing.
in case, namenode isn't up.
if see in logs, appears folders aren't created. this:
mkdir -p $hadoop_home/dfs/name mkdir -p $hadoop_home/dfs/name/data
and specify in hdfs-site.xml following.
<property> <name>dfs.data.dir</name> <value>/usr/local/hadoop/dfs/name/data</value> <final>true</final> </property> <property> <name>dfs.name.dir</name> <value>/usr/local/hadoop/dfs/name</value> <final>true</final> </property>
reinitialize hadoop, , remember format previous anything.
Comments
Post a Comment