Created
May 19, 2014 10:27
-
-
Save anonymous/9f62d7805711f3915bbe to your computer and use it in GitHub Desktop.
HDP 2 Windows Startup Log - URISyntaxException in NameNode/DataNode
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2014-05-19 15:46:53,169 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: | |
/************************************************************ | |
STARTUP_MSG: Starting DataNode | |
STARTUP_MSG: host = DEVWIN/192.50.50.212 | |
STARTUP_MSG: args = [] | |
STARTUP_MSG: version = 2.2.0.2.0.6.0-0009 | |
STARTUP_MSG: classpath = {a_whole_list_of_jars} | |
STARTUP_MSG: build = git@github.com:hortonworks/hadoop-monarch.git -r b845729d6990bc11889a5bdefaf6d2221ef9e6d1; compiled by 'jenkins' on 2013-12-21T02:17Z | |
STARTUP_MSG: java = 1.7.0_55 | |
************************************************************/ | |
2014-05-19 15:46:55,157 ERROR org.apache.hadoop.hdfs.server.common.Util: Syntax error in URI C:\Lab\Apps\horton\data\hdfs\dn. Please check hdfs configuration. | |
java.net.URISyntaxException: Illegal character in opaque part at index 2: C:\Lab\Apps\horton\data\hdfs\dn | |
at java.net.URI$Parser.fail(URI.java:2829) | |
at java.net.URI$Parser.checkChars(URI.java:3002) | |
at java.net.URI$Parser.parse(URI.java:3039) | |
at java.net.URI.<init>(URI.java:595) | |
at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48) | |
at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98) | |
at org.apache.hadoop.hdfs.server.datanode.DataNode.getStorageDirs(DataNode.java:1648) | |
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1638) | |
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665) | |
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837) | |
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858) | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2014-05-19 15:46:53,371 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: | |
/************************************************************ | |
STARTUP_MSG: Starting NameNode | |
STARTUP_MSG: host = DEVWIN/192.50.50.212 | |
STARTUP_MSG: args = [] | |
STARTUP_MSG: version = 2.2.0.2.0.6.0-0009 | |
STARTUP_MSG: classpath = {a_whole_list_of_jars} | |
STARTUP_MSG: build = git@github.com:hortonworks/hadoop-monarch.git -r b845729d6990bc11889a5bdefaf6d2221ef9e6d1; compiled by 'jenkins' on 2013-12-21T02:17Z | |
STARTUP_MSG: java = 1.7.0_55 | |
************************************************************/ | |
2014-05-19 15:46:59,174 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
2014-05-19 15:46:59,291 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog | |
2014-05-19 15:46:59,342 INFO org.apache.hadoop.http.HttpServer: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter) | |
2014-05-19 15:46:59,344 INFO org.apache.hadoop.http.HttpServer: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs | |
2014-05-19 15:46:59,344 INFO org.apache.hadoop.http.HttpServer: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs | |
2014-05-19 15:46:59,344 INFO org.apache.hadoop.http.HttpServer: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static | |
2014-05-19 15:46:59,355 INFO org.apache.hadoop.http.HttpServer: dfs.webhdfs.enabled = true | |
2014-05-19 15:46:59,361 INFO org.apache.hadoop.http.HttpServer: Added filter 'SPNEGO' (class=org.apache.hadoop.hdfs.web.AuthFilter) | |
2014-05-19 15:46:59,376 INFO org.apache.hadoop.http.HttpServer: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/* | |
2014-05-19 15:46:59,481 INFO org.apache.hadoop.http.HttpServer: Jetty bound to port 50070 | |
2014-05-19 15:46:59,481 INFO org.mortbay.log: jetty-6.1.26 | |
2014-05-19 15:46:59,660 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: 'signature.secret' configuration not set, using a random value as secret | |
2014-05-19 15:46:59,692 INFO org.mortbay.log: Started SelectChannelConnector@DEVWIN:50070 | |
2014-05-19 15:46:59,692 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Web-server up at: DEVWIN:50070 | |
2014-05-19 15:46:59,711 ERROR org.apache.hadoop.hdfs.server.common.Util: Syntax error in URI C:\Lab\Apps\horton\data\hdfs\nn. Please check hdfs configuration. | |
java.net.URISyntaxException: Illegal character in opaque part at index 2: C:\Lab\Apps\horton\data\hdfs\nn | |
at java.net.URI$Parser.fail(URI.java:2829) | |
at java.net.URI$Parser.checkChars(URI.java:3002) | |
at java.net.URI$Parser.parse(URI.java:3039) | |
at java.net.URI.<init>(URI.java:595) | |
at org.apache.hadoop.hdfs.server.common.Util.stringAsURI(Util.java:48) | |
at org.apache.hadoop.hdfs.server.common.Util.stringCollectionAsURIs(Util.java:98) | |
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getStorageDirs(FSNamesystem.java:1119) | |
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNamespaceDirs(FSNamesystem.java:1074) | |
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkConfiguration(FSNamesystem.java:508) | |
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:563) | |
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:443) | |
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:491) | |
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:684) | |
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:669) | |
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1254) | |
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1320) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment