MySQL到Sqoop - 连接被拒绝:java.net.ConnectException

3

我正在尝试使用Sqoop将本地MySQL数据库中的数据导入到HDFS,但每当我尝试运行代码时,它都会给出连接被拒绝的异常。

sqoop import --connect jdbc:mysql://localhost/sqoop --username root --password 123@ajith --table mysql_sqoop --m 1

但它显示连接被拒绝的错误。

所有内容都安装在我的本地机器上。

以下是附加的错误信息:

Warning: /usr/lib/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/lib/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/04 14:44:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/04 14:44:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/05/04 14:44:50 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/05/04 14:44:50 INFO tool.CodeGenTool: Beginning code generation
16/05/04 14:44:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_sqoop` AS t LIMIT 1
16/05/04 14:44:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_sqoop` AS t LIMIT 1
16/05/04 14:44:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-hduser_/compile/21bf2271e2878039d8e7c32486f8b7b7/mysql_sqoop.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/05/04 14:44:52 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hduser_/compile/21bf2271e2878039d8e7c32486f8b7b7/mysql_sqoop.jar
16/05/04 14:44:52 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/05/04 14:44:52 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/05/04 14:44:52 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/05/04 14:44:52 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/05/04 14:44:52 INFO mapreduce.ImportJobBase: Beginning import of mysql_sqoop
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/05/04 14:44:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/05/04 14:44:52 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/05/04 14:44:53 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
16/05/04 14:44:53 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/05/04 14:44:53 ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call From ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
 at org.apache.hadoop.ipc.Client.call(Client.java:1479)
 at org.apache.hadoop.ipc.Client.call(Client.java:1412)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
 at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
 at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
 at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
 at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
 at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
 at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
 at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
 at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
 at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
 at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
 at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
 at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
 at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
 at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
 at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
 at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
 at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
 at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
 at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
 at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.net.ConnectException: Connection refused
 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
 at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
 at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
 at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
 at org.apache.hadoop.ipc.Client.call(Client.java:1451)
 ... 40 more

3个回答

0
请重启Hadoop并再次尝试。
./stop-dfs.sh
./stop-yarn.sh
./start-dfs.sh
./start-dfs.sh

0

你能试一下这个吗?

sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password 123@ajith --table mysql_sqoop --m 1

我想你应该添加端口3306


我尝试了,但是同样的错误又发生了:请查看错误信息:ERROR tool.ImportTool: 在运行导入作业时遇到 IOException 错误:java.net.ConnectException: 从 ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 到 localhost:9000 的调用因连接异常而失败:java.net.ConnectException: Connection refused;更多详细信息请参见:http://wiki.apache.org/hadoop/ConnectionRefused。 - John Simon
当我执行 hadoop fs -ls hdfs://localhost:9000/ 时,出现以下错误: 16/05/05 10:23:00 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable ls: Call From ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused - John Simon
你应该使用正确的地址来访问hdfs,你知道这个地址吗?你可以像这样列出你的hdfs中的文件:hadoop fs -ls xxxx - cdhit
hadoop fs -ls / 找到2个项目 drwxr-xr-x - hduser_ supergroup 0 2016-05-03 15:44 /system drwxr-xr-x - hduser_ supergroup 0 2016-05-03 15:37 /user - John Simon
请检查HDFS配置文件,例如:<property> <name>fs.default.name</name> <value>hdfs://hadoop:8020</value> </property> 您能否将此值用作Sqoop参数 --hadoop-home xxx 并尝试? - cdhit
显示剩余2条评论

0

尝试使用JPS命令检查所有的守护进程是否正在运行,如果守护进程服务未启动,可能会出现连接被拒绝的错误。

要启动所有服务,请使用start-all.sh(注意:已经过时,但它将启动所有服务)。如果您想单独启动某些服务,请相应地使用它。


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接