一般Ubuntu12.04中Eclipse安装svn插件经常碰到两种问题:
- Failed to load JavaHL Library.
- ubuntu Incompatible JavaH[……]
一般Ubuntu12.04中Eclipse安装svn插件经常碰到两种问题:
Spark 源码除了用 sbt/sbt assembly
编译,也可用Maven进行编译,具体步骤如下:
1、配置Maven 的参数
[crayon-675a73748a5dc88702812[……]
在DataNode节点中的Hive CLI中执行 select count(*) from table_name 查询时报错:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
java.io.IOException: java.net.ConnectException: Call From Slave7.Hadoop/192.168.8.207 to 0.0.0.0:10020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:331) at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:416) at org.apache.hadoop.mapred.YARNRunner.getJobStatus(YARNRunner.java:522) at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:183) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:580) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:578) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:578) at org.apache.hadoop.mapred.JobClient.getJob(JobClient.java:596) at org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:288) at org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:547) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:426) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) |
从[……]
Sqoop 把数据从Oracle中迁移到Hive中时发生错误:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
[hadoop@Master ~]$ sqoop import --connect jdbc:oracle:thin:@192.168.6.77:1521:orcl --username micmiu -P --table T_DEMO --warehouse-dir /user/sqoop --hive-import --create-hive-table Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Enter password: 14/04/11 11:30:18 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 14/04/11 11:30:18 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 14/04/11 11:30:18 WARN tool.BaseSqoopTool: It seems that you've specified at least one of following: 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --hive-home 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --hive-overwrite 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --create-hive-table 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --hive-table 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --hive-partition-key 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --hive-partition-value 14/04/11 11:30:18 WARN tool.BaseSqoopTool: --map-column-hive 14/04/11 11:30:18 WARN tool.BaseSqoopTool: Without specifying parameter --hive-import. Please note that 14/04/11 11:30:18 WARN tool.BaseSqoopTool: those arguments will not be used in this session. Either 14/04/11 11:30:18 WARN tool.BaseSqoopTool: specify --hive-import to apply them correctly or remove them 14/04/11 11:30:18 WARN tool.BaseSqoopTool: from command line to remove this warning. 14/04/11 11:30:18 INFO tool.BaseSqoopTool: Please note that --hive-home, --hive-partition-key, 14/04/11 11:30:18 INFO tool.BaseSqoopTool: hive-partition-value and --map-column-hive options are 14/04/11 11:30:18 INFO tool.BaseSqoopTool: are also valid for HCatalog imports and exports 14/04/11 11:30:18 INFO manager.SqlManager: Using default fetchSize of 1000 14/04/11 11:30:18 INFO tool.CodeGenTool: Beginning code generation 14/04/11 11:30:19 INFO manager.OracleManager: Time zone has been set to GMT 14/04/11 11:30:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM T_DEMO t WHERE 1=0 14/04/11 11:30:19 ERROR tool.ImportTool: Imported Failed: Attempted to generate class with no columns! |
这个原因是因为:-username
micmiu 用户名这个参数,[……]
Sqoop是一个用来将Hadoop(Hive、HBase)和关系型数据库中的数据相互转移的工具,可以将一个关系型数据库(例如:MySQL ,Oracle ,Postgres等)中的数据导入到Hadoo[……]
HBase和Hive整合后,在Hive shell中执行建表语句时错误信息如下:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
14/03/28 16:41:59 ERROR exec.DDLTask: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: Not a host:port pair: PBUF Master.Hadoop��ظ�( at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:602) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3661) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:252) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.IllegalArgumentException: Not a host:port pair: PBUF |
一般这样的错误信息是由于:<HIVE_HO[……]
近期评论