Hive:JDBC示例
发布日期:2021-10-09 07:57:19 浏览次数:6 分类:技术文章

本文共 15509 字,大约阅读时间需要 51 分钟。

1)本地目录/home/hadoop/test下的test4.txt文件内容(每行数据之间用tab键隔开)如下所示:

[hadoop@master test]$ sudo vim test4.txt1    dajiangtai2    hadoop3    hive4    hbase5    spark

2)启动hiveserver2

[hadoop@master test]$ cd ${HIVE_HOME}/bin[hadoop@master bin]$ lltotal 32-rwxr-xr-x 1 hadoop hadoop 881 Jan 30 2015 beelinedrwxr-xr-x 3 hadoop hadoop 4096 May 14 23:28 ext-rwxr-xr-x 1 hadoop hadoop 7311 Jan 30 2015 hive-rwxr-xr-x 1 hadoop hadoop 1900 Jan 30 2015 hive-config.sh-rwxr-xr-x 1 hadoop hadoop 885 Jan 30 2015 hiveserver2-rwxr-xr-x 1 hadoop hadoop 832 Jan 30 2015 metatool-rwxr-xr-x 1 hadoop hadoop 884 Jan 30 2015 schematool[hadoop@master bin]$ ./hiveserver2 SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/opt/modules/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/opt/modules/hive1.0.0/lib/hive-jdbc-1.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

3) 程序代码

import java.sql.Connection;import java.sql.DriverManager;import java.sql.ResultSet;import java.sql.SQLException;import java.sql.Statement;public class Hive {
private static String driverName = "org.apache.hive.jdbc.HiveDriver";//hive驱动名称 private static String url = "jdbc:hive2://master:10000/default";//连接hive2服务的连接地址,Hive0.11.0以上版本提供了一个全新的服务:HiveServer2 private static String user = "hadoop";//对HDFS有操作权限的用户 private static String password = "";//在非安全模式下,指定一个用户运行查询,忽略密码 private static String sql = ""; private static ResultSet res; public static void main(String[] args) { try { Class.forName(driverName);//加载HiveServer2驱动程序 Connection conn = DriverManager.getConnection(url, user, password);//根据URL连接指定的数据库 Statement stmt = conn.createStatement(); //创建的表名 String tableName = "testHiveDriverTable"; /** 第一步:表存在就先删除 **/ sql = "drop table " + tableName; stmt.execute(sql); /** 第二步:表不存在就创建 **/ sql = "create table " + tableName + " (key int, value string) row format delimited fields terminated by '\t' STORED AS TEXTFILE"; stmt.execute(sql); // 执行“show tables”操作 sql = "show tables '" + tableName + "'"; res = stmt.executeQuery(sql); if (res.next()) { System.out.println(res.getString(1)); } // 执行“describe table”操作 sql = "describe " + tableName; res = stmt.executeQuery(sql); while (res.next()) { System.out.println(res.getString(1) + "\t" + res.getString(2)); } // 执行“load data into table”操作 String filepath = "/home/hadoop/test/test4.txt";//hive服务所在节点的本地文件路径 sql = "load data local inpath '" + filepath + "' into table " + tableName; stmt.execute(sql); // 执行“select * query”操作 sql = "select * from " + tableName; res = stmt.executeQuery(sql); while (res.next()) { System.out.println(res.getInt(1) + "\t" + res.getString(2)); } // 执行“regular hive query”操作,此查询会转换为MapReduce程序来处理 sql = "select count(*) from " + tableName; res = stmt.executeQuery(sql); while (res.next()) { System.out.println(res.getString(1)); } conn.close(); conn = null; } catch (ClassNotFoundException e) { e.printStackTrace(); System.exit(1); } catch (SQLException e) { e.printStackTrace(); System.exit(1); } }}

4) 运行结果(右击–>Run as–>Run on Hadoop)

此时直接运行会报错,解决方案请见下一篇博文:。

运行日志如下:

2018-05-24 09:25:52,416 INFO [org.apache.hive.jdbc.Utils] - Supplied authorities: master:100002018-05-24 09:25:52,418 INFO [org.apache.hive.jdbc.Utils] - Resolved authority: master:100002018-05-24 09:25:52,508 INFO [org.apache.hive.jdbc.HiveConnection] - Will try to open client transport with JDBC Uri: jdbc:hive2://master:10000/default2018-05-24 09:25:52,509 DEBUG [org.apache.thrift.transport.TSaslTransport] - opening transport org.apache.thrift.transport.TSaslClientTransport@3834d63f2018-05-24 09:25:52,529 DEBUG [org.apache.thrift.transport.TSaslClientTransport] - Sending mechanism name PLAIN and initial response of length 172018-05-24 09:25:52,533 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status START and payload length 52018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status COMPLETE and payload length 172018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Start message handled2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Main negotiation loop complete2018-05-24 09:25:52,534 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: SASL Client receiving last message2018-05-24 09:25:52,536 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Received message with status COMPLETE and payload length 02018-05-24 09:25:52,552 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 712018-05-24 09:25:52,749 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:52,768 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1372018-05-24 09:25:52,795 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:52,805 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:25:52,830 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:25:52,837 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 962018-05-24 09:25:52,840 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 422018-05-24 09:25:52,841 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 2292018-05-24 09:25:52,914 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:52,914 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:25:53,270 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:25:53,270 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 962018-05-24 09:25:53,271 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 422018-05-24 09:25:53,272 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1402018-05-24 09:25:53,328 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:53,328 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:25:53,369 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:25:53,376 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1022018-05-24 09:25:53,429 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1312018-05-24 09:25:53,451 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:25:53,536 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1192018-05-24 09:25:53,551 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **testhivedrivertable**2018-05-24 09:25:53,551 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 962018-05-24 09:25:53,554 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 422018-05-24 09:25:53,554 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1352018-05-24 09:25:53,674 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:53,674 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:25:53,740 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:25:53,740 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1022018-05-24 09:25:53,741 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 2732018-05-24 09:25:53,741 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:25:53,745 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1792018-05-24 09:25:53,746 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **key    int**2018-05-24 09:25:53,746 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **value    string**2018-05-24 09:25:53,746 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:25:53,747 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1382018-05-24 09:25:53,747 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 962018-05-24 09:25:53,751 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 422018-05-24 09:25:53,751 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1902018-05-24 09:25:53,833 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:53,833 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:25:54,721 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:25:54,721 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 962018-05-24 09:25:54,722 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 422018-05-24 09:25:54,723 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1402018-05-24 09:25:55,219 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:55,219 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:25:55,221 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:25:55,222 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1022018-05-24 09:25:55,223 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1862018-05-24 09:25:55,224 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:25:55,324 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1872018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **1    dajiangtai**2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **2    hadoop**2018-05-24 09:25:55,329 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **3    hive**2018-05-24 09:25:55,330 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **4    hbase**2018-05-24 09:25:55,330 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **5    spark**2018-05-24 09:25:55,330 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:25:55,330 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1172018-05-24 09:25:55,331 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 962018-05-24 09:25:55,346 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 422018-05-24 09:25:55,346 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1472018-05-24 09:25:55,603 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1092018-05-24 09:25:55,603 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:00,604 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:00,604 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:05,605 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:05,605 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:10,607 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:10,607 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:15,609 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:15,609 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:20,611 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:20,611 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:25,613 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:25,613 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:30,614 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:30,614 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1002018-05-24 09:26:33,020 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 532018-05-24 09:26:33,020 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1022018-05-24 09:26:33,021 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1022018-05-24 09:26:33,021 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:26:33,043 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 1042018-05-24 09:26:33,052 DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: **5**2018-05-24 09:26:33,052 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 1122018-05-24 09:26:33,053 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 962018-05-24 09:26:33,057 DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 832018-05-24 09:26:33,103 DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 40

执行“show tables”运行结果:

testhivedrivertable

执行“describe table”运行结果:

key    intvalue    string

执行“select * query”运行结果:

1    dajiangtai2    hadoop3    hive4    hbase5    spark

或者从集群上查看运行结果。

hive> show tables;OKcopy_student1copy_student2copy_student3copy_student4employeegroup_gender_agggroup_gender_sumgroup_testindex_testindex_tmppartition_teststudent1student2testtest_viewtesthivedrivertableuserTime taken: 0.153 seconds, Fetched: 17 row(s)hive> desc testhivedrivertable;OKkey                     int                                         value                   string                                      Time taken: 0.184 seconds, Fetched: 2 row(s)hive> select * from testhivedrivertable;OK1    dajiangtai2    hadoop3    hive4    hbase5    sparkTime taken: 0.346 seconds, Fetched: 5 row(s)

以上就是博主为大家介绍的这一板块的主要内容,这都是博主自己的学习过程,希望能给大家带来一定的指导作用,有用的还望大家点个支持,如果对你没用也望包涵,有错误烦请指出。如有期待可关注博主以第一时间获取更新哦,谢谢! 

转载地址:https://blog.csdn.net/py_123456/article/details/80430041 如侵犯您的版权,请留言回复原文章的地址,我们会给您删除此文章,给您带来不便请您谅解!

上一篇:hive性能调优
下一篇:Hive 基本语法操练(六):Hive 的权限控制

发表评论

最新留言

路过,博主的博客真漂亮。。
[***.116.15.85]2024年03月29日 18时44分35秒