Hive的訪問接口
Hive提供了三種客戶端訪問接口:
1)Hive CLI(Hive Command Line,Hive命令行),客戶端可以直接在命令行模式下進(jìn)行操作。
2)hwi(Hive Web Interface,Hive Web接口),Hive提供了更直觀的Web界面
3)hiveserver,Hive提供了Thrift服務(wù),Thrift客戶端目前支持C++/Java/PHP/Python/Ruby。
下面我們來分別嘗試下這三種接口訪問方式:
一、Hive CLI
直接鍵入hive命令即可進(jìn)入CLI模式:
[ cloud@cloud01 lib ] $ hive Hive history file = /tmp/ cloud / hive_job_log_cloud_201110311056_1009535967 . txt hive & gt ; show tables ; OK testhivedrivertable Time taken : 3.038 seconds hive & gt ; select * from testhivedrivertable ; OK Time taken : 0.905 seconds hive & gt ; quit ; [ cloud@cloud01 lib ] $更多的命令選項,參見官方wiki, Hive Cli
二、Hive hwi
Hive hwi提供了一個更直觀的web界面,使用起來更方便。
1)啟動hive hwi
[ cloud@cloud01 ~] $ hive -- service hwi 11 / 10 / 31 10 : 14 : 11 INFO hwi . HWIServer : HWI is starting up 11 / 10 / 31 10 : 14 : 11 INFO mortbay . log : Logging to org . slf4j . impl . Log4jLoggerAdapter ( org . mortbay . log ) via org . mortbay . log . Slf4jLog 11 / 10 / 31 10 : 14 : 11 INFO mortbay . log : jetty - 6.1 . 14 11 / 10 / 31 10 : 14 : 11 INFO mortbay . log : Extract jar : file : /data/ cloud / hive - 0.7 . 1 / lib / hive - hwi - 0.7 . 1.war ! / to / tmp / Jetty_0_0_0_0_9999_hive . hwi . 0.7 . 1.war __hwi__ . hf8ccz / webapp 11 / 10 / 31 10 : 14 : 12 INFO mortbay . log : Started SocketConnector@0 . 0.0 . 0 : 99992)通過hwi方式訪問Hive
我的Hive部署在10.46.169.101機(jī)器上,hive默認(rèn)hwi端口為9999。我們在瀏覽器中鍵入 http://10.46.169.101:9999/hwi/ ?就可以訪問了。如圖:
更多hwi的信息,訪問官方wiki, hwi
?
三、hiveserver
Hive以Thrift方式作為服務(wù)對客戶端提供,目前Hive的Thrift綁定了多種語言,C++/Java/PHP/Python/Ruby,可在Hive發(fā)行版本的src/service/src目錄下找到這些語言的Thrift綁定。Hive還提供了JDBC和ODBC的驅(qū)動,大大方面了基于Hive的應(yīng)用開發(fā)。我利用官方的例子對JDBC驅(qū)動進(jìn)行了測試。
1)啟動hiveserver
[ cloud@cloud01 ~] $ hive -- service hiveserver Starting Hive Thrift Server2)在Eclipse中新建一個Java工程Hive0.7.1Test
3)將$HIVE_HOME/lib目錄下的jar包加到工程的buildpath里
4)Hive的表是存儲在HDFS上,所以,需要加載Hadoop的核心jar包。我的Hadoop版本是0.20.1。
5)新建一個class,用官方wiki提供的代碼,如下:
import java . sql . SQLException ; import java . sql . Connection ; import java . sql . ResultSet ; import java . sql . Statement ; import java . sql . DriverManager ; public class HiveJdbcClient { private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver" ; /** * @param args * @throws SQLException */ public static void main ( String [] args ) throws SQLException { try { Class . forName ( driverName ); } catch ( ClassNotFoundException e ) { // TODO Auto-generated catch block e . printStackTrace (); System . exit ( 1 ); } Connection con = DriverManager . getConnection ( "jdbc:hive://10.46.169.101:10000/default" , "" , "" ); Statement stmt = con . createStatement (); String tableName = "testHiveDriverTable" ; stmt . executeQuery ( "drop table " + tableName ); ResultSet res = stmt . executeQuery ( "create table " + tableName + " (key int, value string)" ); // show tables String sql = "show tables '" + tableName + "'" ; System . out . println ( "Running: " + sql ); res = stmt . executeQuery ( sql ); if ( res . next ()) { System . out . println ( res . getString ( 1 )); } // describe table sql = "describe " + tableName ; System . out . println ( "Running: " + sql ); res = stmt . executeQuery ( sql ); while ( res . next ()) { System . out . println ( res . getString ( 1 ) + "\t" + res . getString ( 2 )); } // load data into table // NOTE: filepath has to be local to the hive server // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line String filepath = "/tmp/a.txt" ; sql = "load data local inpath '" + filepath + "' into table " + tableName ; System . out . println ( "Running: " + sql ); res = stmt . executeQuery ( sql ); // select * query sql = "select * from " + tableName ; System . out . println ( "Running: " + sql ); res = stmt . executeQuery ( sql ); while ( res . next ()) { System . out . println ( String . valueOf ( res . getInt ( 1 )) + "\t" + res . getString ( 2 )); } // regular hive query sql = "select count(1) from " + tableName ; System . out . println ( "Running: " + sql ); res = stmt . executeQuery ( sql ); while ( res . next ()) { System . out . println ( res . getString ( 1 )); } } }6)編譯運行,console如下:<div> 2011-10-31 11:21:31,703 WARN ?[main] conf.Configuration(175): DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively </div> <div> Running: show tables 'testHiveDriverTable' </div> <div> testhivedrivertable </div> <div> Running: describe testHiveDriverTable </div> <div> keyint </div> <div> valuestring </div> <div> Running: load data local inpath '/tmp/a.txt' into table testHiveDriverTable </div> <div> Exception in thread "main" java.sql.SQLException: Query returned non-zero code: 10, cause: FAILED: Error in semantic analysis: Line 1:23 Invalid path '/tmp/a.txt': No files matching path file:/tmp/a.txt </div> <div> at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192) </div> <div> at com.zte.allen.hive.HiveJdbcClient.main(HiveJdbcClient.java:53) </div>報了一個Exception,是因為加載源是找不到/tmp/a.txt。這個不影響,但可以從hwi里看到已經(jīng)新建了一個表testhivedrivertable。更多關(guān)于hiveserver的內(nèi)容,參見官方wiki? Setting up Hive Server , 還有 這里 介紹了各種客戶端(cli、Java、PHP、Python、ODBC、Thrift方式等)如何訪問Hive。
更多文章、技術(shù)交流、商務(wù)合作、聯(lián)系博主
微信掃碼或搜索:z360901061

微信掃一掃加我為好友
QQ號聯(lián)系: 360901061
您的支持是博主寫作最大的動力,如果您喜歡我的文章,感覺我的文章對您有幫助,請用微信掃描下面二維碼支持博主2元、5元、10元、20元等您想捐的金額吧,狠狠點擊下面給點支持吧,站長非常感激您!手機(jī)微信長按不能支付解決辦法:請將微信支付二維碼保存到相冊,切換到微信,然后點擊微信右上角掃一掃功能,選擇支付二維碼完成支付。
【本文對您有幫助就好】元
