java通过jdbc连接hive
Hive系列文章
- Hive表的基本操作
- Hive中的集合数据类型
- Hive动态分区详解
- hive中orc格式表的数据导入
- Java通过jdbc连接hive
- 通过HiveServer2访问Hive
- SpringBoot连接Hive实现自助取数
- hive关联hbase表
- Hive udf 使用方法
- Hive基于UDF进行文本分词
- Hive窗口函数row number的用法
- 数据仓库之拉链表
java
连接hive
进行一些统计分析,在工作中再正常不过了。这里介绍通过java
直连和mybatis
整合的方式。
1. java直连的方式
java
直连的方式跟连接mysql
手法一样,只是driverName
不一样而已。
public class HiveJdbcConnect implements Closeable{
private static String driverName = "org.apache.hive.jdbc.HiveDriver";
// jdbc:hive2://
private String url ;
private String user ;
private String password ;
private Connection conn = null;
public HiveJdbcConnect(String url, String user, String password) {
this.url = url;
this.user = user;
this.password = password;
try {
init();
} catch (Exception e) {
e.printStackTrace();
}
}
private void init() throws Exception {
Class.forName(driverName);
conn = DriverManager.getConnection(url, user, password);
}
}
Java
测试用例
@Test
public void test(){
HiveConfigModel hiveConfigModel = ConfigureContext.getInstance("hive-config.properties")
.addClass(HiveConfigModel.class)
.getModelProperties(HiveConfigModel.class);
try {
Connection conn = DriverManager.getConnection(hiveConfigModel.getUrl(),
hiveConfigModel.getUsername(), hiveConfigModel.getPassword());
String sql = "show tables";
PreparedStatement preparedStatement = conn.prepareStatement(sql);
ResultSet rs = preparedStatement.executeQuery();
List<String> tables = new ArrayList<>();
while (rs.next()){
tables.add(rs.getString(1));
}
System.out.println(tables);
} catch (SQLException e) {
e.printStackTrace();
}
}
Java
2. 整合mybatis
需要的maven
依赖,为了节省篇幅,给出了主要的依赖:
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
<version>3.2.7</version>
</dependency>
<!-- hadoop依赖 -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.5</version>
</dependency>
<!-- hive-jdbc -->
<!-- https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
</exclusion>
<exclusion>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</exclusion>
</exclusions>
<version>1.2.1</version>
</dependency>
XML
mybatis
配置文件:
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE configuration PUBLIC "-//mybatis.org//DTD Config 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-config.dtd">
<configuration>
<environments default="production">
<environment id="production">
<transactionManager type="JDBC"/>
<dataSource type="POOLED">
<property name="driver" value="org.apache.hive.jdbc.HiveDriver"/>
<property name="url" value="jdbc:hive2://localhost:10000/db"/>
<property name="username" value="hive"/>
<property name="password" value="hive"/>
</dataSource>
</environment>
</environments>
<mappers>
<mapper resource="mapper/hive/hm2/ActiveHelperMapper.xml"/>
</mappers>
</configuration>
XML
mapper
文件:
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="***">
<select id="***" resultType="Integer" useCache="false" parameterType="String">
select count(1) from tableName where dt = #{dateTime}
</select>
</mapper>
XML
sqlSessionFactory
部分:
public class HiveSqlSessionFactory {
private static Object o = new Object();
private static HiveSqlSessionFactory hiveSqlSessionFactory;
private SqlSessionFactory sqlSessionFactory;
public static HiveSqlSessionFactory getInstance (){
synchronized (o){
if (hiveSqlSessionFactory==null){
hiveSqlSessionFactory = new HiveSqlSessionFactory();
}
return hiveSqlSessionFactory;
}
}
private HiveSqlSessionFactory(){
try {
init();
}catch (Exception e){
log.error("init sqlSessionError->"+e.getMessage());
}
}
public void init () throws IOException {
SqlSessionFactoryBuilder builder = new SqlSessionFactoryBuilder();
InputStream inputStream= Resources.getResourceAsStream("sqlConfig.xml");
SqlSessionFactory sqlSessionFactory = builder.build(inputStream);
log.info("sqlSession init->sqlConfig.xml");
this.sqlSessionFactory=sqlSessionFactory;
}
public SqlSessionFactory getSqlSessionFactory(){
synchronized (o) {
return sqlSessionFactory;
}
}
}
Java
mapper
层:
public interface ActiveHelperMapper {
int getHelperCount(@Param("dateTime") String dateTime);
}
Java
dao
层:
public class ActiveHelperMapperImpl implements ActiveHelperMapper {
private static SqlSessionFactory sqlSessionFactory = HiveSqlSessionFactory.getInstance().getSqlSessionFactory();
@Override
public int getHelperCount(String dateTime) {
SqlSession sqlSession = sqlSessionFactory.openSession();
ActiveHelperMapper activeHelperMapper = sqlSession.getMapper(ActiveHelperMapper.class);
int count = activeHelperMapper.getHelperCount(dateTime);
sqlSession.close();
return count;
}
}
Java
测试用例:
@Test
public void getHelperCountTest(){
String dateTime = DateUtils.getDayDelayFormat(new Date(), -1, "yyyy-MM-dd");
ActiveHelperMapperImpl activeHelperMapper = new ActiveHelperMapperImpl();
System.out.println(activeHelperMapper.getHelperCount(dateTime));
}
作者:柯广的网络日志
微信公众号:Java大数据与数据仓库