欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

大数据综合练习题目

程序员文章站 2022-04-03 22:36:53
...

本题是一个综合练习题目总共包括以下部分:

1.数据的预处理阶段

2.数据的入库操作阶段

3.数据的分析阶段

4.数据保存到数据库阶段

5.数据的查询显示阶段
给出数据格式表和数据示例,请先阅读数据说明,再做相应题目。

数据说明:

表1-1 视频表

大数据综合练习题目
表1-2 用户表

大数据综合练习题目

原始数据:

qR8WRLrO2aQ:mienge:406:People & Blogs:599:2788:5:1:0:4UUEKhr6vfA:zvDPXgPiiWI:TxP1eXHJQ2Q:k5Kb1K0zVxU:hLP_mJIMNFg:tzNRSSTGF4o:BrUGfqJANn8:OVIc-mNxqHc:gdxtKvNiYXc:bHZRZ-1A-qk:GUJdU6uHyzU:eyZOjktUb5M:Dv15_9gnM2A:lMQydgG1N2k:U0gZppW_-2Y:dUVU6xpMc6Y:ApA6VEYI8zQ:a3_boc9Z_Pc:N1z4tYob0hM:2UJkU2neoBs

预处理之后的数据:

qR8WRLrO2aQ:mienge:406:People,Blogs:599:2788:5:1:0:4UUEKhr6vfA,zvDPXgPiiWI,TxP1eXHJQ2Q,k5Kb1K0zVxU,hLP_mJIMNFg,tzNRSSTGF4o,BrUGfqJANn8,OVIc-mNxqHc,gdxtKvNiYXc,bHZRZ-1A-qk,GUJdU6uHyzU,eyZOjktUb5M,Dv15_9gnM2A,lMQydgG1N2k,U0gZppW_-2Y,dUVU6xpMc6Y,ApA6VEYI8zQ,a3_boc9Z_Pc,N1z4tYob0hM,2UJkU2neoBs

1、对原始数据进行预处理,格式为上面给出的预处理之后的示例数据。
通过观察原始数据形式,可以发现,每个字段之间使用“:”分割,视频可以有多个视频类别,类别之间&符号分割,且分割的两边有空格字符,同时相关视频也是可以有多个,多个相关视频也是用“:”进行分割。为了分析数据时方便,我们首先进行数据重组清洗操作。
即:将每条数据的类别用“,”分割,同时去掉两边空格,多个“相关视频id”也使用“,”进行分割

2、把预处理之后的数据进行入库到hive中

2.1创建数据库和表
创建数据库名字为:video
创建原始数据表:
视频表:video_ori 用户表:video_user_ori
创建ORC格式的表:
视频表:video_orc 用户表:video_user_orc
给出创建原始表语句

创建video_ori视频表:

create table video_ori(
videoId string,
uploader string,
age int,
category array,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array)
row format delimited
fields terminated by “:”
collection items terminated by “,”
stored as textfile;

创建video_user_ori用户表:

create table video_user_ori(
uploader string,
videos int,
friends int)
row format delimited
fields terminated by “,”
stored as textfile;

请写出ORC格式的建表语句:
创建video_orc表

create table video_orc(
videoId string,
uploader string,
age int,
category array,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array)
row format delimited
fields terminated by “:”
collection items terminated by “,”
stored as orcfile;

创建video_user_orc表:

create table video_user_orc(
uploader string,
videos int,
friends int)
row format delimited
fields terminated by “,”
stored as orcfile;

2.2分别导入预处理之后的视频数据到原始表video_ori和导入原始用户表的数据到video_user_ori中
请写出导入语句:

video_ori:
load data local inpath ‘/video.txt’ overwrite into table video_ori;
video_user_ori:
load data local inpath ‘/user.txt’ overwrite into table video_user_ori;

2.3从原始表查询数据并插入对应的ORC表中
请写出插入语句:

video_orc:
INSERT INTO TABLE video_orc SELECT * FROM video_ori;

video_user_orc:
INSERT INTO TABLE video_user_orc SELECT * FROM video_user_ori;

3、对入库之后的数据进行hivesql查询操作
3.1从视频表中统计出视频评分为5分的视频信息,把查询结果保存到/export/rate.txt
请写出sql语句:

Hive -e ‘select * from video.video_orc where rate=5’> /export/rate.txt

3.2从视频表中统计出评论数大于100条的视频信息,把查询结果保存到/export/comments.txt
请写出sql语句:

Hive -e ‘select * from video.video_orc where conments>100’> /export/comments.txt

4、把hive分析出的数据保存到hbase中
4.1创建hive对应的数据库外部表
请写出创建rate外部表的语句:

Create external table rate(
videoId string,
uploader string,
age int,
category array,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array)
row format delimited
fields terminated by “\t”
stored as textfile;

请写出创建comments外部表的语句:

Create external table comments(
videoId string,
uploader string,
age int,
category array,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array)
row format delimited
fields terminated by “\t”
stored as textfile;

4.2加载第3步的结果数据到外部表中
请写出加载语句到rate表:

Load data local inpath ‘/export/rate.txt’ overwrite into table rate;

请写出加载语句到comments表:

Load data local inpath ‘/export/comments.txt’ overwrite into table comments;

4.3创建hive管理表与HBase进行映射
给出此步骤的语句
Hive中的rate,comments两个表分别对应hbase中的hbase_rate,hbase_comments两个表
创建hbase_rate表并进行映射:

create table hbase_rate(
videoId string,
uploader string,
age int,
category array,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array
)
stored by ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
with serdeproperties (“hbase.columns.mapping” =
“info:uploader,info:age,info:category,info:length,info:views,info:rate,info:ratings,info:comments,info:relatedId”)
tblproperties (“hbase.table.name” = “hbase_rate”);

创建hbase_comments表并进行映射:

create table hbase_comments(
videoId string,
uploader string,
age int,
category array,
length int,
views int,
rate float,
ratings int,
comments int,
relatedId array
)
stored by ‘org.apache.hadoop.hive.hbase.HBaseStorageHandler’
with serdeproperties (“hbase.columns.mapping” =
“info:uploader,info:age,info:category,info:length,info:views,info:rate,info:ratings,info:comments,info:relatedId”)
tblproperties (“hbase.table.name” = “hbase_comments”);

4.4请写出通过insert overwrite select,插入hbase_rate表的语句

INSERT INTO TABLE hive_rate SELECT * FROM rate;

请写出通过insert overwrite select,插入hbase_comments表的语句

INSERT INTO TABLE hive_comments SELECT * FROM comment;

5.通过hbaseapi进行查询操作
5.1请使用hbaseapi 对hbase_rate表,按照通过startRowKey=1和endRowKey=100进行扫描查询出结果。
5.2请使用hbaseapi对hbase_comments表,只查询comments列的值。

package demo05;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import org.testng.annotations.Test;

/**
 * @projectName DataSplit
 * @title: HBaseApi
 * Created by FengYuan on 2020/01/02 21:26
 **/
@SuppressWarnings("all")
public class HBaseApi {

    @Test
    public static void scan() throws Exception {
        //连接数据库
        Configuration conf = new Configuration();
        conf.set("hbase.zookeeper.quorum", "node01:2181,node02:2181");
        Connection connection = ConnectionFactory.createConnection(conf);
        //读取表
        Table hbase_rate = connection.getTable(TableName.valueOf("hbase_rate"));
        //全表扫描
        Scan scan = new Scan();
        //区间扫描
        scan.setStartRow("z1kJ6Ts4rG4".getBytes());
        scan.setStopRow("z9kJ6Ts4rG4".getBytes());

        ResultScanner scanner = hbase_rate.getScanner(scan);
        //result 是与一行数据(有多个列族,多个列)
        for (Result result : scanner) {
            System.out.println(Bytes.toString(result.getRow()));
            System.out.println(Bytes.toString(result.getValue("cf".getBytes(), "age".getBytes())));
        }
        //关闭连接
        connection.close();
    }

    @Test
    public static void searchData() throws Exception {
        //连接数据库
        Configuration conf = new Configuration();
        conf.set("hbase.zookeeper.quorum", "node01:2181,node02:2181");
        Connection connection = ConnectionFactory.createConnection(conf);
        //读取表
        Table hbase_rate = connection.getTable(TableName.valueOf("hbase_comments"));
        Scan scan = new Scan();
        //result是一行数据
        ResultScanner scanner = hbase_rate.getScanner(scan);
        for (Result result : scanner) {
            //遍历一行内所有列
            Cell[] cells = result.rawCells();
            //遍历每个cell
            for (Cell cell : cells) {
                    if (Bytes.toString(CellUtil.cloneQualifier(cell)).equals("comments")) {
                        System.out.println(Bytes.toString(CellUtil.cloneFamily(cell)) + ":" + Bytes.toString(CellUtil.cloneQualifier(cell)) + ":" + Bytes.toString(CellUtil.cloneValue(cell)));
                    }
            }
        }
        //关闭连接
        connection.close();
    }
}
相关标签: HBase 大数据