欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

flink1.11.1集成iceberg0.11(hadoop_catalog)-(1)

程序员文章站 2022-07-14 12:29:38
...

确定 cdh 环境,flink集群已经准备好

  1. cdh 6.3.2
  2. flink1.11.1
  3. scala2.11

需要的包:

  1. iceberg集成flink包

​​​​​​​https://repo.maven.apache.org/maven2/org/apache/iceberg/iceberg-flink-runtime/0.11.1/iceberg-flink-runtime-0.11.1.jarflink1.11.1集成iceberg0.11(hadoop_catalog)-(1)https://repo.maven.apache.org/maven2/org/apache/iceberg/iceberg-flink-runtime/0.11.1/iceberg-flink-runtime-0.11.1.jar

  2. 需要连接kafka的可以下载flink-sql-connector-kafka的包

将上面的包放到flink的lib目录下,重启flink集群

环境配置:

  1. /etc/profile配置

export HADOOP_USER_NAME=hdfs
export HADOOP_HOME=/opt/cloudera/parcels/CDH/lib/hadoop
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_CLASSPATH=`hadoop classpath`

export SCALA_HOME=/usr/java/scala
export FLINK_HOME=/opt/flink-1.11.1
export JAVA_HOME=/usr/java/default
export CLASSPATH=$JAVA_HOME/lib/
export PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin:${FLINK_HOME}/bin:$HADOOP_HOME/bin

source /etc/profile 读取配置文件