shell 例行任务脚本
程序员文章站
2022-07-06 10:46:31
...
执行crontab -e 命令可进入定时任务设置页面,编辑方法同vim。
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
每天早上八点,跳转到work_dir目录,以后台任务的形式执行tesh.sh脚本,日志输入到logs/log_2020-08-12.log
00 08 * * * cd /home/zz/work_dir && nohup bash -x tesh.sh > logs/log_$(date +'\%Y-\%m-\%d') 2>&1 &
注意:此处需要在work_dir目录手动创建logs文件夹,否则会找不到日志。
test.sh代码demo:
#!/bin/bash
if [ $# = 1 ]
then
current_dt=$1
elif [ $# = 2 ]
then
echo "too many parmars!"
elif [ $# = 0 ]
then
current_dt=$(date "+%Y-%m-%d")
fi
echo ${current_dt}
last_dt=`date -d "1 day ago $current_dt" +%Y-%m-%d`
last_2dt=`date -d "2 day ago $current_dt" +%Y-%m-%d`
last_7dt=`date -d "7 day ago $current_dt" +%Y-%m-%d`
last_30dt=`date -d "30 day ago $current_dt" +%Y-%m-%d`
echo ${last_dt}
echo ${last_2dt}
echo ${last_7dt}
echo ${last_30dt}
/app/hadoop/hive/bin/hive -e "
insert overwrite table table_name partition(dt)
select id
,min(fupload_dt) as fupload_dt --首次上传日期
,max(lupload_dt) as lupload_dt --最近一次上传日期
,'${last_dt}' as pt
from(
select id
,fupload_dt
,lupload_dt
,dt
from table_name
where dt='${last_2dt}'
union all
select id
,dt as fupload_dt
,dt as lupload_dt
,dt
from XX
where dt='${last_dt}'
)t
group by id
;
"
#只保留7天数据
/app/hadoop/hive/bin/hive -e "
alter table table_name drop partition(dt<'${last_7dt}');
"
echo "---insert table_name done!"
上一篇: shell定时任务脚本