欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

postgresql 删除重复数据案例详解

程序员文章站 2022-06-17 22:49:42
1.建表/* navicat premium data transfer source server : localhost source server type : postg...

1.建表

/*
 navicat premium data transfer

 source server         : localhost
 source server type    : postgresql
 source server version : 110012
 source host           : localhost:5432
 source catalog        : postgres
 source schema         : public

 target server type    : postgresql
 target server version : 110012
 file encoding         : 65001

 date: 30/07/2021 10:10:04
*/


-- ----------------------------
-- table structure for test
-- ----------------------------
drop table if exists "public"."test";
create table "public"."test" (
  "id" int4 not null default null,
  "name" varchar(255) collate "pg_catalog"."default" default null,
  "age" int4 default null
)
;

-- ----------------------------
-- records of test
-- ----------------------------
insert into "public"."test" values (1, 'da', 1);
insert into "public"."test" values (2, 'da', 12);
insert into "public"."test" values (3, 'dd', 80);
insert into "public"."test" values (4, 'dd', 80);
insert into "public"."test" values (5, 'd1', 13);

-- ----------------------------
-- primary key structure for table test
-- ----------------------------
alter table "public"."test" add constraint "test_pkey" primary key ("id");

2.根据名称获取重复

先看看哪些数据重复了

select name ,count(1)  from test group by name  having count(1)>1

输出.

name        count

da              2

dd              2

3.删除所有重复数据

注意把要更新的几列数据查询出来做为一个第三方表,然后筛选更新。

delete from test where name in (select t.name from (select name ,count(1)  from test group by name  having count(1)>1) t)

4.保留一行数据

这里展示我们需要保留的数据:重复数据,保留id最大那一条

select
 1. 
from
 test 
where
 id not in (
 ( select min( id ) as id from test group by name ) 
 )

5.删除数据

delete 
from
 test 
where
 id not in (
 select
  t.id 
 from
 ( select max( id ) as id from test group by name ) t 
 )

到此这篇关于postgresql 删除重复数据案例详解的文章就介绍到这了,更多相关postgresql 删除重复数据内容请搜索以前的文章或继续浏览下面的相关文章希望大家以后多多支持!