python爬虫必备,自建ip代理池,不惧封ip。
为什么要使用代理IP
在爬虫的过程中,很多网站会采取反爬虫技术,其中最经常使用的就是限制一个IP的访问次数。当你本地的IP地址被该网站封禁后,可能就需要换一个代理来爬虫。无私分享全套Python爬虫干货,如果你也想学习Python,@ 私信小编获取
开发思路
1、通过本地IP抓取第一批启动代理IP
我们从代理IP网站抓取代理IP的过程本身就是爬虫,如果短时间内请求次数过多会被网站禁止访问,因此我们需要利用本地IP去抓取第一批代理IP,然后使用代理IP去抓取新的代理IP。
2、对第一批启动的代理IP验证有效性后存入数据库
我们在数据库IP.db下建了两个表:proxy_ip_table(存储所有抓取的IP,用于查看抓取IP功能是否正常)和validation_ip_table(存储所有通过验证的IP,用于查看IP有效性)
第一步中获取的代理IP经检验后存入validation_ip_table,检验的实现如下:
def ip_validation(self, ip):
#判断是否高匿:非高匿的ip仍会出卖你的真实ip
anonymity_flag = False
if "高匿" in str(ip):
anonymity_flag = True
IP = str(ip[0]) + ":" + str(ip[1]);IP
url = "http://httpbin.org/get" ##测试代理IP功能的网站
proxies = { "https" : "https://" + IP} #为什么要用https而不用http我也不清楚
headers = FakeHeaders().random_headers_for_validation()
#判断是否可用
validation_flag = True
response = None
try:
response = requests.get(url = url, headers = headers, proxies = proxies, timeout = 5)
except:
validation_flag = False
if response is None :
validation_flag = False
if anonymity_flag and validation_flag:
return True
else:
return False
3、构建待访问的网址列表并循环抓取,每次抓取的ip_list经验证后存入数据库表
我们构建了待访问的网址列表
self.URLs = [ "https://www.xicidaili.com/nn/%d" % (index + 1) for index in range(100)]
包含的模块
1、RandomHeaders.py
构造随机请求头,用于模拟不同的网络浏览器,调用方式:
包含的模块
1、RandomHeaders.py
构造随机请求头,用于模拟不同的网络浏览器,调用方式:
from RandomHeaders import FakeHeaders
#返回请求xici代理网站的请求头
xici_headers = FakeHeaders().random_headers_for_xici
2、DatabaseTable.py
提供数据库的创建表和增删查功能,调用方式:
from DatabaseTable import IPPool
tablename = "proxy_ip_table"
#tablename也可以是validation_ip_table
IPPool(tablename).create() #创建表
IPPool(tablename).select(random_flag = False)
# random_flag = True时返回一条随机记录,否则返回全部记录
IPPool(table_name).delete(delete_all = True) #删除全部记录
3、GetProxyIP.py
核心代码,有几个函数可以实现不同的功能:
- 从0开始完成建表、抓取IP和存入数据库的功能
from GetProxyIP import Carwl
Crawl().original_run()
- 当代理IP个数不够的时候,根据url_list列表进行抓取,将合适的IP存入列表
from GetProxyIP import Carwl
#其他提供代理IP的网站
url_kuaidaili = ["https://www.kuaidaili.com/free/inha/%d" % (index + 1) for index in range(10,20)]
Crawl().get_more_run(url_list)
- 当IP池太久没用时,需要对IP有效性进行验证,不符合要求的IP需要删除
from GetProxyIP import Carwl
Crawl().proxy_ip_validation()
部分代码
1、RandomHeaders.py
提供随机请求头,模仿浏览器访问以应付反爬
# -*- coding: utf-8 -*-
"""
Created on Tue Jan 29 10:36:28 2019
@author: YANG
功能:生成随机请求头,模拟不同的浏览器访问
"""
import random
from fake_useragent import UserAgent
class FakeHeaders(object):
"""
生成随机请求头
"""
def __init__(self):
self.__UA = [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)",
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0;",
"Mozilla/5.0 (Windows; U; Windows NT 5.1) Gecko/20070803 Firefox/1.5.0.12",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.11 TaoBrowser/2.0 Safari/536.11",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser)",
"Opera/9.27 (Windows NT 5.2; U; zh-cn)",
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)",
"Opera/9.80 (Windows NT 5.1; U; zh-cn) Presto/2.9.168 Version/11.50",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.9.2.1000 Chrome/39.0.2146.0Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.87 Safari/537.36",
"Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11",
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E)",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 BIDUBrowser/8.3 Safari/537.36",
"Mozilla/4.0 (compatible; MSIE 6.0; ) Opera/UCWEB7.0.2.37/28/999",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)",
"Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; 360SE)",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.57.2 (KHTML, like Gecko) Version/5.1.7 Safari/534.57.2",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.154 Safari/537.36 LBBROWSER",
"Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.89 Safari/537.1",
"Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2486.0 Safari/537.36 Edge/13.10586",
"Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SE 2.X MetaSr 1.0; SE 2.X MetaSr 1.0; .NET CLR 2.0.50727; SE 2.X MetaSr 1.0)",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.87 Safari/537.36 OPR/37.0.2178.32",
"Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; InfoPath.3; rv:11.0) like Gecko",
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; LBBROWSER)",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)",
"Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)",
"Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.84 Safari/535.11 SE 2.X MetaSr 1.0",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)",
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36 Core/1.47.277.400 QQBrowser/9.4.7658.400",
"Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 UBrowser/5.6.12150.8 Safari/537.36",
"Mozilla/5.0 (Windows; U; Windows NT 5.1) Gecko/20070309 Firefox/2.0.0.3",
"Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; SE 2.X MetaSr 1.0)",
"Mozilla/4.0 (compatible; MSIE 12.0",
"Mozilla/5.0 (Windows; U; Windows NT 5.2) Gecko/2008070208 Firefox/3.0.1",
"Mozilla/5.0 (Windows NT 5.2) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.122 Safari/534.30",
"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36",
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 10.0; WOW64; Trident/7.0; Touch; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; Tablet PC 2.0)",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.71 Safari/537.1 LBBROWSER",
"Mozilla/5.0 (Windows NT 10.0; WOW64; rv:38.0) Gecko/20100101 Firefox/38.0",
"Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0)",
"Mozilla/5.0 (Windows NT 5.1; rv:44.0) Gecko/20100101 Firefox/44.0",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 Safari/537.36 TheWorld 7",
"Mozilla/5.0 (Windows NT 6.1; rv,2.0.1) Gecko/20100101 Firefox/4.0.1",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36 SE2.X MetaSr 1.0",
]
#UserAgent用户代理,主要提供浏览器类型及版本、操作系统及版本和浏览器内核等信息
def random_headers_for_xici(self):
headers = {
"User-Agent": UserAgent().random, ##随机选择UA
"Accept-Language": "zh-CN,zh;q=0.9,en-US;q=0.8,en;q=0.7",
"Accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8",
"Accept-Encoding":"gzip, deflate, br",
"Cache-Control":"max-age=0",
"Connection":"keep-alive",
"Host":"www.xicidaili.com",
"Upgrade-Insecure-Requests":"1"
}
return headers
def random_headers_for_validation(self):
headers = {
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8",
"Accept-Encoding": "gzip, deflate",
"Accept-Language": "zh-CN,zh;q=0.9",
"Connection": "close",
"Host": "httpbin.org",
"Upgrade-Insecure-Requests": "1",
"User-Agent": UserAgent().random}
return headers
if __name__ == "__main__":
print("随机抽取20条headers:")
for i in range(20):
print(FakeHeaders().random_headers_for_xici())
2、DatabaseTable.py
提供数据库功能,这里提供了能存储IP的数据库IP.db。
import sqlite3 ##可以在 Python 程序中使用 SQLite 数据库
import time
class IPPool(object):
##存储ip的数据库,包括两张表ip_table和all_ip_table
##insert和建表语句绑定在一起
def __init__(self,table_name):
self.__table_name = table_name
self.__database_name = "IP.db" ##IPPool对应的数据库为IP.db
##初始化类,传入参数table_name
ef create(self):
conn = sqlite3.connect(self.__database_name, isolation_level = None)
conn.execute(
"create table if not exists %s(IP CHAR(20) UNIQUE, PORT INTEGER, ADDRESS CHAR(50), TYPE CHAR(50), PROTOCOL CHAR(50))"
% self.__table_name)
print("IP.db数据库下%s表建表成功" % self.__table_name)
##建表语句
def insert(self, ip):
conn = sqlite3.connect(self.__database_name, isolation_level = None)
#isolation_level是事务隔离级别,默认是需要自己commit才能修改数据库,置为None则自动每次修改都提交
for one in ip:
conn.execute(
"insert or ignore into %s(IP, PORT, ADDRESS, TYPE, PROTOCOL) values (?,?,?,?,?)"
% (self.__table_name),
(one[0], one[1], one[2], one[3], one[4]))
conn.commit() #提交insert 但是已经设置isolaion_level为None,所以应该不需要
conn.close()
def select(self,random_flag = False):
conn = sqlite3.connect(self.__database_name,isolation_level = None)
##连接数据库
cur=conn.cursor()
#cursor用于接受返回的结果
if random_flag:
cur.execute(
"select * from %s order by random() limit 1"
% self.__table_name)
result = cur.fetchone()
#如果是random_flag为T则随机抽取一条记录并返回
else:
cur.execute("select * from %s" % self.__table_name)
result = cur.fetchall()
cur.close()
conn.close()
return result
def delete(self, IP = ('1',1,'1','1','1'), delete_all=False):
conn = sqlite3.connect(self.__database_name,isolation_level = None)
if not delete_all:
n = conn.execute("delete from %s where IP=?" % self.__table_name,
(IP[0],))
#逗号不能省,元组元素只有一个的时候一定要加
print("删除了",n.rowcount,"行记录")
else:
n = conn.execute("delete from %s" % self.__table_name)
print("删除了全部记录,共",n.rowcount,"行")
conn.close()
为了帮助大家更轻松的学好Python,我给大家分享一套Python学习资料,希望对正在学习的你有所帮助!
获取方式:关注并私信小编 “ 学习 ”,即可免费获取!
上一篇: Android 自定义View -- 简约的折线图
下一篇: 乐观锁,悲观锁实现步骤
推荐阅读
-
python利用proxybroker构建爬虫免费IP代理池
-
python利用proxybroker构建爬虫免费IP代理池的实现
-
python爬虫利用代理池更换IP的方法步骤
-
python爬虫(3)——用户和IP代理池、抓包分析、异步请求数据、腾讯视频评论爬虫
-
玩爬虫封IP是最头痛的事情!从零搭建异步爬虫代理池!随你怎么封
-
Python数据抓取爬虫代理防封IP方法
-
python爬虫设置代理ip池【源代码】(存入数据库)
-
爬虫老是被封IP?看我大Python搭建高匿代理池!封IP你觉得可能吗
-
python利用proxybroker构建爬虫免费IP代理池
-
python利用proxybroker构建爬虫免费IP代理池的实现