在 Scrapy 框架中的 Engine 和 Downloader 之間存在一個 Downloader Middlewares,我們知道 spider 發送的請求需要通過 Engine 發送給 Downloader 進行下載,而 Downloader 完成下載後的響應要通過 Engine 發送到 spider 進行解析。而下載器中間件則可以在請求發送給 Downloader 和響應發送給 Engine 之前執行某些操作,如設置代理,請求頭等,而這些操作都是通過兩個函數實現的:
- process_request(self,request,spider):該函數在請求發送給 Downloader 之前被執行
- process_response(self,request,response,spider):該函數在響應發送到 Engine 之前執行
process_request
參數:
- request:發送請求的 request 對象
- spider:發送請求的 spider 對象
返回值:
- None:此時 Scrapy 將繼續處理該 request,執行其它中間件的相應方法,直到合適的 Downloader 處理函數被調用
- Response 對象:Scrapy 不會再調用其它的 process_request 方法,直接返回該 Response 對象。已經激活的中間件的 process_response 方法則會在每個 response 返回時被調用
- Request 對象:不再使用之前的 request 對象,而是根據現在返回的 request 對象重新請求
- Exception:拋出異常,此時調用 process_exception
process_response
參數:
- request:發送請求的 request 對象
- response:Downloader 返回的 response 對象
- spider:發送請求的 spider 對象
返回值:
- Response 對象:將該新的 response 對象傳遞給其它中間件,然後傳遞給 spider
- Request 對象:Downloader 傳遞被阻攔,此時重新進行 request 請求
- Exception:拋出異常,此時調用 request 的 errback 方法,如果不存在 errback 則拋出異常
設置隨機請求頭
設置隨機請求頭可以避免被服務器檢測到一直是相同的請求頭髮送的請求,而該設置可以在 Downloader Middlewares 中實現:
settings.py
仍舊需要設置:
- ROBOTSTXT_OBEY:設置爲 False,否則爲 True。True 表示遵守機器協議,此時爬蟲會首先找 robots.txt 文件,如果找不到則會停止
- DEFAULT_REQUEST_HEADERS:默認請求頭,可以在其中添加 User-Agent,表示該請求是從瀏覽器發出的,而不是爬蟲
- DOWNLOAD_DELAY:表示下載的延遲,防止過快
- DOWNLOADER_MIDDLEWARES:啓用 middlewares.py
spider
# -*- coding: utf-8 -*-
import scrapy
class HeaderSpider(scrapy.Spider):
name = 'header'
allowed_domains = ['httpbin.org']
start_urls = ['http://www.httpbin.org/user-agent']
def parse(self, response):
print(response.text)
yield scrapy.Request(url=self.start_urls[0],dont_filter=True)
middlewares.py
def process_request(self, request, spider):
USERAGENTS = ['Opera/9.80 (X11; Linux x86_64; U; pl) Presto/2.7.62 Version/11.00',
'Opera/9.80 (X11; Linux i686; U; it) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 6.1; U; zh-cn) Presto/2.6.37 Version/11.00',
'Opera/9.80 (Windows NT 6.1; U; pl) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 6.1; U; ko) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 6.1; U; fi) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 6.1; U; en-GB) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 6.1 x64; U; en) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 6.0; U; en) Presto/2.7.39 Version/11.00',
'Opera/9.80 (Windows NT 5.1; U; ru) Presto/2.7.39 Version/11.00',
'Opera/9.80 (Windows NT 5.1; U; MRA 5.5 (build 02842); ru) Presto/2.7.62 Version/11.00',
'Opera/9.80 (Windows NT 5.1; U; it) Presto/2.7.62 Version/11.00',
'Mozilla/5.0 (Windows NT 6.0; U; ja; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 11.00',
'Mozilla/5.0 (Windows NT 5.1; U; pl; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 11.00',
'Mozilla/5.0 (Windows NT 5.1; U; de; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 11.00',
'Mozilla/4.0 (compatible; MSIE 8.0; X11; Linux x86_64; pl) Opera 11.00',
'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; fr) Opera 11.00',
'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; ja) Opera 11.00',
'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; pl) Opera 11.00'
]
user_agent = random.choice(USERAGENTS)
request.headers['User-Agent'] = user_agent
return None
在上邊的文件中只需要編寫 process_request 函數即可,結果爲:
2020-05-25 17:29:33 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.httpbin.org/user-agent> (referer: http://www.httpbin.org/user-agent)
{
"user-agent": "Opera/9.80 (Windows NT 6.1 x64; U; en) Presto/2.7.62 Version/11.00"
}
2020-05-25 17:29:34 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.httpbin.org/user-agent> (referer: http://www.httpbin.org/user-agent)
{
"user-agent": "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; ja) Opera 11.00"
}
2020-05-25 17:29:34 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.httpbin.org/user-agent> (referer: http://www.httpbin.org/user-agent)
{
"user-agent": "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; pl) Opera 11.00"
}
2020-05-25 17:29:35 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.httpbin.org/user-agent> (referer: http://www.httpbin.org/user-agent)
{
"user-agent": "Opera/9.80 (Windows NT 6.1 x64; U; en) Presto/2.7.62 Version/11.00"
}
2020-05-25 17:29:36 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://www.httpbin.org/user-agent> (referer: http://www.httpbin.org/user-agent)
{
"user-agent": "Opera/9.80 (X11; Linux x86_64; U; pl) Presto/2.7.62 Version/11.00"
}
從結果可以看出每次的 User-Agent 結果都是不一樣的,也就是說請求頭是隨機的。