Python3 爬蟲學習筆記 C15【代理的基本使用】


Python3 爬蟲學習筆記第十五章 —— 【代理的基本使用】


【15.1】代理初識

大多數網站都有反爬蟲機制,如果一段時間內同一個 IP 發送的請求過多,服務器就會拒絕訪問,直接禁封該 IP,此時,設置代理即可解決這個問題,網絡上有許多免費代理和付費代理,比如西刺代理全網代理 IP快代理等,設置代理需要用到的就是代理 IP 地址和端口號,如果電腦上裝有代理軟件(例如:酸酸乳SSR),軟件一般會在本機創建 HTTP 或 SOCKS 代理服務,直接使用此代理也可以

【15.2】urllib 庫使用代理

from urllib.error import URLError
from urllib.request import ProxyHandler, build_opener

proxy = '127.0.0.1:1080'
proxy_handler = ProxyHandler({
    'http': 'http://' + proxy,
    'https': 'https://' + proxy
})
opener = build_opener(proxy_handler)
try:
    response = opener.open('http://httpbin.org/get')
    print(response.read().decode('utf8'))
except URLError as e:
    print(e.reason)

http://httpbin.org/get 是一個請求測試站點,藉助 ProxyHandler 設置代理,參數爲字典類型,鍵名爲協議類型,鍵值爲代理,代理的寫法:proxy = '127.0.0.1:1080',其中 127.0.0.1 爲 IP 地址,1080 爲端口號,這裏表示本機的代理軟件已經在本地 1080 端口創建了代理服務,代理前面需要加上 http 或者 https 協議,當請求的鏈接爲 http 協議時,ProxyHandler 會自動調用 http 代理,同理,當請求的鏈接爲 https 協議時,ProxyHandler 會自動調用 https 代理,build_opener() 方法傳入 ProxyHandler 對象來創建一個 opener,調用 open() 方法傳入一個 url 即可通過代理訪問該鏈接,運行結果爲一個 JSON,origin 字段爲此時客戶端的 IP

{
  "args": {}, 
  "headers": {
    "Accept-Encoding": "identity", 
    "Host": "httpbin.org", 
    "User-Agent": "Python-urllib/3.6"
  }, 
  "origin": "168.70.60.141, 168.70.60.141", 
  "url": "https://httpbin.org/get"
}

如果是需要認證的代理,只需要在代理前面加入代理認證的用戶名密碼即可:

from urllib.error import URLError
from urllib.request import ProxyHandler, build_opener

proxy = 'username:[email protected]:1080'
proxy_handler = ProxyHandler({
    'http': 'http://' + proxy,
    'https': 'https://' + proxy
})
opener = build_opener(proxy_handler)
try:
    response = opener.open('http://httpbin.org/get')
    print(response.read().decode('utf8'))
except URLError as e:
    print(e.reason)

如果代理是 SOCKS5 類型,需要用到 socks 模塊,設置代理方法如下:

擴展:SOCKS5 是一個代理協議,它在使用TCP/IP協議通訊的前端機器和服務器機器之間扮演一箇中介角色,使得內部網中的前端機器變得能夠訪問 Internet 網中的服務器,或者使通訊更加安全

import socks
import socket
from urllib import request
from urllib.error import URLError

socks.set_default_proxy(socks.SOCKS5, '127.0.0.1', 1080)
socket.socket = socks.socksocket
try:
    response = request.urlopen('http://httpbin.org/get')
    print(response.read().decode('utf-8'))
except URLError as e:
    print(e.reason)

【15.3】requests 庫使用代理

requests 庫使用代理只需要傳入 proxies 參數即可:

import requests

proxy = '127.0.0.1:1080'
proxies = ({
    'http': 'http://' + proxy,
    'https': 'https://' + proxy
})
try:
    response = requests.get('http://httpbin.org/get', proxies=proxies)
    print(response.text)
except requests.exceptions.ChunkedEncodingError as e:
    print('Error', e.args)

輸出結果:

{
  "args": {}, 
  "headers": {
    "Accept": "*/*", 
    "Accept-Encoding": "gzip, deflate", 
    "Host": "httpbin.org", 
    "User-Agent": "python-requests/2.22.0"
  }, 
  "origin": "168.70.60.141, 168.70.60.141", 
  "url": "https://httpbin.org/get"
}

同樣的,如果是需要認證的代理,也只需要在代理前面加入代理認證的用戶名密碼即可:

import requests

proxy = 'username:[email protected]:1080'
proxies = ({
    'http': 'http://' + proxy,
    'https': 'https://' + proxy
})
try:
    response = requests.get('http://httpbin.org/get', proxies=proxies)
    print(response.text)
except requests.exceptions.ChunkedEncodingError as e:
    print('Error', e.args)

如果代理是 SOCKS5 類型,需要用到 requests[socks] 模塊或者 socks 模塊,使用 requests[socks] 模塊時設置代理方法如下:

import requests

proxy = '127.0.0.1:1080'
proxies = {
    'http': 'socks5://' + proxy,
    'https': 'socks5://' + proxy
}
try:
    response = requests.get('http://httpbin.org/get', proxies=proxies)
    print(response.text)
except requests.exceptions.ConnectionError as e:
    print('Error', e.args)

使用 socks 模塊時設置代理方法如下(此類方法爲全局設置):

import requests
import socks
import socket

socks.set_default_proxy(socks.SOCKS5, '127.0.0.1', 1080)
socket.socket = socks.socksocket
try:
    response = requests.get('http://httpbin.org/get')
    print(response.text)
except requests.exceptions.ConnectionError as e:
    print('Error', e.args)

【15.4】Selenium 使用代理

【15.4.1】Chrome

from selenium import webdriver

proxy = '127.0.0.1:1080'
chrome_options = webdriver.ChromeOptions()
chrome_options.add_argument('--proxy-server=http://' + proxy)
path = r'F:\PycharmProjects\Python3爬蟲\chromedriver.exe'
browser = webdriver.Chrome(executable_path=path, chrome_options=chrome_options)
browser.get('http://httpbin.org/get')

通過 ChromeOptions 來設置代理,在創建 Chrome 對象的時候用 chrome_options 參數傳遞即可,訪問目標鏈接後顯示如下信息:

{
  "args": {}, 
  "headers": {
    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3", 
    "Accept-Encoding": "gzip, deflate", 
    "Accept-Language": "zh-CN,zh;q=0.9", 
    "Host": "httpbin.org", 
    "Upgrade-Insecure-Requests": "1", 
    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36"
  }, 
  "origin": "168.70.60.141, 168.70.60.141", 
  "url": "https://httpbin.org/get"
}

如果是認證代理,則設置方法如下:

from selenium import webdriver
from selenium.webdriver.chrome.options import Options
import zipfile

ip = '127.0.0.1'
port = 1080
username = 'username'
password = 'password'

manifest_json = """{"version":"1.0.0","manifest_version": 2,"name":"Chrome Proxy","permissions": ["proxy","tabs","unlimitedStorage","storage","<all_urls>","webRequest","webRequestBlocking"],"background": {"scripts": ["background.js"]
    }
}
"""

background_js ="""
var config = {
        mode: "fixed_servers",
        rules: {
          singleProxy: {
            scheme: "http",
            host: "%(ip) s",
            port: %(port) s
          }
        }
      }

chrome.proxy.settings.set({value: config, scope: "regular"}, function() {});

function callbackFn(details) {
    return {
        authCredentials: {username: "%(username) s",
            password: "%(password) s"
        }
    }
}

chrome.webRequest.onAuthRequired.addListener(
            callbackFn,
            {urls: ["<all_urls>"]},
            ['blocking']
)
""" % {'ip': ip, 'port': port, 'username': username, 'password': password}

plugin_file = 'proxy_auth_plugin.zip'
with zipfile.ZipFile(plugin_file, 'w') as zp:
    zp.writestr("manifest.json", manifest_json)
    zp.writestr("background.js", background_js)
chrome_options = Options()
chrome_options.add_argument("--start-maximized")
path = r'F:\PycharmProjects\Python3爬蟲\chromedriver.exe'
chrome_options.add_extension(plugin_file)
browser = webdriver.Chrome(executable_path=path, chrome_options=chrome_options)
browser.get('http://httpbin.org/get')

需要在本地創建一個 manifest.json 配置文件和 background.js 腳本來設置認證代理。運行代碼之後本地會生成一個 proxy_auth_plugin.zip 文件來保存當前配置

【15.4.1】PhantomJS

藉助 service_args 參數,也就是命令行參數即可設置代理:

from selenium import webdriver

service_args = [
    '--proxy=127.0.0.1:1080',
    '--proxy-type=http'
]
path = r'F:\PycharmProjects\Python3爬蟲\phantomjs-2.1.1\bin\phantomjs.exe'
browser = webdriver.PhantomJS(executable_path=path, service_args=service_args)
browser.get('http://httpbin.org/get')
print(browser.page_source)

運行結果:

<html><head></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
  "args": {}, 
  "headers": {
    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8", 
    "Accept-Encoding": "gzip, deflate", 
    "Accept-Language": "zh-CN,en,*", 
    "Host": "httpbin.org", 
    "User-Agent": "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/538.1 (KHTML, like Gecko) PhantomJS/2.1.1 Safari/538.1"
  }, 
  "origin": "168.70.60.141, 168.70.60.141", 
  "url": "https://httpbin.org/get"
}
</pre></body></html>

如果是需要認證的代理,只需要在 service_args 參數加入 --proxy-auth 選項即可:

from selenium import webdriver

service_args = [
    '--proxy=127.0.0.1:1080',
    '--proxy-type=http',
    '--proxy-auth=username:password'
]
path = r'F:\PycharmProjects\Python3爬蟲\phantomjs-2.1.1\bin\phantomjs.exe'
browser = webdriver.PhantomJS(executable_path=path, service_args=service_args)
browser.get('http://httpbin.org/get')
print(browser.page_source)
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章