Scrapy未在POST请求中发送Cookies

3

我想使用scrapy来提交POST请求,但是它没有在头部发送Cookies。

设置

在OSX下运行。创建了一个虚拟环境并运行了pip install Scrapy。然后创建了一个默认的爬虫:

(hotlanesbot)tollspider $ scrapy startproject vai66tolls
(hotlanesbot)tollspider $ cd vai66tolls/
(hotlanesbot)vai66tolls $ scrapy genspider vai66tolls-spider vai66tolls.com

我随后在settings.py中启用了cookie调试:

COOKIES_DEBUG = True

代码

蜘蛛爬取的代码非常基本:解析网站,然后在parse_eb中POST表单并处理响应。 vai66tolls_spider.py的内容如下:

# -*- coding: utf-8 -*-
import scrapy
from scrapy.http.cookies import CookieJar

class Vai66tollsSpiderSpider(scrapy.Spider):
    name = 'vai66tolls-spider'
    allowed_domains = ['vai66tolls.com']
    start_urls = ['http://vai66tolls.com/']

    def parse(self, response):
        filename = "/tmp/body.html"
        with open(filename, 'wb') as f:
            f.write(response.body)
        self.log('Saved file %s' % filename)

        self.log('Initial Response headers: (%s)' % response.headers)

        # look for "cookie" things in response headers
        poss_cookies = response.headers.getlist('Set-Cookie')
        self.log('Set-Cookie?: (%s)' % poss_cookies)

        poss_cookies = response.headers.getlist('Cookie')
        self.log('Cookie?: (%s)' % poss_cookies)

        poss_cookies = response.headers.getlist('cookie')
        self.log('cookie?: (%s)' % poss_cookies)

        # Parse Eastbound
        r = scrapy.FormRequest.from_response(
            response,
            callback=self.parse_eb,
            )

        yield r

    def parse_eb(self, response):
        filename = "/tmp/eb.txt"
        with open(filename, 'wb') as f:
            f.write(response.body)
        self.log('Saved file %s' % filename)
        self.log('Request headers: %s' % response.request.headers)
        self.log('Request cookies: %s' % response.request.cookies)

您可以在这里的Github上查看

输出

我正在使用以下命令运行爬虫:

(hotlanesbot)vai66tolls $ scrapy crawl vai66tolls-spider

在日志输出中,我看到了“Received cookies”DEBUG语句,但没有来自文档 / CookiesMiddleware所期望的“Sending cookies to”消息。

下面是更大的输出片段:

2018-01-10 08:50:35 [scrapy.core.engine] INFO: Spider opened
2018-01-10 08:50:35 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-01-10 08:50:35 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2018-01-10 08:50:35 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://vai66tolls.com/robots.txt> from <GET http://vai66tolls.com/robots.txt>
2018-01-10 08:50:35 [scrapy.core.engine] DEBUG: Crawled (404) <GET https://vai66tolls.com/robots.txt> (referer: None)
2018-01-10 08:50:35 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET https://vai66tolls.com/> from <GET http://vai66tolls.com/>
2018-01-10 08:50:35 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://vai66tolls.com/> (referer: None)
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Saved file /tmp/body.html
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Initial Response headers: ({'X-Powered-By': ['ASP.NET'], 'X-Aspnet-Version': ['4.0.30319'], 'Server': ['Microsoft-IIS/10.0'], 'Cache-Control': ['private'], 'Date': ['Wed, 10 Jan 2018 13:50:35 GMT'], 'Content-Type': ['text/html; charset=utf-8']})
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Set-Cookie?: ([])
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Cookie?: ([])
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: cookie?: ([])
2018-01-10 08:50:35 [scrapy.downloadermiddlewares.cookies] DEBUG: Received cookies from: <200 https://vai66tolls.com/>
Set-Cookie: ASP.NET_SessionId=im3zxr01stwmr02z0cisggbl; path=/; HttpOnly

2018-01-10 08:50:35 [scrapy.core.engine] DEBUG: Crawled (200) <POST https://vai66tolls.com/> (referer: https://vai66tolls.com/)
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Saved file /tmp/eb.txt
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Request headers: {'Accept-Language': ['en'], 'Accept-Encoding': ['gzip,deflate'], 'Accept': ['text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'], 'User-Agent': ['Scrapy/1.5.0 (+https://scrapy.org)'], 'Referer': ['https://vai66tolls.com/'], 'Content-Type': ['application/x-www-form-urlencoded']}
2018-01-10 08:50:35 [vai66tolls-spider] DEBUG: Request cookies: {}
2018-01-10 08:50:35 [scrapy.core.engine] INFO: Closing spider (finished)

(未显示的是一行表明 scrapy.downloadermiddlewares.cookies.CookiesMiddleware 已包含在下载器中间件中。)

相比之下,如果我通过Chrome的调试工具监视初始请求,我会看到以下响应头:

cache-control:private
content-length:7289
content-type:text/plain; charset=utf-8
date:Tue, 09 Jan 2018 04:38:57 GMT
server:Microsoft-IIS/10.0
status:200
x-aspnet-version:4.0.30319
x-powered-by:ASP.NET

而对于随后的表单提交,调试工具报告了以下请求头:

:authority:vai66tolls.com
:method:POST
:path:/
:scheme:https
accept:*/*
accept-encoding:gzip, deflate, br
accept-language:en-US,en;q=0.9
cache-control:no-cache
content-length:4480
content-type:application/x-www-form-urlencoded; charset=UTF-8
cookie:ASP.NET_SessionId=up5ygvcjzjalnw2z1r1e0qeg
origin:https://vai66tolls.com
pragma:no-cache
referer:https://vai66tolls.com/
user-agent:Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36
x-microsoftajax:Delta=true
x-requested-with:XMLHttpRequest

同时在Chrome中,我可以生成一个正确工作的curl请求。使用curl请求,我确认从头部删除Cookies就足以防止正确响应返回。例如,我知道可能需要发送其他必需的表单数据,但如果没有Cookies,它一定会失败。

问题

  1. 为什么Scrapy没有在请求的头部包含Cookie?
  2. 有没有办法手动获取Scrapy获取的Cookie,以便将其添加到FormRequest.from_response()中?

根据您的日志,您的初始请求(即start_urls)没有接收到任何cookie,因此后续请求也不会包含它们。也许您应该使用start_requests而不是start_urls,并使用与您的curl请求相同的用户代理、标头等,看看是否可以在初始响应中获取cookie。 - Gallaecio
2个回答

2
请检查在设置中是否将COOKIES_ENABLED设置为True。关于第二个问题,您应该能够从Response对象的headers中提取cookie。
cookies = response.headers.getlist('Set-Cookie')

现在你可以手动将它们插入到FormRequest中,并将它们作为参数传递给from_response方法。我认为可以使用Request对象的cookies参数,或直接使用headers参数(headers={'Cookie': xxx})。


不幸的是,“Set-Cookie”似乎不在“Response”头中。我更新了最初的问题来展示这一点。我更新了代码来检查它,并展示了Chrome的响应头。 - steamfarmer

-1

我使用这里的答案自己解决了问题。最好使用cookies属性来处理Cookie,而不是使用headers属性。不知何故,headers属性往往会处理Cookie不当。

request_with_cookies = Request(url="http://...",cookies={'country': 'UY'})

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接