class documentation

class ReppyRobotParser(RobotParser):

View In Hierarchy

Undocumented

Class Method from​_crawler Parse the content of a robots.txt_ file as bytes. This must be a class method. It must return a new instance of the parser backend.
Method __init__ Undocumented
Method allowed Return True if user_agent is allowed to crawl url, otherwise return False.
Instance Variable rp Undocumented
Instance Variable spider Undocumented
@classmethod
def from_crawler(cls, crawler, robotstxt_body):
Parse the content of a robots.txt_ file as bytes. This must be a class method. It must return a new instance of the parser backend.
Parameters
crawler:~scrapy.crawler.Crawler instancecrawler which made the request
robotstxt​_body:bytescontent of a robots.txt_ file.
def __init__(self, robotstxt_body, spider):

Undocumented

def allowed(self, url, user_agent):
Return True if user_agent is allowed to crawl url, otherwise return False.
Parameters
url:strAbsolute URL
user​_agent:strUser agent
rp =

Undocumented

spider =

Undocumented