I want to do some web crawling with scrapy and python. I have found few code examples from internet where they use selenium with scrapy.
I don't know much about
selenium but only knows that it automates some web tasks. and browser actually opens and do stuff. but i don't want the actual browser to open but i want everything to happen from command line.
Can i do that in selenium and scrapy
You can use selenium with PyVirtualDisplay, at least on linux.
from pyvirtualdisplay import Display from selenium import webdriver display = Display(visible=0, size=(1024, 768)) display.start() browser = webdriver.Chrome()
Updated: PhantomJS is abandoned, and you can use headless browsers directly now, such like Firefox and Chrome!<hr>
Use PhantomJS instead.
You can do
browser = webdriver.PhantomJS() in selenium v2.32.0.