In case you are not aware of what Scrapy is or how it works, we suggest researching Scrapy documentation in order to continue development with this tool.
To get started with Scrapy you will first need to install it using methods provided in their documentation. Check here for more information
Once you get Scrapy up and running if you have not yet, make sure that you create your project folder:
scrapy startproject yourprojectname
When project directory is setup, you can deploy our middleware:
- Open Terminal window.
- Navigate to the main directory of your project folder using
cd yourprojectname
- Download our proxy middleware using the following command:
curl https://raw.githubusercontent.com/Smartproxy/Scrapy-Middleware/master/smartproxy_auth.py > smartproxy_auth.py
- You should now see your project folder populated with smartproxy_auth.py file.
To start using our middleware for proxy authentication, you'll need to configure settings for our proxy authentication.
Doing so is very simple:
- Using file manager, navigate to your project folder, you should see settings.py file located at the bottom of the directory.
- Edit the settings.py file using an editor of your choice and add the following properties at the bottom:
DOWNLOADER_MIDDLEWARES = {
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110,
'yourprojectname.smartproxy_auth.ProxyMiddleware': 100,
}
SMARTPROXY_USER = 'username' ## Smartproxy Username (Sub-user)
SMARTPROXY_PASSWORD = 'password' ## Password for your user
SMARTPROXY_ENDPOINT = 'gate.smartproxy.com' ## Endpoint you'd like to use
SMARTPROXY_PORT = '7000' ## Port of the endpoint you are using.
- In
DOWNLOADER_MIDDLEWARES
changeyourprojectname
line to the name of your project.
- Make sure that you enter your details account details as well as proxy details within punctuation marks ('').
- Save the file.
Once all that is done, all of your spiders will be going through our proxies, if you are not sure how to setup a spider, take a look here
Email - sales@smartproxy.com
Live chat 24/7