How to Use a Proxy with Python Requests

Requests is the most popular HTTP library for Python, making it easy to send HTTP requests. However, when scraping or sending a high volume of requests, you may get blocked or throttled. Using a proxy with Requests is an effective way to prevent this.

In this comprehensive guide, you'll learn:

  • What proxies are and why they help avoid blocks
  • How to set up proxies in Python Requests
  • Where to find free and paid proxies
  • How to authenticate proxies
  • Rotating proxies to avoid detection
  • Using sessions for better performance
  • Integration with proxy services like ZenRows

By the end, you'll know how to configure and use a proxy seamlessly with Python Requests. Let's dive in!

Why Use a Proxy with Python Requests?

Here are the main benefits of using proxies with Python Requests:

Avoid Blocks and Throttling

Websites often block IP addresses sending too many requests. This is common when web scraping or sending automated requests. Using a proxy hides your real IP, making it appear you are sending from different locations. This prevents blocks.

Bypass Geographic Restrictions

Some sites restrict content based on location. A proxy in another country lets you access geo-restricted content.

Increase Privacy and Anonymity

Your real IP reveals information about you. Proxies add a layer of privacy and anonymity to your web requests.

Overcome Corporate Firewalls

Proxies may help access sites blocked by corporate firewalls and network restrictions.

Load Balance Requests

Spreading requests over multiple proxies distributes load and stress on servers. This reduces the chance of throttling.

So in summary, proxies are essential for reliably sending automated requests at scale.

How Proxies Work

A proxy server acts as an intermediary for web requests. When you send a request, it first goes to the proxy server, which then makes the request to the target website on your behalf.

The proxy sends back the response from the website to you. So the website only sees the IP of the proxy rather than your real IP address.

This hides your identity and appears as though requests originate from the proxy's IP.

Now let's see how to use proxies with Python Requests.

Using a Proxy in Python Requests

The Requests library makes it very easy to send requests through a proxy. Here's how:

First, install Requests:

pip install requests

Then import Requests:

import requests

Define a dictionary containing proxies to use for HTTP and HTTPS requests:

proxies = {
  'http': 'http://10.10.1.10:3128',
  'https': 'http://10.10.1.10:1080'
}

The keys http and https indicate the protocol, while the value contains the proxy IP/host and port.

Finally, make a request using the proxies parameter:

response = requests.get('https://example.com', proxies=proxies)

This routes your HTTP requests through the proxy 10.10.1.10 on port 3128, and HTTPS requests through port 1080.

And that's all you need to start using a proxy with Python Requests!

Now let's go over where to find proxies to use.

Where to Find Proxies

There are generally two main types of proxies – free public proxies, and paid private proxies. Let's discuss both options.

Free Public Proxies

Free public proxies are easy to find, but have significant downsides:

  • Often slow and unreliable
  • Usually have usage limits imposed
  • Already banned on many sites due to abuse
  • No control over quality or locations

Still, free proxies may be useful for personal or very light usage. Here are some places to find public proxies:

Check these sites frequently as proxies go offline regularly. Also try scraping the proxy lists yourself to automate finding the latest working ones.

Overall free proxies are fine for personal browsing but lack the quality and reliability needed for automated scripts sending a high volume of requests.

Paid Private Proxies

For best performance, you'll want to use paid private proxies. The advantages of private proxies include:

  • Fast connection speeds
  • High uptime and reliability
  • Avoid blocks since they are not overused
  • Available in specific cities/countries
  • Some services offer automatic IP rotation
  • Support for SOCKS proxies
  • Usage limits suitable for automation

Some top paid proxy providers include:

  • Brightdata – Enterprise proxy service with global residential IPs
  • Smartproxy – Optimized for web scraping with static IPs
  • Storm Proxies – Rotating proxies delivering high anonymity

Paid proxies start around $50/month for small plans. The investment is well worth it for reliable bots and scrapers.

We'll be using paid proxies in the examples going forward.

Authenticating Proxies

Some paid proxy services will require authenticating yourself to use the proxies.

The main authentication methods are:

  • Username/Password – Set a username and password for the proxy
  • IP Whitelisting – Only allow requests from specific IP addresses
  • API Key – Use a unique API key to identify yourself

To authenticate in Python Requests, include the username and password in the proxy URL:

proxies = {
    'http': 'http://username:[email protected]:8080',
    'https': 'http://username:[email protected]:8080'
}

Or for API key authentication:

proxies = {
    'http': 'http://[email protected]:8080',
    'https': 'http://[email protected]:8080'   
}

Consult your proxy provider's docs for specifics on how to authenticate.

Now let's discuss rotating proxies.

Rotating Proxies

An effective technique to further avoid blocks is rotating proxies. This cycles through different proxy IP addresses for each request.

To implement this in Python, we can create a list of proxies, then randomly select one per request:

import random 

proxies = [
  'http://proxy1', 
  'http://proxy2',
  'http://proxy3' 
]

# Choose random proxy
proxy = random.choice(proxies)  

response = requests.get(url, proxies={'http': proxy, 'https': proxy})
import random 

proxies = [
  'http://proxy1', 
  'http://proxy2',
  'http://proxy3' 
]

# Choose random proxy
proxy = random.choice(proxies)  

response = requests.get(url, proxies={'http': proxy, 'https': proxy})

This helps distribute requests across many IPs to appear more human and avoid pattern detection.

Some paid proxy services like Oxylabs and Storm Proxies offer automatically rotating proxies. This saves you the effort of implementing IP cycling yourself.

Using Sessions

When making multiple requests, it's advantageous to use a Requests Session.

Sessions allow connection pooling, where multiple requests are sent over a single persistent connection. Reusing connections improves performance compared to opening a new connection for each request.

Here is how to use a proxy with a Requests Session:

import requests

session = requests.Session() 

# Configure session proxies
session.proxies = {
  'http': 'http://10.10.1.10:3128',
  'https': 'http://10.10.1.10:1080'  
}

# Requests will now use the proxy
response = session.get('https://www.website.com')

This runs all requests made through the Session over the defined proxy server.

Using ZenRows

ZenRows provides a proxy API optimized for web scraping. It handles proxy rotation and residential IPs for you.

To use ZenRows with Python Requests:

  1. Sign up for a free account

  2. Copy your unique ZenRows API key

  3. Configure the proxy like so:

proxy = 'http://<api_key>:@proxy.zenrows.com:8001'

response = requests.get('https://example.com', proxies={
    'http': proxy,
    'https': proxy 
}, verify=False)

This integrates seamlessly with Requests, offloading proxy management to ZenRows.

Other Tips for Proxies in Python

Here are some other helpful proxy tips:

  • Debug Errors – If your code stops working, ensure the proxy is still active. Catch and print proxy errors.
  • Use ProxyPools – Maintain a pool of proxies and cycle through them to maximize uptime.
  • Disable SSL Verification – Set verify=False in requests to disable SSL checks which may fail due to proxies.
  • Use SOCKS Proxies – For SOCKS support install requests[socks] and use a socks5:// URL scheme.
  • Cloud Proxies – Services like ScraperAPI and Oxylabs provide cloud-based proxies to avoid infrastructure headaches.

And those are some of the best practices for running Python requests through proxies.

Conclusion

I hope this guide provided a comprehensive overview of using proxies with Python Requests.

Here's a quick recap of what you learned:

  • Why proxies help – Avoid blocks, bypass geographic restrictions, increase privacy
  • Setting up proxies – Define proxy dict pointing to HTTP and HTTPS proxies
  • Free vs paid proxies – Free proxies are slow and unreliable, paid ones are better for automation
  • Authentication methods – Add username/password or API keys to proxy URL
  • Rotating proxies – Randomly select proxies to distribute requests
  • Using sessions – Reuse connections for better performance
  • Tools like ZenRows – Handle proxy management for you

You're now ready to start integrating proxies into your Python requests scripts to prevent blocks at scale.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *