why proxies doesn't work when switching from requests to selenium?

548 Views Asked by At

I tried other solutions here on Stackoverflow, bit non of them worked for me.

I'm trying to configure selenium with a proxy, It worked with requests library, I used this command:

requests.get('https://stackoverflow.com', proxies={'https':'https://3.0.32.21:22881'})

It gave me a valid status code, which means It's working:

<Response [200]>

But when I try the same proxy with Selenium, It doesn't work at all, here is what I tried regarding Selenium configuration with the proxy:

from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities

PROXY = "3.0.32.21:22881"
capabilities = DesiredCapabilities.CHROME.copy()
capabilities['goog:loggingPrefs'] = {'performance': 'ALL'}
capabilities['proxy'] = {
        "httpProxy": PROXY,
        "ftpProxy": PROXY,
        "sslProxy": PROXY,
        "noProxy": None,
        "proxyType": "MANUAL",
        "class": "org.openqa.selenium.Proxy",
        "autodetect": False}
            
driver = webdriver.Chrome('chromedriver', desired_capabilities=capabilities)
driver.get("https://stackoverflow.com")

Here is a screenshot for the automated browser session as well: enter image description here

Is there something wrong with my selenium configuration?

Thanks

1

There are 1 best solutions below

0
On BEST ANSWER

I had a similar issue for me switching to the Firefox driver solved the issue.

If you wanna stick to chrome maybe you can try that approach:

    options = webdriver.ChromeOptions()
    options.add_argument(f'--proxy-server={PROXY}')
    driver = webdriver.Chrome('chromedriver', chrome_options=chrome_options)