Giter Site home page Giter Site logo

rootviii / proxy_requests Goto Github PK

View Code? Open in Web Editor NEW
387.0 387.0 47.0 768 KB

a class that uses scraped proxies to make http GET/POST requests (Python requests)

License: MIT License

Python 100.00%
http http-get http-getter http-proxy http-proxy-middleware proxy proxy-list proxy-requests proxy-server python python-requests python3 recursion recursion-problem requests requests-module webscraper webscraper-api webscraping

proxy_requests's People

Contributors

rootviii avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

proxy_requests's Issues

Checking for Cloudflare Captcha

Thanks for providing this great script! I was about to make something similar before finding this.

I had two things I noticed when using the code; ReadTimeout was not in the list of errors
and it could be a good idea to run a check if the returned data is not a cloudflare captcha check

I've added this bit of code to on my side and seems to do the job well;
(Added to the set_request_data function)
if "why_captcha_headline" in req.text:
raise CloudFlareCaptcha
then created a CloudFlareCaptcha exception class, and added it to the list of errors.

When this error is hit, it should hit the exception in each of the get/post functions, and then re-try

Reduced size of sockets list

Rather a question than an issue: I have read that you have reduced the size of sockets to be parsed from sslproxies. I'm curios why as I first thought that the regex was faulty.

not working

I am trying to scrap a site which is blocked by my isp.

my code
from proxy_requests import ProxyRequests

r = ProxyRequests(url)
r.get()

output
Unable to make proxied request... Please check the validity of https://httpbin.org/ip

KeyError: 'request'

Python 3.9.1 (default, Dec 13 2020, 11:55:53)

from proxy_requests import ProxyRequests
url = "https://api.ipify.org/"
r = ProxyRequests(url)
print(r)
--------------------------------------------------
KeyError         Traceback (most recent call last)
<ipython-input-9-c6b8a6a9416b> in <module>
----> 1 print(r)

~/.local/lib/python3.9/site-packages/proxy_requests/proxy_requests.py in __str__(self)
    202 
    203     def __str__(self):
--> 204         return str(self.rdata['request'])
    205 
    206 

KeyError: 'request'

[Feature request]: get_with_headers

It is crucial to have UserAgent headers for some sites to work. I spend an hour debugging and trying to find what's wrong and it turned out that there is no get method with headers support

    def get_with_headers(self):
        if len(self.sockets) > 0:
            current_socket = self.sockets.pop(0)
            proxies = {"http": "http://" + current_socket, "https": "https://" + current_socket}
            try:
                request = requests.get(self.url, timeout=3.0, proxies=proxies, headers=self.headers)
                self.request = request.text
                self.headers = request.headers
                self.status_code = request.status_code
                self.proxy_used = current_socket
            except:
                print('working...')
                self.get_with_headers()

proxies are repeating

is there a way to use every proxy once?
my code is like:

`
from proxy_requests import ProxyRequests

while True:
r = ProxyRequests("Censored")
r.post({"Censored"})
print(r)
print(r.get_status_code())
print(r.get_proxy_used())`

and output is like this:

`
91.92.80.25:40487

"CENSORED"

200

117.206.83.26:41960

"CENSORED"

400

117.206.83.26:41960

"CENSORED"

400

117.206.83.26:41960

"CENSORED"`

Proxy is always the same

I've noticed that if I 10 times call ProxyRequest I get always the same first proxy from the list. Is it possible to get a random proxy from the list?

Not Working Python 3.9

Not Working On Recent Python Version 3.9
It Do Request then reply with error
Proxy Pool Empty
and dont use no proxy no return source no json

HTTP proxies in the list

There are proxies in the list which are http only, and I am assuming they get skipped.

It's better to ignore the http proxies or add support for them so that they are not skipped.

request get

Pass extra data to request get

for example

data = {'key': 1 , "key2":2}
r = ProxyRequests(url)
r.set_headers(headers)
r.get_with_headers()
r.get(data)

responsed url

Hi,
I am missing the possibility to get the responsed url like response.request.url in Python’s Requests Library. I can't find this info in headers or html.text.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.