Questions tagged [grequests]
GRequests allows you to use Requests with Gevent to make asyncronous HTTP Requests easily.
grequests
139
questions
0
votes
0
answers
9
views
Can processing of request/grequest responses be run concurrently in Python2?
I'm writing a Python2 program that pulls a large amount of JSON data from a remote CouchDB database. The data is indexed by timestamp: I pass a startkey and an endkey, and the database returns all ...
-1
votes
1
answer
70
views
How i can send TOO MANY requests to DEFFERENT sites and get responses?
How can I send a lot of requests to DIFFERENT sites, I have a database of sites (1kk) and need to check whether they are alive or not, conditionally if you just do it through grequests(python) chunks ...
1
vote
0
answers
25
views
Django grequests batch call fail
I have 100 API to pin in Django, I tried to use grequests to do batch call, the batch call code looks like this:
class BatchCall:
def __init__(self, urls, BO_urls):
self.urls = urls
...
1
vote
0
answers
63
views
Conflict between grequests / gevent and sshtunnel
I'm working on a web scraper that makes use of grequests, and then sshtunnel to store the data in a database.
However importing grequests and sshtunnel causes Error reading SSH protocol banner cannot ...
0
votes
1
answer
163
views
How to send thousands of HTTP Requests using grerequests?
I need to request all the review pages for a company on Glassdoor, and in certain cases, there can be thousands of pages. I am trying to use grequests to do this, but I found that when I sent more ...
0
votes
1
answer
140
views
How to get JSON data from a post in grequests library (async-requests) python
so im trying to make an async-requests thingy and i cant get the json data from a post
import grequests as requests
headers = {SOME HEADERS}
data = {
SOME DATA...
}
r = requests.post(
"...
0
votes
1
answer
388
views
part of the requests delay the whole process in grequests
I have 600 urls to request, when I use grequests, I found that sometimes it finishes so fast within 10 secs, but sometimes just stuck there(can't reach the statement of printing 'done').
Here is my ...
0
votes
1
answer
507
views
why do I can't extract data from response using grequests?
import grequests
import time
start_time = time.time()
sites = ['https://facebook.com' for x in range(5)]
data = {'a': 'b'}
responses = [grequests.get(u, data=data) for u in sites]
for response in ...
1
vote
1
answer
476
views
How to avoid 429 error when using grequests?
I want to make a parser using "grequests", but as expected, due to many requests, i get an error 429.
import grequests
sites = [f"https://fantlab.ru/work{i}" for i in range(1, 10)]...
1
vote
1
answer
1k
views
Enabling gevent debugging in Google Colab
I want to used the grequests library in a Google Colab notebook, however upon importing it (and patching it using gevent.monkey), the program spits out a random amount of the following warning:
It ...
0
votes
0
answers
140
views
How to block requests while analyzing 1000 links?
How i can block "grequests" when i found "target file"? I need to analyze like 1000 links, but just one is that good.
If I have found that link after e.g. 10 URLs, I need to block ...
0
votes
1
answer
294
views
MODUL grequests, How GET simple print URL and Response?
anyone can please explain me how i can split results for get just simple url and response?
I have try so many time but nothing, for now i can print just like:
50
0.4110674999999999
........, [<...
1
vote
1
answer
262
views
How to Scrape Product Pages using Python grequests and BeautifulSoup
from bs4 import BeautifulSoup
import grequests
import pandas as pd
# STEP 1: Create List of URLs from main archive page
def get_urls():
urls = []
for x in range(1,3):
urls.append(...
0
votes
0
answers
185
views
Asynchronous API Requests
I am trying to send multiple requests to an API using grequests, but I am getting no response, when I map the list. The API has a limit of about 1000 requests so it cannot be that I am getting denied.
...
1
vote
1
answer
304
views
How to parse the response from Grequests faster?
I want to webscraping multiple urls and parse quick as possible but the for loop is not too faster for me, have a way to do this maybe with asynchronous or multiprocessing or multithreading?
import ...