-
Notifications
You must be signed in to change notification settings - Fork 519
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ConnectionError: HTTPConnectionPool(host='en.wikipedia.org', port=80): Max retries exceeded with url: /w/api.php?list=search&srprop=&srlimit=10&limit=10&srsearch=Barack&format=json&action=query (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000023606F8AB38>: Failed to establish a new connection... #303
Comments
Hi YixinNJU: |
@hawbox |
Hello, I'm also from China and I meet this problem too. Thank you very much for your warm reply. I used the global Https proxy and had degraded the version of urllib3 to 1.25.11 as you said, but I still can't solve this question. Have you solved it? please. |
Hello, I have solved this error with your help. I add the below code in the '_wiki_request' function of the wikipedia.py. |
Hello
like here:
or, what I do, is adding a certificate of the proxy, so that it let's me out (the page is not blocked, but we need to go through the proxy).
You could place the arguments at other points, but I choose this one. That way, I covered my virtual environment. |
您好,我按照上述操作还是无法连接到wiki的网站,我是在实验室服务器上操作的,已经全局export的http代理,同时也尝试了降级urllib3,以及修改'_wiki_request' function,但上述操作都没有起作用。求教~~ |
你好,请问你解决了吗 |
I can not use almost any function of wikipidia module. It always returns to me the connection error.
My original code is:
wikipedia.search("Barack")
It gives me the error:
`Traceback (most recent call last):
File "", line 1, in
wikipedia.search("Barack")
File "F:\Anaconda3\anaconda3\lib\site-packages\wikipedia\util.py", line 28, in call
ret = self._cache[key] = self.fn(*args, **kwargs)
File "F:\Anaconda3\anaconda3\lib\site-packages\wikipedia\wikipedia.py", line 103, in search
raw_results = _wiki_request(search_params)
File "F:\Anaconda3\anaconda3\lib\site-packages\wikipedia\wikipedia.py", line 737, in _wiki_request
r = requests.get(API_URL, params=params, headers=headers)
File "F:\Anaconda3\anaconda3\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "F:\Anaconda3\anaconda3\lib\site-packages\requests\api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "F:\Anaconda3\anaconda3\lib\site-packages\requests\sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "F:\Anaconda3\anaconda3\lib\site-packages\requests\sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "F:\Anaconda3\anaconda3\lib\site-packages\requests\adapters.py", line 516, in send
raise ConnectionError(e, request=request)
ConnectionError: HTTPConnectionPool(host='en.wikipedia.org', port=80): Max retries exceeded with url: /w/api.php?list=search&srprop=&srlimit=10&limit=10&srsearch=Barack&format=json&action=query (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000023606F8AB38>: Failed to establish a new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。'))`
I am new to scratching data. And I did not find any answer direct for this problem. Do I need to modify any of the scripts in the tracebacks? Appreciate your help!
The text was updated successfully, but these errors were encountered: