----- Bot: Logs manager init ----- ----- Bot: Task DownloadMediaWiki started ----- ----- Bot: command: wikiteam3dumpgenerator --xml --images --xmlrevisions http://wiki.rosalab.ru/ru/index.php/%D0%97%D00%D03%D0B%D00%D02%D0D%D00%D1%8F_%D1%81%D1%82%D1%80%D00%D0D%D08%D1%86%D00 ----- No suitable dump found at Internet Archive Checking API... http://wiki.rosalab.ru/ru/api.php API is OK: http://wiki.rosalab.ru/ru/api.php Checking index.php... http://wiki.rosalab.ru/ru/index.php check_index(): Trying Special:Random... POST http://wiki.rosalab.ru/ru/index.php {'title': 'Special:Random'} 302 GET http://wiki.rosalab.ru/ru/index.php/%D0%A2%D0%B5%D1%81%D1%82%D0%B8%D1%80%D0%BE%D0%B2%D0%B0%D0%BD%D0%B8%D0%B5_20-24_%D0%B0%D0%BF%D1%80%D0%B5%D0%BB%D1%8F {'title': 'Special:Random'} 200 index.php available probability: 90% (0.9) index.php is OK No --path argument provided. Defaulting to: [working_directory]/[domain_prefix]-[date]-wikidump Which expands to: ./wiki.rosalab.ru_ru-20231106-wikidump --delay is the default value of 1.5 There will be a 1.5 second delay between HTTP calls in order to keep the server from timing you out. If you know that this is unnecessary, you can manually specify '--delay 0.0'. ######################################################################### # Welcome to DumpGenerator 4.1.4 by WikiTeam3 (GPL v3) # # More info at: https://github.com/saveweb/wikiteam3 # # Copyright (C) 2011-2023 WikiTeam developers # ######################################################################### Analysing http://wiki.rosalab.ru/ru/api.php Trying generating a new dump into a new directory... http://wiki.rosalab.ru/ru/api.php Getting the XML header from the API Export test via the API failed. Wiki too old? Trying without xmlrevisions. http://wiki.rosalab.ru/ru/api.php Delay 7.0s: req retry (500) . Delay 7.0s: req retry (500) / Delay 9.0s: req retry (500) . Delay 9.0s: req retry (500) / Delay 9.0s: req retry (500) - Delay 13.0s: req retry (500) . Delay 13.0s: req retry (500) / Delay 13.0s: req retry (500) - Delay 13.0s: req retry (500) \ Delay 21.0s: req retry (500) . Delay 21.0s: req retry (500) / Delay 21.0s: req retry (500) - Delay 21.0s: req retry (500) \ Delay 21.0s: req retry (500) . Delay 21.0s: req retry (500) / urllib3.exceptions.ResponseError: too many 500 error responses The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 486, in send resp = conn.urlopen( File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 941, in urlopen return self.urlopen( File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 941, in urlopen return self.urlopen( File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 941, in urlopen return self.urlopen( [Previous line repeated 2 more times] File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 931, in urlopen retries = retries.increment(method, url, response=response, _pool=self) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/cli/cli.py", line 367, in increment return super(CustomRetry, self).increment(method=method, url=url, *args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/urllib3/util/retry.py", line 515, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='wiki.rosalab.ru', port=80): Max retries exceeded with url: /ru/index.php?title=Special%3AExport&pages=Main_Page&action=submit&offset=1&limit=1000 (Caused by ResponseError('too many 500 error responses')) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/bin/wikiteam3dumpgenerator", line 8, in sys.exit(main()) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/__init__.py", line 5, in main DumpGenerator() File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/generator.py", line 105, in __init__ DumpGenerator.createNewDump(config=config, other=other) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/generator.py", line 136, in createNewDump generate_XML_dump(config=config, session=other["session"]) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/xmldump/xml_dump.py", line 100, in generate_XML_dump header, config = getXMLHeader(config=config, session=session) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/xmldump/xml_header.py", line 124, in getXMLHeader header, config = getXMLHeader(config=config, session=session) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/xmldump/xml_header.py", line 70, in getXMLHeader [ File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/xmldump/xml_header.py", line 70, in [ File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/page/xmlexport/page_xml_export.py", line 139, in getXMLPageWithExport xml = getXMLPageCore(params=params, config=config, session=session) File "/usr/local/lib/python3.10/dist-packages/wikiteam3/dumpgenerator/dump/page/xmlexport/page_xml_export.py", line 93, in getXMLPageCore r = session.post( File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 637, in post return self.request("POST", url, data=data, json=json, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 510, in send raise RetryError(e, request=request) requests.exceptions.RetryError: HTTPConnectionPool(host='wiki.rosalab.ru', port=80): Max retries exceeded with url: /ru/index.php?title=Special%3AExport&pages=Main_Page&action=submit&offset=1&limit=1000 (Caused by ResponseError('too many 500 error responses')) ----- Bot: Task DownloadMediaWiki finished ----- ----- Bot: Exit code: 1 -----