As an alternative to to downloading huge patches every time or using UberWilliam's Go PA patcher, I've written my own patcher in python. I've already mentioned it in other threads but I think it's better if I don't keep detracting people from other stuff going on there. Henceforth, I will only accept feedback, feature requests, and bug reports in this very thread - unless of course you prefer using GitHub for that. You can find the official repository at https://github.com/pa-pyrus/papatcher. For Arch Linux there is an AUR package as well, it's not maintained by me though.
I know it is fairly new but I'd really like to bump the python requirement to 3.4. Does anyone have a major problem with that? Currently, we're at 3.3 for openmode 'x'. People on Arch and Ubuntu 14.04 (and probably Mint) should be fine, as well as everyone using the development versions of Debian, Fedora and openSUSE. If I don't get negative feedback, I probably will make the switch in a couple of days.
Just got weird patcher crash when updated to most recent PTE: Code: Traceback (most recent call last): File "./papatcher.py", line 448, in <module> if not patcher.patch(): File "./papatcher.py", line 242, in patch if not completed.result(): File "/usr/lib/python3.4/concurrent/futures/_base.py", line 395, in result return self.__get_result() File "/usr/lib/python3.4/concurrent/futures/_base.py", line 354, in __get_result raise self._exception File "/usr/lib/python3.4/concurrent/futures/thread.py", line 54, in run result = self.fn(*self.args, **self.kwargs) File "./papatcher.py", line 282, in _download_bundle cache_file.write(response.read()) File "/usr/lib/python3.4/http/client.py", line 512, in read s = self._safe_read(self.length) File "/usr/lib/python3.4/http/client.py", line 664, in _safe_read raise IncompleteRead(b''.join(s), amt) http.client.IncompleteRead: IncompleteRead(97457 bytes read, 1743381 more expected) Though after restart it's finished update just fine.
As nobody objected I just bumped the python version to 3.4 and made SSL mandatory again. For now 3.4 is only necessary for ssl.create_default_context and os.cpu_count. So if you wish to support older versions as well, feel free to fork the repo. Please tell me if that happens again. If it wasn't just a glitch, I'll have the patcher retry downloading a file when it encounters that exception.
Thanks, this is great Unfortunately I have a slow connection and sometimes am sometimes not sure if it's still downloading or just timed out - so I quickly hacked in some progress meters for the ongoing downloads. It's not particularly nice but it should work. (I don't have git on this computer yet so I've just attached it here)
I'm currently looking into using PycURL instead of urllib or http.client for downloading files. While adding a dependency to an external library, that should quite easily allow for faster, more stable downloads with optional rate limiting and progress indicators. I'll keep you posted.
Just now pushed an update based on PycURL. That means you'll have to install python-pycurl. This change adds a progress meter and rate limiting (-r/--ratelimit). I also replaced os.path with pathlib. Had to change quite a bit for that, so keep an eye out for bugs.
Nice, I'll test it when the next patch comes out. cURL is pretty much universal, so as far as dependencies go it's not something I'd worry about.
curl.setopt(pycurl.TIMEOUT, 300) Probably not a good idea. People with slower connections will never be able to download the larger bundles.
I just pushed a change that makes the patcher conform to the XDG basedir specification. That means it now uses $HOME/.local/share and $HOME/.cache for application data and cached files. The subfolder in both cases is named Planetary Annihilation. You may want to remove the now obsolote directory $HOME/.local/Uber Entertainment/PA.