FS#58015 - makepkg could use all CPU cores during package compression
Attached to Project:
Arch Linux
Opened by Daniel Quinn (danielquinn) - Thursday, 29 March 2018, 07:53 GMT
Last edited by Christian Hesse (eworm) - Thursday, 29 March 2018, 08:22 GMT
Opened by Daniel Quinn (danielquinn) - Thursday, 29 March 2018, 07:53 GMT
Last edited by Christian Hesse (eworm) - Thursday, 29 March 2018, 08:22 GMT
|
Details
Description:
I don't use the AUR very often, but when I do, it's for some rather large packages like pycharm-professional with an installed size ~700MB. For packages like this, the `Compressing package...` step takes a surprisingly long time, so I dug around to see just what was going on: 1. It's building a tar.xz archive 2. It's not using `--threads=0` so that compression only using one of my 16 cores :-( If the Bash code could be modified to do something like this, compression would go a lot faster and people with more than one core (most people these days) could make use of their machines better: tar --use-compress-program='xz --threads=0' -cf projectname.tar.xz path/ Steps to reproduce: 1. makepkg -i 2. Wait :-) |
This task depends upon
Closed by Christian Hesse (eworm)
Thursday, 29 March 2018, 08:22 GMT
Reason for closing: Not a bug
Additional comments about closing: Can be configured in makepkg.conf
Thursday, 29 March 2018, 08:22 GMT
Reason for closing: Not a bug
Additional comments about closing: Can be configured in makepkg.conf
Comment by
Daniel M. Capella (polyzen) -
Thursday, 29 March 2018, 08:02 GMT
https://wiki.archlinux.org/index.php/Makepkg#Utilizing_multiple_cores_on_compression
Comment by
Daniel Quinn (danielquinn) -
Thursday, 29 March 2018, 08:12 GMT
I'm so sorry, I was careful to search for existing bugs, but
didn't look for my answers there. Thanks for pointing it out. You
might as well close this :-(