FS#26772 - Optimize heavy usage of "file" in makepkg
Attached to Project:
Pacman
Opened by Jason William Walton (jasonww) - Monday, 07 November 2011, 00:58 GMT
Last edited by Allan McRae (Allan) - Monday, 20 February 2012, 05:56 GMT
Opened by Jason William Walton (jasonww) - Monday, 07 November 2011, 00:58 GMT
Last edited by Allan McRae (Allan) - Monday, 20 February 2012, 05:56 GMT
|
Details
It should be faster to package big stuff.
Optimize the usage of file and other outside commands to reduce the time of the packaging process. Consider blacklisting certain directorys in the obvious areas (i.e /usr/include during stripping) to speed up the process. |
This task depends upon
Closed by Allan McRae (Allan)
Monday, 20 February 2012, 05:56 GMT
Reason for closing: No response
Additional comments about closing: Will be reopened if numbers demonstrating useful speed benefits are provided.
Monday, 20 February 2012, 05:56 GMT
Reason for closing: No response
Additional comments about closing: Will be reopened if numbers demonstrating useful speed benefits are provided.
Comment by Allan McRae (Allan) -
Monday, 07 November 2011, 02:20 GMT
I'd like to see some numbers here. I remember there being little
difference between scanning a whitelist of directories (old
approach) and the entire package (current approach), remembering
all files being examined by "file" will be in the system cache at
that stage...
Comment by Dan McGee (toofishes) -
Monday, 07 November 2011, 15:12 GMT
There is "big stuff", and "lots of files", and from what I can
reason the latter could be much more of a problem. Can you be more
clear with an actual example involving size and file count
numbers, please?