Historical bug tracker for the Pacman package manager.
The pacman bug tracker has moved to gitlab:
https://gitlab.archlinux.org/pacman/pacman/-/issues
This tracker remains open for interaction with historical bugs during the transition period. Any new bugs reports will be closed without further action.
The pacman bug tracker has moved to gitlab:
https://gitlab.archlinux.org/pacman/pacman/-/issues
This tracker remains open for interaction with historical bugs during the transition period. Any new bugs reports will be closed without further action.
FS#11076 - makepkg download fails with special chars in URL
Attached to Project:
Pacman
Opened by Allan McRae (Allan) - Thursday, 31 July 2008, 14:02 GMT
Last edited by Xavier (shining) - Wednesday, 06 August 2008, 06:22 GMT
Opened by Allan McRae (Allan) - Thursday, 31 July 2008, 14:02 GMT
Last edited by Xavier (shining) - Wednesday, 06 August 2008, 06:22 GMT
|
DetailsSummary and Info:
makepkg fails to download sources when there are special characters in the URL e.g the mythbrowser package which has url: http://www.mythtv.org/modules.php?name=Downloads&d_op=getit&lid=136&foo=/mythplugins-0.21.tar.bz2 |
This task depends upon
Closed by Xavier (shining)
Wednesday, 06 August 2008, 06:22 GMT
Reason for closing: Fixed
Additional comments about closing: fixed in commit 9bc799ec7b1
Wednesday, 06 August 2008, 06:22 GMT
Reason for closing: Fixed
Additional comments about closing: fixed in commit 9bc799ec7b1
local proto=$(echo $netfile | sed 's|://.*||')
from get_downloadclient() should be replced with
local proto=$(echo "$url" | sed 's|://.*||')
Also,
local dlcmd=$(echo "$dlagent" | sed "s|%o|$file.part|" | sed "s|%u|$netfile|"
from get_downloadcmd() fails due to sed's handling of & in the rhs of the expression.
the cl-asdf package in AUR (http://aur.archlinux.org/packages.php?do_Details=1&ID=18800&O=&L=&C=&K=&SB=&SO=&PP=&do_Orphans=&SeB=) mysteriously fails.
The makepkg line that fails is '$(get_downloadcmd "$dlclient" "$netfile" "$file") || ret=$?'. The "get_downloadcmd()" routine works fine (calling it by itself echoes a command which I can run in an interactive bash session with no errors), but somehow the line as a whole manages to fail with this message: "/usr/bin/wget: option requires an argument -- 'O'".
FYI, the output of get_downloadcmd() for this specific package is: '/usr/bin/wget -c -t 3 --waitretry=3 -O asdf.lisp?revision=1.123.part http://cclan.cvs.sourceforge.net/*checkout*/cclan/asdf/asdf.lisp?revision=1.123';
What makepkg (from pacman 3.2.0-1) tells me (after I put the URL in single quotes):
% makepkg -sfic
==> Making package: ut 451-4 i686 (Sun Aug 3 22:04:14 CEST 2008)
==> Checking Runtime Dependencies...
==> Checking Buildtime Dependencies...
==> Retrieving Sources...
==> ERROR: There is no agent set up to handle URLs. Check /etc/makepkg.conf.
Aborting...
-> Downloading ?what=dl&catid=6&gameid=51&filename=unreal.tournament_436-multilanguage.run...
==> ERROR: There is no agent set up to handle URLs. Check /etc/makepkg.conf.
Aborting...
-> Downloading ?what=dl&catid=6&gameid=51&filename=unreal.tournament_436-multilanguage.goty.run...
-> Using cached copy of UTPGPatch451.tar.bz2
-> Found ut.desktop in build dir
-> Found base.list in build dir
==> WARNING: Integrity checks (md5) are missing or incomplete.
==> Extracting Sources...
==> ERROR: Unable to find source file ?what=dl&catid=6&gameid=51&filename=unreal.tournament_436-multilanguage.run for extraction.
Aborting...
I tested ut and cl-atse with it and it worked fine.
However, the filename is awful with these stupid urls (?what=dl&catid=6&gameid=51&filename=unreal.tournament_436-multilanguage.run and asdf.lisp?revision=1.123), but well, it has always been like that.
Thanks Xavier, I did not know about nullglob. Now I know why they call you "veteran programmer extraordinaire" :)
"Thanks to Henning Garus for pointing out that nullglob was problematic with
urls containing expansion char like '?'."
http://archlinux.org/pipermail/pacman-dev/2008-July/012715.html
So I guess Henning Garus aka Gars deserves that title more than me :)