Community Packages

Please read this before reporting a bug:
http://wiki.archlinux.org/index.php/Reporting_Bug_Guidelines

Do NOT report bugs when a package is just outdated, or it is in the AUR. Use the 'flag out of date' link on the package page, or the Mailing List.

REPEAT: Do NOT report bugs for outdated packages!
Tasklist

FS#65675 - [restic] 0.9.6-2 operations fail on CIFS mounted repo

Attached to Project: Community Packages
Opened by Phil Lewellen (Underhillian) - Sunday, 01 March 2020, 22:13 GMT
Last edited by freswa (frederik) - Thursday, 19 March 2020, 23:28 GMT
Task Type Bug Report
Category Packages
Status Closed
Assigned To Christian Rebischke (Shibumi)
Morten Linderud (Foxboron)
Architecture x86_64
Severity Medium
Priority Normal
Reported Version
Due in Version Undecided
Due Date Undecided
Percent Complete 100%
Votes 0
Private No

Details

Description:

Prune and rebuild-index (at least) operations fail under 0.9.6-2. The operations succeed using both the previous Arch version (0.9.6-1) and the latest compiled version from the restic download page (v0.9.6-123-g95da6c1c_linux_amd64). Other operations (e.g. backup) appear to work normally.

The problem appears limited to repos accessed over a CIFS mount. Local repos seem unaffected.

Additional info:
* package version(s): restic 0.9.6-2, linux 5.5.7-arch1-1, glibc 2.31-1

Here is the error history from an attempted prune:

repository 0e454656 opened successfully, password is correct
counting files in repo
List(data) returned error, retrying after 552.330144ms: readdirent: interrupted system call
building new index for repo
Load(<data/216ea5f2d2>, 591, 4564169) returned error, retrying after 720.254544ms: open /mnt/nas/redacted/reponame/data/21/216ea5f2d21b458a7b913609ddef2a6ac4788b4bad5481b2916558d2ce1bef04: interrupted system call
Load(<data/27a2788568>, 591, 4698699) returned error, retrying after 582.280027ms: open /mnt/nas/redacted/reponame/data/27/27a2788568aeb93f264d00a6b201a083a0ca25379eb78190c62c33d26f7323fc: interrupted system call
Load(<data/2bec719a2b>, 591, 4742880) returned error, retrying after 468.857094ms: read /mnt/nas/redacted/reponame/data/2b/2bec719a2bb19764e756c63806447f738101aba18bbdad19f7124c74e78fc3eb: interrupted system call
Load(<data/457e74b780>, 591, 4417316) returned error, retrying after 462.318748ms: read /mnt/nas/redacted/reponame/data/45/457e74b780aed8f75c6169042641279ae886e70ffe7d518c64a8104edafb2002: interrupted system call
Load(<data/488a01577b>, 591, 4447883) returned error, retrying after 593.411537ms: open /mnt/nas/redacted/reponame/data/48/488a01577bd183bca81b616b2873ca65cfed22db636f02f982e7bfd99049412a: interrupted system call
Load(<data/4e5e29bbb7>, 591, 4581078) returned error, retrying after 282.818509ms: open /mnt/nas/redacted/reponame/data/4e/4e5e29bbb7061c21fec165d643e18dabdc6cfd3315a881b334b3bc8607dec88a: interrupted system call
Load(<data/5abc29df2c>, 591, 4563962) returned error, retrying after 328.259627ms: read /mnt/nas/redacted/reponame/data/5a/5abc29df2c4922d9d210b26fe358300980428f9d041201764f43bcc810526f3e: interrupted system call
List(data) returned error, retrying after 298.484759ms: lstat /mnt/nas/redacted/reponame/data/5d: interrupted system call
[0:04] 41.24% 5993 / 14533 packs
repository contains 5993 packs (33054 blobs) with 27.052 GiB
processed 33054 blobs: 0 duplicate blobs, 0 B duplicate
load all snapshots
find data that is still in use for 9 snapshots
tree 26ead64a6120bcc0eda123d5bdc07bd3c655ce51b29ad147864fad6e21f8913a not found in repository
github.com/restic/restic/internal/repository.(*Repository).LoadTree
github.com/restic/restic/internal/repository/repository.go:713
github.com/restic/restic/internal/restic.FindUsedBlobs
github.com/restic/restic/internal/restic/find.go:11
main.pruneRepository
github.com/restic/restic/cmd/restic/cmd_prune.go:191
main.runPrune
github.com/restic/restic/cmd/restic/cmd_prune.go:85
main.glob..func18
github.com/restic/restic/cmd/restic/cmd_prune.go:25
github.com/spf13/cobra.(*Command).execute
github.com/spf13/cobra@v0.0.3/command.go:762
github.com/spf13/cobra.(*Command).ExecuteC
github.com/spf13/cobra@v0.0.3/command.go:852
github.com/spf13/cobra.(*Command).Execute
github.com/spf13/cobra@v0.0.3/command.go:800
main.main
github.com/restic/restic/cmd/restic/main.go:86
runtime.main
runtime/proc.go:203
runtime.goexit
runtime/asm_amd64.s:1373

Steps to reproduce:

Perform restic prune on a remote repo mounted as a CIFS share.
This task depends upon

Closed by  freswa (frederik)
Thursday, 19 March 2020, 23:28 GMT
Reason for closing:  Upstream
Comment by Christian Rebischke (Shibumi) - Sunday, 08 March 2020, 22:53 GMT
Hi,
did you report this upstream?
Comment by Phil Lewellen (Underhillian) - Monday, 09 March 2020, 13:05 GMT
I did not. As I mentioned, one of the things I tried was installing the latest (at the time) pre-compiled build from upstream (v0.9.6-123-g95da6c1c_linux_amd64). This resolved the issue. I concluded that either the problem did not originate upstream, or (if it did) it has since been resolved. Either way, I didn't see a reason to report upstream.

FWIW, I didn't find any mention of the problem (or of a resolution) in the restic issues database. Since the latest restic works, this further led me to think it must be an Arch issue.

Thanks.
Comment by Morten Linderud (Foxboron) - Monday, 09 March 2020, 13:07 GMT
The problem is (if anything) related to go 1.14 changes. The prebuilt binaries might still be base don the 1.13 compiler. It's still upstream issues.
Comment by Phil Lewellen (Underhillian) - Tuesday, 10 March 2020, 11:26 GMT
Indeed your diagnosis is correct. I built restic from source using the current Arch go release (2:1.14.1) and reproduced the problem.

I am new to bug reporting. Would it be better to report this upstream to restic or to go? (I know very little about go, so not sure I could report the issue intelligently there).

Thanks.
Comment by Phil Lewellen (Underhillian) - Thursday, 12 March 2020, 13:35 GMT
I took this up on the restic forum. It appears very likely that, as suggested there by @MichaelEischer, my problem is "a side effect of asynchronous preemptions in go 1.14." He provided a link to the go changelog (https://golang.org/doc/go1.14#runtime) where there is some relevant commentary.

Without repeating the forum discussion, my personal solution to this will be to get out of the CIFS business with restic and use a different backend (e.g. rest-server). I can imagine that the issue might come up again for someone who doesn't have this option, but I'm convinced this isn't an Arch bug after all and ask that the report be closed.

Thanks for the help.
Comment by Phil Lewellen (Underhillian) - Thursday, 19 March 2020, 21:12 GMT
@MichaelEischer further suggested a simple workaround with some explanation (see https://forum.restic.net/t/prune-fails-on-cifs-repo-using-go-1-14-build/2579/10?u=underhillian) which resolves the issue. It appears from that discussion that there may in fact be an issus with the kernel.

Loading...