FS#63048 - Conky segfaults shortly after starting

Attached to Project: Arch Linux
Opened by Nick Boughton (tiberiouse) - Sunday, 30 June 2019, 21:14 GMT
Last edited by Gaetan Bisson (vesath) - Monday, 16 September 2019, 18:26 GMT
Task Type Bug Report
Category Packages: Extra
Status Closed
Assigned To Gaetan Bisson (vesath)
Architecture All
Severity Low
Priority Normal
Reported Version
Due in Version Undecided
Due Date Undecided
Percent Complete 100%
Votes 0
Private No

Details

Description:
Conky starts but segfaults moments later, the config below has worked for months until a recent update.

Additional info:
* package version(s)
1.11.4-1

* config and/or log files etc.
conky.config = {
alignment = 'top_right',
background = true,
border_width = 0,
cpu_avg_samples = 2,
default_color = 'lightgrey',
default_outline_color = 'white',
default_shade_color = 'white',
double_buffer = true,
draw_borders = false,
draw_graph_borders = false,
draw_outline = false,
draw_shades = false,
extra_newline = false,
font = 'Ubuntu Mono:size=10',
gap_x = 15,
gap_y = 35,
minimum_width = 100,
maximum_width = 100,
net_avg_samples = 2,
no_buffers = true,
out_to_console = false,
out_to_ncurses = false,
out_to_stderr = false,
out_to_x = true,
own_window = true,
own_window_class = 'Conky',
own_window_type = 'desktop',
own_window_transparent = true,
own_window_argb_visual = true,
show_graph_range = false,
show_graph_scale = false,
stippled_borders = 0,
update_interval = 1.0,
uppercase = false,
use_spacer = 'none',
use_xft = true,
template0 = [[${font Symbols Nerd Font}$font \1 ${alignr} ${fs_used_perc \1}%
${fs_bar 4 \1}]],
}

conky.text = [[
${font Symbols Nerd Font}⏻$font${alignr}$uptime

${font Symbols Nerd Font}$font${alignr}$battery
${battery_bar 4}

${template0 /}
${template0 /home}
${font Symbols Nerd Font}$font swap $alignr$swapperc%
${swapbar 4}

${font Symbols Nerd Font}說$font$alignr${wireless_essid wlp1s0}
${font Symbols Nerd Font}$font$alignr${upspeed wlp1s0}
${upspeedgraph wlp1s0 10,0 25928f f06068 -t}
${font Symbols Nerd Font}$font$alignr${downspeed wlp1s0}
${downspeedgraph wlp1s0 10,0 25928f f06068 -t}

${font Symbols Nerd Font}$font $running_processes/$processes${alignr}${cpu cpu0}%
${cpugraph cpu0 10,0 25928f f06068 -t}

${font Symbols Nerd Font}$font $memeasyfree$alignr$memperc%
${memgraph 10,0 25928f f06068 -t}
#${alignc}${font Symbols Nerd Font:size=40}$font
]]

* link to upstream bug report, if any

Steps to reproduce:
Start conky with the above config
This task depends upon

Closed by  Gaetan Bisson (vesath)
Monday, 16 September 2019, 18:26 GMT
Reason for closing:  No response
Comment by Nick Boughton (tiberiouse) - Sunday, 30 June 2019, 21:15 GMT
Segfault output from journalctl -xe:

Jun 30 22:05:06 bigby systemd-coredump[5482]: Process 5462 (conky) of user 1000 dumped core.

Stack trace of thread 5462:
#0 0x000055c57841690b _Z20draw_each_line_innerPcii (conky)
#1 0x000055c5784170d2 _ZL9draw_textv (conky)
#2 0x000055c5784177e8 _ZL10draw_stuffv (conky)
#3 0x000055c57841a4dd _Z9main_loopv (conky)
#4 0x000055c57840d35a main (conky)
#5 0x00007f9e9f354ee3 __libc_start_main (libc.so.6)
#6 0x000055c57841245e _start (conky)

Stack trace of thread 5466:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)

Stack trace of thread 5472:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)

Stack trace of thread 5468:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)

Stack trace of thread 5469:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)

Stack trace of thread 5470:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)

Stack trace of thread 5471:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)

Stack trace of thread 5467:
#0 0x00007f9ea07e0d16 do_futex_wait.constprop.0 (libpthread.so.0)
#1 0x00007f9ea07e0e18 __new_sem_wait_slow.constprop.0 (libpthread.so.0)
#2 0x000055c5784504f6 _ZN5conky4priv13callback_base13start_routineEv (conky)
#3 0x00007f9e9f5dbcd4 execute_native_thread_routine (libstdc++.so.6)
#4 0x00007f9ea07d857f start_thread (libpthread.so.0)
#5 0x00007f9e9f42a0e3 __clone (libc.so.6)
Comment by Gaetan Bisson (vesath) - Tuesday, 02 July 2019, 21:08 GMT
Could you please report this upstream? And if you could bisect the exact commit that triggers this segfault between the previous release that worked for you and the current one, that would be great. Sorry I cannot do more but I'm travelling with limited Internet access for a while still. Cheers.

Loading...