Regression on USB DRM device hotplug
Xorg crashes when I hotplug a DisplayLink USB 3 dock.
With Xorg at 3340ddf3 on Ubuntu 19.04, DisplayLink proprietary driver stack installed (open source EVDI kernel driver creates a normal DRM device node for closed-source userspace sink).
Hotplugging and unplugging a DisplayLink USB 3 dock works fine if the dock was already plugged in during Xorg start-up, that is, if the DRM device node pre-exists.
When the DRM device node does not pre-exist and a DisplayLink dock is hotplugged, Xorg crashes with
(EE) Segmentation fault at address 0x3
(EE)
Fatal server error:
(EE) Caught signal 11 (Segmentation fault). Server aborting
The GDB backtrace is:
#7 0x00007f716f2c5f60 in <signal handler called> () at /lib/x86_64-linux-gnu/libc.so.6
#8 0x00007f716f32aa1a in __strcmp_sse2_unaligned () at ../sysdeps/x86_64/multiarch/strcmp-sse2-unaligned.S:31
#9 0x00005646e4960425 in xf86ProbeOutputModes (scrn=scrn@entry=0x5646e55e9560, maxX=8192, maxX@entry=0, maxY=8192, maxY@entry=0)
at ../hw/xfree86/modes/xf86Crtc.c:1852
#10 0x00005646e49692b7 in xf86RandR12GetInfo12 (pScreen=0x5646e5790b00, rotations=<optimised out>) at ../hw/xfree86/modes/xf86RandR12.c:1694
#11 0x00005646e488a6f5 in RRGetInfo (pScreen=0x5646e5790b00, force_query=<optimised out>) at ../randr/rrinfo.c:196
#12 0x00005646e491bb31 in ospoll_wait (ospoll=0x5646e55c0350, timeout=<optimised out>) at ../os/ospoll.c:657
#13 0x00005646e4915673 in WaitForSomething (are_ready=0) at ../os/WaitFor.c:208
#14 0x00005646e484cdac in Dispatch () at ../include/list.h:220
#15 0x00005646e4850fe6 in dix_main (argc=2, argv=0x7fff855253f8, envp=<optimised out>) at ../dix/main.c:274
#16 0x00007f716f2a8b6b in __libc_start_main (main=
0x5646e4814220 <main>, argc=2, argv=0x7fff855253f8, init=<optimised out>, fini=<optimised out>, rtld_fini=<optimised out>, stack_end=0x7fff855253e8)
at ../csu/libc-start.c:308
#17 0x00005646e481425a in _start () at ../hw/xfree86/modes/xf86Crtc.c:2045
Inspection:
(gdb)
#9 0x00005646e4960425 in xf86ProbeOutputModes (scrn=scrn@entry=0x5646e55e9560, maxX=8192, maxX@entry=0, maxY=8192, maxY@entry=0)
at ../hw/xfree86/modes/xf86Crtc.c:1852
1852 if (!strcmp(preferred_mode, mode->name)) {
(gdb) print preferred_mode
$1 = 0x3 <error: Cannot access memory at address 0x3>
(gdb) print mode->name
$2 = 0x5646e5cbaed0 "1366x768"
(gdb) print *scrn->display
$4 = {frameX0 = 0, frameY0 = 0, virtualX = 1366, virtualY = 768, depth = 24, fbbpp = 32, weight = {red = 0, green = 0, blue = 0}, blackColour = {
red = 4294967295, green = 4294967295, blue = 4294967295}, whiteColour = {red = 4294967295, green = 4294967295, blue = 4294967295}, defaultVisual = -1,
modes = 0x5646e55d8a40, options = 0x0}
(gdb) print scrn->display->modes
$5 = (const char **) 0x5646e55d8a40
(gdb) print *scrn->display->modes
$6 = 0x3 <error: Cannot access memory at address 0x3>
This is a regression compared to the Xorg 1.20.4 in Ubuntu (with Ubuntu patch stack), where this use case works just fine. Testing without the Ubuntu patch stack is not meaningful, because then I would lack the patch to "autobind GPUs to the screen" which means that nothing visible will happen when a new DRM device appears.
Here is the full Xorg.0.log from that run. Curiously, the first time it mentions card1
is in
[ 94.218] (II) config/udev: removing GPU device /sys/devices/platform/evdi.0/drm/card1 /dev/dri/card1