#StackBounty: #graphics Graphical distortions on high res / high refresh rate monitor

Bounty: 100

Ever since upgrading to a 1440p 144hz monitor, Ubuntu has been displaying strange graphical distortions. This is one example:

graphical distortions on Ubuntu

This doesn’t happen on Windows, therefore I think this is an OS-specific or software problem and not a hardware defect. It also didn’t happen with my old 1080p monitor, and I haven’t changed any settings since changing monitors (besides upping the refresh rate).

I’m not sure how to go about troubleshooting this, so any help is appreciated.

Note: I had to take a photo with my phone because screenshotting it didn’t capture the distortions.

Output of lspci -k | grep -EA3 'VGA|3D|Display':

01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Tahiti XT [Radeon HD 7970/8970 OEM / R9 280X]
    Subsystem: Micro-Star International Co., Ltd. [MSI] Tahiti XT [Radeon HD 7970/8970 OEM / R9 280X]
    Kernel driver in use: radeon
    Kernel modules: radeon, amdgpu

Get this bounty!!!

#StackBounty: #linux #graphics #multi-monitor #intel-graphics #dock-station Dual monitor issues Linux Mint Thinkpad Docking station

Bounty: 100

The setup

  • T460 thinkpad (Intel HD Graphics) docked with lid closed
    connected to two external monitors one via DVI and the other via HDMI

The problem

  • During boot, up until the login screen both monitors work fine. But as soon as I sign in the monitor connected with the DVI will go blank about %80 of the time.

My workaround solutions to fix this problem

  • Going back to the login screen via CTRL ALT BACKSPACE and signing in again will almost always fix it.
  • Physically removing/reinserting the cable will almost always fix it.


  • When logging in and experiencing the problem the monitor seems to be recognized as I can move my mouse into its resolution, but the monitor claims it is waiting for signal.

When did the problem occur?

  • After moving and going from three to two monitors as well as reinstalling Linux Mint for luks encryption

I’m pretty green when it comes to diagnosing graphical problem. I would be happy to paste logs, but I’m not sure which ones would be helpful. Please let me know which logs might be worth taking a look at, or any other suggestions you may have!

Get this bounty!!!

#StackBounty: #performance #graphics #fractals #julia Julia set in Julia (and other fractals)

Bounty: 50

I’ve made the following to draw a series of zooming in images of a fractal.

using Distributed, FileIO, ImageCore, Images, ImageView

@everywhere function mandel_pow(z::Complex, p::Number)
    c = z
    maxiter::UInt8 = 255
    for n = 1:maxiter
        if abs2(z) > 4
            return n-1
        z = z^p + c
    return maxiter

@everywhere function mandel(z)
    mandel_pow(z, 2)

@everywhere function julia_pow(z::Complex, c::Complex, p::Number)
    maxiter::UInt8 = 255
    for n = 1:maxiter
        if abs2(z) > 4
            return n-1
        z = z^p + c
    return maxiter

@everywhere function julia(z)
    julia(z, 2)

function gen_frame(min_coord::Complex,
                       max_coord::Complex, res, fn)
    x_res, y_res = res
    arr = @distributed hcat for i in range(real(min_coord), real(max_coord), length=x_res)
        col = Vector{UInt8}(undef, y_res)
        for j in enumerate(range(imag(max_coord), imag(min_coord), length=y_res))
            col[j[1]] = fn(i+j[2]*im)

function zoom_in(image, zoom, scale)
    y_res, x_res = size(image)
    zoom = zoom/2
    xmin = 1+round(Int, x_res*(.5-zoom))
    xmax = round(Int, x_res*(.5+zoom))
    ymin = 1+round(Int, y_res*(.5-zoom))
    ymax = round(Int, y_res*(.5+zoom))
    v = @view image[ymin:ymax, xmin:xmax]
    imresize(v, div.(size(image), scale))

function draw_zoom(center, half_size, fps, seconds, fn)
    res = (1920,1080)
    scale = 2

    for t in 0:seconds
        zoom = scale^convert(Float64, -t)
        base = normedview(gen_frame(center - half_size*zoom,
                                    center + half_size*zoom,
                                    res.*scale, fn));
        for i in 0:fps
            zoom = scale^(-i/fps)
            save("mandel" * lpad(string(i+fps*t),4,'0') * ".jpg", zoom_in(base, zoom, scale))
        println(" done.")

c = -0.5-0im
half_size = 1.5+1im
draw_zoom(c, half_size, 1, 1, mandel)

It can draw 10 seconds of frames at 30fps in about a minute using all cores of my computer.
I’m specifically looking for ways to improve the way gen_frame is parallelized, and ideally make the saving zoomed in strings happen while the next computation is being done.

Get this bounty!!!

#StackBounty: #algorithms #optimization #graphics Algorithm or Method to reconstruct 3D world coordinates from 2D pixels with some know…

Bounty: 100

I have an algorithm than can recognize the 2d pixel locations of certain 3d points within a 2d image. Ultimately we are interested in computing a real-world horizontal distance between these points. See the below example.

Example image

We have some known a-priori information about the world and the camera.
First of all, we know the real vertical distances between the two pairs of points, and this is always a fixed value in the real world (e.g. 2 meters). We hope to use this information as a reference value to somehow proportionalize to other points in the world.

Other assumptions that we have (and hence may be used in the algorithm) are

  • known corresponding world distances between points (vertically)
  • detected points line in the same plane
  • camera’s optical axis points approximately with a right angle w.r.t. this plane
  • focal length f
  • field of view (angular)
  • more than 1 image taken at potentially different heights

Our current method of solving this, (i.e. obtaining the real-world horizontal distance) is by linearly extrapolating the known reference distances (vertical) to fit the pixel locations.
Now this should (and theoretically will) work perfectly given the above assumptions. However, if the pixel location is off by 1 or 2 pixels, then this can propagate back to an error of ~20 mm depending on the height. Not to mention other variations in the real world like cam angle, object height offsets etc., which may yield too big of an error. Also, this computation involves not all the information available: that is we only need 1 of the two 2 known distances and only 1 image to perform this computation.

Now I’m wondering if we can’t approach this problem in a better way, by combining:
– multiple images,
– the information of the focal length/angle
– and potentially multiple reference distances within the image.

Research suggest that there are algorithms like Kanade-Tomasi and back projection, but I’m not sure if they are relevant to us, because they use different assumptions (like constant distance to objects?).

Anyway, I myself was thinking in terms of a least squares approach, but not sure how to parametrize the model such that a 3d reconstruction (or real-world distances) is predicted by it, given the knowns (focal length etc.).

So I’m seeking for a push in the right direction, that can help solve our prediction/computation problem, given the above assumptions.

Get this bounty!!!

#StackBounty: #nvidia #graphics #19.04 How to enable NVIDIA?

Bounty: 50

Note: Even after I followed the explanation in another question (as my question was identified as a possible duplicate of it) nothing changed of what I am detailing below.

I know there are myriad questions and how-to’s about installing and using NVIDIA graphics card on Ubuntu, but my case is a bit different (probably due to using Ubuntu 19.04).

My laptop has a GeForce GTX 1050 NVIDIA graphics card besides the “default” Intel graphics card. (I am using the laptop’s own display and nothing is connected to the HDMI port.)

$ lspci -k | grep -A 2 -i "VGA"
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 630 (Mobile)
    Subsystem: Tongfang Hongkong Limited UHD Graphics 630 (Mobile)
    Kernel driver in use: i915
01:00.0 VGA compatible controller: NVIDIA Corporation GP107M [GeForce GTX 1050 Mobile] (rev a1)
    Subsystem: Tongfang Hongkong Limited GP107M [GeForce GTX 1050 Mobile]
    Kernel driver in use: nvidia

I already have installed the latest recommended NVIDIA driver (probably several weeks ago!):

$ sudo ubuntu-drivers devices
== /sys/devices/pci0000:00/0000:00:01.0/0000:01:00.0 ==
modalias : pci:v000010DEd00001C8Dsv00001D05sd00001042bc03sc00i00
vendor   : NVIDIA Corporation
model    : GP107M [GeForce GTX 1050 Mobile]
driver   : nvidia-driver-390 - distro non-free
driver   : nvidia-driver-418 - distro non-free recommended
driver   : xserver-xorg-video-nouveau - distro free builtin

$ sudo apt-get install nvidia-driver-418
Reading package lists... Done
Building dependency tree       
Reading state information... Done
nvidia-driver-418 is already the newest version (418.56-0ubuntu1).
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.

$ prime-select query

However, system Settings | Details | About displays that the following graphics processor is active:

IntelĀ® UHD Graphics 630 (Coffeelake 3x8 GT2)

I get the following output from nvidia-smi command:

$ nvidia-smi
Wed May 29 19:17:55 2019       
| NVIDIA-SMI 418.56       Driver Version: 418.56       CUDA Version: 10.1     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  GeForce GTX 1050    Off  | 00000000:01:00.0 Off |                  N/A |
| N/A   33C    P8    N/A /  N/A |      2MiB /  4040MiB |      0%      Default |

| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|  No running processes found                                                 |

The GPU seems idle.

And the nvidia-settings command just displays a simple window like this:

enter image description here

Considering all these, I believe that currently the Intel graphics card, and not the NVIDIA graphics card is active in my system.

Purging and reinstalling the NVIDIA driver does not help.

Certainly I am missing something. But what?

Get this bounty!!!

#StackBounty: #drivers #nvidia #graphics #xorg #gpu Using P104-100 GPU on Ubuntu (as single or second GPU)

Bounty: 50

What I have:

GA-B250-Fintech motherboard

Ubuntu 18.04 (but I’m ready to reinstall to any version if it will work)

$ uname -r

And P104-100 video card.

$ lspci | grep NVIDIA
01:00.0 3D controller: NVIDIA Corporation GP104 [P104-100] (rev a1)

gcc --version
gcc (Ubuntu 7.4.0-9ubuntu1~18.04.york0) 7.4.0

The problem:
every time I install nvidia drivers, Ubuntu stop loading. Usually with message:

started user manager uid 121.

I plug out P104-100 and plug in GTX 1060, this works correctly:

~$ nvidia-smi
Mon May 13 22:56:17 2019       
| NVIDIA-SMI 418.74       Driver Version: 418.74       CUDA Version: 10.1     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  GeForce GTX 106...  Off  | 00000000:01:00.0 Off |                  N/A |
|  0%   42C    P8     6W / 120W |      0MiB /  6078MiB |      0%      Default |

| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|  No running processes found                                                 |

Then I tried plug in P104-100 as second card using another slot, and Ubuntu stops loading: freeze with message

Started GNOME Display Manager.

P.S. Using lshw -C display I found both cards have
physical id: 0
Can it be a reason?

How to install both cards on Ubuntu?
I think if I successfully install P104-100 as single card the same solution could be for installing both of them.

P.S. I found some solution about editing xconf for using on multiple GPUs: for example.
But it did not help.

Get this bounty!!!

#StackBounty: #boot #graphics #uefi What is safe graphics mode?

Bounty: 50

On Ubuntu 19.04 and all its official flavors I observed that if I boot the LiveUSB in UEFI mode, there are options for live session and installation with safe graphics written beside them. However these options aren’t available in legacy mode.

enter image description here

So, what is the purpose of this safe graphics mode? Is there any specific reason why they included in 19.04 and not on previous releases such as 16.04, 18.04 and 18.10?

Get this bounty!!!

#StackBounty: #drivers #18.04 #graphics #intel-graphics Screen flickering as I type

Bounty: 50

I have laptop with 18.04 installed, this problem also occurred when I use 16.04 before.

My screen seems to blinked when I start typing, it blinked instantly that I could not capture the moment of it, anyhow the blinking also happens randomly, there is no pattern at all.

I have tried the suggestion from various source, this includes.

  1. Updating the kernel, Iam using kernel 4.19 now, before this Im using kernel 4.5 or 4.6 but does not work also.
  2. Updating graphics driver, running command from https://askubuntu.com/a/215967. Doesn’t seem to work, the flickering still happens.

How do I stop this from happening, it is not really distracting because it’s only fraction of second, but if I can stop it I rather do so.

I don’t have dedicated video graphics card, using Intel graphics at the moment. I think it’s Intel HD 3000.

sudo lshw -C video | grep product result.

sudo lshw -C video | grep product
product: 3rd Gen Core processor Graphics Controller

uname -r result.

$ uname -r

If you want further command, I’d love to include them.


Get this bounty!!!

#StackBounty: #drivers #nvidia #graphics #video #gpu Video quality problem with Nvidia

Bounty: 50

I have a GTX 1060 6GB GPU on my desktop connected to a 4K monitor. There is no image quality or resolution error, but when there is fast movement in a video, the image becomes pixelated.

I installed the driver and CUDA toolkit 10.1 from the official website.

With a Radeon GPU on the same machine, the video plays smoothly with crisp images.

Do you have any hint what can be the source of this problem?

Mon Apr  1 15:17:38 2019       
| NVIDIA-SMI 418.39       Driver Version: 418.39       CUDA Version: 10.1     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  GeForce GTX 106...  On   | 00000000:42:00.0  On |                  N/A |
|  0%   52C    P0    28W / 120W |   1957MiB /  6077MiB |     25%      Default |

| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|    0      1560      G   /usr/lib/xorg/Xorg                           587MiB |
|    0      1782      G   /usr/bin/gnome-shell                         529MiB |
|    0     59194      G   ...quest-channel-token=6090805687932963293   557MiB |
|    0    101293      G   ...equest-channel-token=272009698120777591   280MiB |

Get this bounty!!!

#StackBounty: #drivers #graphics #amd-graphics #amdgpu #amdgpu-pro How to uninstall AMDGPU?

Bounty: 50

I have a fresh install of 18.04 on my desktop with Radeon Vega 64 GPU. Since I installed AMD GPU official driver, I have a collection of problems.

For a week, the mouse was lagging after a few hours. Now, the entire screen freezes after a few hours. There is no graphic output. If I turn off the monitor and start it again, there is no signal. However, the machine still works as a webserver.

I uninstalled AMD GPU with the command,


but then, Ubuntu does not boot into the graphical interface (freezes before the purple screen).

I tried to blacklist the driver, according to this answer.

Again, Ubuntu does not come up.

It is frustraiting, as my machine is practically useless because of this stupid driver. I am thinking of buying an Nvidia GPU.

Do you know a safe method to get rid of AMD GPU driver? Because I don’t remember any of these problems with the fresh install of Ubuntu 18.04. I believe the native firmware worked much better.

Get this bounty!!!