[lxc-users] How to share a dual nvidia cards between two LXC

Eric Espino eric.espino at rocketmail.com
Thu Sep 4 18:44:28 UTC 2014

Guillaume Thouvenin <guillaume.thouvenin at ...> writes:

> Hello,
>   I have a card with two nvidia GPUs. Currently I'm using it in one 
> LXC. I compiled the nvidia drivers from their our official web site in 
> the container. I created /dev/nvidia0, /dev/nvidia1 and /dev/nvidiactl 
> devices into the container. From the container I can start an X server 
> on :0. Then I'm using TurboVNC and virtualGL to use the 3D graphics 
> capabilities of the card.
>   As I have a two GPUs I'd like to dedicate one GPU to a container and 
> the other one to other container. My approach is to compile the nvidia 
> drivers in both containers, create /dev/nvidia0, /dev/nvidiactl into 
> one container and /dev/nvidia1, /dev/nvidiactl into the other 
> container. Then I should be able to start an X server in both 
> containers. The main problem I have is that both containers try to use 
> display :0 even if I start one with xinit -display :2
> So I'd like to know if this approach seems doable and if people that 
> already achieve this can share the configuration about cgroups, tty and 
> nvidia device.
> ...

Hello Guillaume:

Quick questions:
1. The original post was done on 2013, did you solve your problem on regard 
assigning one GPU to different containers?
2. Are the two GPUs interconnected using a SLI cable?
2. Without the solution in place, what does it happen when you bootup a 
second containter that requires a GPU (assuming that the first container is 
also require access to a GPU)?


More information about the lxc-users mailing list