Giter Site home page Giter Site logo

primus's Introduction

primus

Primus is a shared library that provides OpenGL and GLX APIs and implements low-overhead local-only client-side OpenGL offloading via GLX forking, similar to VirtualGL. It intercepts GLX calls and redirects GL rendering to a secondary X display, presumably driven by a faster GPU. On swapping buffers, rendered contents are read back using a PBO and copied onto the drawable it was supposed to be rendered on in the first place. For more information, refer to [technotes.md] (https://github.com/amonakov/primus/blob/master/technotes.md).

To use, install or build from source and use primusrun wrapper script.

In distributions

Building for multilib (32-bit + 64-bit) systems

LIBDIR=lib make && CXX=g++\ -m32 LIBDIR=lib32 make

Adjust LIBDIR variables above as appropriate for your distribution (reflecting how /usr/lib* are named):

  • Arch needs lib and lib32 as above

  • Gentoo needs lib64 and lib32

  • RPM-based may need lib64 and lib

  • Debian (with multiarch) needs lib/x86_64-linux-gnu and lib/i386-linux-gnu

  • Ubuntu (with multiarch) seems rather inconsistent. The dynamic linker expands $LIB to x86_64-linux-gnu/i386-linux-gnu (without lib/), but Nvidia drivers are installed into /usr/lib{,32}/nvidia-current. Something like the following is needed:

      export PRIMUS_libGLd='/usr/lib/$$LIB/mesa/libGL.so.1'
      LIBDIR=x86_64-linux-gnu make
      LIBDIR=i386-linux-gnu CXX=g++\ -m32 make
      unset PRIMUS_libGLd
    

    Starting from 13.04, Ubuntu needs the same LIBDIR paths as Debian (with leading lib/); consequently, lib/ in PRIMUS_libGLd should be omitted.

    Furthermore, libnvidia-tls.so is not present in default shared library search directories. Uncomment the corresponding line in primusrun.

Issues under compositing WMs

Since compositing hurts performance, invoking primus when a compositing WM is active is not recommended. If you need to use primus with compositing and see flickering or bad performance, synchronizing primus' display thread with the application's rendering thread may help (can anyone investigate why?):

PRIMUS_SYNC=1 primusrun ...

This makes primus display the previously rendered frame. Alternatively, with PRIMUS_SYNC=2 primus will display the latest rendered frame, trading frame rate for reduced visual latency.

FAQ

Q: Performance does not exceed 60 fps, I was getting more with optirun/VirtualGL.
A: This is the effect of vblank synchronisation. For benchmarking, you can use vblank_mode=0 primusrun ..., but in practice this will probably only waste power, as your LCD panel does not display more than 60 frames per second anyway.

primus's People

Contributors

amonakov avatar andreas-schwab avatar bundyo avatar karolherbst avatar ralfjung avatar vincent-c avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

primus's Issues

Fullscreen application is black 1 frame of 2 using kwin

it look like it render only 1 frmae of 2 using KWin with visual effect (CRender and Opengl). I checked the case "suspend effect on fullscreen window" and it didn't fix the issue. I tried some games like minecraft or bit.trip.runner and all of these appear to have this bug.

Slow performance and freezes running CoDMW

I get the following unexpectedly low frames per second reported by the CoDMW application (arcade mission 'War Pig') running at 1600x900 under Ubuntu 12.10 (see also comments on Bumblebee-Project/Bumblebee#241):

primus: 15-16 fps
optirun: 25-32 fps

and I get occasionally graphics freezes of 1-5 seconds where the video stops but the game continues running in the background (eg when video starts up again I sometimes have been shot).

primus does not report any profiling information for this application, and using PRIMUS_SYNC=1 (or 2) doesn't make any difference.

Looking at the CPU usage, it appears that under primus one of the cores is running at approx 40%, whereas under optirun it is running at around 90%.

I'm running on a 64 bit OS with primus's libGL.so.1 built for i386.

Firefox WebGL: segfault

If I try running firefox with primusrun, it looks fine, but if I load any page requiring WebGL or acces to the graphic cards, it segfaults, whereas it runs correctly with optirun (VGL).

The terminal output is just:
Segfault (core dumped)

No relevant output in Xorg.8.log, nor in syslog.

Is there anything usefull I can provide (something with gdb for example) ?

TF2 crash with multicore rendering

When using primusrun, TF2 will crash with multicore rendering, randomly. It'll most likely crash at the end of the server connection, or when a player get killed. It works fine with optirun.

Thread backtraces from gdb:
http://pastebin.com/2FNucqzh

Here is an example console output after the crash:
intel_do_flush_locked failed: Erreur d'entrée/sortie
client callback thread error
client callback thread error
Uploading dump (in-process) [proxy '']
/tmp/dumps/crash_20121206165044_1.dmp
pipe write 15679/4024863552 failed 32 - remote process seems dead
32success = yes
response: CrashID=bp-81532fb9-5676-4c88-99e8-ac93a2121206

The first line can be translated as:
intel_do_flush_locked failed: Input/Output error

And this crash my desktop environment (cinnamon), I am unable to restart it, if I try to, it will always print the same error (intel_do_flush_locked failed: Input/Output error) and just close. I could restart it by making it run on the nvidia card with "optirun cinnamon" though.

I am running Ubuntu 12.10, Nvidia 310.19 driver with the xorg-edgers PPA.

If I disable multicore rendering (mat_queue_mode 0), everything works fine though, but I get lower framerates.

primus is capped by VSync from the intel card

$ primusrun glxspheres

Polygons in scene: 62464
Visual ID of window: 0xa2
Context is Direct
OpenGL Renderer: GeForce GT 525M/PCIe/SSE2
primus: sorry, not implemented: glXUseXFont
61.495434 frames/sec - 64.358661 Mpixels/sec
60.271691 frames/sec - 63.077941 Mpixels/sec
60.278291 frames/sec - 63.084849 Mpixels/sec
60.258595 frames/sec - 63.064235 Mpixels/sec

$ optirun glxspheres

Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: GeForce GT 525M/PCIe/SSE2
121.773730 frames/sec - 127.443515 Mpixels/sec
127.029911 frames/sec - 132.944424 Mpixels/sec
122.118546 frames/sec - 127.804385 Mpixels/sec
123.875380 frames/sec - 129.643018 Mpixels/sec

segfault

./primusrun glxinfo
loading /usr/lib64/opengl/nvidia/lib/libGL.so.1: 0x7f853b8fe000
loading /usr/lib/libGL.so.1: 0xfd5ab0
name of display: :0.0
glXCreateContext
Segmentation fault

sys libs:
mesa-master
xf86-video-intel-2.20.5 (on sna)
libdrm-2.4.39
libX11-1.5.0

uname -a:
Linux karols-gentoo 3.5.3-gentoo #1 SMP PREEMPT Mon Aug 27 00:17:22 CEST 2012 x86_64 Intel(R) Core(TM) i7-3610QM CPU @ 2.30GHz GenuineIntel GNU/Linux

Nvidia GPU: GT630M

do you need more information?
It would be nive to know which system you are using, so it is easier to find for errors, because on your system it seems to work.

EDIT:
this line is failing:

GLXContext actx = primus.afns.glXCreateNewContext(primus.adpy, *acfgs, GLX_RGBA_TYPE, shareList, direct);

Can't run primus with nvidia-experimental-310.

Hello,

I tried to run primus with nvidia-experimental-310 but I failed.
I changed PRIMUS_libGLa to: '/usr/lib/nvidia-experimental-310/libGL.so.1:/usr/lib32/nvidia-experimental-310/libGL.so.1', and tried to fire up primusrun.

primusrun glxgears
primus: fatal: failed to load any of the libraries: /usr/lib/nvidia-experimental-310/libGL.so.1:/usr/lib32/nvidia-experimental-310/libGL.so.1
libnvidia-tls.so.310.14: cannot open shared object file: No such file or directory
/usr/lib32/nvidia-experimental-310/libGL.so.1: wrong ELF class: ELFCLASS32

I don't know where the problem is, but the library libnvidia-tls.so.310.14 is present in /usr/lib/nvidia-experimental-310/.

Any thoughts?

correct way to add primusrun to this sh file?

So trying to get all my steam games to run via primusrun:

Amnesia uses this it seems:

!/bin/sh

if [ arch == "x86_64" ]; then
./checklibs.sh libs64 Amnesia.bin64
./Amnesia.bin64
else
./checklibs.sh libs Amnesia.bin
./Amnesia.bin
fi

What would be the correct way to add the command?

Sometimes black screen while playing Team Fortress 2 on native Steam Client

Hey Guys,

I have a little problem while playing Team Fortress 2 on my Linux box.
First I have to say, that the problem doesn't exist, when playing with optirun command instead of primusrun.

Steps to reproduce:

  1. Start Steam with
    $ primusrun steam
  2. Play Team Fortress 2 for a while
  3. After a while the screen will go black for 1-10 secs
  4. After that the game will run normal for a few seconds
  5. Again black screen

System Specs:

Alienware M11xR3
Gentoo Linux
$ uname -a
Linux GentooM11x 3.7.1-gentoo #1 SMP PREEMPT Thu Dec 20 11:37:49 CET 2012 x86_64 Intel(R) Core(TM) i7-2637M CPU @ 1.70GHz GenuineIntel GNU/Linux
GPU: NVIDIA Corporation GF108 [GeForce GT 540M]
nvidia Driver Version: 310.19

If there's any ability for me to help you debugging my system, let me know.

regards Leo

primusrun does not work on LFS system

Hello!
Bumblebee 3.0.1, Nvidia 313.18 (GeForce 620M + intel hd 3000), Mesa 9.0.2.
Primusrun does not work, but optirun is ok.
Some examples:

$ primusrun glxgears
Error: couldn't get an RGB, Double-buffered visual
$ primusrun glxgears -display :8
libGL error: failed to load driver: swrast
libGL error: Try again with LIBGL_DEBUG=verbose for more details.
glxgears: libglfork.cpp:502: void match_fbconfigs(Display*, XVisualInfo*, __GLXFBConfigRec***, __GLXFBConfigRec***): Assertion `ncfg' failed.
Aborted
$ optirun glxgears
4868 frames in 5.0 seconds = 973.558 FPS
4819 frames in 5.0 seconds = 963.719 FPS
5046 frames in 5.0 seconds = 1009.149 FPS
5071 frames in 5.0 seconds = 1014.160 FPS
$ glxgears 
Error: couldn't get an RGB, Double-buffered visual

primusrun:
http://pastebin.com/gH5c2SUB

Where is a problem?
Thanks!

Cant seem to get primus working on fedora 16

After trying every way i can think of I cant get primus to work even though optirun works well.
Any help for my F16 installation would be greatly appreciated: For anyone who has gotten it to work – what were your exact steps to get it working?

I fullly updated everything and pulled the most recent copy of primus as of 12-16-2012:
removed old bbswitch and am now running 0.5, also running bumblebee 3.0.1
installed packages listed below – are there any others that should be installed or looked at?

Linux localhost 3.6.7-4.fc16.x86_64 #1 SMP Tue Nov 20 20:33:31 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

$ rpm -qa |grep kernel |grep 3.6.7
kernel-tools-3.6.7-4.fc16.x86_64
kernel-tools-libs-3.6.7-4.fc16.x86_64
kernel-devel-3.6.7-4.fc16.x86_64
kernel-headers-3.6.7-4.fc16.x86

$ rpm -qa |grep VirtualGL
VirtualGL-2.3.2-20121002.x86_64
VirtualGL-2.3.2-20121002.i386
VirtualGL-debuginfo-2.3.2-20121002.x86_64

$ rpm -qa |grep nvidia
kmod-nvidia-3.6.7-4.fc16.x86_64-304.64-1.fc16.x86_64
nvidia-xconfig-1.0-20.fc16.x86_64
nvidia-settings-1.0-22.fc16.x86_64
xorg-x11-drv-nvidia-304.64-3.fc16.x86_64
akmod-nvidia-304.64-1.fc16.x86_64
xorg-x11-drv-nvidia-libs-304.64-3.fc16.x86_64

$ rpm -qa |grep mesa
mesa-dri-drivers-7.11.2-3.fc16.x86_64
mesa-dri-drivers-dri1-7.11.2-3.fc16.x86_64
mesa-libEGL-devel-7.11.2-3.fc16.x86_64
mesa-libEGL-7.11.2-3.fc16.x86_64
mesa-libGLw-6.5.1-9.fc15.x86_64
mesa-libGL-7.11.2-3.fc16.i686
mesa-libGLES-devel-7.11.2-3.fc16.x86_64
mesa-libGL-devel-7.11.2-3.fc16.x86_64
mesa-libOSMesa-7.11.2-3.fc16.x86_64
mesa-dri-filesystem-7.11.2-3.fc16.i686
mesa-dri-drivers-7.11.2-3.fc16.i686
mesa-libGLES-7.11.2-3.fc16.x86_64
mesa-libGLU-7.11.2-3.fc16.x86_64
mesa-debuginfo-7.11.2-3.fc16.x86_64
mesa-libGL-7.11.2-3.fc16.x86_64
mesa-demos-7.10-5.20101028.fc16.x86_64
mesa-libGLw-debuginfo-6.5.1-9.fc15.x86_64
mesa-dri-filesystem-7.11.2-3.fc16.x86_64
mesa-libOSMesa-devel-7.11.2-3.fc16.x86_64
mesa-demos-debuginfo-7.10-5.20101028.fc16.x86_64
mesa-libGLU-devel-7.11.2-3.fc16.x86_64
mesa-libGLw-devel-6.5.1-9.fc15.x86_64
mesa-libGLU-7.11.2-3.fc16.i686

---------GPU TESTS-------

Intel i7 internal video card

$ vblank_mode=0 glxgears
ATTENTION: default value of option vblank_mode overridden by environment.
ATTENTION: default value of option vblank_mode overridden by environment.
24240 frames in 5.0 seconds = 4847.902 FPS
23846 frames in 5.0 seconds = 4769.086 FPS
24608 frames in 5.0 seconds = 4921.437 FPS

optirun – verified running against optimus GPU (nvidia-settings) (nvidia card clocks up and gets warmer – and dmesg shows card starting up)

$ optirun glxgears
927 frames in 5.0 seconds = 185.291 FPS
984 frames in 5.0 seconds = 196.531 FPS
995 frames in 5.0 seconds = 198.842 FPS
955 frames in 5.0 seconds = 190.843 FPS

oddly when primus cant find the primuslib it gets similar numbers to the intel i7 which indicates to me its falling back to the i7 which is confirmed in the nvidia-settings control panel looking at the gpu temp, clock speed and ram usage – the card is sitting idle. (also dmesg DOESNT show card starting up)

just running make in the primus directory seems to cause primus to use the intel i7 gpu. having tried every compile option i could think of, i cant seem to get primus to use the nvidia gpu at all. the config below is the closest i could get to working - it could see the usual pause to load the nvidia gpu but the glxgears window would flash for a split second before throwing the following error.

primusrun as configured and complied below

$ ./primusrun glxgears
X Error of failed request: BadWindow (invalid Window parameter)
Major opcode of failed request: 137 (NV-GLX)
Minor opcode of failed request: 4 ()
Resource id in failed request: 0×200003
Serial number of failed request: 42
Current serial number in output stream: 42

contents of makefile and primusrun:

http://pastebin.com/eHCXx1Ui

Using primus with nouveau

Hi,

How should I configure primus to run it with nouveau? If I switch bumblebee to nouveau, primus does not work anymore, saying NV-GLX is missing.

Looking at the way I've built primus, it looks like it's only compatible with nvidia proprietary driver. How could I build it for nouveau support, or better, for both support ?

Multisampling AA limited by iGPU capabilities

Couldn't get Freespace 2 working unless I turned down the AA to 4x. It will run, but flicker constantly. Optirun can handle 8x and 16x (ignoring framerate).

FXAA seems to work fine.

Bumblebee + primusrun

When using primusrun, Bumblebeed writes this to syslog:
bumblebeed[1000]: Could not read data! Error: Connection reset by peer

Is this an error on primus or Bumblebee side?

Slow performance on some video modes

I was getting consistent very slow performance on my Zenbook UX32VD (the FullHD one) of about 8-9 fps on 1920x1080 in Trine 2 - regardless of the game settings. I tried changing the resolution to others and still the performance was sub par. Then I discovered that if the resolution width was different than the standard resolutions (for instance if I set 1919x1079), FPS almost tripled to about 25. Same goes for lower resolutions.

Unfortunately this also affects fullscreen and I can't change the resolution there. Is this a known issue?

primus won't quit with bastion sometimes

after playing bastion for some time, primus won't quit bastion anymore (background sound is still played after quit).

I got this warning: "primus: warning: dropping a frame to avoid deadlock"

But sometimes I am getting also this error:
"X Error of failed request: BadWindow (invalid Window parameter)
Major opcode of failed request: 40 (X_TranslateCoords)
Resource id in failed request: 0x4000002
Serial number of failed request: 7050
Current serial number in output stream: 7050

Native stacktrace:

Segmentation fault"

but then bastion is closed

Packages for Ubuntu 12.10

I see your PPA only has Precise, any chance of packages for 12.10? Really dislike compiling it's quite confusing.

primusrun fails to set higher resolution than available (while optirun works)

If I set up CoD MW2 so that:

  • winecfg is set to emulate a 1600x900 screen
  • CoD MW2 is set up for 1920x1080

and then I try running CoD MW2 with primusrun, the game freezes when it tries to set the resolution. There are no errors from primusrun, and only this error from wine:

err:d3d:context_create wglSwapIntervalEXT failed to set swap interval 1 for context 0x173dc1f0, last error 0x591

However, if I run this using optirun, it works - I just lose the window borders.

primusrun fails on Ubuntu 13.04

I just updated to Ubuntu 13.04, and primusrun no longer works correctly.

primusrun glxgears give the following result:
Xlib: extension "NV-GLX" missing on display ":0".

TF2 gives an error about a missing extension.

It works fine if I do the following export though:
export PRIMUS_libGL=/usr/lib/x86_64-linux-gnu/primus/
(or i386-linux-gnu for TF2)

EDIT:

The problem might still be that $LIB is not working after all.

I could solve the problem by editing primusrun, and changing:
PRIMUS_libGL=${PRIMUS_libGL:-/usr/lib/'$LIB'/primus}
to
PRIMUS_libGL=${PRIMUS_libGL:-'/usr/lib/x86_64-linux-gnu/primus:/usr/lib/i386-linux-gnu/primus'}

Any idea why $LIB isn't working?

Wine (Crossover) 32bit game does not start on 64bit Ubuntu system

Hi

when I start primus with glxspheres everything works fine:

> LIBGL_DEBUG=verbose primusrun glxspheres
Polygons in scene: 62464
libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/tls/i965_dri.so
libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/i965_dri.so
libGL: Can't open configuration file /etc/drirc: No such file or directory.
libGL: Can't open configuration file /home/userhome/.drirc: No such file or directory.
Visual ID of window: 0xaf
libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/tls/i965_dri.so
libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/i965_dri.so
libGL: Can't open configuration file /etc/drirc: No such file or directory.
libGL: Can't open configuration file /home/userhome/.drirc: No such file or directory.
Context is Direct
OpenGL Renderer: NVS 5200M/PCIe/SSE2
primus: sorry, not implemented: glXUseXFont
61.318186 frames/sec - 68.431095 Mpixels/sec
59.876914 frames/sec - 66.822636 Mpixels/sec
59.874329 frames/sec - 66.819751 Mpixels/sec

But when I want to start the Limbo game that uses wine (Crossover) under the hood it tries to load i386 versions of the intel driver and fails:

> LIBGL_DEBUG=verbose primusrun /opt/limbo/launch-limbo.sh
libGL: OpenDriver: trying /usr/lib/i386-linux-gnu/dri/tls/i965_dri.so
libGL: OpenDriver: trying /usr/lib/i386-linux-gnu/dri/i965_dri.so
libGL error: dlopen /usr/lib/i386-linux-gnu/dri/i965_dri.so failed (/usr/lib/i386-linux-gnu/dri/i965_dri.so: cannot open shared object file: No such file or directory)
libGL: OpenDriver: trying ${ORIGIN}/dri/tls/i965_dri.so
libGL: OpenDriver: trying ${ORIGIN}/dri/i965_dri.so
libGL error: dlopen ${ORIGIN}/dri/i965_dri.so failed (${ORIGIN}/dri/i965_dri.so: cannot open shared object file: No such file or directory)
libGL: OpenDriver: trying /usr/lib/dri/tls/i965_dri.so
libGL: OpenDriver: trying /usr/lib/dri/i965_dri.so
libGL error: dlopen /usr/lib/dri/i965_dri.so failed (/usr/lib/dri/i965_dri.so: cannot open shared object file: No such file or directory)
libGL error: unable to load driver: i965_dri.so
libGL error: driver pointer missing
libGL error: failed to load driver: i965
primus: fatal: failed to acquire direct rendering context for display thread

Why is the intel driver needed anyway? Primus should use the dedicated Nvidia graphic card, correct? Can anyone help?

Christoph

Support multiple directories in PRIMUS_libGL{a,d}

Currently, you have to pass an absolute path (defined as "first character is a slash") to PRIMUS_libGL{a,d}. The 32/64-bit "problem" was solved by using /usr/$LIB/nvidia/libGL.so.1, but this also has issues.

On Ubuntu (as you mentioned in the README), $LIB gets expanded to x86_64-linux-gnu or i386-linux-gnu (https://bugs.launchpad.net/ubuntu/+source/eglibc/+bug/993955) instead of the usual lib32 and lib64. It isn't even consequent with library paths, e.g. /usr/lib32/nvidia-current/libGL.so.1 vs /usr/lib/i386-linux-gnu/libGL.so (note host triplet vs plain lib32 name).

To work around this issues, I propose to augment the PRIMUS_libGL{a,d} variables with support for multiple files (or directories in which case /libGL.so.1 is simply appended). These would then be separated by colons (:).

Furthermore, if the name passed to dlopen contains a slash (instead of "begins with a slash"), it is already interpreted as a relative or absolute path instead of a name to be searched. This could be changed while we are at it.

Thoughts?

Add OpenGL 3.x and 4.x support

Hi, I really like your work, but is it possible to add support for creating OpenGL 3.x and 4.x context?
In fact, I use OpenGL > 3.2 with VirtualGL, but I would use primus!

The problem: glXGetProcAddressARB( "glXCreateContextAttribsARB" ) return a NULL pointer :(
Can you add it?

Below, a test program, working fine with VirtualGL, and on PCs without Optimus too (compile with g++ main.cpp -o test -lGL -lX11)

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <X11/Xlib.h>
#include <X11/Xutil.h>
#include <GL/gl.h>
#include <GL/glx.h>

#include <iostream>


typedef GLXContext (*glXCreateContextAttribsARBProc)(Display*, GLXFBConfig, GLXContext, Bool, const int*);


int main()
{
    Display *display = XOpenDisplay(0);

    if (!display)
    {
        printf( "Failed to open X display\n" );
        exit(1);
    }

    static int visual_attribs[] =
    {
        GLX_X_RENDERABLE    , True,
        GLX_DRAWABLE_TYPE   , GLX_WINDOW_BIT,
        GLX_RENDER_TYPE     , GLX_RGBA_BIT,
        GLX_X_VISUAL_TYPE   , GLX_TRUE_COLOR,
        GLX_RED_SIZE        , 8,
        GLX_GREEN_SIZE      , 8,
        GLX_BLUE_SIZE       , 8,
        GLX_ALPHA_SIZE      , 8,
        GLX_DEPTH_SIZE      , 24,
        GLX_STENCIL_SIZE    , 8,
        GLX_DOUBLEBUFFER    , True,
        None
    };

    int glx_major, glx_minor;

    // FBConfigs were added in GLX version 1.3.
    if ( !glXQueryVersion( display, &glx_major, &glx_minor ) || ( ( glx_major == 1 ) && ( glx_minor < 3 ) ) || ( glx_major < 1 ) )
    {
        printf( "Invalid GLX version" );
        exit(1);
    }

    printf( "Getting matching framebuffer configs\n" );

    int fbcount;
    GLXFBConfig *fbc = glXChooseFBConfig( display, DefaultScreen( display ), 
    visual_attribs, &fbcount );

    if ( !fbc || fbcount < 1)
    {
        printf( "Failed to retrieve a framebuffer config\n" );
        exit(1);
    }



    GLXFBConfig bestFbc = fbc[0];



    // Get a visual
    XVisualInfo *vi = glXGetVisualFromFBConfig( display, bestFbc );

    XSetWindowAttributes swa;
    Colormap cmap;
    swa.colormap = cmap = XCreateColormap( display,RootWindow( display, vi->screen ), vi->visual, AllocNone );
    swa.background_pixmap = None ;
    swa.border_pixel      = 0;
    swa.event_mask        = StructureNotifyMask;


    printf( "Creating window\n" );
    Window win = XCreateWindow(display, 
                               RootWindow( display, vi->screen ), 
                               0, 0, 100, 100, 0, vi->depth, 
                               InputOutput, 
                               vi->visual, 
                               CWBorderPixel|CWColormap|CWEventMask, 
                               &swa );

    if (!win)
    {
        printf( "Failed to create window.\n" );
        exit(1);
    }

    // Done with the visual info data
    XFree( vi );

    XStoreName( display, win, "OpenGL Window" );
    XMapWindow( display, win );






    glXCreateContextAttribsARBProc glXCreateContextAttribsARB = 0;
    glXCreateContextAttribsARB = (glXCreateContextAttribsARBProc)
    glXGetProcAddressARB( (const GLubyte *) "glXCreateContextAttribsARB" );

    GLXContext ctx = 0;

    if (!glXCreateContextAttribsARB)
    {
        printf( "glXCreateContextAttribsARB() not found\n" );
        exit(1);
    }


    else
    {
        int context_attribs[] =
        {
            GLX_CONTEXT_MAJOR_VERSION_ARB, 3,
            GLX_CONTEXT_MINOR_VERSION_ARB, 3,
            None
        };

        printf( "Creating context\n" );
        ctx = glXCreateContextAttribsARB( display, bestFbc, 0, True, context_attribs );


        XSync( display, False );
        if (!ctx)
        {
            printf( "Failed to create GL 3.3 context\n" );
            exit(1);
        }
    }



    glXMakeCurrent( display, win, ctx );


    const GLubyte* oglVersion = glGetString(GL_VERSION);
    std::cout << "OpenGL version : " << oglVersion << std::endl;



    // Frame 1 :

    glClearColor ( 0, 0.5, 1, 1 );
    glClear ( GL_COLOR_BUFFER_BIT );

    if( glGetError() != GL_NO_ERROR)
        std::cout << "Error OpenGL frame 1" << std::endl;

    glXSwapBuffers ( display, win );

    sleep( 1 );





    glXMakeCurrent( display, 0, 0 );
    glXDestroyContext( display, ctx );

    XDestroyWindow( display, win );
    XFreeColormap( display, cmap );
    XCloseDisplay( display );
}

Line 108 is the problem!

Thanks a lot for your support!

X_GLXVendorPrivate not suppported?

Hi all!

I'm trying to run X-Plane by using primusrun, since it works well with Bumblebee and I would like to see if I could get some benefits from primus.

I've compiled from sources under Ubuntu 12.04, with Linux kernel 3.7 and a working bumblebee installation, using the instructions on the provided README file.
The test with glxspheres is working properly, however when I start xplane with this command line:

PRIMUS_libGL=${PRIMUS_libGL}:/usr/lib/nvidia-current:/usr/lib32/nvidia-current \
PRIMUS_SYNC=1 \
vblank_mode=0 \
primusrun ./X-Plane-i686

I'm getting this error:

Xlib:  extension "NV-GLX" missing on display ":0".
X Error of failed request:  GLXUnsupportedPrivateRequest
  Major opcode of failed request:  153 (GLX)
  Minor opcode of failed request:  16 (X_GLXVendorPrivate)
  Serial number of failed request:  44
  Current serial number in output stream:  46

Do you have any clue on this error? Perhaps I'm missing something, or simply the require functionality if not supported by primus?

Thanks Patrick

N.B. I've noticed an interesting reduction of CPU load when running glxspheres with primusrun vs it running with optirun. That's seems really promising! ;-)

Segmentation fault when trying to run minetest

primusrun minetest 
primus: loading /usr/$LIB/nvidia-bumblebee/libGL.so.1
primus: loading /usr/$LIB/libGL.so.1
Irrlicht Engine version 1.7.3
Linux 3.5.4-1-ck #1 SMP PREEMPT Sat Sep 15 06:23:50 EDT 2012 x86_64
Creating X window...
Visual chosen: : 152
X Error: BadDrawable (invalid Pixmap or Window parameter)
From call : unknown
X Error: BadDrawable (invalid Pixmap or Window parameter)
From call : unknown
X Error: BadDrawable (invalid Pixmap or Window parameter)
From call : unknown
Using renderer: OpenGL 4.2.0
GeForce GT 525M/PCIe/SSE2: NVIDIA Corporation
OpenGL driver version is 1.2 or better.
GLSL version: 4.2
Loaded texture: /usr/share/minetest/textures/base/pack/menubg.png
Loaded texture: /usr/share/minetest/textures/base/pack/menulogo.png
Segmentation fault

Trying to use Nuke (7.0v4) with primus causes crash

Trying to launch Nuke (made by TheFoundry) with primusrun causes it to crash:

 primusrun ./Nuke7.0 
Nuke 7.0v4, 64 bit, built Jan 18 2013.
Copyright (c) 2013 The Foundry Visionmongers Ltd.  All Rights Reserved.
Fontconfig warning: "/etc/fonts/conf.d/50-user.conf", line 9: reading configurations from ~/.fonts.conf is deprecated.
Vertex shader for simpleShaderProg (MainVertexShader & PositionOnlyVertexShader) failed to compile
Fragment shader for simpleShaderProg (MainFragmentShader & ShockingPinkSrcFragmentShader) failed to compile
QGLShaderProgram: shader programs are not supported 
Segmentation fault (core dumped)

Nuke does run with optirun.

Running Linux Mint 14 64bit, nVidia 630m, 310.14 drivers

Bash completion

Is it possible to make bash completion available for primusrun? Even though primusrun doesn't actually take any options, it's still handy to be able to auto-complete commands, just like when using optirun. Thanks!

amd/intel setup

Hello,
I am trying to use primus with intel/amd setup:

00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 12)
01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Madison [Radeon HD 5000M Series]

I use open source radeon driver(xf86-video-ati 1:7.0.0).

  • glxinfo works, seems like it returns vaild info: direct rendering yes, opengl version 3.0 (intel suports only 2.1)
  • glxgears works.
  • minecraft
    crashes with xmlconfig.c:1033: driQueryOptioni: Assertion cache->info[i].name != ((void *)` (which seems to be in mesa)
  • unigine-heaven (oilrush):
    works! fps rises from 5 (intel) to 17.
  • simple opengl test from qtcreator examples(slightly modifed to use shaders):
    QGLShaderProgram: shader programs are not supported

Program received signal SIGSEGV, Segmentation fault.

primusrun gdb -batch -ex run -ex bt glxspheres
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
Polygons in scene: 62464
Visual ID of window: 0x94
Context is Indirect
[New Thread 0x7ffff0627700 (LWP 2742)]
[New Thread 0x7fffefe26700 (LWP 2743)]
OpenGL Renderer: GeForce GT 540M/PCIe/SSE2
primus: sorry, not implemented: glXUseXFont

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7fffefe26700 (LWP 2743)]
0x00007ffff70aede4 in ?? () from /lib/x86_64-linux-gnu/libc.so.6
#0 0x00007ffff70aede4 in ?? () from /lib/x86_64-linux-gnu/libc.so.6
#1 0x00007ffff593dd54 in ?? () from /usr/lib/nvidia-current/libGL.so.1
#2 0x00007ffff5950bc3 in ?? () from /usr/lib/nvidia-current/libGL.so.1
#3 0x00007ffff7bc6cb6 in ?? () from /usr/lib/x86_64-linux-gnu/primus/libGL.so.1
#4 0x00007ffff6d4ae9a in start_thread () from /lib/x86_64-linux-gnu/libpthread.so.0
#5 0x00007ffff7053cbd in clone () from /lib/x86_64-linux-gnu/libc.so.6
#6 0x0000000000000000 in ?? ()

fixed, I tried to configure primus to work qith nvidia-experimental-310, but forgot 1 line.
It's all good now.

Trying to use RV with primus causes crash

RV (from Tweak Software) causes crash when trying to launch with primus:

primusrun rv
/usr/local/tweak/rv-Linux-x86-64-3.12.19/bin/rv.bin
Version 3.12.19, built on Nov 26 2012 at 13:53:00 (64bit). (L)
Copyright (c) 2008-2011 Tweak Software. All rights reserved.
Fontconfig warning: "/etc/fonts/conf.d/50-user.conf", line 9: reading configurations from ~/.fonts.conf is deprecated.
/usr/local/tweak/rv-Linux-x86-64-3.12.19/bin/rv.bin: symbol lookup error: /usr/local/tweak/rv-Linux-x86-64-3.12.19/bin/rv.bin: undefined symbol: glGetProgramivARB

Same machine/specs as other crash error.

Not finding libGl or libnvidia-tls on Fedora

Currently trying to compile on Fedora, though I've also had problems with the openSUSE package that is provided in rpm form.

As the readme states my two lib directories are lib64 and lib, I changed that so that the command reads:
LIBDIR=lib64 make && CXX=g++\ -m32 LIBDIR=lib make

Here's the full output of that command if you need it:

$LIBDIR=lib64 make && CXX=g++\ -m32 LIBDIR=lib make
mkdir -p lib64
g++ -Wall -g -Werror=missing-declarations -DBUMBLEBEE_SOCKET='/var/run/bumblebee.socket' -DPRIMUS_SYNC='0' -DPRIMUS_VERBOSE='1' -DPRIMUS_DISPLAY=':8' -DPRIMUS_LOAD_GLOBAL='libglapi.so.0' -DPRIMUS_libGLa='/usr/$LIB/nvidia-bumblebee/libGL.so.1' -DPRIMUS_libGLd='/usr/$LIB/libGL.so.1' -fvisibility=hidden -fPIC -shared -Wl,-Bsymbolic -o lib64/libGL.so.1 libglfork.cpp -lX11 -lpthread -lrt
libglfork.cpp:850:2: warning: #warning Enabled workarounds for applications demanding more than promised by the OpenGL ABI [-Wcpp]
mkdir -p lib
g++ -m32 -Wall -g -Werror=missing-declarations -DBUMBLEBEE_SOCKET='/var/run/bumblebee.socket' -DPRIMUS_SYNC='0' -DPRIMUS_VERBOSE='1' -DPRIMUS_DISPLAY=':8' -DPRIMUS_LOAD_GLOBAL='libglapi.so.0' -DPRIMUS_libGLa='/usr/$LIB/nvidia-bumblebee/libGL.so.1' -DPRIMUS_libGLd='/usr/$LIB/libGL.so.1' -fvisibility=hidden -fPIC -shared -Wl,-Bsymbolic -o lib/libGL.so.1 libglfork.cpp -lX11 -lpthread -lrt
libglfork.cpp:850:2: warning: #warning Enabled workarounds for applications demanding more than promised by the OpenGL ABI [-Wcpp]

After running glxgears:

$ ./primusrun glxgears
primus: fatal: failed to load any of the libraries: /usr/$LIB/nvidia-bumblebee/libGL.so.1
libnvidia-tls.so.310.19: cannot open shared object file: No such file or directory
[robert@Robert-Laptop primus]$ 

contents of /usr/lib/nvidia-bumblebee:

$ ls /usr/lib/nvidia-bumblebee/
libcuda.so         libGL.so.310.19               libnvidia-encode.so.1       libnvidia-opencl.so.1       libOpenCL.so.1.0.0  tls
libcuda.so.1       libnvcuvid.so                 libnvidia-encode.so.310.19  libnvidia-opencl.so.310.19  libvdpau_nvidia.so  vdpau
libcuda.so.310.19  libnvcuvid.so.1               libnvidia-glcore.so.310.19  libnvidia-tls.so.310.19     libvdpau.so
libGL.la           libnvcuvid.so.310.19          libnvidia-ml.so             libOpenCL.so                libvdpau.so.1
libGL.so           libnvidia-compiler.so.310.19  libnvidia-ml.so.1           libOpenCL.so.1              libvdpau.so.310.19
libGL.so.1         libnvidia-encode.so           libnvidia-ml.so.310.19      libOpenCL.so.1.0            libvdpau_trace.so

Contents of /usr/lib64/nvidia-bumblebee:

$ ls /usr/lib64/nvidia-bumblebee/
libcuda.so         libGL.so.310.19               libnvidia-encode.so.1       libnvidia-opencl.so.1       libOpenCL.so.1.0.0  tls
libcuda.so.1       libnvcuvid.so                 libnvidia-encode.so.310.19  libnvidia-opencl.so.310.19  libvdpau_nvidia.so  vdpau
libcuda.so.310.19  libnvcuvid.so.1               libnvidia-glcore.so.310.19  libnvidia-tls.so.310.19     libvdpau.so         xorg
libGL.la           libnvcuvid.so.310.19          libnvidia-ml.so             libOpenCL.so                libvdpau.so.1
libGL.so           libnvidia-compiler.so.310.19  libnvidia-ml.so.1           libOpenCL.so.1              libvdpau.so.310.19
libGL.so.1         libnvidia-encode.so           libnvidia-ml.so.310.19      libOpenCL.so.1.0            libvdpau_trace.so

Optirun reportedly runs everything fine as well (using the nvidia drivers):

$ optirun glxinfo
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
name of display: :0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: VirtualGL
server glx version string: 1.4
server glx extensions:
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_SGI_make_current_read, GLX_SGIX_fbconfig, 
    GLX_SGIX_pbuffer, GLX_SUN_get_transparent_index, GLX_ARB_create_context
client glx vendor string: VirtualGL
client glx version string: 1.4
client glx extensions:
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_SGI_make_current_read, GLX_SGIX_fbconfig, 
    GLX_SGIX_pbuffer, GLX_SUN_get_transparent_index, GLX_ARB_create_context
GLX version: 1.4
GLX extensions:
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_SGI_make_current_read, GLX_SGIX_fbconfig, 
    GLX_SGIX_pbuffer, GLX_SUN_get_transparent_index, GLX_ARB_create_context
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GT 630M/PCIe/SSE2
OpenGL version string: 4.3.0 NVIDIA 310.19
OpenGL shading language version string: 4.30 NVIDIA via Cg compiler

I originally installed nvidia-bumblebee from this source:
http://techies.ncsu.edu/wiki/bumblebee-nvidia

I hope that's enough information, I sincerely hope it's not as trivial as the error seems to be.

mesa is loaded by runtime dlopen calls

primus doesn't work with libGL.so loaded by dlopen calls.

this could be solved with an overwritten dlopen function within primus itself. Because in general I don't like this idea, we should find other ways to handle this.

This occurs for many important libraries, for example wine, mono and SDL.

I will build a little demo for this issue later.

Games occasionally showing black screen only

Amnesa is not always starting properly when using primusrun: Every once in a while (usually only the first time I try after booting the machine), the entire screen is blacked out. After some time, I can hear the menu sound, but nothing is shown. I have to switch to a vtty and kill it from the outside.
The primus output shows that all (or at least, really many) frames are "dropped to prevent deadlock". There's not a single line reporting display fps, but there are a bunch of render fps lines.

I've also seen TrackMania (in wine) showing the same behaviour once, but did not check the primus output then - and I never saw that before. I don't know if that was the same issue.

World of Warcraft under Wine with the opengl renderer fails to detect compatible graphics card

Hi,
Running WoW under wine with the opengl renderer (with the -opengl commandline argument) with primusrun results in the game giving the following error: "
Your 3D accelerator card is not supported by World of Warcraft. Please install a 3D accelerator card with dual-TMU support."

The game's d3d renderer works just fine, and with optirun both renderers work fine.
Amonakov asked on IRC that I add the output of "strings -a Wow.exe | grep -i wgl"
so here it is: pastebin.com/KXtZitr9

Attempting to use Viewport 2.0 in Maya causes crash

Switching to Viewport 2.0 ("Renderer" menu --> "Viewport 2.0") causes Maya to crash

 primusrun maya
Fontconfig warning: "/etc/fonts/conf.d/50-user.conf", line 9: reading configurations from ~/.fonts.conf is deprecated.

Xlib:  extension "NV-GLX" missing on display ":0".
primus: sorry, not implemented: glXUseXFont

maya encountered a fatal error

Signal: 11 (Unknown Signal)
Stack trace:
  /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f2f492e44a0]
  OGSMayaVramQuery::queryVramOGL()
  OGSMayaVramQuery::OGSMayaVramQuery()
  OGSMayaVramQuery::getInstance()
  OGSMayaVramQuery::queryVram()
  OGSRenderer::getTotalGPUMemory()
  OGSRenderer::initializeOGSDevice(OGS::Objects::UString*, int)
  OGSMayaRenderer::initialize(bool, unsigned int, int, void*, void*, void*, bool)
  OGSMayaBridge::CreateOGSRenderer()
  OGSMayaBaseRenderer::initialize()
  OGSViewportRenderer::initialize()
  TidleRefreshCmd::refreshOGSRender(T3dView*, bool, bool, TdisplayAppearance, bool, bool, bool)
  TidleRefreshCmd::refresh(T3dView*, TdisplayAppearance, bool, bool, bool, bool, TrefreshType, bool, bool)
  TidleRefreshCmd::doIt(Tevent const&)
  TeventHandler::doIdles()
  QObject::event(QEvent*)
  QidleTimer::event(QEvent*)
  QApplicationPrivate::notify_helper(QObject*, QEvent*)
  QApplication::notify(QObject*, QEvent*)
  QmayaApplication::notify(QObject*, QEvent*)
  QCoreApplication::notifyInternal(QObject*, QEvent*)
...

Primus and VirtualBox update

Hi there. I just read Bumblebee-Project/Bumblebee#278 and was testing some things based on that discussion. I noticed via dmesg that bbswitch was powering up then quickly powering back down the Nvidia card back down when I was trying ' primusrun vboxmanage startvm "Windows XP" ' and virtualbox --startvm "Windows XP". That post mentions root is needed to make it work so I tried some tricks with symbolic links that for whatever reason didn't work. So I imported the vbox disk... sudo virtualbox. I added root to the bumblebee, video, audio group, etc. - probably some things not needed but for experiment's sake to cover all bases. This is what I found:

[ 41.597638] bbswitch: enabling discrete graphics
[ 42.095936] pci 0000:01:00.0: power state changed by ACPI to D0
[ 42.234697] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=none,decodes=none:owns=none
[ 42.234903] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 310.19 Thu Nov 8 00:52:03 PST 2012
[ 44.123729] NVRM: GPU at 0000:01:00: GPU-025de124-7fa3-daeb-b978-a80a7ff395c5
[ 50.043047] EMT-0[943]: segfault at 7fcff4424718 ip 00007fcff7d8a3e3 sp 00007fcfdcf70be0 error 4 in ld-2.16.so[7fcff7d81000+f000]
[ 94.369910] bbswitch: disabling discrete graphics
[ 94.382319] pci 0000:01:00.0: Refused to change power state, currently in D0
[ 94.382716] pci 0000:01:00.0: power state changed by ACPI to D3cold

Optirun works, however. (It's funny, I posted a while back that I couldn't get optirun to work with Windozey things).

I'm not sure how to diagnose the segfault so I just thought I'd offer the findings here.

Primus uses Intel graphics

> glxspheres 
Polygons in scene: 62464
Visual ID of window: 0xa2
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) Sandybridge Mobile 
60.110172 frames/sec - 67.082952 Mpixels/sec
60.112984 frames/sec - 67.086091 Mpixels/sec
60.115761 frames/sec - 67.089190 Mpixels/sec
60.134923 frames/sec - 67.110574 Mpixels/sec
^C
> optirun glxspheres 
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: GeForce GT 540M/PCIe/SSE2
104.932059 frames/sec - 117.104178 Mpixels/sec
108.305792 frames/sec - 120.869264 Mpixels/sec
105.142943 frames/sec - 117.339525 Mpixels/sec
^C[ 2781.569451] [WARN]Received Interrupt signal.
> primusrun glxspheres 
Polygons in scene: 62464
Visual ID of window: 0xa2
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) Sandybridge Mobile 
60.365576 frames/sec - 67.367983 Mpixels/sec
60.114843 frames/sec - 67.088165 Mpixels/sec
60.122392 frames/sec - 67.096589 Mpixels/sec
^C

X Error when application quits

This error shows up for some, but not all applications, and I can't make it reproducible: When the application quits, this (or a similar) error is shown on the console:

X Error of failed request: BadDrawable (invalid Pixmap or Window parameter)
Major opcode of failed request: 137 (DRI2)
Minor opcode of failed request: 7 (DRI2GetBuffersWithFormat )
Resource id in failed request: 0x540005d
Serial number of failed request: 3300
Current serial number in output stream: 3300

This does not happen when I use optirun instead of primusrun.

Problem running primusrun

I'm finally (at last) trying primusrun, but after managing for building it, I'm currently facing this error:

./primusrun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x21
Xlib:  extension "NV-GLX" missing on display ":0".
X Error of failed request:  BadAlloc (insufficient resources for operation)
  Major opcode of failed request:  154 (GLX)
  Minor opcode of failed request:  3 (X_GLXCreateContext)
  Serial number of failed request:  25
  Current serial number in output stream:  26

Anything I've missed or explanation for that ?

crash with oil-rush

Today I tried oil-rush with primus, but I am getting this error, while the game is loading (a black window appears, but not more):

X Error of failed request: BadDrawable (invalid Pixmap or Window parameter)
Major opcode of failed request: 151 (DRI2)
Minor opcode of failed request: 8 (DRI2SwapBuffers )
Resource id in failed request: 0x4600002
Serial number of failed request: 29
Current serial number in output stream: 30

I tried, but I failed

I like the idea of this, but maybe I am to silly to try it out (keep in mind, that I have a working bumblebee configuration here):

./primusrun glxinfo
loading /usr/lib/nvidia-bumblebee/libGL.so.1: 0x7f8f7c5af000
glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"): glXGetProcAddress("glDrawPixels"):

is all I get

sys libs:
mesa-master
xf86-video-intel-2.20.5 (on sna)
libdrm-2.4.39

uname -a:
Linux karols-gentoo 3.5.3-gentoo #1 SMP PREEMPT Mon Aug 27 00:17:22 CEST 2012 x86_64 Intel(R) Core(TM) i7-3610QM CPU @ 2.30GHz GenuineIntel GNU/Linux

Nvidia GPU: GT630M

do you need more information?

Support for VDPAU?

I'm finding that Primus is too similar to VirtualGL.

That is, they share the same flaw. The transport method only recognizes GLX calls.

VDPAU will not work unless the app window that requires it is rendered on the Nvidia X server before being displayed on the Intel X server.

The solution is buffer sharing such as dma_buf, but this is currently not possible with the proprietary driver.

What does exist are CPU-bound solutions such as hybrid-windump provide a crude way to achieve VDPAU, but the windowing is really awful.

Would it be possible to process calls for libvdpau using a similar method?

Warcraft III Frozen Throne crashes when changing resolution

When changing the resolution of Warcraft III Frozen Throne (using wine 1.5.13) in primusrun, the game crashes. The console shows:

X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 11 (X_GLXSwapBuffers)
Serial number of failed request: 1037
Current serial number in output stream: 1038

With optirun, changing the resolution is working fine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.