Why does my OpenGL Version String not correspond with my core profile version string?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP








up vote
1
down vote

favorite












I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).



As far as I know, I have OpenGL 3.3. However, my glxinfo output is very, very confusing:



glxinfo | grep "OpenGL"
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.1 Mesa 18.2.0-devel
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:


So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?



I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.







share|improve this question
























    up vote
    1
    down vote

    favorite












    I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).



    As far as I know, I have OpenGL 3.3. However, my glxinfo output is very, very confusing:



    glxinfo | grep "OpenGL"
    OpenGL vendor string: VMware, Inc.
    OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
    OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
    OpenGL core profile shading language version string: 3.30
    OpenGL core profile context flags: (none)
    OpenGL core profile profile mask: core profile
    OpenGL core profile extensions:
    OpenGL version string: 3.1 Mesa 18.2.0-devel
    OpenGL shading language version string: 1.40
    OpenGL context flags: (none)
    OpenGL extensions:
    OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
    OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
    OpenGL ES profile extensions:


    So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?



    I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.







    share|improve this question






















      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).



      As far as I know, I have OpenGL 3.3. However, my glxinfo output is very, very confusing:



      glxinfo | grep "OpenGL"
      OpenGL vendor string: VMware, Inc.
      OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
      OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
      OpenGL core profile shading language version string: 3.30
      OpenGL core profile context flags: (none)
      OpenGL core profile profile mask: core profile
      OpenGL core profile extensions:
      OpenGL version string: 3.1 Mesa 18.2.0-devel
      OpenGL shading language version string: 1.40
      OpenGL context flags: (none)
      OpenGL extensions:
      OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
      OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
      OpenGL ES profile extensions:


      So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?



      I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.







      share|improve this question












      I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).



      As far as I know, I have OpenGL 3.3. However, my glxinfo output is very, very confusing:



      glxinfo | grep "OpenGL"
      OpenGL vendor string: VMware, Inc.
      OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
      OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
      OpenGL core profile shading language version string: 3.30
      OpenGL core profile context flags: (none)
      OpenGL core profile profile mask: core profile
      OpenGL core profile extensions:
      OpenGL version string: 3.1 Mesa 18.2.0-devel
      OpenGL shading language version string: 1.40
      OpenGL context flags: (none)
      OpenGL extensions:
      OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
      OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
      OpenGL ES profile extensions:


      So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?



      I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.









      share|improve this question











      share|improve this question




      share|improve this question










      asked May 23 at 23:22









      Leo Tindall

      1113




      1113




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          2
          down vote













          The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.



          The OpenGL version string you can just ignore. It's the compat profile version.



          However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode package installed.






          share|improve this answer






















          • But then why do games that require 3.2 fail on my system, when I have 3.3?
            – Leo Tindall
            May 24 at 12:30










          • But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
            – Leo Tindall
            May 25 at 16:34











          • OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
            – dobey
            May 25 at 20:02










          • OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
            – Leo Tindall
            May 27 at 21:36

















          up vote
          0
          down vote













          Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.



          Bug 1, Bug 2, and Bug 3 caused any system with libegl to fall back to llvmpipe - software rendering.



          libegl, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390 depending on it.



          libnvidia-gl-390 was incorrectly required on many systems. Uninstalling it fixes this issue.






          share|improve this answer




















            Your Answer







            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "89"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );








             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1039595%2fwhy-does-my-opengl-version-string-not-correspond-with-my-core-profile-version-st%23new-answer', 'question_page');

            );

            Post as a guest






























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            2
            down vote













            The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.



            The OpenGL version string you can just ignore. It's the compat profile version.



            However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode package installed.






            share|improve this answer






















            • But then why do games that require 3.2 fail on my system, when I have 3.3?
              – Leo Tindall
              May 24 at 12:30










            • But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
              – Leo Tindall
              May 25 at 16:34











            • OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
              – dobey
              May 25 at 20:02










            • OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
              – Leo Tindall
              May 27 at 21:36














            up vote
            2
            down vote













            The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.



            The OpenGL version string you can just ignore. It's the compat profile version.



            However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode package installed.






            share|improve this answer






















            • But then why do games that require 3.2 fail on my system, when I have 3.3?
              – Leo Tindall
              May 24 at 12:30










            • But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
              – Leo Tindall
              May 25 at 16:34











            • OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
              – dobey
              May 25 at 20:02










            • OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
              – Leo Tindall
              May 27 at 21:36












            up vote
            2
            down vote










            up vote
            2
            down vote









            The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.



            The OpenGL version string you can just ignore. It's the compat profile version.



            However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode package installed.






            share|improve this answer














            The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.



            The OpenGL version string you can just ignore. It's the compat profile version.



            However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode package installed.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited May 25 at 20:01

























            answered May 24 at 2:25









            dobey

            31.8k33484




            31.8k33484











            • But then why do games that require 3.2 fail on my system, when I have 3.3?
              – Leo Tindall
              May 24 at 12:30










            • But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
              – Leo Tindall
              May 25 at 16:34











            • OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
              – dobey
              May 25 at 20:02










            • OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
              – Leo Tindall
              May 27 at 21:36
















            • But then why do games that require 3.2 fail on my system, when I have 3.3?
              – Leo Tindall
              May 24 at 12:30










            • But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
              – Leo Tindall
              May 25 at 16:34











            • OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
              – dobey
              May 25 at 20:02










            • OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
              – Leo Tindall
              May 27 at 21:36















            But then why do games that require 3.2 fail on my system, when I have 3.3?
            – Leo Tindall
            May 24 at 12:30




            But then why do games that require 3.2 fail on my system, when I have 3.3?
            – Leo Tindall
            May 24 at 12:30












            But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
            – Leo Tindall
            May 25 at 16:34





            But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
            – Leo Tindall
            May 25 at 16:34













            OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
            – dobey
            May 25 at 20:02




            OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
            – dobey
            May 25 at 20:02












            OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
            – Leo Tindall
            May 27 at 21:36




            OK, I installed intel-microcode but, after rebooting, my system is still using llvmpipe.
            – Leo Tindall
            May 27 at 21:36












            up vote
            0
            down vote













            Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.



            Bug 1, Bug 2, and Bug 3 caused any system with libegl to fall back to llvmpipe - software rendering.



            libegl, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390 depending on it.



            libnvidia-gl-390 was incorrectly required on many systems. Uninstalling it fixes this issue.






            share|improve this answer
























              up vote
              0
              down vote













              Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.



              Bug 1, Bug 2, and Bug 3 caused any system with libegl to fall back to llvmpipe - software rendering.



              libegl, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390 depending on it.



              libnvidia-gl-390 was incorrectly required on many systems. Uninstalling it fixes this issue.






              share|improve this answer






















                up vote
                0
                down vote










                up vote
                0
                down vote









                Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.



                Bug 1, Bug 2, and Bug 3 caused any system with libegl to fall back to llvmpipe - software rendering.



                libegl, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390 depending on it.



                libnvidia-gl-390 was incorrectly required on many systems. Uninstalling it fixes this issue.






                share|improve this answer












                Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.



                Bug 1, Bug 2, and Bug 3 caused any system with libegl to fall back to llvmpipe - software rendering.



                libegl, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390 depending on it.



                libnvidia-gl-390 was incorrectly required on many systems. Uninstalling it fixes this issue.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered May 27 at 22:43









                Leo Tindall

                1113




                1113






















                     

                    draft saved


                    draft discarded


























                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1039595%2fwhy-does-my-opengl-version-string-not-correspond-with-my-core-profile-version-st%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Popular posts from this blog

                    pylint3 and pip3 broken

                    Missing snmpget and snmpwalk

                    How to enroll fingerprints to Ubuntu 17.10 with VFS491