Why does my OpenGL Version String not correspond with my core profile version string?
![Creative The name of the picture](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgO9GURib1T8z7lCwjOGLQaGtrueEthgQ8LO42ZX8cOfTqDK4jvDDpKkLFwf2J49kYCMNW7d4ABih_XCb_2UXdq5fPJDkoyg7-8g_YfRUot-XnaXkNYycsNp7lA5_TW9td0FFpLQ2APzKcZ/s1600/1.jpg)
![Creative The name of the picture](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYQ0N5W1qAOxLP7t7iOM6O6AzbZnkXUy16s7P_CWfOb5UbTQY_aDsc727chyphenhyphen5W4IppVNernMMQeaUFTB_rFzAd95_CDt-tnwN-nBx6JyUp2duGjPaL5-VgNO41AVsA_vu30EJcipdDG409/s400/Clash+Royale+CLAN+TAG%2523URR8PPP.png)
up vote
1
down vote
favorite
I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).
As far as I know, I have OpenGL 3.3. However, my glxinfo
output is very, very confusing:
glxinfo | grep "OpenGL"
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.1 Mesa 18.2.0-devel
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:
So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?
I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.
games intel-graphics opengl
add a comment |Â
up vote
1
down vote
favorite
I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).
As far as I know, I have OpenGL 3.3. However, my glxinfo
output is very, very confusing:
glxinfo | grep "OpenGL"
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.1 Mesa 18.2.0-devel
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:
So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?
I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.
games intel-graphics opengl
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).
As far as I know, I have OpenGL 3.3. However, my glxinfo
output is very, very confusing:
glxinfo | grep "OpenGL"
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.1 Mesa 18.2.0-devel
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:
So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?
I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.
games intel-graphics opengl
I am attempting to run some Unity3D games, which require "OpenGL core profile 3.2 or later for OpenGL Core renderer" (according to the games' own output).
As far as I know, I have OpenGL 3.3. However, my glxinfo
output is very, very confusing:
glxinfo | grep "OpenGL"
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 6.0, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.2.0-devel
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.1 Mesa 18.2.0-devel
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.2.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:
So, I have "Core profile version" 3.3 (greater than is required by Unity3D) but my "version" is 3.1? Why is this, and what can I do about it?
I have a Intel Core i5-3320M, and am using Mesa 18.2 with the i915 driver.
games intel-graphics opengl
asked May 23 at 23:22
![](https://i.stack.imgur.com/XsCbC.jpg?s=32&g=1)
![](https://i.stack.imgur.com/XsCbC.jpg?s=32&g=1)
Leo Tindall
1113
1113
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
2
down vote
The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.
The OpenGL version string you can just ignore. It's the compat profile version.
However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode
package installed.
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, I installedintel-microcode
but, after rebooting, my system is still usingllvmpipe
.
â Leo Tindall
May 27 at 21:36
add a comment |Â
up vote
0
down vote
Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.
Bug 1, Bug 2, and Bug 3 caused any system with libegl
to fall back to llvmpipe
- software rendering.
libegl
, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390
depending on it.
libnvidia-gl-390
was incorrectly required on many systems. Uninstalling it fixes this issue.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.
The OpenGL version string you can just ignore. It's the compat profile version.
However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode
package installed.
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, I installedintel-microcode
but, after rebooting, my system is still usingllvmpipe
.
â Leo Tindall
May 27 at 21:36
add a comment |Â
up vote
2
down vote
The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.
The OpenGL version string you can just ignore. It's the compat profile version.
However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode
package installed.
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, I installedintel-microcode
but, after rebooting, my system is still usingllvmpipe
.
â Leo Tindall
May 27 at 21:36
add a comment |Â
up vote
2
down vote
up vote
2
down vote
The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.
The OpenGL version string you can just ignore. It's the compat profile version.
However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode
package installed.
The core profile version is what version of GL you can use on your card with your drivers. This is what really matters, usually.
The OpenGL version string you can just ignore. It's the compat profile version.
However, the specific issue you are having, seems to be that your system is actually using the llvmpipe renderer, rather than the hardware acceleration for your actual GPU. The vendor string value should be something like Intel Open Source Technology Center with the renderer string something like Mesa DRI Intel(R) Ivybridge Mobile. You may need to make sure you have the intel-microcode
package installed.
edited May 25 at 20:01
answered May 24 at 2:25
dobey
31.8k33484
31.8k33484
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, I installedintel-microcode
but, after rebooting, my system is still usingllvmpipe
.
â Leo Tindall
May 27 at 21:36
add a comment |Â
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, I installedintel-microcode
but, after rebooting, my system is still usingllvmpipe
.
â Leo Tindall
May 27 at 21:36
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But then why do games that require 3.2 fail on my system, when I have 3.3?
â Leo Tindall
May 24 at 12:30
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
But I'm not running under a VM! I have Xubuntu 18.04 installed on a T430s.
â Leo Tindall
May 25 at 16:34
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, so the problem is that the intel driver is not actually being used here, perhaps due to lack of microcode or such.
â dobey
May 25 at 20:02
OK, I installed
intel-microcode
but, after rebooting, my system is still using llvmpipe
.â Leo Tindall
May 27 at 21:36
OK, I installed
intel-microcode
but, after rebooting, my system is still using llvmpipe
.â Leo Tindall
May 27 at 21:36
add a comment |Â
up vote
0
down vote
Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.
Bug 1, Bug 2, and Bug 3 caused any system with libegl
to fall back to llvmpipe
- software rendering.
libegl
, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390
depending on it.
libnvidia-gl-390
was incorrectly required on many systems. Uninstalling it fixes this issue.
add a comment |Â
up vote
0
down vote
Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.
Bug 1, Bug 2, and Bug 3 caused any system with libegl
to fall back to llvmpipe
- software rendering.
libegl
, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390
depending on it.
libnvidia-gl-390
was incorrectly required on many systems. Uninstalling it fixes this issue.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.
Bug 1, Bug 2, and Bug 3 caused any system with libegl
to fall back to llvmpipe
- software rendering.
libegl
, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390
depending on it.
libnvidia-gl-390
was incorrectly required on many systems. Uninstalling it fixes this issue.
Turns out this was due to a chain of bugs in the Ubuntu 18.04 upgrade process.
Bug 1, Bug 2, and Bug 3 caused any system with libegl
to fall back to llvmpipe
- software rendering.
libegl
, in turn, was incorrectly installed on systems that didn't need it, due to libnvidia-gl-390
depending on it.
libnvidia-gl-390
was incorrectly required on many systems. Uninstalling it fixes this issue.
answered May 27 at 22:43
![](https://i.stack.imgur.com/XsCbC.jpg?s=32&g=1)
![](https://i.stack.imgur.com/XsCbC.jpg?s=32&g=1)
Leo Tindall
1113
1113
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1039595%2fwhy-does-my-opengl-version-string-not-correspond-with-my-core-profile-version-st%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password