Facebook Twitter YouTube Frictional Games | Forum | Privacy Policy | Dev Blog | Dev Wiki | Support | Gametee


Thread Rating:
  • 2 Vote(s) - 3 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Amnesia uses wrong Graphic Card
Steezy Offline
Junior Member

Posts: 10
Threads: 1
Joined: Sep 2010
Reputation: 0
#1
Solved: 8 Years, 1 Month ago Amnesia uses wrong Graphic Card

i posted this in another crash thread a couple pages in but i thought that a single thread for this wouldn't get look over that fast and would be more useful for other people who have the same problem.

My problem is that the game uses my onboard intel chip instead of my nvidia card even though the nvidia card is displayed in the boot menu but when i look up in the hpl.txt the designated card is a "Intel® HD Graphics". how can i force the game to use the appropriate card?

here a couple screenshots

boot menu:

[Image: kevynb66.jpg]

error:

[Image: gdntiqly.jpg]

hpl.txt:
Quote:-------- THE HPL ENGINE LOG ------------
Engine build ID 20100818114615

Creating Engine Modules
--------------------------------------------------------
Creating graphics module
Creating system module
Creating resource module
Creating input module
Creating sound module
Creating physics module
Creating ai module
Creating gui module
Creating generate module
Creating haptic module
Creating scene module
--------------------------------------------------------

Initializing Resources Module
--------------------------------------------------------
Creating loader handlers
Creating resource managers
Adding loaders to handlers
--------------------------------------------------------

Initializing Graphics Module
--------------------------------------------------------
Setting video mode: 1024 x 768 - 32 bpp
Init Glew...OK
Setting up OpenGL
Vendor: Intel
Renderer: Intel® HD Graphics
Version: 2.1.0 - Build 8.15.10.2182
Max texture image units: 16
Max texture coord units: 8
Max user clip planes: 6
Two sided stencil: 1
Vertex Buffer Object: 1
Anisotropic filtering: 1
Max Anisotropic degree: 2
Multisampling: 1
Texture compression: 1
Texture compression S3TC: 1
Auto generate MipMaps: 1
Render to texture: 1
Max draw buffers: 8
Max color render targets: 8
Packed depth-stencil: 1
Texture float: 1
GLSL Version: 1.20 - Intel Build 8.15.10.2182
ShaderModel 2: 1
ShaderModel 3: 1
ShaderModel 4: 0
OGL ATIFragmentShader: 0
ATTENTION: System does not support const arrays in glsl!

i have a asus notebook with windows 7, the said nvidia gt 325m, 4gb ram and a i5-450m.

please somebody look into this, i would love to be able to finally play the game.
09-10-2010, 10:03 AM
Find
Cyborg34572 Offline
Junior Member

Posts: 30
Threads: 1
Joined: Sep 2010
Reputation: 0
#2
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

Have you tried disabling the onboard graphics chipset in device manager so that your video card will be the one used?.
(This post was last modified: 09-10-2010, 12:50 PM by jens.)
09-10-2010, 11:14 AM
Find
Steezy Offline
Junior Member

Posts: 10
Threads: 1
Joined: Sep 2010
Reputation: 0
#3
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

if i try to disable it in the device manager the screen turns black and i have to restart the notebook manually. after the reboot the card is listed as deactivated but the game shows a "GDI Generic" card in the boot menu instead of my nvidia, but i get the same error if i want to start the game.

here is the hpl.txt
Quote:-------- THE HPL ENGINE LOG ------------
Engine build ID 20100818114615

Creating Engine Modules
--------------------------------------------------------
Creating graphics module
Creating system module
Creating resource module
Creating input module
Creating sound module
Creating physics module
Creating ai module
Creating gui module
Creating generate module
Creating haptic module
Creating scene module
--------------------------------------------------------

Initializing Resources Module
--------------------------------------------------------
Creating loader handlers
Creating resource managers
Adding loaders to handlers
--------------------------------------------------------

Initializing Graphics Module
--------------------------------------------------------
Setting video mode: 1024 x 768 - 32 bpp
Init Glew...OK
Setting up OpenGL
Vendor: Microsoft Corporation
Renderer: GDI Generic
Version: 1.1.0
Max texture image units: 3
Max texture coord units: 4
Max user clip planes: 6
Two sided stencil: 0
Vertex Buffer Object: 0
Anisotropic filtering: 0
Multisampling: 0
Texture compression: 0
Texture compression S3TC: 0
Auto generate MipMaps: 0
Render to texture: 0
Max draw buffers: 13
Max color render targets: 21
Packed depth-stencil: 0
Texture float: 0
GLSL Version: (null)
ShaderModel 2: 0
ShaderModel 3: 1
ShaderModel 4: 0
OGL ATIFragmentShader: 0
09-10-2010, 11:28 AM
Find
Cyborg34572 Offline
Junior Member

Posts: 30
Threads: 1
Joined: Sep 2010
Reputation: 0
#4
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

Hmm, go to your laptops website and download/install its video card drivers, see if that helps.
(This post was last modified: 09-10-2010, 12:51 PM by jens.)
09-10-2010, 11:42 AM
Find
Steezy Offline
Junior Member

Posts: 10
Threads: 1
Joined: Sep 2010
Reputation: 0
#5
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

already tried updating my drivers. didn't help. i think i need one of the devs to find out how to force the game to use the right card.
09-10-2010, 12:54 PM
Find
Thomas Offline
Frictional Games

Posts: 2,634
Threads: 184
Joined: Apr 2006
Reputation: 68
#6
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

See A.5:
http://frictionalgames.com/forum/thread-3754.html

perhaps that can fix it!
09-10-2010, 01:42 PM
Find
Steezy Offline
Junior Member

Posts: 10
Threads: 1
Joined: Sep 2010
Reputation: 0
#7
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

uhhm..., besides the fact that you already suggestet this approach and it didn't work the last time, you could read the responses above and start from there.

again, deactivating my card doesn't work.

at first it shows the intel card in the hlp.txt instead of my nvidia, after deactivating the intel chip it shows some "gdi render"-thingy as the used card in the hlp.txt.

you can see everything in my posts above with screenshots & logs.

and there is no switch to turn off the card if that's the next thing you would suggest.

there has to be some other way. this really needs to be patched.
09-10-2010, 02:22 PM
Find
Thomas Offline
Frictional Games

Posts: 2,634
Threads: 184
Joined: Apr 2006
Reputation: 68
#8
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

Did you tried updating the drivers for the Intel card (could not see that you had written this)

I am not sure this is anything I can fix for a patch, as I do not know any way of choosing card at start up.

If you are using a laptop, then perhaps contacting the support of the company that made it might help?
09-10-2010, 03:03 PM
Find
Mayron Offline
Junior Member

Posts: 14
Threads: 0
Joined: Feb 2008
Reputation: 0
#9
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

What about editing the .ini files in My Documents/Amnesia? I can't check it right now, but im sure there were several config options, like sound device etc. Maybe you can force it there?

Try to change the default display device in BIOS. Disabling intel chip in Windows is one way, but BIOS is the really sure way Big Grin

. what has been seen, cannot be unseen .
09-10-2010, 03:54 PM
Website Find
Steezy Offline
Junior Member

Posts: 10
Threads: 1
Joined: Sep 2010
Reputation: 0
#10
Solved: 8 Years, 1 Month ago RE: Amnesia uses wrong Graphic Card

just checked my bios and looked for default display device or something similar but i couldn't find anything to be honest.

here is my mainconfig file, but i don't think there is any way to force the game to use the other card.

Quote:<Main ShowMenu="true" ExitMenuDirectly="false" SaveConfig="true" DefaultProfileName="dev_user" StartLanguage="english.lang" UpdateLogActive="false" />
<Graphics TextureQuality="0" TextureFilter="2" TextureAnisotropy="16.000000" Gamma="1.000000" Shadows="1" SSAOActive="true" SSAOSamples="16" ShadowQuality="2" ShadowResolution="2" WorldReflection="true" SSAOResolution="0" EdgeSmooth="true" ParallaxEnabled="true" ParallaxQuality="0" />
<Engine LimitFPS="true" />
<Screen Width="1024" Height="768" FullScreen="true" Vsync="false" />
<Physics PhysicsAccuracy="2" UpdatesPerSec="60.000000" />
<Sound Device="0" />


this is getting frustrating, but tanks for your time.
09-10-2010, 04:35 PM
Find




Users browsing this thread: 1 Guest(s)