Try the last of February/March 2008 wich you find in the forum
cheers
yes those work great for me also; try setting the il2set tool and run with maxium setting instead of assgin a custom option or video card
and do for get the to install the nivida twink this will help
Best settings for GTX series:
In my case gtx280 and c2d-6750 @ 3.6ghz
I use 130308 dll
The problem to Nvidea Cards isnt the conf.ini or the dlls
It is the individual settings to the grafix card own driver.
I spent a lot of time with 2 different nvidea cards in my il2 carrer to get the most of it.
The standart setts applied in the driver settings are imho BS.
If needed or if your grafix look like shit,i can have a look at my "old system with nvidea" at home and post my driver setts (8800gtx;driver version not present at work here),as i changed back to ATI in my newest system (change dedicated to il2).
You can also save a lot FPs in the "modern AA" settz, so there is almost no diff. between 4x and 6x ingame anymore by view but by fps, so the decision was clear to me (note:using ATI now,dont know about new AA settz on GTX280 series)
Though the dlls are just an extension in the main setup of il2 to be able choosing a conf.ini configuration.
But if peybolman is fine with his grafix on his NV system @ GTX280 he could perhabs post his driver setts.
Refering to the conf.ini
I use (4890HD/I7/12gigRam blabla) TexCompress=2,as you dont realize the difference at all but it spends a lot fps back.
If u use the HD effects mod i would change back to HardwareShaders=1;Forest =2 (at the end it u got no diff. above 1000m height)
If u customize your conf.ini by "hand/notepad", set VideoSetupId=17 (means custom,otherwise the values will be changed on next startup again to videosetup=2.
Hello again,
FOR NVIDIA GTX ....
Since nvidia drivers v190.62, it looks like threated optimization in opengl nvidia options now works fine, so you can use both cpu cores if you have a c2d or quad cpu and the driver works fine. Before this version, it did not work fine. This is because of a new opengl driver implemention. Remember to switch threated optimization to on in the opengl nvidia options. Also select a ProcessAffinityMask=3 for 2 core cpus and ProcessAffinityMask=15 for quad core cpus.
The test:
Before, with nvidia drivers v186.18, If threated optimization was set to on, I had fps problems and continuous stoppings. (it did care what ProcessAffinityMask was set)
Also If I switched to threated optimization = off then no fps problems or 2-sec stoppings. Windows task manager was at 50% in use . It meant that 1 core worked at 100% for il2fb.exe and the other core is at 0%. That made a total of 50% cpu usage.
Well, here is the point. It did not care whatever ProcessAffinityMask was set. The only difference was if was core0 or core1 the one that was working with il2fb.exe
After nvidia drivers v190.62:
1st- Now if threated optimization in opengl nvidia options is set to on, I dont have any problems like before with nvidia drivers v186.18 or earlier ones. Now it works fine. No fps problems or 2-sec stoppings.
2nd- If you set ProcessAffinityMask=1 or ProcessAffinityMask=2 you will get a 50% cpu usage as before, (core 0 at 100% and core 1 at 0%) but if you set ProcessAffinityMask=3 or ProcessAffinityMask=15 you will see that cpu usage gets over 50%. Ususaly it stays around 70-90% usage. So now, core 0 is at 100% usage for il2fb.exe and core 1 is at 60-80% also for il2fb.exe
PLEASE, TEST WITH ProcessAffinityMask=3 AND check in Windows task manager (select always on top in Windows task manager properties ) what cpu % usage you get for il2fb.exe.
Best settings for GTX series:
In my case gtx280 and c2d-6750 @ 3.6ghz fsb and 4GB ddr2 900mhz
I use 130308 dll