|
Post by lala on Nov 15, 2007 19:38:20 GMT
Wow, I was surprised to read some differences for games using DX10. Just reading a review of recent good cards being tested with DX10 at Hexus. The surprise for me is that DX10 improves the graphics slightly but at the cost of nearly halving the framerate. Seems that SLI and Xfire are going to be needed for latest DX10 games on large monitors Scroll down the page link to compare the DX9 and DX10 performance. www.hexus.net/content/item.php?item=10415&page=9Crysis was really low framerate on the demo. Cheers Lala
|
|
|
Post by chirikov on Nov 16, 2007 2:42:26 GMT
they only had one game that was tested in both dx9 and dx10. evidence is not conclusive that dx10 is at fault here; it could be poor programming on the developers part
the low framerates arent as important as the fact that they are incredibly variable anyways imo. according to their intro, they had to run tests 'countless times' to come up with numbers
|
|
|
Post by lala on Nov 16, 2007 12:24:12 GMT
they only had one game that was tested in both dx9 and dx10. evidence is not conclusive that dx10 is at fault here; it could be poor programming on the developers part the low framerates arent as important as the fact that they are incredibly variable anyways imo. according to their intro, they had to run tests 'countless times' to come up with numbers Actually your totally wrong there. 4 Games, one of which only had partial map to play (Crysis), which if you read to get a consistent result they had to setup the benchmark system for countless number of runs. The 1 games were done on both DX9 and DX10, while the other 3 were pure DX10. And they ran the games so many times you do get an average, so its not as you say " so variable" to be meaningless. Re-iterate a test often enough and you have enough data to provide a result in this case. Hexus is well respected by those who work at the top end of computer technology, they do know what they are doing. Just a quick quote to show you did not read the full article bit your quoting: Notice they also say "eliminated the majority of the variance" and "four games". Edit: The reason for my initial post is that whats quite noticable at the moment is that both AMD and NVIDEA are running substantially slower with DX10 for 4 games around today. So we are seeing DX10 games, however both companies are having performance issues that may in time be improved (by them and Microsoft). However the question is how much will the performance improve, especially if you like using the 24+ inch monitors. One can no longer assume that a single near top of the range gfx card will run well for DX10, and the cost of playing at the best visual now needs to truly take Xfire and SLI more into consideration. Cheers Lala
|
|
|
Post by Enrith on Nov 16, 2007 15:22:02 GMT
I bought Crysis last day and I must say the graphics kick ass!! I'm running it on a DX10 NVIDEA card and Intel dual core with Vista as OS. I'm not noticing any major drop of fps and the game runs smooth. The test might prove that DX10 runs slower but when you sit down and play you don't notice it at all, the gfx is INSANE and the level of details is amazing. Enrith
|
|
|
Post by lala on Nov 16, 2007 17:14:33 GMT
I bought Crysis last day and I must say the graphics kick ass!! I'm running it on a DX10 NVIDEA card and Intel dual core with Vista as OS. I'm not noticing any major drop of fps and the game runs smooth. The test might prove that DX10 runs slower but when you sit down and play you don't notice it at all, the gfx is INSANE and the level of details is amazing. Enrith What resolution and detail settings do you use En, also would love to know the gfx card (hoping its not top range Nvidea or detail settings do not have to be high on DX10 for it to look great) ? Just curious as I am looking to use above 24 inch monitor Cheers bud Lala
|
|
|
Post by lala on Nov 16, 2007 17:56:47 GMT
Just been reading other forums and reviews, seems you are a bit lucky there En as many are having performance way below the nice 45+ frames per second.
Cheers Lala
|
|
|
Post by Enrith on Nov 16, 2007 18:40:59 GMT
Yeah, I have read that as well lala, and what I meant to say is that the drop of fps is not very noticeable when you sit down and play the game, it still run smoothly. At least for me. I'm running Crysis on a 8800 GTS card with 640mb ram on the gfx card. Res on 1600 x 1200 and all settings to High (Very High is max) Using a 22" Wide screen monitor. Also went 8x anti aliasing were 32 or 64 is max, I don't remember. I'll take a look at what fps I'm running at when I get home from work. And I didn't mean to say you are wrong lala, simply meant that the drop of fps you talk about is not something very noticeable for the average player. Enrith
|
|
|
Post by chirikov on Nov 16, 2007 19:32:48 GMT
your original statement read "The surprise for me is that DX10 improves the graphics slightly but at the cost of nearly halving the framerate." given that only one game was actually tested in both dx9 and dx10, my response to that was it is not conclusive
running the tests countless times does provide numbers, but it was not the numbers i was doubting (i did not call them meaningless either). a stable framerate is far more important than being able to hit 100+ some of the time; i was pointing towards vista's highly variable framerates as an issue
i did read the article, and my comments were not about the validity of the article
hexus attributes nvidia's performance lead to their closer relationship with game developers; i myself dont really know. all i know is that im going to stick to cranking down the quality and using more cost efficient graphics cards than trying to run everything full blast
|
|
|
Post by lala on Nov 16, 2007 19:48:37 GMT
Yeah, I have read that as well lala, and what I meant to say is that the drop of fps is not very noticeable when you sit down and play the game, it still run smoothly. At least for me. I'm running Crysis on a 8800 GTS card with 640mb ram on the gfx card. Res on 1600 x 1200 and all settings to High (Very High is max) Using a 22" Wide screen monitor. Also went 8x anti aliasing were 32 or 64 is max, I don't remember. I'll take a look at what fps I'm running at when I get home from work. And I didn't mean to say you are wrong lala, simply meant that the drop of fps you talk about is not something very noticeable for the average player. Enrith Still impressive though bud, out of interest you tried maxiumum setting to see if there is a visual difference for eye candy? I wonder if the physics engine card will assist frame rate, would be cheaper than going for 2x 8800 640mb cards. Cheers Lala
|
|
|
Post by lala on Nov 16, 2007 19:54:13 GMT
chirikov, the variation that your hung up on is only 15%. Now while that is quite a bit, on 35 fps the variation is +- 5fps so there wont be a dramatic drop in framerate, hence what Enrith says he doesnt see that big a slow down (makes sense). Hexus comment on this because they are creating a benchmark and not using gameplay perception. This is backed up by Enrith saying he has not noticed the variable slow down.
The trend I am pointing out is how both AMD and Nvidea lose a bucket load of fps switching from DX9 to DX10 especially for higher resolutions and better settings.
Cheers Lala
|
|
|
Post by chirikov on Nov 16, 2007 20:16:01 GMT
15% is a lot if you play any fps seriously. not so bad in something like nwn, but places where twitch matters youll find your aim just slightly off all the time. i can consciously feel the framerate changes without looking at the numbers if it gets too drastic, but it affects me subconsciously regardless
enrith might not notice a fps drop, more power to him. he is only a single data point tho, and he might not care so much about his framerate. hexus benchmarked framerates because there is quite literally no way to benchmark perception (its all in the head)
again you say amd/nvidia loses tons of fps moving to dx10, and i repeat: there was only one game tested in both dx9 and dx10. you have no way of knowing whether that game performed more poorly due to dx10, or a myriad of other factors such as poor developer programming, hardware incompatibilities, driver issues, etc. find a whole list of games with their dx9 and dx10 framerates side by side and then you can make that kind of statement
|
|
|
Post by lala on Nov 16, 2007 20:54:19 GMT
15% is a lot if you play any fps seriously. not so bad in something like nwn, but places where twitch matters youll find your aim just slightly off all the time. i can consciously feel the framerate changes without looking at the numbers if it gets too drastic, but it affects me subconsciously regardless enrith might not notice a fps drop, more power to him. he is only a single data point tho, and he might not care so much about his framerate. hexus benchmarked framerates because there is quite literally no way to benchmark perception (its all in the head) again you say amd/nvidia loses tons of fps moving to dx10, and i repeat: there was only one game tested in both dx9 and dx10. you have no way of knowing whether that game performed more poorly due to dx10, or a myriad of other factors such as poor developer programming, hardware incompatibilities, driver issues, etc. find a whole list of games with their dx9 and dx10 framerates side by side and then you can make that kind of statement I see your point but ALL the games had framerates in the same sort of region, not where they are when you check DX9 games using those cards. What makes the test more interesting is that its 2 different manufacturers, this means if drivers are problems its BOTH manufacturers. And furthermore, I am yet to see a DX9 game using those top end gfx cards be below 80 fps. Look at the DX10 and your very lucky to even get to 40fps for similar settings. I would be interested if you can find a review for a modern game tested with an Nvidia 8800 which provides less than 80fps at 1680x1050 using DX9. Cheers Lala
|
|
|
Post by chirikov on Nov 16, 2007 21:53:46 GMT
|
|
|
Post by Enrith on Nov 16, 2007 22:44:48 GMT
15% is a lot if you play any fps seriously.... enrith might not notice a fps drop, more power to him. he is only a single data point tho, and he might not care so much about his framerate. hexus benchmarked framerates because there is quite literally no way to benchmark perception (its all in the head) As I said: I'm def a average player when it comes to FPS, and I don't notice the small drop in fps much. Enrith
|
|
|
Post by lala on Nov 17, 2007 2:00:02 GMT
Thanks for the link. A few things struck me as really wierd with their testing, the average fps is down because of 3 games, one was funny enough Crysis, while the another is Supreme Commander, which is surprising to be so low as the gfx are not amazing, and then Stalker that I have seen have problems according to reviews and forums. To show why the test is real quirky check out the other games in their batch; Oblivion that hits 90 fps and the others are rather nifty as well. Any idea why they went with what seems strange resolutions? 24 inch monitor is 1920x1200. 22 inch monitor is 1680x1050. 30 inch monitor is 2560x1600. I think, someone will correct me I am sure Yet they did not seem to use those settings in the test at Toms. I got a feeling you and me are totally going to disagree, thats fine. However looking at both articles its interesting that the DX10 trend I commented upon can be seen when comparing Crysis with both reviews; DX9 at Toms and DX10 at Hexus. The only one that can be compared on Crysis though is the 1280x1024 resolution, as yet again they seemed to have gone quirky with the slightly higher resolution test. In the DX10 version it is around half the speed, which ties in with my original argument Cheers and thanks for the link Lala
|
|