I have an NVIDIA GTX 980 Ti
-
I upload the first benchmarks I've done


A fire strike:

Metro 2033 1080p ultra physX off

Metro LL 1080p ultra physX off

I upload tomb raider from stock to 1150mhz of base clock (1354mhz boost) and with oc to 1500mhz+ 450mhz memos (as a note, during half the benchmark, the boost dropped to 1487mhz)
The parameters are the same as in guru 3d (and at 1440p)
"This particular test has the following enabled:
DX11
Ultra Quality mode
FX AA enabled
16x AF enabled
Hair Quality Normal (TressFX disabled)
Tessellation On
SSAO Ultra"Here are mine, the first one from stock:

With oc:

I've tested bf4 at 1440p native with the resolution scale at 200% and everything on ultra, and the damn thing moved from 40 to 50 fps without antialiasing, of course. And then I wanted to take it to the extreme, so I set it to 5120 x 2880 without antialiasing, with medium options and the scale at 200% which would be like 10K, and it moved from 17 to 20 fps… ratataaaa the surprising thing was that it only needed 6144 megs of vram under those extreme conditions, that is, I had it overclocked to 1400mhz above its clock.
You can look for any jagged effect.... [qmparto]
5k with scale at 200% (that is, equivalent to 10k) high and medium without AA:




These are at 5k scale at 100% (that is, 5k native) everything ultra msa 4x


For those who have doubts if 6144 megs of vram are enough in 4k… you'll see that I didn't even reach 6 gigs under extreme conditions of 5k even with antialiasing and at 10k with medium settings without AA. Before you run out of vram
you go to 15/20 fps and it becomes unplayable not because of lack of it, but because of the graphics power of the chip that can't give more if the poor thing... :nono:
I leave that for the defenders of the 12 gig titans who said that 6 gigs is not enough... :risitas: -
If you can run Tomb Raider at 1080p to see which one gives you better results, so I can compare XD.
-
If you can pass the tomb raider to 1080p to see what gives you better, so I can compare XD.
The parameters are the same as in guru 3d (and at 1080p)
"This particular test has the following enabled:
DX11
Ultra Quality mode
FX AA enabled
16x AF enabled
Hair Quality Normal (TressFX disabled)
Tessellation On
SSAO Ultra"Here they are, the first one stock:

With Oc at 1500mhz + 450 memos:

-
I've tried the btf4 at 1440p native with the resolution scale at 200% and everything on ultra and the damn thing was moving it from 40 to 50 fps without antialiasing of course. And then I wanted to take it to the extreme I put 5120 x 2880 without antialiasing, in medium options with the scale at 200% which would be like 10K and it was moving it from 17 to 20 fps… ratataaaa the surprising thing has been that it has been enough with the 6144megas of vram in those extreme conditions, that is if I had it overclocked in memos 1400mhz above its clock.
You can look for some jagged effect.... [qmparto]
For those who have doubts if 6144megas of vram are enough in 4k… you will see that it doesn't even reach 6gigas in extreme conditions of 5k even with antialiasing and in 10k in high medium without AA. Before you run out of Vram
you go to 15/20 fps and it becomes unplayable not for lack of it, but for the graphic power of the chip that can't give any more if the poor thing... :nono:
I leave that for the defenders of the 12 gigas of the titanX who said that 6 gigas is little... :risitas:But what an animal you are Ciclito…. :osvaisacagar:
Almost 60 Megapixels per frame. "Sometimes I see pixels"... :ugly:
-
But what an animal you are Ciclito…. :osvaisacagar:
Almost 60 Megapixels per frame. "Sometimes I see pixels"... :ugly:
If :ugly: I also... I wanted to check it out because of a discussion I had with people who had the Titan X and said that 6 gigabytes were scarce... well I already knew that wasn't true because I came from the Titan Black and from playing in native 4K for a year and a half... but the mems in the Black didn't go up as much as these... and I wanted to retest this game under the same conditions and confirm that it holds up more thanks to that extra OC in mems.
Zoom in and look for any sawtooth :ugly: they look perfect and that's in jpg that if I capture them in bmp quality they take up 50 megs each....:sisi:
-
I'm going to put mine on air:



A 67% more, not bad

By the way, on guru3d it gives 192 fps in reference and 221 in G1, I don't know how you get 231...
http://www.guru3d.com/index.php?ct=articles&action=file&id=16319
And then at 2K you give 9 fps less.
-
I'm going to put mine through its paces:



A 67% more, not bad

By the way, on guru3d it gives 192 fps in reference and 221 in G1, I don't know how you get 231...
http://www.guru3d.com/index.php?ct=articles&action=file&id=16319
And then at 2K you get 9 fps less.
My g1 gaming does 1354 mhz with boost 2.0 from its 1150mhz base clock. We would have to see what boost the guru3d one had, surely something lower than mine. Hence the difference.
The 154 fps is clearly that they got it wrong, especially at 1440p, analyze it, their ref gives 139 fps up to 154, I tell you that you don't reach by increasing 150 mhz of the base clock.
I go from 1354 without memos to 1500mhz +450 memos, I go up 17 fps. On the other hand, note that their benchmark is at 1920 x 1200 and mine at 1920 x 1080. That also counts in my favor.
-
My g1 gaming does 1354 mhz with the boost 2.0 from its 1150mhz base clock. We would have to see what boost the guru 3d had, surely something inferior to mine. Hence the difference.
That's what I've been thinking, but since you give fewer fps at 2K, it really surprises me, the 980Ti bug is biting me but I'm resisting, which is bad ;D
-
3DMARK 11 PERFORMANCE single 980ti (bios de stock)

-
Batman Arkham Knight 2160p/1440p/1080p max settings benchmarking 980ti G1 Gaming
!
!
!
! Batman Arkham Knight 980TI G1 Gaming Gameplay.
! The game doesn't run very smoothly and eats up a lot of VRAM even at 1080p.
! Batman Arkham Knight 980TI G1 Gaming Gameplay.
!Salu2
-


FIRE STRIKE 1.1 sli 980ti


