Battle of the titans "780 ti Vs 290x" first results.
-
So according to your experience, how can I solve my problem? With a 3 Way?
First thing is to throw my micro in the trash, but I'm in love with the new Rampage IV Black Edition, and it doesn't want to come out of the damn thing. I know you're not a big fan of LGA2011, but I want a hexacore micro.
Then I'll consider boosting the graphics part.
A hug
PD: Look, I'm talking about a problem when it's not really one, playing everything at full blast without MSAA turned up to the max is a joy; but I want that extra something.
I would listen to what ELP3 says about you having more of a problem with a lack of power to be able to move that super-resolution with MSAA than a real problem with a lack of VRAM, and I'm not just saying that because he has experience with these configurations, but because the tests in 4K (with a similar weight to your monitor configuration in pixels to move) carried out throughout reviews support him if interpreted correctly.
Here it is seen, in these tests, that the GTX 780 or Ti don't seem to be "affected" especially against the 290/X, yes in some specific cases it seems that the latest ones have a bit more muscle (they gain ground, this doesn't necessarily mean they beat the TI), but I'm basically 100% sure that this is due to the potential of the ROPs of Hawaii in games that are highly dependent on their work, not to the extra VRAM.
To reach this conclusion I have taken into account these factors:
1.- Comparative performance between 780 and Ti vs Titan at 4K, there is no apparent loss of performance when using 3 GB instead of 6 GB.
2.- Comparative performance between the GK110 and the Hawaii, in most cases there are no differences because of the resolution itself, and in cases where there does seem to be a difference, the Titan also seems to lose "oomph" (I insist that this doesn't imply any "defeat", although there may be some cases, but simply that sometimes the advantages or equalities are lost), ergo there is no relationship between the amount of VRAM and performance, or rather, there is no direct relationship between current games even with super configurations like this and whether a graphic has 3, 4 or 6 GB of VRAM.
Long before VRAM runs out, muscle usually runs out, look precisely at tomb what happens even using FXAA:
And here with an SLI:
There isn't enough muscle to move this game with SSAA activated, as it implies doubling or quadrupling the total pixels to render, and with 1200px3 you slightly exceed the number of pixels of 4K. Note that they have also disabled TressFX for this test (which also consumes a lot of VRAM by itself, but consumes even more power than VRAM), but it's just that, quite simply, an SLI of 2 ways with GK110s or Hawaiis in 4K with this game is barely enough. In your case, the same applies. 3 ways could give you an extra, but I don't know if it would be enough, to be honest. To play with FXAA and with TressFX activated, perhaps, but not to add SSAA on top of that. This game and those resolutions, nothing current moves them like that, but not because of the amount of VRAM, but because of the lack of power.
Q: Ambition gets the better of you, look that with that tremendous resolution and you still want more... :llorar:
-
Well, just that Tomb Raider doesn't go too bad for me with SSAA (as you can see, the graphics are without any OC in these tests):
Tomb Raider 5760x1200 - 2xSAA - 16xAF - Ultra settings - TressFX Hair

Tomb Raider 5760x1200 - FXAA - 16xAF - Ultra settings - TressFX Hair

**Tomb Raider 5760x1200 - 16xAF - Ultra settings - Normal Hair
**

Although ELP is very right that one thing is the benches and another very different is to really play.
Best regards -
Well, just that Tomb Raider doesn't go too bad for me with SSAA (As you can see, the graphics are without any OC in these tests):
Tomb Raider 5760x1200 - 2xSAA - 16xAF - Ultra settings - TressFX Hair

Tomb Raider 5760x1200 - FXAA - 16xAF - Ultra settings - TressFX Hair

**Tomb Raider 5760x1200 - 16xAF - Ultra settings - Normal Hair
**

Although ELP is very right that one thing is benchmarks and another very different is to really play.
Best regards
Yeah, but this game specifically if it doesn't go above 50 fps you notice, not by jumps but not perfectly fluid, it has to be above that rate to give a full experience.
That's not to mention that, what is the integrated benchmark, has little or nothing to do with the real experience in the game, which unfortunately usually has areas of worse performance than the benchmark.
Regards.
-
1. Lo de copiar creo lo único que e puestos es los enlaces y he cambiado lo que yo creía, como aquí no se comentó nada pues lo puse.
Aquí mas señores también a puesto copy de n3d y nadie le a dicho nada, claro por que ellos dicen lo que todos queréis escuchar.
2.Sobre lo que has comentado ELP3, yo nunca creí lo que decían de ti en n3d y entre en este foro para conocer de tu mano el rendimiento de las r9.
3**.Nunca en ningún momento he llevado comentarios de este foro a n3d.**
Pero aquí hay gente que si los trae de n3d aquí para reírse de la gente.
Como tu dijiste de tu boca en n3d son unos talibanes y no se puede hablar mal de amd…..no no no.
Donde no se puede decir nada de nvidia es aquí y donde hay más talibanes es aquí.
Me rio yo de lo imparciales que sois, si solo hay que ver como se me habéis lanzado al cuello.
Es que me da vergüenza ajena de ver gente que trata de engañar a los demás y se engañan a ellos mismos diciendo que son imparciales.
Si Krampak cree que me tiene que banear del foro pues lo hará, pero hasta hoy no he faltado el respeto en este foro y aquí no dejan de faltarmelo.
Pero bueno con no entrar en los hilos donde están los talibanes de nvidia me sobra.
No molestarse en contestar, que es que paso de leer más nada en este poss, ya con lo que he leído tengo que me sobra para ver de que va la película.
ELP3 sabias de sobra como funcionaban las r9, solo las probaste para demostrar tu fanatismo pro nvidia y reírte de amd…punto y final.
Saludos campeones.
PD: ya se por que comentan lo que comentan en n3d, no me estraña que no quieras entrar o no te dejen.Enviado desde mi GT-N7100 usando Tapatalk 2
No te iba a contestar, como hicieron los otros, pero lo voy a hacer, y de una forma fácil y que no implicará que YO me ensucie con la mierda que pretendes llevar entre foros, sólo citándote a ti y descubriendo tu culo ante los demás, tu falsedad y falta total de sinceridad:
http://foro.noticias3d.com/vbulletin/showthread.php?t=420213&p=4992424&viewfull=1#post4992424
Normal que a elp3 lo tachen de partidista de nvidia, y no a el si no a muchos mas de hartlimit que esta de sus ….si amo....si eres el mejor....si tienes toda la verdad.
Hoy puse un enlace en este foro de talibanes de nvidia......que ellos dicen que n3d es un foro de talibanes de amd sisi eso dicen.......puse las temperaturas, precio y rendimiento de TI respecto a la r9 290 y a la 780.
Se me tiraron al cuello literalmente y cada vez que comentaba algo de amd igual.
Lo dicho elp3 y compañía,talibanes pro nvidia, y de sus espias en este foro, que se lo digan que se que están recopilando comentarios para los talivanvidia.
Pero si se ve a leguas que elp3 se compro las r9 solo para criticar amd y después adorar a sus 3 titan, anda hombre que se les ve el plumero a un kilómetro.
No entro mas en talibania ni loco, por lo menos en los hilos de nvidia.
Por lo menos aquí en n3d puedes hablar, y después dicen allí que somos talibanes de ams y ellos son muy imparciales jejejejejejejejejejejejejejejejejejeejjeje me parto toda enteraEl problema es que te crees que la gente de aquí no lee N3D o no sabe de los comentarios que has hecho por esos lares antes de pasear por acá, el problema está en que pienses que nos chupamos el dedo, estás buscando meter gresca entre foros y has venido a este foro y a este hilo concreto sólo a molestar, en cuanto te enteraste que ELP3 estaba gastando su tiempo en un hilo dedicado a una prueba de las 290X en crossfire.
No has venido con buenas intenciones y eso es obvio, también que no eres nada maduro para gastar esos 40 y pico tacos que dices tener. No añado más, pero no es de recibo nada de lo que estás haciendo, aquí entra el que quiere, pero…. para convivir, no para putear a los demás. Y putear, has puteado tú mucho a unos cuantos entre foros y marujeando de un lado a otro, hasta algún aviso te han dado en N3D por esta actitud que estás teniendo.
-
Ostia what good and what a steal jajajaja.When ELPE finds out he will scold him with a 290x making a mark with its 98 degrees temperature xD.
-
So according to your experience, how can I solve my problem? With a 3 Way?
First thing is to throw my micro in the trash, but I'm in love with the new Rampage IV Black Edition, and it doesn't want to come out of the damn thing. I know you're not a big fan of LGA2011, but I want a hexacore micro.
Then I'll consider boosting the graphics part.
A hug
PD: Look, I'm talking about a problem when it's not such a big deal, playing everything at full blast without MSAA through the roof is a joy; but I want that extra touch.
If you want a hexacore, wait for haswell-E, it shouldn't be long now, it just doesn't make sense to have 2011 without an X89, everything they put on the X79 will just be patches, Haswell-E should already be on par with Intel's new architecture, the X79 is an SB, that is, like a P67 or a z68.
It looks like Intel is abandoning that platform and that's why they're not even releasing an X89, with Haswell_E they'll surely move to the X99 and will have native sata, USB3, Pcie3, all native, now X79 even though it works with Pcie3, Intel only guarantees you Pcie2.
Anyway, Haswell-E will be the first with DDR4, I don't think it will be worth the price, the shame is that they don't put a 6 core in the 1150 socket, because then few would buy that 2011-3, nor will that X79 be compatible with Haswell_E processors, Intel's monopoly stinks, if there was competition we would advance more and wouldn't separate the platforms so much, nor would they hoard processors.P.D.Juanjo I put a negative on you for being a hypocrite, this way the world is then everyone pitted against everyone else.
regards
-
The most prudent thing to do would be to wait for the maxwell, don't get your head blown off (it seems that these could come with 5GB).
Don't get your head blown off over the hexacore either, wait for X99 and 8 cores, maybe there will even be a motherboard that supports DDR3 / DDR4 simultaneously.
Regards.
-
The thing with the RIV BE and the 4930K is decided; I know it's an old platform; but I want to tinker; and test for myself the performance difference between 4 and 6 cores. Haswell E has almost a year left; that's a lot of time :). Next year we will upgrade again if we can. Regarding the graphics; well, I don't know. I'll see. Thanks a lot guys. Best regards -
The RIV BE and 4930K thing is decided; I know it's an old platform; but I want to tinker; and test for myself the performance difference between 4 and 6 cores.
Haswell E still has almost a year left; that's a lot of time :). Next year we'll upgrade again if we can.
Regarding the graphics; well, I don't know. I'll see.
Thanks a lot guys.
Best regards
I wouldn't do it..X79 isn't worth it in terms of cost or performance, at least until X89 arrives.
If you really want to tinker, it's more worth it to get an Ivy or a Haswell despite their disaster of temps, as they perform better right now in games, it's a native 3.0 platform and above all, much cheaper.
As for being able to put filters on surround, either you get another gtx 780 or forget it...and you can't even put SSAA with 3...with an MSAA X4 and fluid it would already be quite a lot...
Best regards
-
I do believe it..it's the only way for Nvidia to once again price it at 1000€
http://videocardz.com/47530/nvidia-geforce-gtx-780-ti-also-special-edition
That card will be released at the end of December with the 6GB memory with a price increase of 175 dollars and that's as far as it will go, no further, they won't release a 12GB version as rumored. And the next thing will be the new series, time will tell
-
-
I'll finally get my hands on one tomorrow, but only for "tinkering"...
Again, for technical reasons beyond my control and my knowledge in RL, which is zero... my equipment is K.O.
Now, I'm going to set up a 4 WAY of TITAN from aupa... with the SLI master with 2 EVGA SC that will eat those TI with chips... Jotolillo knows this well...;)
And meanwhile, to avoid carrying the ipad and to entertain myself, a 780Ti arrives, which won't stay with me unless it's the 8th wonder of the world, which it's not.
I also say, although this is a warning to navigators, that I don't spend all day returning things or buying. But thanks to God, I've earned some reputation in this little world, which is unfair for many and I respect them, because for some big manufacturers and wholesalers it's just the opposite and they want me to try their new products.Es that's why many times, not always, I can have them before.De fact today I could have had the 780Ti, but as I had other more important things to do like attending to my work and my family, I left it for tomorrow.
In principle it's an analysis at an internal level, but I don't mind saying how it goes, however I see that many people have already asked for it..which is a bit incredible because they don't give them away precisely, but it seems that good products sell well despite the high markup, so you're going to find many things firsthand tomorrow.
Best regards.
P.D.Jotole!!! I want my 4 WAY DE TITAN A 1350MHZ!!! I'm going with the metal detector of the laptop and the ipad...;)
-
I'll finally get my hands on one tomorrow, but only for "tinkering"…
Again, for technical reasons beyond my will and wisdom in RL, I refer, which is null..my equipment is K.O.
Now, I'm going to set up a 4 WAY of TITAN from aupa...with the SLI master with 2 EVGA SC that will eat those TI with potatoes...Jotolillo knows this well...;)
And meanwhile, to not have to carry the ipad and to entertain myself, I get a 780Ti, which won't stay with me unless it's the 8th wonder of the world, which it's not.
I also say, although this is a warning to navigators, that I don't spend all day returning things or buying. But thanks to God, I've earned some reputation in this little world, which is unfair for many and I respect them, because for some big manufacturers and wholesalers it's just the opposite and they want me to try their new products.Es that's why many times, not always, I can have them before.De in fact today I could have had the 780Ti, but as I had other more important things to do like attending to my work and my family, I left it for tomorrow.
In principle it's an analysis at an internal level, but I don't mind saying how it goes, however I see that many people have already asked for it..which is a bit incredible because they don't give them away precisely, but it seems that good products sell well despite the high markup, so you will find many things firsthand tomorrow.
Best regards.
P.D.Jotole!!! I want my 4 WAY DE TITAN A 1350MHZ!!! I'm going with the treasure hunter of the laptop and the ipad...;)
If you can't stay still even a little, jajaja… :facepalm:
Let's see what impressions the GTX 780 Ti leaves you, and be fair with it, I know it's not going to be a significant change compared to your battery of Titan, but evaluate it in its "smallness" of loneliness and its 3 GB of VRAM, how it fights and how well it bites. Especially how it revs up, to see the potential of the B1 stepping.
Regards.
-
P.D.Jotole!!! quiero mi 4 WAY DE TITAN A 1350MHZ!!! que voy con el buscaminas del portátil y del ipad…;)
Jajajaaa, se hará lo que se pueda. ;). Pero te veo en 1400 :ugly:
Salu2…
-
¡Esta publicación está eliminada! -
Well, I've already screwed it up.
As always, ASICs and my luck don't go together... even though it's supposed to be a new revision and all... the ASIC of this Gigabyte is only 70%... basically this means it's useless for OC unless you put tons of voltage and here's where the curious thing comes in... Nvidia has incredibly unlocked the voltage of these 780Ti... up to 1.3V... but it's absolutely useless since it hasn't increased their TPD. So one thing cancels out the other and if you do a lot of OC or put a lot of voltage in, it goes into a spectacular Throttling. Besides, putting memos at 7GHz makes the consumption skyrocket in a very pronounced way. It spends a little more than the TITAN.
The graphics card is a real gem, fresh, with terrible potential, but that's potential... because right now there's no justification for owners of TITAN or 780 and less with unlocked bios and voltage mods to get this Ti.
A TITAN in these conditions, matching clocks performs even better than the Ti. Which logically is not normal despite, in my judgment, being at most 5-7% of theoretical performance.
When bios come out that allow it to consume what it really spends with OCs, we'll see a splendid graphics card... right now it's a card "attacked" by default to stay first in bench performance united to the very good finishes and sound qualities and temperature of these already known GK110 reference. But that's it... I expected more, honestly...
However, it's also true that where the frequency drop is more pronounced is in synthetics that consume a lot. In games there should be no problem in maintaining about 1200MHZ with stock voltage and temps and sound of laughter, which keeps it unbeatable... but I want more... so we'll have to wait for custom bios to really see the performance of this card squeezed.
Meanwhile... I'll stick with my TITAN...;)
Here's an example.
I've been looking for 1250MHZ... and as you can see, the Power Target reached a maximum of 112 when the card only has 106, this has led to brutal drops down to 1189MHZ... however, although it has practically not been at 1250MHZ for any length of time, I would say around 1230MHZ... the graphics result is good. It should be remembered that I'm on Ivy which is a 4-core and will always lose in total by 2011.
But for the ASIC it has, with stock voltage, I didn't even bother to unlock it, and for the memos that are very attacked in frequency, I say again that it's an acceptable result.
Well, now that I'm tickling it, I must say I'm impressed.
Although obviously it hasn't been at 1300MHZ or 3sg... it's true that the minimum has been 1221MHZ and so all the time above 1260MHZ... which is surprising since no voltage has been increased, only the core and yet it has had better minimums in MHZ than the previous time with less core... maybe we should sit down to understand how this new 780Ti works... it doesn't seem to be just a TPD problem... which it is, but there's something more... the card always drops the same regardless of the core or consumption it has at that moment... interesting...
The truth is that you have to surrender to the evidence... they pull a lot:
Here, despite dropping down to 1211MHZ... they're pulling the same as a TITAN at
1290MHZwith bios mod... and without any voltage on the 780Ti... by the way, Skynet has just released a bios mod...;DRegards.
-
What a failure the voltage unlock is, if in the end it has a consumption limit it's not worth anything, I also thought that the B1 revision would be a more refined process and better ASIC but it seems that it's going to be more of the same, perhaps if models with OC come out they should choose the best ASICs, to make sure they don't lower the frequency, that or let it eat more.saludos
-
What a failure the voltage unlock is, if it ends up having a consumption limit it's not worth anything, I also thought that the B1 revision would be a more refined process and better ASIC but it seems it's going to be more of the same, maybe if models with OC come out they should choose the best ASICs, to make sure they don't lower the frequency, that or let it eat more.
regards
It's an A1 like a cathedral...nothing at all about B1...there you have the GPU-Z..
Regards.
-
What a failure the voltage unlock was, if it ends up having a consumption limit it's not worth anything, I also thought that the B1 revision would be a more refined process and better ASIC but it looks like it's going to be more of the same, maybe if models with OC come out they should choose the best ASICs, to make sure they don't lower the frequency, that or let it eat more.
regards
The ASIC has nothing to do with the stepping, they are two totally distinct things and besides gpu-z is not known for making a correct reading of the ASIC quality in new models, less when there is a change in stepping.
It seems to me that you don't see many Titan/GTX780 that with stock voltage reach not only 1250 MHz (I have seen several reviews consistent with the data of ELP3 reaching between these and 1300 MHz peak with stock voltage), but 1200 MHz. I insist on the topic of "with stock voltage".
The ASIC itself gives us information, when read correctly, about the OC potential of a given chip but within what is a GIVEN STEPPING, it does not give us information about the capabilities between different steppings. Different chips and revisions, different ASIC behaviors. There is nothing to do with the ASIC of a GK110 A1 with the ASIC of a GK104 A2 (I believe the only one that exists), same thing.
We must understand that very slight revisions of a die give rise to subversions like A0, A1, A2, etc., and somewhat larger changes give a jump to the next letter, in the purest style of B0, B1, etc. In this case the difference is so important as to assign a new letter, so it is clear that it is not even a difference like the one between the GK104 A1 and the GK104 A2 (of which the first we did not even see in real products, but it is supposed that it already had a "commercial grade"). The game of ASIC quality is applied within the same chip with the same stepping, and it is what the BIOS use to determine in each case the maximum and real top of the Boost in each unit (it is the only real relationship seen throughout each model of the GK104 A2 with the reference BIOS, that the maximum boost was greater the better the ASIC, and also related but not so directly the maximum OC potential).
Obviously, in every stepping there must be products of better and worse quality, but besides the fact that GPU-z measures as it comes out of the pumpkin at least initially (my GTX 670 went from an +80% ASIC to something like 67% with subsequent gpu-z revisions :ugly:), the rules of the game only apply within "the family".
As an example, no matter how bad a G0 was (Q6600), it was rare that it did not surpass even the best B3 that it faced in OC, and within its stepping it could be of "the worst" and still... be in another world than the B3.
-
It's that it's an A1 like a cathedral…nothing of B1 at all...there you have the GPU-Z..
Best regards.
ELP3, it seems unbelievable that you take gpu-z seriously, you know very well that those data of stepping, die size and number of transistors are taken from internal databases in the program itself, not because they "detect" them, as always a later and revised version of gpu-z will be needed for it to "detect" (which it doesn't) features like stepping (it's really incapable of doing so, it relies on databases). The ASIC possibly does "detect" it through some low-level functionality of the graphics that delivers a numerical value representing it, but given the situation, gpu-z has to interpret it and put it on a correct scale for each new model that comes out (not manufacturer model, but gpu/variant). That's why this value changes (scale readjustment according to what they think the "real" ASIC is per revision of cases) between versions of gpu-z in new graphics. But the issue of the info on the first tab of gpu-z… but it doesn't even show you the real info of the maximum boost of a unit, it only shows you the official boost value of the model (and you can induce the maximum boost through the ASIC and the boost tables, but it seems that it's something too complex for w1zzard to integrate this interpretation of the real boost value).


