Test of the Real Nvidia Gtx Titan, single-sli-tri Sli, 4 WAY SLI
-
Geltops, a graphic with a 128-bit Bus with a VRAM at 2Ghz has a bandwidth of 32000 Mb/s, and a graphic with a 256-bit Bus with a VRAM at 1Ghz has a bandwidth of 32000 Mb/s, that is, the same…

Saludos.
Possibly for computing if he is more interested in SP or that the graphic is not capped for GPGPU, that's why maybe if he puts a 580 or a 7970 he could get better results than a 680, although it wouldn't be a matter of having a wider bus, it would be that they have more GPGPU power.
Depending on the program he uses, maybe if he uses a 580 it would be better than a 680, in games that wouldn't happen.
saludos
-
Is it known if any water block will be released for the Titan? -
@josele.:
is it known if any water block will be released for the Titan?
There are already water blocks for TITAN:
EK first with GeForce GTX Titan Full-Cover water block | EKWaterBlocks
Even Artic has released a hybrid with RL:
Professional Review » Inno3D iChill GTX Titan Accelero Hybrid
Salu2.
-
Hello:
First of all, I want to clarify that I don't want to contradict anyone, confrontations or bad vibes, please.
Mainly, because I don't even reach your shoes in technical knowledge (many of you already know this).
I CAN ONLY speak from my experience….......
I have put to work EQUAL units of the same project on graphics with 256 bits and other graphics with 384 bits, and the graphics with 384 bits finished before.
Was the time differential overwhelming, exaggerated? I would say NO but it was perceptible.
Let's handle the time with an imaginary example:
Graphics with 256 Bits: 1 minute and 55 seconds.
Graphics with 384 Bits: 1 minute and 30 seconds.
For me YES there is a difference.................
Regards.
Nop, no bad vibes at all, but some people claim that just because of the bus width the performance is better, and it's not.
You yourself are explaining it with your experience, same units but the ones with 384 bits finish before, but for the simple reason that they probably have more bandwidth. If they used a type of slower memory than the units with 256 bits, maybe what you saw was the opposite.
We have to understand that the bus width HELPS to get more bandwidth (or more total memory for when it's needed, it's easier to put more memory in graphics with a wide bus), and this is how it sometimes translates into better performance. But by itself it doesn't define the goodness of an architecture, that's defined by other parameters.
-
Hello:
First of all, I want to clarify that I don't want to contradict anyone, confrontations or bad vibes, please.
Mainly, because I don't even reach your shoelaces in technical knowledge (many of you already know this).
I can only talk about my experience….......
I have put to work EQUAL units of the same project on graphics with 256 bits and other graphics of 384 bits, and the graphics with 384 bits finished before.
¿ Differential time overwhelming, exaggerated? I would say NO but it was perceptible.
Let's handle the time with an imaginary example:
Graphics of 256 Bits: 1 minute and 55 seconds.
Graphics of 384 Bits: 1 minute and 30 seconds.
For me YES there is a difference.................
Regards.
Do you think that a GTX285 would finish the job before a Titan/Tesla K20X? The first one has a bus of 512 bits and the second ones a bus of 384 bits…
By the way, I don't know what cards you would test but as far as I know there has never existed the same GPU with two data buses of different widths, the only case that comes close is the G80 vs G92 (8800GTX vs 9800GTX), and, even so, they are different chips despite having quite a similarity, but the larger bus of the G80 provided it with greater bandwidth, in short, what is sure is that there is no same GPU, commercialized with two different buses and same bandwidth, and with computing capabilities (It wouldn't make sense to put a larger bus to a GPU, more expensive, to stay with the same bandwidth that a smaller bus).
-
Do you think that a GTX285 would finish that job faster than a Titan/Tesla K20X? The first one has a 512-bit bus and the second ones have a 384-bit bus…
By the way, I don't know what cards you would test but as far as I know there has never been the same GPU with two different data buses, the closest case is the G80 vs G92 (8800GTX vs 9800GTX), and even so, they are different chips despite having quite a lot of similarity, but the larger bus of the G80 provided it with more bandwidth, that is, what is certain
is that there is no same GPU, commercialized with two different buses and same bandwidth
, and with computing capabilities (It wouldn't make sense to put a larger bus on a GPU, more expensive, to stay with the same bandwidth as a smaller bus).
Well, I have never fallen for that fact and YOU are completely right …..........
It is impossible that with different data buses the bandwidth is the same.
Regards.
-
Well, I've never fallen for that trick and YOU're absolutely right.............
It's impossible for the bandwidth to be the same with different data buses.
Best regards.
Impossible no, but very unlikely, because either very different types of memory are used in the "same" chips with different bus widths, or the memory bandwidth will be clearly different (there is a big difference between a 512, 384 and 256-bit bus to compensate only with frequency). That is, when a manufacturer releases a chip with a bus, it's to squeeze it well, it wouldn't make sense that they could use a smaller bus and the same bandwidth for whatever reason, because that would go against their manufacturing costs (one of the main reasons for the bus reduction that occurred when moving from GT200 to GF1x0 is that, when changing the memory type from GDDR3 to GDDR5, they could increase the bandwidth somewhat by using a less wide bus, thus cutting costs while increasing performance).
-
ELP3 I already imagine that you have played quite a bit with the 4 titans as you were commenting, what is the average and minimum that you achieve in games at 1600p?Seeing that it is the first certified 4096x2160 graphics card you have tested, have you tried using downsampling to bring it to this resolution;D?
Regards.
-
ELP3 I imagine you've already played quite a bit with the 4 titans as you were commenting, what is the average and minimum you achieve in games at 1600p
Seeing that it is the first certified 4096x2160 graphics you have tested by downsampling to bring it to this resolution ;D?
Regards.-
60 fps constant with v-sync. With everything I've played at the maximum of the maximum.. metro very high, DOF, Physx4XMSAA, Sleeping dogs extreme, Max payne all in very high and 8XMSAA and even Tomb raider, although this one doesn't work quite well as there are some stutters like loading sometimes quite annoying.
I haven't tried downsampling… I also don't know if my monitor supports it..
Regards.
-
Constant 60 fps with v-sync. With everything I've played at the maximum of the maximum.. metro very high, DOF, Physx4XMSAA, Sleeping dogs extreme, Max payne all in very high and 8XMSAA and even Tomb raider, although this one doesn't work quite right as there are some stutters like loading sometimes quite annoying.
I haven't tried downsampling… I also don't know if my monitor supports it..
Regards.
It's a matter of trying but it seems to me that it's not coming from the monitor side but from the graphics, I just tried it on my 1080p TV and I managed 2560x1600 and I also tried it on a 19 Samsung 940s monitor (1280x1024 resolution) and it also gives 2560x1600 I tried Crysis 3 and the Resident Evil 6 benchmark and it works although the resolution is suitable for my TV of 2560x1440.
Or a combination of both (monitor and GPU) because of what I read the Dell U2711 does 2160p.-
I'll leave you the tutorial but I think it could work at 2160p
Downsampling, a simple method for making your pc-games look better. - NeoGAF
Regards.
-
It's a matter of trying it out, but I think it's not coming from the monitor side, but from the graphics. I just tried it on my 1080p TV and I managed to get 2560x1600, and I also tried it on a 19 Samsung 940s monitor with a resolution of 1280x1024) and it also gave me 2560x1600. I tried Crysis 3 and the benchmark for Resident Evil 6, and it works, although the appropriate resolution for my TV is 2560x1440.
Or a combination of both (monitor and GPU), because I read that the Dell U2711 does 2160p.
I'll leave you the tutorial, but it seems to me that it could work at 2160p.
Downsampling, a simple method for making your pc-games look better. - NeoGAF
Salu2.
Mine is a 3011. And all the previous downsampling I tried were unsuccessful.
-
Mine is a 3011. And all the previous downsampling I tried, were unsuccessful.
You have to try them with the latest drivers and it works
I just tested it on another TV and I managed 3840x2160 30 Hz (dvi-hdmi) via vga it stays at 1600p.
It looks clear in the Resident evil 6 benchmark how the fps drop.
at 1080p - scene 1 36 fps - scene 2 30 fps
at 1600p - scene 1 24 fps - scene 2 17 fps
at 2160p - scene 1 14 fps - scene 2 9 fpsSalu2.-
-
That downsampling is great, it works well on my 24" at 2560x1440! In games where you can't scale the UI it's a pain because it gets so small, but the quality improves considerably. I've tried higher ones but it just goes black. -
Elp3 tests a 2800x1700 downsampling. You can do that without any problems, I have the same monitor as you, and I can't go any higher
-
I have a 27" HP 2710m and it works at 2560x1440
-
ELP3 I leave you a link to achieve 3840*2160 even in 16:10 formatDownsampling – A full guide to achieve 3840x2160 resolution – NVIDIA only - ScreenArchery Wiki
This weekend I will do some benches in 1080p, 1440p and 2160p and upload the results.-
Salu2.-
-
Hello fellow users.
First of all, congratulations to ELP3 for your efforts in sharing your feelings and experiences with all of us. What a pass of TITAN they are walking beasts….....
This is how nice it is to play in surround on the themes, like Crysis and tomb raider.... :wall:
I am very happy that you have once again beaten another record in 3DMARK, as last year you achieved it with our beloved 4 way 680.Although I do not post much due to lack of time, I try to follow you by reading all your posts...........
A hug.
-
Elp3 tests a downsampling of 2800x1700. You can do that without any problems, I have the same monitor as you, and I can't go higher
More than a problem with the monitor, it will be a problem with the standards to be met for each resolution in terms of available bandwidth for video output, even if this is completely fictional. Let me explain:
It is supposed that if you activate screen scaling by GPU, and not by the screen itself, there is not really a change in resolution compared to the native one of the same, and therefore there is no additional requirement against the screen.
What may be different is the synchronization and that the resolutions internally the system evaluates if they can or cannot be used by bandwidth (MHz), depending on how that screen is connected. Resolutions like 4K make a Dual link DVI fall short, and although it is not a real limitation of the screen or the communication with it, since it continues to work at the native resolution (the downsampling occurs in the graphics, not in the screen, and everything that is sent to it is in native resolution).
But anyway, the system continues with its thing even if the scaling is done in the gpu, and it may "not be able" to use certain resolutions for wiring issues, although technically it is not using that higher resolution (what can change is the way how the information is sent to the screen, which can cause some of the problems when using certain types of screens and wiring). I suppose that using a displayport or something like that can allow better reach of 4K.
-
More than a monitor problem, it will be a problem of the standards to be met for each resolution in terms of available bandwidth for video output, even if it is totally fictional. Let me explain:
It is supposed that if you activate screen scaling by GPU, and not by the screen itself, there is really no change in resolution compared to the native one of the same, and therefore there is no additional requirement against the screen.
What may be different is the synchronization and that the resolutions internally the system evaluates if they can or cannot be used by bandwidth (MHz), depending on how that screen is connected. Resolutions like 4K make a Dual link DVI fall short, and although it is not a real limitation of the screen or the communication with it, since it continues to work at the native resolution (the downsampling occurs in the graphics, not in the screen, and everything that is sent to it is in native resolution).
But anyway, the system continues with its thing even if the scaling is done in the gpu, and it may "not be able" to use certain resolutions for wiring issues, although technically it is not using that higher resolution (what can change is the way how the information is sent to the screen, which can cause some of the problems when using certain types of screens and wiring). I suppose that using a displayport or something like that can allow better reach of 4K.
Indeed I think that's where the bullets are going. DVI doesn't give much more... anyway, with 2560X1600p and being able to add the amount of MSAA.SSAA etc.. that games support, Max Payne 3, with 8xMSAA at 1600p is truly spectacular as it is seen.. downsampling is not exactly something that keeps me awake at night..;)
salu2.
-
Indeed I think that's where it's headed. DVI can't do much more... anyway, with 2560X1600p and being able to add the amount of MSAA.SSAA etc.. that games support, Max Payne 3, with 8xMSAA at 1600p is truly spectacular as it looks.. downsampling isn't exactly something that keeps me up at night..;)
salu2.
I imagine that yes, there must be quite a difference between a native resolution at 1600p on an IPS monitor versus downsampling on a 1080p monitor or TV I suppose because of the pixel size, I got 2160p on 2 TVs via DVI-HDMI but I never achieved good quality like via VGA and the latest generation games are at 10 fps performance with the GTX 580 at that resolution.
Anyway, this downsampling thing is amazing I don't want to take the 1440p resolution away anymore.
I would have liked to be able to get an impression of your Titans at that resolution although from what I read the one that was tested on a 4k TV couldn't get the Titan SLi to work.-
Salu2.