• Portada
    • Recientes
    • Usuarios
    • Registrarse
    • Conectarse

    First tests of Nvidia's 980 and 970

    Programado Fijo Cerrado Movido Tarjetas Gráficas
    131 Mensajes 10 Posters 28.7k Visitas 1 Watching
    Cargando más mensajes
    • Más antiguo a más nuevo
    • Más nuevo a más antiguo
    • Mayor número de Votos
    Responder
    • Responder como tema
    Accede para responder
    Este tema ha sido borrado. Solo los usuarios que tengan privilegios de administración de temas pueden verlo.
    • JavisoftJ Desconectado
      Javisoft Veteranos HL @ELP3
      Última edición por

      Yes, I realized that the comparison was with oc at more or less the same stress level and such, as with kepler, when you add voltage, the consumption increases quite a bit. But of course, you multiply it by 4 and damn XD, in terms of consumption, I bow to nvidia.
      1 Respuesta Última respuesta Responder Citar 0
      • F Desconectado
        fjavi @Patagonico
        Última edición por

        @Patagonico:

        Ya lo compre, la decisión estaba tomada hace mucho quería un 8 núcleos y es verdad todo lo que dices no tengo necesidad ni el 2500K se me queda corto pero como vengo haciendo hace tiempo no lo tomo como un gasto sino una inversión porque esto explota en cualquier momento no tanto al extremo como te voy a explicar pero a fin de año se espera mínimo dólar $ 10, los componentes se toman a cotización dólar y viene subiendo rápidamente para darte un EJ el dólar cuesta en mi país $ 8,50 pero tienes un limite de compra mensual si quieres comprar lo debes hacer por afuera de mercado cuesta $ 15 una brecha de 70%

        Ahora los componentes se pagan a 8,70 por el momento pero se habla de no dejar entrar productos importados y agregar mas aranceles.

        Si lo traes importado como me paso con el monitor pagas precio paralelo $ 15 es como vendría a decirte uds pagan 1 a 1 con dólar, el ejemplo seria pasar de 1 a 1,70.

        Resumiendo quiero darme el gusto por lo menos con el micro que me durara mucho tiempo.

        Saludos.-

        Buen pepino te vas a montar, de todas formas en todas partes se habla de economía de que si el dolar puede colapsar, por eso a veces es mejor no vivir preocupado.

        Pero si puedes darte el gusto pues tendras un equipo que debe durarte mucho, disfrutaremos lo que podamos mientras podamos.

        saludos

        @wwwendigo:

        ¿Qué te apuestas a que se refiere de STOCK?

        La típica trampa. Para alguien que reclama frecuencias iguales o usando el máximo potencial de OC de frecuencias entre chips totalmente distintos, es hasta curioso que haga la comparación como le convenga según el caso entre modelos que usan, esta vez sí, mismo chip.

        Las matemáticas no mienten, en el silicio de ambas dies la única diferencia que hay es un SMX deshabilitado, ni más ni menos. Esto significa que en potencia bruta hay justo esta diferncia MAXIMA:

        14 SMXs vs 15 SMXs. O sea, 14/15 o lo que es lo mismo, si se depende TOTALMENTE del rendimiento de los SMXs, ya sea por uso masivo de shaders o accesos a texturas via TMUs, un 7% de potencia extra a igualdad de reloj. No un10% (si la Titan es un 9% más lenta, la Ti le saca un 10% de rendimiento) que no sale de ninguna cuenta posible de las specs reales. Eso sin contar con que… si una aplicación está en un grado limitada por ROPs, por geometría, etc, va a DAR IGUAL o influir mucho menos ese SMX agraciado extra de la Ti. Eso sin contar además que el rendimiento real en aplicaciones reales siempre es menor que el potencial teórico por specs entre sabores de un chip (por cierto, un ejemplo muy gráfico para ver esto es ver el rendimiento comparativo entre GTX 970 y 980, con un número de SMMs capados mucho mayor, vemos una diferencia real no tan grande, más cuando además los relojes son un poco más altos para la GTX 980).

        Fuera de la gpu, la diferencia está en las frecuencias de stock de las memos, más altas en las Ti, pero dado que no tiene problemas de ancho de banda precisamente las GK110, dudo y mucho que esto "libere ningún rendimiento oculto", no estamos hablando de un GK104, vamos.

        Por tanto, o se compara con rendimientos de stock, o está falseando la información again exagerando para el lado que más le conviene. Que de todas formas, da igual, una Titan a 1450 a la que una Ti le saque un 10% (1,1) a igualdad de reloj (ja y ja) sería equivalente a una Ti de... 1450/1,1, o sea, 1318 MHz. Más incluso de su símil entre Ti 1300 vs 980 a 1400. Es que ni cogiendo el caso extremadamente favorable para su teoría.

        Yo creo que no deberíamos complicarnos, para mi es positivo que salgan estas tarjetas, hay mas donde elegir bajan precios de las otras, a mi personalmente me parece mucha diferencia de precio de la 970 a la 980, pero supongo que tendrán su estrategia comercial, ya que poner una 980 a 450€ les obliga a vender 780ti mucho mas baratas, quizá no sea solo cosa de Nvidia, pues las ha descatalogado, quizá sean los partner también quien presiona para no tener que bajar mucho.

        Luego el tema actualización, si es por rendimiento no le interesa a gente con GK110 y tampoco a quien tenga r9 290 o 290x aunque luego ya cada uno debe valorar si quiere menor consumo, menor ruido, o solo quiere trastear.

        Para gente que tiene 570,480,580,es muy positivo tener la opción de una 970 que ganaran mucho rendimiento y van a mejorar mucho en tema consumo, temperatura, ruido y nuevos extras que traen estas Maxwell, por eso es positivo que saquen cosas nuevas.

        Por eso digo que para mi no son un timo, no avanzan en rendimiento mucho, pero es bueno que salgan por que se mueve precio en las demás, y sobretodo que la 970 por precio rendimiento, consumo y OC si que es un avance importante, pues por lo que ha estado costando una 770 de 4Gb ahora pueden tener una 970 que si es un cambio claro a mejor.

        Ahora veo que se dice que entre 290 y 970 la diferencia podría ser un 1€ al mes en consumo, pues si quizá, pero yo me ahorre desde que tengo la 780 unos 130€ por que no he necesitado tanto aire acondicionado, con dos 480 hacia falta ponerlo alto, por que sino era como una sauna. son cosas que cuentan, con las Maxwell será mejor aun.

        saludos

        PatagonicoP 1 Respuesta Última respuesta Responder Citar 0
        • PatagonicoP Desconectado
          Patagonico @fjavi
          Última edición por

          @ELP3:

          Please leave the discussions for other forums, or do it privately. Because maybe there are people who are interested in first-hand data from the GTX900..:wall: EDITO: I see that you are handling it well, so better..;D

          Patagónico, very good news… PICO consumption to the wall of the thief (who doesn't have shit) of the entire team, including monitor, headphones etc...) and attention, with an OC OF 1470MHZ AT 4 PASSING UNIGINE 3 ROUNDS!!!

          4770K at 4600MHZ

          I wasn't far off... 1200W less consumption, 20% more performance.

          Regards.

          Temps good for being 4 juntas.No passes over 78º none with 1470MHZ and an acceptable profile not too aggressive.

          Incredible, we went from talking when the fermi reactors came out and now 1000W with 4 GPUS if someone had told me this a while ago I wouldn't have believed it.

          Will you water cool them? I suppose they could reach much more by removing the voltage limit or do you think that being a 6+6 connection they have already reached their maximum.

          Regards.-

          @fjavi:

          You're going to put together a good cucumber, anyway everywhere they talk about the economy of whether the dollar can collapse, that's why sometimes it's better not to live worried.

          But if you can afford it then you will have a team that should last you a long time, we will enjoy what we can while we can.

          regards

          Not living worried, just being able to afford it and not thinking about it like before because one knows that what you buy today will be more expensive tomorrow not cheaper, the clearest example was the monitor in my country you can't get a 4K I had it shipped to me it arrived in June it took 1 month also problems with customs when I ordered it the dollar was $10.80 the following month when I picked it up I had to pay it at $11.15 and if I wanted to buy it today I would have to pay it at $15.20 by eye it would be 40% more expensive than 4 months ago

          The one who has a little money is changing TV, car and so we are

          The team is going to be a cucumber the graphics are easy to change but the micro the longer it lasts the better I don't like to be assembling and disassembling.

          Regards.-

          F 1 Respuesta Última respuesta Responder Citar 0
          • F Desconectado
            fjavi @Patagonico
            Última edición por

            I would never put these graphics cards in water, I don't put any in water, but these less so, they perform well with their heatsink, they don't get very hot and it's more comfortable to tinker with, not to mention how expensive it is to put blocks on each one and then in a few months they come out with some better ones, and back to square one.

            Besides, maybe with these you don't see the gain that you could see with a Titan when you put it in water, the other could put more voltage and go higher but here everything is to be seen.

            But just for not disassembling RL every time you remove a card I don't even consider it.

            Regards

            PatagonicoP 1 Respuesta Última respuesta Responder Citar 0
            • PatagonicoP Desconectado
              Patagonico @fjavi
              Última edición por

              @fjavi:

              I would never put these graphics cards on water, I don't put any on water, but these even less so. They perform well with their heatsink, they don't get too hot and it's more convenient to mess around with. That's not to mention how expensive it is to put blocks on each one and then in a few months they come out with some better ones, and then you have to start all over again.

              Besides, maybe with these you don't see the gains that you could see with a Titan when you put it on water. The other one could put more voltage and go higher but here everything is to be seen.

              But just for not having to disassemble RL every time you remove a card I don't even consider it.

              Regards

              I think the same way but there are people who like to make them go skynet3 is working on the bios

              NVIDIA GTX 980 Owner's Club

              Now I also think that possibly soon the 20nm maxwells will come out I suppose the blocks will be compatible and probably not all but the vast majority that I bought this 980 (3-4 gpus) will switch to the 20nm ones for tinkering and they will already have the equipment to put them on water.

              Yesterday I installed the 344.11 drivers (supposedly support the 700 series) and my machine went crazy it no longer recognized the displayport and I had to do some juggling to go back to the 340.52

              Today I found that there is already a beta driver 344.16 for these 980/970

              Drivers | GeForce

              Salu2.

              ELP3E 1 Respuesta Última respuesta Responder Citar 0
              • ELP3E Desconectado
                ELP3 @Patagonico
                Última edición por

                Now that I'm playing with the 4 that are already installed, I'm having problems with V sync in many games.

                My DELL 3011 has 59.9 more or less. And in games like metro redux, splinter cell blacklist etc.No it is not able to synchronize the image to those frames. They dance from 58 to 62, and it creates a feeling of tremendous constant lag. It doesn't matter what you touch in the control panel, that now there are more options, but I don't like the v-sync in the game options, nor in the adaptive ones in the control panel and in nothing. The only thing that fixes it is blocking the fps to 60 with RTSS. Then it goes well, but the truth is that I don't like it at all eso.No I don't know if the issue of incorporating G-Sync has something to do with it, or if it's because of having 4 graphics cards stuck or whatever.

                Salu2.

                PatagonicoP 1 Respuesta Última respuesta Responder Citar 0
                • W Desconectado
                  wwwendigo @Javisoft
                  Última edición por

                  This post is being processed/translated. The original version will be shown:

                  @fjavi:

                  Yo creo que no deberíamos complicarnos, para mi es positivo que salgan estas tarjetas, hay mas donde elegir bajan precios de las otras, a mi personalmente me parece mucha diferencia de precio de la 970 a la 980, pero supongo que tendrán su estrategia comercial, ya que poner una 980 a 450€ les obliga a vender 780ti mucho mas baratas, quizá no sea solo cosa de Nvidia, pues las ha descatalogado, quizá sean los partner también quien presiona para no tener que bajar mucho.

                  Luego el tema actualización, si es por rendimiento no le interesa a gente con GK110 y tampoco a quien tenga r9 290 o 290x aunque luego ya cada uno debe valorar si quiere menor consumo, menor ruido, o solo quiere trastear.

                  Para gente que tiene 570,480,580,es muy positivo tener la opción de una 970 que ganaran mucho rendimiento y van a mejorar mucho en tema consumo, temperatura, ruido y nuevos extras que traen estas Maxwell, por eso es positivo que saquen cosas nuevas.

                  Por eso digo que para mi no son un timo, no avanzan en rendimiento mucho, pero es bueno que salgan por que se mueve precio en las demás, y sobretodo que la 970 por precio rendimiento, consumo y OC si que es un avance importante, pues por lo que ha estado costando una 770 de 4Gb ahora pueden tener una 970 que si es un cambio claro a mejor.

                  Ahora veo que se dice que entre 290 y 970 la diferencia podría ser un 1€ al mes en consumo, pues si quizá, pero yo me ahorre desde que tengo la 780 unos 130€ por que no he necesitado tanto aire acondicionado, con dos 480 hacia falta ponerlo alto, por que sino era como una sauna. son cosas que cuentan, con las Maxwell será mejor aun.

                  saludos

                  Lo del consumo y su coste, me hace gracia, porque es una medio mentirijilla esa estimación que he visto por algunos lados como justificación (sería por cierto, la misma mentira que podría usarse en la época de las Fermi como argumento a favor de éstas, y sin embargo entonces no se solía ver como argumento, todo lo contrario, y eso que Fermi tenía algo a su favor para justificar excesos: El rendimiento absoluto más alto).

                  El consumo es inferior no sólo mientras se juega, sino en escritorio, en reproducción multimedia, vale que se gotea mucho menos por hora en este caso en vatios-hora gastados, pero es algo mucho más continuo en el tiempo en sistemas que tienen un alto uso por usuarios, y lo más importante, se suma a lo anterior sin incompatibilidad. Excepto el "idle profundo" de las AMD cuando dejan de dar señal al monitor (y no hay aplicaciones en el escritorio en primer plano, vamos, las condiciones no son tan pocas precisamente), en lo demás son un goteo continuo de consumo a mayores, en multimonitor, etc.

                  Lo mejor de estas arquitecturas es cuando se usan "racionalmente" y los consumos empiezan a ser muy distintos de los máximos sostenidos en pruebas. Si juegas con vsync o similares, las diferencias pueden ser muy apreciables. La diferencia entre mi GTX 560 Ti y una GTX 670 en consumos máximos no era importante, algo mayor incluso en la GTX 670 normalmente, pero mirando tal cual yo jugaba, con vsync o lo que fuera, se veían diferencias a favor de kepler por el uso del boost y una menor exigencia de rendimiento máximo para hacer X frames. Estas Maxwell parecen ir por el camino, ya digo que seguro que mejoran los consumos pero sobre todo en esta situación más realista, más que mirando consumos tope aunque sea en entornos "realistas" pero que no representan el cómo muchos juegan (juegos top en gráficos sin Vsync para poner a tope a las gráficas).

                  Por ejemplo, en lo del multimonitor, la diferencia entre tarjetas equivalentes nvidia y AMD pueden ser fácilmente 40W, esto es una diferencia abismal, imagina un equipo usado 5 horas al día fuera de juegos, son tan o más importantes esos consumos que los propios del juego (200 Wh a añadir diarios). Y ya si a alguien se le ocurre colaborar en Folding o es un loco de la minería (altamente desaconsejado por mí, es una forma de perder dinero en forma de mayor gasto de energía y su coste que en el beneficio), entonces la factura crece exponencialmente, ahí una GM204 no es que ahorre, es que hay diferencias abismales. En simple idle "normal" las diferencias aunque pequeñas, son como gotas de agua, 10w aquí, 10w allí… y poco a poco.

                  Yo contaría fácilmente unos 2 € de diferencia por mes entre UNA GM204 y una Hawaii, en un equipo entusiasta donde habrá muchas horas de uso y no sólo en juegos. Si se usa multimonitor, incluso podría ser algo más, 3€ con facilidad. No todo es contar sólo el consumo en juegos, hay más. Eso no va a quitar a nadie de pobre o hacerle rico, pero sí pueden significar 20-30€ anuales gastados. Y eso ya es una parte o el todo de lo que igual se gasta o ahorra entre distintos modelos que pueden parecer más baratos. Ahora dale una vida media de una gráfica de 3 años y ya es un total de ahorro a contar.

                  @Javisoft:

                  Cuando hablo de frecuencia maxima en oc o boost, hablo como te digo en graficas referencia y con el maximo voltaje disponible y bloqueado, creo que es la comparacion justa ;D.

                  El caso es que yo lo enfoco desde la perspectiva de mis graficas y tu desde la tuya, entiendo tu postura y se que nvidia en 20 nm va a fabricar una buena bomba, pero aceptando que el oc medio este en los 1500, volvemos a tener practicamente un empate en fps entre las dos, unos cuantos arriba en X juego y otros abajo en Y juego.

                  Si te sale buen asic y buen chip pues tus 1600 o mas, se ha llegado a ver ya 2 ghz con ln2 segun lei no recuerdo donde, creo que fue kingping, tengo que mirarlo, a ver si baten los 18400 puntos en el famoso 3dmark XD.

                  El tema del precio está claro que ha beneficiado al nuevo comprador, o al indeciso, que ve en la 970 un chollo y las Atis cayendo y ajustando precios.

                  Realmente Nvidia ha pegado un golpe sobre la mesa en consumo con un rendimiento top, AMD va a tener que trabajar muy duro para ofrecer mejor rendimiento/precio por que en consumo no creo que pueda acercarse a los verdes.

                  Por cierto un gusto hablar contigo, leo tu blog de vez en cuando y las reviews que vais haciendo y tal, muy interesante todo la verdad ;).

                  Un saludo.

                  Si el empate técnico es casi inevitable, de stock la GTX 980 es algo mejor, pero tampoco mucho, con OC a pesar de que tiene mucho en reserva, las 780 Ti ya sabemos que también, y las cosas tienden a igualarse. Al final depende de cómo salga la unidad o unidades que se tenga.

                  Como nota personal, yo no me creo o no doy por sentado que se vaya a superar los 1500 o siquiera alcanzar con una GTX 900, porque conozco cómo va el percal del OC en reviews, y lo que significa tener "buena suerte", yo me esperaría y me daría ya con un canto en los dientes con pasar de 1400 con las GM204, de la misma forma que estaría contento con llegar o pasar de 1200 en un GK110.

                  Soy extremadamente tiquismiquis con los OCs pseudoestables que van muy bien en "casi todo" y durante muchas horas, con la típica excepción de X juego o lo que sea, que yendo "casi bien" acaba mostrando flaquezas ocultas en las demás pruebas. Por eso mis OCs suelen ser bajos, eso y que la suerte no suele acompañar al vulgar de los mortales… :ffu:

                  De 3 GK104, ninguna me pasó de 1300 (aunque había una que quizás sí tuviera ese potencial, pena de ser una GTX 670 de referencia), y a alguna hasta le costaba mantener 100% estables los 1200. De una GK106, lo mismo, de antes con Fermi, again, difícil pasar del GHz con un GF114, no con voltajes aceptables. Etc.

                  Tampoco soy nada amigo de usar voltajes muy distintos a las de referencia, porque se suele ir por el camino bastante a tomar por saco los consumos, en el caso de mi antigua GTX 560, la jodía si empezabas a subir voltajes y tal para superar el GHz, subía los consumos del equipo fácilmente 100W, creo que incluso rozaba 150W extra (ojo, consumos de pared, no contra fuente, que son algo más bajos). Aunque pudiera mantenerla fresca, me parecía una locura mantener ese tipo de OCs sólo para ganar un 2-5% de rendimiento, a costa totalmente del consumo. Por eso tampoco soy de hacer mucho caso a los OCs con toqueteo importante de voltaje, aunque pueda interesarme conocer el potencial teórico de una gpu.

                  Evidentemente, yo veo las cosas desde el que viene "sin nada" similar en su equipo, donde sí existe beneficio, sobre todo mirando a la GTX 970 :ugly:, que es la que realmente me impresiona. Pero la 980, aún siendo mucho más cara (su precio "bueno" mirando a la GTX 970, debería rondar los 400€, 450€ como tope, por la potencia extra que ofrece y ya bien pagada). Y siempre que pienso en el potencial de OC, cuento tirando por lo bajo, así que no suelo hacer demasiada expectativa extra en este potencial.

                  Evidentemente, el interés por las GTX 900 por un usuario de tops GTX 700 de nvidia, debería ser nulo o poquísimo, desde el punto del rendimiento como mínimo.

                  1 Respuesta Última respuesta Responder Citar 0
                  • W Desconectado
                    wwwendigo @ELP3
                    Última edición por

                    @ELP3:

                    Please leave the discussions for other forums, or do it privately. Because maybe there are people who are interested in first-hand data from the GTX900..:wall: EDIT: I see that you are handling it well, so better..;D

                    Patagónico, very good news… PICO consumption to the wall of the thief (who doesn't have much crap) of the entire team, including monitor, headphones etc...) and attention, with an OC OF 1470MHZ AT 4 PASSING UNIGINE 3 ROUNDS!!!

                    4770K at 4600MHZ

                    I wasn't wrong… 1200W less consumption, 20% more performance.

                    Regards.

                    Temps good for being 4 juntas.No passes 78º none with 1470MHZ and an acceptable profile not too aggressive.

                    Impressive to say the least, 4 cards plugged in and you don't reach 1000W. When I think that I reached over 550W at the wall with a single card like a GTX 560 Ti (that model came without power control, and it was going wild and more with OC with overvoltage with Alan Wake), although it was in different setup conditions, it leaves me feeling off.

                    In reality it has "logic" said consumption, but of course, between inefficiencies of power supplies and other stories, maybe we're talking about about 850W of consumption requested from the power supply by the team, that is, a quadSLI that could go well with a 1000W power supply of quality (a bit out of the optimal zone of a power supply to have so much load of the total, but still acceptable).

                    Now that I'm playing with the 4 already put in, I'm having problems in many games with V sync.

                    My DELL 3011 has 59.9 more or less. And in games like Metro Redux, Splinter Cell Blacklist etc.No it's capable of synchronizing the image to those frames. They dance from 58 to 62, and it creates a feeling of constant lag tremendous. It doesn't matter what you touch in the control panel, that now there are more options, but I don't like the v-sync in the game options, nor in the adaptive ones of the control panel and in nothing. The only thing that fixes it is blocking the fps to 60 with RTSS. Then it goes well, but the truth is that I don't like it at all eso.No I don't know if the incorporation of G-Sync has something to do with it, or if it's because of having 4 graphics cards plugged in or whatever..

                    Regards.

                    What a mess, maybe it's a thing of SLI, or maybe there's an initial problem as already happened with the GK104 when they came out with vsync, especially with the then experimental variable vsync. Let's see if it gets resolved soon or if you locate the reason for this problem in your case.

                    JavisoftJ 1 Respuesta Última respuesta Responder Citar 0
                    • JavisoftJ Desconectado
                      Javisoft Veteranos HL @wwwendigo
                      Última edición por

                      Try to upload them without boost, maybe the lag is due to drops in mhz, you can try v-sync a, maybe it will help, otherwise I will already throw the drivers that are not fine for these cards.

                      1 Respuesta Última respuesta Responder Citar 0
                      • PatagonicoP Desconectado
                        Patagonico @ELP3
                        Última edición por

                        @ELP3:

                        Now that I'm playing with the 4 that are already set, I'm having problems in many games with V sync.

                        My DELL 3011 has 59.9 more or less. And in games like metro redux, splinter cell blacklist etc.No it is not able to synchronize the image to those frames. They dance from 58 to 62, and it creates a tremendous constant lag sensation. It doesn't matter what you touch in the control panel, that now there are more options, but I don't like v-sync in the game options, nor in the adaptive ones in the control panel and in nothing. The only thing that fixes it is blocking the fps to 60 with RTSS. Then it goes well, but the truth is that I don't like it at all eso.No I don't know if the incorporation of G-Sync has something to do with it, or if it's because of having 4 graphics cards stuck or whatever.

                        Salu2.

                        Did you try the new drivers 344.16 perhaps they have fixed the problem

                        Drivers | GeForce

                        Salu2.

                        ELP3E 1 Respuesta Última respuesta Responder Citar 0
                        • ELP3E Desconectado
                          ELP3 @Patagonico
                          Última edición por

                          @Patagonico:

                          You tried the new drivers 344.16 perhaps they have fixed the problem

                          Drivers | GeForce

                          Regards.

                          Yes, the same thing happens.

                          It is not able to stabilize the 60 fps stuck. Instead, it leaves you in a dance between 58 to 62 fps. This creates a lot of stutter. I have already commented on the driver forums, but since people do not realize, they think it is a problem of scaling of the 3rd and 4th graphics when it has absolutely nothing to do with it. It is simply that it does not leave the fps fixed with the HZ of the monitor. This has a solution as I say, blocking with riva tuner the 60 fps on the screen. But I do not think it is acceptable to resort to third-party programs when I had never had this problem with nvidia. I imagine that those who really have to look at the incidents in the drivers will pay attention. What I do not know is if it only happens with multi-gpu or with any single one.

                          Regards

                          PatagonicoP 1 Respuesta Última respuesta Responder Citar 0
                          • PatagonicoP Desconectado
                            Patagonico @ELP3
                            Última edición por

                            @ELP3:

                            Si, it happens the same.

                            It is not able to stabilize the 60 fps stuck. Instead, it leaves you in a dance between 58 to 62 fps. This creates a lot of stutter. I have already commented on the driver forums, but since people do not realize, they think it is a problem of scaling of the 3rd and 4th graphics when it has absolutely nothing to ver.Es simply that it does not leave the fps fixed with the HZ of the monitor. This has a solution as I say, blocking with riva tuner the 60 fps on screen. But I do not think it is acceptable to resort to third-party programs when I had never had this problem with nvidia. I imagine that those who really have to look at the incidents in the drivers will pay attention. What I do not know is if it only happens with multi-gpu or with any loose one.

                            Regards

                            These drivers 344.11 are not working well, I installed them yesterday to see if it improved Dead Rising 3 and in the middle of the installation the displayport screen started to flicker. I finished the installation and I got the message that there is an error in the equipment, it restarted and I had no more monitor flickering, it turned on and off. I was able to make it start in safe mode but through the TV not through the monitor. I uninstalled it and put the 340.52 and everything works correctly.

                            We will have to wait.

                            Regards.

                            F 1 Respuesta Última respuesta Responder Citar 0
                            • F Desconectado
                              fjavi @Patagonico
                              Última edición por

                              @Patagonico:

                              These 344.11 drivers are not working well. Yesterday I installed them to see if Dead Rising 3 would improve, and during installation the DisplayPort screen started flickering. I finished the installation and got a message that there was an error on the computer, it restarted, and I had no monitor, it was flickering, turning on and off, my DisplayPort was flickering. I was able to boot it into safe mode, but only through the TV, not the monitor. I uninstalled them and installed the 340.52 drivers, and everything is working correctly.

                              We'll have to wait.

                              Regards.

                              They don't fail me, although I don't have anything connected via DP. They don't fail me via HDMI, so I leave them installed. I install them and don't even restart, I've done it but everything works.

                              Regards

                              @ELP3:

                              Yes, it's the same for me.

                              It's not able to stabilize the 60 fps. Instead, it leaves it fluctuating between 58 to 62 fps. This creates a lot of stutter. I've already commented on this in the driver forums, but since people don't realize, they think it's a problem with the scaling of the 3rd and 4th graphics when it has absolutely nothing to do with it. It's simply that it doesn't keep the fps fixed with the monitor's HZ. This has a solution as I say, by blocking 60 fps on the screen with Riva Tuner. But I don't think it's acceptable to use third-party programs when I've never had this problem with NVIDIA. I imagine that those who really have to look at the incidents in the drivers will pay attention. What I don't know is if it only happens with multi-gpu or with any single one.

                              Regards

                              Isn't Manuel G there? I mean where the driver must be very green everything, maybe now that they've added more things to the new ones, they have a mess to make everything work, they should make some for those and unify them later.

                              Regards

                              1 Respuesta Última respuesta Responder Citar 0
                              • H Desconectado
                                h2omadrid
                                Última edición por

                                Congratulations ELP3 for the gadgets.

                                They make good benches, but what interests me are the games. You who have had both chips, this one and the gk110, have been able to test enough to comment on how it goes with stability maxwell vs your titan, ignoring the issue of not pinning the 60 fps activating the v-sync?

                                Regards and thanks.

                                Pd: some beta drivers have come out, let's see if they fix the 60 fps problem.

                                Sent from my iPhone using Tapatalk

                                ELP3E 1 Respuesta Última respuesta Responder Citar 0
                                • ELP3E Desconectado
                                  ELP3 @h2omadrid
                                  Última edición por

                                  @h2omadrid:

                                  Congratulations ELP3 on the gadgets.

                                  They make good benches, but what interests me are the games. You've had both chips, this one and the gk110, you've been able to test enough to comment on how it's going with stability maxwell vs your titans, ignoring the issue of not capping the 60 fps by activating v-sync?

                                  Regards and thanks.

                                  Pd: beta drivers have come out, let's see if they fix the 60 fps problem.

                                  Sent from my iPhone using Tapatalk

                                  Hello.

                                  In games the general rule is that they are a bit more powerful than the TITANs. As a rule, about 15-20%. Examples:

                                  Tomb raider, totally maxed out at my resolution and 1600p, with 4XSAA it was practically impossible for the TITANs to keep me at 60 fps in the cinematics made with the game's engine at 60fps, and these ones did. Or the witcher 2 with hyperreal exactly the same. These keep the 60 fps constant in cinematics and the others couldn't reach it. However, metro redux 2033 goes worse on these than on the TITANs, significantly worse, around 5-7 fps. However, redux LL does go better.. it's curious.

                                  And the v-sync issue, although solvable, is an issue for me that is very important. We're talking about this being what generates the stuttering, the thing I hate most in the world in multi-gpu. Therefore it's a very important thing to keep in mind.

                                  Regards.

                                  P:D:Ah! I forgot to mention one thing, now in SLI (or at least in 4) each graphics card doesn't have its own boost. Instead, they all sync, but to the one with the worst boost. In my case, this is a bit of a pain, since the good ones have about 20 or 30 MHZ of additional boost that is lost along the way. In the case that one of them drops boost, either due to temp or TPD, the others do the same.. etc.

                                  F 1 Respuesta Última respuesta Responder Citar 0
                                  • F Desconectado
                                    fjavi @ELP3
                                    Última edición por

                                    @ELP3:

                                    Hello.

                                    In games, the general rule is that they are a bit more powerful than the TITAN. Usually about 15-20%. Examples:

                                    Tomb Raider, completely maxed out at my resolution and 1600p, with 4XSAA it was practically impossible for the TITAN to keep up with 60 fps in the cinematics made with the game's engine at 60fps, and these did. Or The Witcher 2 with hyperreal exactly the same. These maintain 60 fps constant in cinematics and the others couldn't reach it. However, Metro Redux 2033 runs worse on these than on the TITAN, significantly worse, around 5-7 fps. However, the Redux LL runs better.. it's curious.

                                    And the issue of v-sync, although solvable, is a very important issue for me. We're talking about what causes stuttering, the thing I hate most in the world in multi-gpu. Therefore, it's a very important thing to keep in mind.

                                    Regards.

                                    P:D:Ah! I forgot to mention one thing, now in SLI (or at least in 4) each card doesn't have its own boost. Instead, they all synchronize, but to the one with the worst boost. In my case, this is a minor issue, since the good ones have about 20 or 30 MHz of additional boost that is lost along the way. In the case that one of them drops boost, either due to temp or TPD, the others do the same.. etc.

                                    Before I used to synchronize them the same way but at the frequencies of the Master, before Kepler and boost, now they go backwards and puts all at the frequency of the worst and with 4 cards it's noticeable.

                                    I like it better like this synchronized, but of course if you get a bad one it makes dust, it takes performance from all, if you get good ones then it's fine to synchronize the frequencies because according to the drivers they will get something more stable.

                                    The issue of V-sync is a problem, although I suppose they will fix it with drivers, they should have released exclusive drivers for these and try to fix those things quickly and then unify them.

                                    People are going crazy, wanting to change 780Ti or SLI of 780, they can perfectly wait a while and be able to upgrade to a better Maxwell, it's just that they should only look at the TDP and see that the 970 has 145W even less than a 660gtx which has 150W, it performs double, with OC possibly more than double because the 660 doesn't go up, but it's clear that as soon as they release something of 200 or 220W it must be a big deal and that's when it's interesting to change GK110, because it should give a significant margin, that in 28nm in 20nm maybe the difference is greater.

                                    Regards

                                    ELP3E 1 Respuesta Última respuesta Responder Citar 0
                                    • ELP3E Desconectado
                                      ELP3 @fjavi
                                      Última edición por

                                      @fjavi:

                                      Before I used to synchronize them the same way but at the Master frequencies, before Kepler and the boost, now they go backwards and it puts them all at the frequency of the worst one and with 4 cards it's noticeable.

                                      I like them synchronized better, but of course if you get a bad one it makes dust, it takes performance from all of them, if you get good ones then synchronizing the frequencies is good, depending on what drivers they release they will get it to pull something more stable.

                                      The V-sync is a problem, although I suppose they will fix it with drivers, they should have released exclusive drivers for these and try to fix those things quickly and then unify them.

                                      People are going crazy, wanting to change 780Ti or SLI of 780, they can perfectly wait a while and be able to upgrade to a better Maxwell, it's just that they should only look at the TDP and see that the 970 has 145W even less than a 660gtx which has 150W, it yields double, with OC possibly more than double because the 660 doesn't go up, but it seems that as soon as they release something of 200 or 220W it must be a cucumber and that's when it's interesting to change GK110, because it should give a significant margin, that in 28nm in 20nm perhaps the difference is greater.

                                      Regards

                                      People do what they want. But it's silly to change a GTX 780, TITAN, TI for these.

                                      Undoubtedly these pull more and consume much less, but there's no logical change. I've done it simply because one of my TITANs has died and it's not even worth it or you can't get a replacement.

                                      The 20nm has to be a huge leap. And there will be a substantial leap in %.

                                      Now of course for it to start from 0, a 970 is undoubtedly the best option price performance.Not to say the only one.

                                      Salud.2

                                      F 1 Respuesta Última respuesta Responder Citar 0
                                      • F Desconectado
                                        fjavi @ELP3
                                        Última edición por

                                        @ELP3:

                                        People do what they want. But it's silly to change a GTX 780, TITAN, TI for these.

                                        Undoubtedly these throw more and consume much less, but there is no logical change. I have done it simply because one of my TITANs has died and it's no longer worth it or even possible to get a replacement.

                                        The 20nm thing has to be a huge leap. And there will be a substantial leap in %.

                                        Now of course for it to start from 0, a 970 is undoubtedly the best option in terms of price-performance. Not to mention the only one.

                                        Salud.2

                                        Of course for people with 480, 570, 580, even 660, for those people it's a very big step and they will improve a lot in temperature, noise and consumption, even some 600 series.

                                        In 20nm it should be a better step, which I fear they will make it quite expensive, they will say that the process is expensive and perhaps they will release it drop by drop, which is why they might release some in 28nm with 200 or 220W TDP, because it seems that they can stretch this well and it will be more profitable than Kepler.

                                        The bad thing is that it should also be profitable for buyers, but they will probably do that of releasing something bigger in 28nm and they would release these renamed for example to 170gtx and 160gtx or 1600gtx depending on how they want to call it.

                                        saludos

                                        ELP3E 1 Respuesta Última respuesta Responder Citar 0
                                        • ELP3E Desconectado
                                          ELP3 @fjavi
                                          Última edición por

                                          @fjavi:

                                          Sure for people with 480, 570,580, even 660, for those people if it is a very big step and they will improve a lot in temperature, noise and consumption, even some series 600.

                                          In 20nm it should be a better step, which I fear they will make it quite expensive, they will say that the process is expensive and perhaps they will release it drop by drop, that's why it's possible that they will release some in 28nm with 200 or 220W TDP, because it seems that they can stretch this well and it will be more profitable than Kepler.

                                          The bad thing is that it should also be profitable for buyers, but possibly they will do that of releasing something bigger in 28nm and they would release these renowned for example to 170gtx and 160gtx or 1600gtx depending on how they want to call it.

                                          regards

                                          And with a 680 too. We are talking about a graphics card that with oc takes it without considering the dual 690.

                                          F 1 Respuesta Última respuesta Responder Citar 0
                                          • F Desconectado
                                            fjavi @ELP3
                                            Última edición por

                                            @ELP3:

                                            And with a 680 as well. We're talking about a graphic that with oc is taken without hesitation to the dual 690.

                                            Of course I said 600 series, Nvidia has compared them against the 680, what I was referring to a bit about the price ranges and potential buyers, some people do not spend 500€ or more on a card so I think the ones that will sell a lot are the 970 and 960 when it comes out.

                                            The 980 will have its market but where they will sell a lot is with the 970 and if it comes out well priced the 960 will sweep in sales.

                                            But in Nvidia's comparative tables they have put the 680 which is the graphic that they think can be upgraded to these, 600 series and below, although in the end many will also upgrade 700 series like the 760 and 770.

                                            P.d. it is noticeable that these had entered very few, that's why prices went up, as soon as they come in from all brands they will stabilize in price and go down a bit.

                                            http://www.wipoid.com/tarjetas-graficas-nvidia/4139-msi-geforce-gtx-970-gaming-4gb-gddr5-4719072365752.html
                                            http://www.wipoid.com/tarjetas-graficas-nvidia/4149-zotac-geforce-gtx-970-4895173605369.html
                                            http://www.wipoid.com/tarjetas-graficas-nvidia/4153-evga-geforce-gtx-970-acx-20-4gb-gddr5-4250812406569.html

                                            that until the 25th or 26th can be considered a paperlaunch, very few have arrived.

                                            Regards

                                            ELP3E 1 Respuesta Última respuesta Responder Citar 0
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 6
                                            • 7
                                            • 5 / 7
                                            • First post
                                              Last post

                                            Foreros conectados [Conectados hoy]

                                            0 usuarios activos (0 miembros y 0 invitados).
                                            febesin, pAtO, HIAL-9000

                                            Estadísticas de Hardlimit

                                            Los hardlimitianos han creado un total de 543.5k posts en 62.9k hilos.
                                            Somos un total de 34.9k miembros registrados.
                                            roymendez ha sido nuestro último fichaje.
                                            El récord de usuarios en linea fue de 123 y se produjo el Thu Jan 15 2026.