-
Hello hardlimiters;
I have several doubts about these two formats and I wanted to see if someone knows how to guide me a little:
Recently I discovered that my monitor (a Hanns-G big mess of a 19") has a DVI connection, but I'm not sure if this type of connection is for watching things in HD or simply a VGA prepared for LCD monitors.
The thing is that my card (nVidia Geforce 8600GT) has 3 outputs: HDMI, DVI and VGA (I have it connected by VGA because I didn't even know the monitor had other inputs ). ¬¬
From what I've seen the output of the card is DVI M1-DA, and the input of the monitor is DVI-D.
The questions are:
**- Can I connect my monitor to the DVI input and gain image quality and/or sharpness?- Should I connect HDMI-to-DVI or DVI-to-DVI?
- What type of cable would I need?
**
I would like to know which option is best for me and if I will gain image quality before spending the money on a cable that might not work…
Thank you very much in advance, all kinds of suggestions are accepted! :ugly:
-
Hello fellow
The truth is that you won't gain anything between VGA and DVI. With HDMI you would gain, but since your monitor doesn't have an HDMI input, there's nothing you can do.
Best regards.
-
The truth is that you won't gain anything between VGA and DVI. With HDMI you would gain, but since your monitor doesn't have an HDMI output, there's nothing you can do.
I see it the other way around… from DVI to HDMI unless you need the audio you won't notice anything, but from VGA to DVI we're talking about analog image (and its respective conversions) to digital image. Generally, DVI has always presented sharper images in 2D (especially text).
At least that's how I understood it xD There's no reason to be using VGA when you have a DVI connection available. -
Hello buddy
The truth is that you won't gain anything between VGA and DVI. With HDMI you would gain, but since your monitor doesn't have an HDMI output, there's nothing you can do.
Best regards.
thisoooo …... Sylver, it's input not output, the monitor receives the signal, it doesn't emit it
-
Alright, alright, don't eat me... xD I'll tell you, the image change between VGA and DVI, to be honest, I haven't noticed a difference after trying it out. However, when I switched from VGA or DVI to HDMI, I did notice a difference. That's my experience so far xD KEEP IN MIND, it also depends largely on the monitor, because it's not the same as a 19" TFT from any brand that's a few years old as a 21" LG LCD (so as not to go too far) that's only 15 months old. (just to give some values as an example)
Best regards
-
If you have DVI on both sides…. which cable will you need? it seems like a question from a late-night quiz show, for €100, come on guys, call now! :ugly:
Well, you will probably notice a little more sharpness, but don't expect too much because it's more noticeable at higher resolutions. PD. As Krampak says, DVI can carry analog video signal (like VGA) and digital (like HDMI) double in the case of dual link and even USB in some cases; what it can't carry is digital audio like HDMI.
-
Hello buddy
The truth is that you won't gain anything between VGA and DVI. With HDMI you would gain, but since your monitor doesn't have an HDMI input, there's nothing you can do.
Best regards.
¬¬ VGA has less quality. DVI and HDMI have the same theoretical quality as they are digital signals, so the real quality depends on the cable (depending on the quality of the cable it works well or works badly, when the signal is digital there is no middle ground and paying extra for a cable is silly).
Personally I prefer DVI, even though it has the same quality as HDMI, the latter tends to make worse contact in the connectors and tends to fail, besides, everything that carries the words high definition has extra cost by the face.
Alright, alright, don't eat me up... xD I'll tell you, the image change between VGA and DVI, to be frank, hasn't made me notice a difference after trying it. However, when I jumped from VGA or DVI to HDMI I did notice it. That's my experience xD MIND YOU, it also depends largely on the monitor, because it's not the same to have a 19" TFT of any brand and a few years old as a 21" LG LCD (so as not to go too far) of barely 15 months. (just to throw values as an example)
Best regards
The fact that you see better image through HDMI than through DVI has a name... have you heard of the placebo effect? :troll:
If you have DVI on both sides... what cable will you need? it seems like a question from a late-night quiz show, for 100€, come on guys, call now! :ugly

what did you say? :troll:
-
¬¬ the vga has less quality. the dvi and the hdmi have the same theoretical quality as it is a digital signal, so the real quality depends on the cable (it depends on the quality of the cable or it works well or it works badly, when the signal is digital there is no middle ground and paying extra for a cable is silly).
personally I prefer the dvi, although it has the same quality as the hdmi, the latter usually makes worse contact in the connectors and tends to fail, besides, everything that carries the words high definition has extra cost by the face.
the fact that you see the image better by hdmi than by dvi has a name… have you heard of the placebo effect? :troll:

what did you say? :troll:
How smart is this boy :ugly:
I think that between VGA and DVI there is not an abyssal difference, and about the placebo effect… with two identical monitors side by side connected to the same graphics one by DVI and another by HDMI with exactly the same resolution values... you tell me, try it try :troll: -
Well I'm sorry to say that the cable is the same :risitas:
There is a difference, take any monitor 1080 or higher and you will clearly see the difference in sharpness

-
With my monitor it is noticeable, although I suppose a good screen with a good converter will make it less noticeable.
-
Well, all opinions are appreciated, I didn't expect so much movement in a single day... :fuckyea: Thanks to all!
Although I'm still not clear if the cable is the same in all cases or if it has to have different pins. :ugly:
I talked to the sales assistant at the store in my neighborhood and he told me that an HDMI to DVI cable is a risky bet, because half the time they don't work or the screen appears in black and white, which is usually a trick that the Chinese use to deceive people and charge more for the HD acronym (verbatim, I swear).
In short, in my case, he recommends DVI-to-DVI because I'm sure I won't get more quality out of my crappy monitor...
So this afternoon I'll go and get a DVI-DVI cable with a little drawing of the pins that both the card and the monitor have... and see what they tell me, if they have it or what.
In case they have the cable I need, I'll comment if I've noticed an improvement...
-
I got a dual because it was the same price, but if I had found a single one a few euros cheaper, I would have saved that money, because you don't need a dual for 1080p.
If you're buying it in a physical store, I don't think they'll have any problem with you returning it.
-
do not eat the coconut or rock the kid
DVI to DVI is clearly superior in quality and sharpness to VGA to VGA
VGA D-sub 15-pin
But this type of connector, which works quite well for CRT monitors, is not capable of providing sufficient image quality when it comes to TFT monitors or other similar types.
This is because, regardless of the type of graphics card, the connection to the monitor is made analogously.
The color depth is defined by simple voltage, so in theory a SVGA or VGA monitor (CRT or Cathode Ray Tube type) has practically no limit in terms of the number of colors it is capable of displaying.
The brightness of each color is determined by a variation in the intensity of the beam while it moves along the corresponding line.But this does not happen in the same way when it comes to a TFT monitor, which as we know are the ones used mostly today.
And this is because this type of screen does not use this cathode ray system, but rather works with a pixel matrix, and a brightness value must be assigned to each of them.This is done by the decoder, which takes input samples of voltages at regular intervals.
This system poses a problem when both the transmitting source (in this case the card) and the receiving (in this case the TFT monitor) are digital, as it forces this sampling from the very center of the pixel, to avoid noise and color distortion.
This causes, among other things, that both the tone and the brightness of a pixel can be affected by those of the pixels around it.DVI connector:
In the DVI format this is done differently, as it is a digital format, so the brightness of each pixel is transmitted by binary code.
This means that when a TFT screen works with DVI connection and in its native resolution (we must remember that TFT screens have a native resolution, which is where they give their maximum quality)
each output pixel corresponds to a pixel on the screen, which makes the pixels have all their color, quality and brightness.
Obviously, for this to happen both elements (graphics card and monitor) must have digital connections (DVI or HDMI).
This is because DVI connectors are capable of transmitting both analog and digital signals in one of their models (DVI-I), which is the one used by graphics cards.
The types, which can be seen in the image above, are three:
DVI-D transmits only digital.
DVI-A transmits only analog.
DVI-I transmits both digital and analog signal.
In turn, the DVI-D and DVI-I types can be dual (DL or Dual Link), that is, they can support two links.The HDMI
HDMI, (High Definition Multimedia Interface), is the most used type of connector currently and, of course, the newest.
The main difference with the other types and in particular with DVI is that apart from transmitting the digital video signal it is also capable of transmitting audio.
And both without compression.This connection offers a bandwidth of up to 5 gigabytes per second, which is why it is used to send high-definition signals, 1920×1080 pixels (1080i, 1080p) or 1280×720 pixels (720p).
There are three types of HDMI connectors:
The usual HDMI connector is type A, which has 19 pins and is backward compatible with a single DVI link, used by modern LCD monitors and graphics cards.
This means that a DVI source can be connected to an HDMI monitor, or vice versa, by means of an appropriate adapter.The HDMI type B connector has 29 pins and is hardly extended at present, as it was designed for higher resolutions than those of the 1080p format (1920×1080 pixels).
And the type C connector is the same as type A but with a smaller size. It would be like the miniUSB to USB.
Within the types of HDMI we find three specifications:
HDMI 1.0 (December 2002).
Single digital audio/video connection cable with a maximum bitrate of 4.9 Gbit/s. Support up to 165Mpixels/s in video mode (1080p60 Hz or UXGA) and 8-channel/192 kHz/24-bit audio.HDMI 1.2 (August 2005).
One Bit Audio support is added in this specification, used in Super Audio CDs, up to 8 channels. HDMI Type A availability for PC connectors.HDMI 1.3 (June 2006).
Bandwidth is increased to 340 MHz (10.2 Gbit/s) and support for Dolby TrueHD and DTS-HD is added.
TrueHD and DTS-HD are lossy audio formats used in HD-DVD and Blu-ray Disc.
This specification also has a new mini-connector format for camcorders.I THINK IT IS CLEAR
regards
-
I think the copy&paste is getting out of hand :risitas:
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Registrarse Conectarse