A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

VGA vs DVI connection to monitor



 
 
Thread Tools Display Modes
  #1  
Old January 21st 11, 09:55 PM posted to rec.photo.digital
N[_9_]
external usenet poster
 
Posts: 80
Default VGA vs DVI connection to monitor

On 22/01/2011, DaveS wrote:
The stuff I find using Google doesn't seem to be authoritative in any way on
this question..

I am receiving a new monitor (Dell U2311h) next week, and it can be connected
by various types of cable. My graphics card can use VGA or DVI. The question
is, will I experience any benefit by paying for and using a DVI connection
over using the included VGA connection?

I'm interested specifically in responses relating to photo editing.

Dave S.


Why do you think the DVI connection will cost you more money? There'll
be a DVI cable in the box.


  #2  
Old January 22nd 11, 05:57 AM posted to rec.photo.digital
N[_9_]
external usenet poster
 
Posts: 80
Default VGA vs DVI connection to monitor

On 22/01/2011, DaveS wrote:
On 1/21/2011 3:55 PM, N wrote:
On 22/01/2011, DaveS wrote:
The stuff I find using Google doesn't seem to be authoritative in any
way on this question..

I am receiving a new monitor (Dell U2311h) next week, and it can be
connected by various types of cable. My graphics card can use VGA or
DVI. The question is, will I experience any benefit by paying for and
using a DVI connection over using the included VGA connection?

I'm interested specifically in responses relating to photo editing.

Dave S.


Why do you think the DVI connection will cost you more money? There'll
be a DVI cable in the box.



I set out to prove you wrong, but I stand corrected:
What's in the Box

Monitor with stand
Power Cable
DVI Cable
VGA Cable (attached to the monitor)
Drivers and Documentation media
USB upstream cable
Quick Setup Guide
Safety Information

I believe that I have purchased and set up LCD monitors for others where
there was a DVI connector but no cable. Clearly, there is no cost for me to
find out for myself if there is any visible difference with this monitor.

Dave S.


My new Dell PC at work has a video card with 3 DisplayPort connections.

http://en.wikipedia.org/wiki/List_of_display_interfaces


  #3  
Old January 22nd 11, 07:17 AM posted to rec.photo.digital
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default VGA vs DVI connection to monitor

DaveS wrote:
Clearly, there is no cost for me
to find out for myself if there is any visible difference with this monitor.


Whether you think you can see it on any given displayed
image or not, use the DVI connection.

I won't go so far as to say digital data is vastly
better than analog data, but it is certainly better and
you get significantly improved precision. Another point
is that with age the VGA interface will drift far more
than will the DVI interface.

--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)
  #4  
Old January 24th 11, 01:17 AM posted to rec.photo.digital
Robert Coe
external usenet poster
 
Posts: 4,901
Default VGA vs DVI connection to monitor

On Fri, 21 Jan 2011 22:17:18 -0900, (Floyd L. Davidson)
wrote:
: DaveS wrote:
: Clearly, there is no cost for me
: to find out for myself if there is any visible difference with this monitor.
:
: Whether you think you can see it on any given displayed
: image or not, use the DVI connection.
:
: I won't go so far as to say digital data is vastly
: better than analog data, but it is certainly better and
: you get significantly improved precision. Another point
: is that with age the VGA interface will drift far more
: than will the DVI interface.

DVI might be slightly more resistant to RF interference, especially if the
cable is long. But in normal use, it's very unlikely that you'll be able to
see any difference in image quality. That said, there's no reason not to take
Floyd's advice: if your card supports DVI, you might as well use it.

I have two dual-monitor setups at work, one of which uses one monitor on DVI
and one on VGA. On that setup, I can see a slight color difference between the
two monitors, but not enough to be annoying. On the setup with two DVI
monitors connected to the same video card, the colors look identical (given
identical settings of the monitors, of course).

Bob
  #5  
Old January 24th 11, 03:25 AM posted to rec.photo.digital
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default VGA vs DVI connection to monitor

Robert Coe wrote:
On Fri, 21 Jan 2011 22:17:18 -0900, (Floyd L. Davidson)
wrote:
: DaveS wrote:
: Clearly, there is no cost for me
: to find out for myself if there is any visible difference with this monitor.
:
: Whether you think you can see it on any given displayed
: image or not, use the DVI connection.
:
: I won't go so far as to say digital data is vastly
: better than analog data, but it is certainly better and
: you get significantly improved precision. Another point
: is that with age the VGA interface will drift far more
: than will the DVI interface.

DVI might be slightly more resistant to RF interference, especially if the
cable is long. But in normal use, it's very unlikely that you'll be able to
see any difference in image quality. That said, there's no reason not to take
Floyd's advice: if your card supports DVI, you might as well use it.


In normal use it should be an obvious difference. A
digital interface sends a specific discrete value to the
monitor. It is the exact same value each time, and is
calculated from the value in the digital image file. It
doesn't change, and has the same accuracy each time.

The VGA interface has to convert the digital value to an
analog value, and then the monitor has to using the
timing of a dot clock to pick out the precise time that
the right value is made available. It is not nearly as
precise as the process used by the digital interface.
It can never be as accurate.

I have two dual-monitor setups at work, one of which uses one monitor on DVI
and one on VGA. On that setup, I can see a slight color difference between the
two monitors, but not enough to be annoying.


But it *is* different! The difference is error.

On the setup with two DVI
monitors connected to the same video card, the colors look identical (given
identical settings of the monitors, of course).


No error.

--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)

  #6  
Old January 24th 11, 07:49 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default VGA vs DVI connection to monitor

On Sun, 23 Jan 2011 18:25:34 -0900, (Floyd L.
Davidson) wrote:

Robert Coe wrote:
On Fri, 21 Jan 2011 22:17:18 -0900,
(Floyd L. Davidson)
wrote:
: DaveS wrote:
: Clearly, there is no cost for me
: to find out for myself if there is any visible difference with this monitor.
:
: Whether you think you can see it on any given displayed
: image or not, use the DVI connection.
:
: I won't go so far as to say digital data is vastly
: better than analog data, but it is certainly better and
: you get significantly improved precision. Another point
: is that with age the VGA interface will drift far more
: than will the DVI interface.

DVI might be slightly more resistant to RF interference, especially if the
cable is long. But in normal use, it's very unlikely that you'll be able to
see any difference in image quality. That said, there's no reason not to take
Floyd's advice: if your card supports DVI, you might as well use it.


In normal use it should be an obvious difference. A
digital interface sends a specific discrete value to the
monitor. It is the exact same value each time, and is
calculated from the value in the digital image file. It
doesn't change, and has the same accuracy each time.

The VGA interface has to convert the digital value to an
analog value, and then the monitor has to using the
timing of a dot clock to pick out the precise time that
the right value is made available. It is not nearly as
precise as the process used by the digital interface.
It can never be as accurate.

I have two dual-monitor setups at work, one of which uses one monitor on DVI
and one on VGA. On that setup, I can see a slight color difference between the
two monitors, but not enough to be annoying.


But it *is* different! The difference is error.

On the setup with two DVI
monitors connected to the same video card, the colors look identical (given
identical settings of the monitors, of course).


No error.


Unless the monitors are calibrated, it might be two different errors.



Eric Stevens
  #8  
Old January 24th 11, 12:50 PM posted to rec.photo.digital
David J Taylor[_16_]
external usenet poster
 
Posts: 1,116
Default VGA vs DVI connection to monitor

In normal use it should be an obvious difference. A
digital interface sends a specific discrete value to the
monitor. It is the exact same value each time, and is
calculated from the value in the digital image file. It
doesn't change, and has the same accuracy each time.

[]
--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)


Maybe it /should/, but in practice it does not (at least on correctly
adjusted monitors).

Cheers,
David

  #9  
Old January 24th 11, 01:24 PM posted to rec.photo.digital
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default VGA vs DVI connection to monitor

"David J Taylor" wrote:
In normal use it should be an obvious difference. A
digital interface sends a specific discrete value to the
monitor. It is the exact same value each time, and is
calculated from the value in the digital image file. It
doesn't change, and has the same accuracy each time.

[]

Maybe it /should/, but in practice it does not (at least on correctly
adjusted monitors).


I don't agree with your statement at all. In practice
with a digital interface it sends *exactly* the same
value every time.

The problem for the analog interface is that is isn't
exactly the same every time.

And that of course is precisely the distinction between
digital and analog when it is affected by noise. The
digital system can function with a much lower SNR than
can an analog system. It's fundamental.

--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)
  #10  
Old January 24th 11, 03:47 PM posted to rec.photo.digital
David J Taylor[_16_]
external usenet poster
 
Posts: 1,116
Default VGA vs DVI connection to monitor

"Floyd L. Davidson" wrote in message
...
"David J Taylor" wrote:
In normal use it should be an obvious difference. A
digital interface sends a specific discrete value to the
monitor. It is the exact same value each time, and is
calculated from the value in the digital image file. It
doesn't change, and has the same accuracy each time.

[]

Maybe it /should/, but in practice it does not (at least on correctly
adjusted monitors).


I don't agree with your statement at all. In practice
with a digital interface it sends *exactly* the same
value every time.

The problem for the analog interface is that is isn't
exactly the same every time.

And that of course is precisely the distinction between
digital and analog when it is affected by noise. The
digital system can function with a much lower SNR than
can an analog system. It's fundamental.

--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)


Yes, you can get the "right" value into the monitor, but the issues of
drift and calibration inside the monitor are just the same as with an
analogue input monitor. I find that, in practice, drift of the analogue
components in a VGA interface isn't an issue, and neither have I seen VGA
signals affected by electrical noise even on moderate cable runs. Perhaps
I've been lucky!

Cheers,
David

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
VGA vs DVI connection to monitor David J Taylor[_16_] Digital Photography 1 January 21st 11 09:10 PM
Olympus E-400 Kodak connection RichA Digital SLR Cameras 9 December 15th 06 04:16 AM
Canon Wireless connection UKR Digital Photography 2 December 7th 06 05:31 PM
Bluetooth connection to camera UKR Digital Photography 6 December 5th 06 01:19 AM
Canon G3 USB connection issues Max Digital Photography 3 November 7th 04 05:50 PM


All times are GMT +1. The time now is 05:37 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.