A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital SLR Cameras
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

"Exposure" vs "Digitization



 
 
Thread Tools Display Modes
  #1  
Old August 7th 05, 01:13 PM
external usenet poster
 
Posts: n/a
Default "Exposure" vs "Digitization

After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.

Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.
--


John P Sheehy

  #2  
Old August 7th 05, 03:03 PM
Gregory Blank
external usenet poster
 
Posts: n/a
Default

In article ,
wrote:

After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.



Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.


In truth its not about exposure, analog or digital....its selective
contrast determination and what can be recorded within the parameters of
the hardware. That is; its More or less about what you wish to drop or
pick up when you select to use analog or digital. But only within a
narrow reference as given by the maker of film or the maker of the
camera.

Perhaps its more an issue of word choice for people less able to grasp
the concept. But you are exposing the sensor to light, so you are making
an exposure.

To answer you quite directly: for lack of using a better description
and to be concise.

--
LF Website @
http://members.verizon.net/~gregoryblank

"To announce that there must be no criticism of the President,
or that we are to stand by the President, right or wrong,
is not only unpatriotic and servile, but is morally treasonable
to the American public."--Theodore Roosevelt, May 7, 1918
  #3  
Old August 7th 05, 03:24 PM
external usenet poster
 
Posts: n/a
Default

In message ,
Gregory Blank wrote:

In article ,
wrote:


After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.


Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.


In truth its not about exposure, analog or digital....its selective
contrast determination and what can be recorded within the parameters of
the hardware. That is; its More or less about what you wish to drop or
pick up when you select to use analog or digital. But only within a
narrow reference as given by the maker of film or the maker of the
camera.


Perhaps its more an issue of word choice for people less able to grasp
the concept. But you are exposing the sensor to light, so you are making
an exposure.


To answer you quite directly: for lack of using a better description
and to be concise.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.
--


John P Sheehy

  #4  
Old August 7th 05, 03:59 PM
Gregory Blank
external usenet poster
 
Posts: n/a
Default

In article ,
wrote:

To answer you quite directly: for lack of using a better description
and to be concise.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.



Thats a rather extreme example,... & it seems unlikely.

That is: is it better for noise, range and color? Or does one make the
choice to keep two and drop one from the equation? Because if its better
for all three you would only need one ISO setting and not any
supplemental light sources.

You add flash/or lights as needed to make the Iso 100 image. Digital
does not solve the problems that exist beyond the scope of the camera-
lighting. & more likely It never will. Lighting is separate set of
issues and require knowledge. I can't seem to state this enough to
people, its something schools should teach

--
LF Website @
http://members.verizon.net/~gregoryblank

"To announce that there must be no criticism of the President,
or that we are to stand by the President, right or wrong,
is not only unpatriotic and servile, but is morally treasonable
to the American public."--Theodore Roosevelt, May 7, 1918
  #5  
Old August 7th 05, 04:56 PM
external usenet poster
 
Posts: n/a
Default

In message ,
Gregory Blank wrote:

In article ,
wrote:

To answer you quite directly: for lack of using a better description
and to be concise.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.


Thats a rather extreme example,... & it seems unlikely.


If you think that's unlikely, you haven't been reading people's posts,
or DPReview. The problem of people under-digitizing at ISO 100 is
epidemic, because of the myth that ISO settings cause noise.

"Why is the sky so noisy in my ISO 100 picture" is a common question.
Of course, it is not just sensor-noisy, it's also highly posterized as
well, and would have looked better if taken at a higher ISO setting,
even with the same aperture and shutter speed. If they were using a
tripod, of course, they could have had a good digitization at a higher
absolute exposure at a lower ISO. I personally don't use ISO 100 very
often, but aim for ISO 200 if I can do it with a full digitization.
Blooming looms just above RAW value 4095 at ISO 100. In my experiments,
the trade-off between sterile posterization and noise indicates that
there is very little value in using ISO 100 over ISO 200 on my Canon
20D. The shadows are of approximately equal worth.

That is: is it better for noise, range and color? Or does one make the
choice to keep two and drop one from the equation? Because if its better
for all three you would only need one ISO setting and not any
supplemental light sources.


You add flash/or lights as needed to make the Iso 100 image.


Not in available light photography, you don't.

Digital
does not solve the problems that exist beyond the scope of the camera-
lighting. & more likely It never will. Lighting is separate set of
issues and require knowledge.


Optimal lighting is different for digital and film. Color film
generally wants to see sunlight or tungsten, depending on the film.
Most digitals have neither sunlight nor tungsten as their native white
balance. The native balances generally run from magenta to pink
lighting with RGB bayer cameras. My Canon DSLRs get the best images
with lighting that is a stop stronger red than green, and a half stop
stronger blue than green.

I can't seem to state this enough to
people, its something schools should teach


Should be taught specific to digital, but I doubt that there are many
teachjers who know the difference.

Available-light photography can only be improved by maximizing exposure
without clipping, or using filters over the lens if there is enough
light.

--


John P Sheehy

  #6  
Old August 7th 05, 07:13 PM
Gregory Blank
external usenet poster
 
Posts: n/a
Default

In article ,
wrote:

Thats a rather extreme example,... & it seems unlikely.


If you think that's unlikely, you haven't been reading people's posts,
or DPReview. The problem of people under-digitizing at ISO 100 is
epidemic, because of the myth that ISO settings cause noise.


Up until this moment:

The only posts I have read are yours and mine regarding this
thread. Hey under exposure is under exposure. On film you get nothing.


"Why is the sky so noisy in my ISO 100 picture" is a common question.
Of course, it is not just sensor-noisy, it's also highly posterized as
well, and would have looked better if taken at a higher ISO setting,
even with the same aperture and shutter speed. If they were using a
tripod, of course, they could have had a good digitization at a higher
absolute exposure at a lower ISO. I personally don't use ISO 100 very
often, but aim for ISO 200 if I can do it with a full digitization.
Blooming looms just above RAW value 4095 at ISO 100. In my experiments,
the trade-off between sterile posterization and noise indicates that
there is very little value in using ISO 100 over ISO 200 on my Canon
20D. The shadows are of approximately equal worth.

That is: is it better for noise, range and color? Or does one make the
choice to keep two and drop one from the equation? Because if its better
for all three you would only need one ISO setting and not any
supplemental light sources.


You add flash/or lights as needed to make the Iso 100 image.


Not in available light photography, you don't.


So knowledge of basic photography is a good thing if one wants to make
good pictures.

Digital
does not solve the problems that exist beyond the scope of the camera-
lighting. & more likely It never will. Lighting is separate set of
issues and require knowledge.


Optimal lighting is different for digital and film.


To a degree maybe, but fairly close for slide film and the sensor.

Color film
generally wants to see sunlight or tungsten, depending on the film.


Most digitals have neither sunlight nor tungsten as their native white
balance. The native balances generally run from magenta to pink
lighting with RGB bayer cameras. My Canon DSLRs get the best images
with lighting that is a stop stronger red than green, and a half stop
stronger blue than green.


Like I stated prior it relative to what the maker imparts.

I can't seem to state this enough to
people, its something schools should teach


Should be taught specific to digital, but I doubt that there are many
teachjers who know the difference.


Should be taught specific to what works it incorporates many
discipline fields. Video, film and digital the principles of light
adjustment should be considered a portion of many curriculum including
architectural design,...they maybe in that but I am just stating that
its an important branch of making pictures.

Available-light photography can only be improved by maximizing exposure
without clipping, or using filters over the lens if there is enough
light.


It all boils down to understanding, that is knowing when to use that
filter and when not,...its the same for film images.

--
LF Website @
http://members.verizon.net/~gregoryblank

"To announce that there must be no criticism of the President,
or that we are to stand by the President, right or wrong,
is not only unpatriotic and servile, but is morally treasonable
to the American public."--Theodore Roosevelt, May 7, 1918
  #9  
Old August 7th 05, 09:04 PM
Jeremy Nixon
external usenet poster
 
Posts: n/a
Default

wrote:

If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.


The thing is, you're the only person I have *ever* seen talk in terms of
"absolute exposure", or compare different ISOs at the same aperture and
shutter speed. I'm not saying that's not a valid way to think about it,
I just don't see how it could be useful to me. ISO 100 *does* give the
best quality, as long as you don't underexpose it, and it's a given that
a proper exposure at an elevated ISO rating is better than underexposing
at ISO 100.

Now, there is another angle to the whole thing, and that is the 12-bit
A/D conversion. We've discussed previously how the dynamic range of
current sensors is not limited by the sensor's capability, but rather
by the A/D conversion. This presents an interesting situation. Imagine
that a camera used 16-bit A/D conversion. Imagine that the extra range
actually *did* use all of the data available from the sensor. You now
have a situation where higher ISO settings are meaningless, and the
camera would have to be marketed as (for example) ISO 100 with *no*
higher settings. Imagine the outcry!

The simple fact that higher ISO settings exist and are useful tells
us that the A/D conversion is not using all of the data the sensor is
providing. Higher ISOs are accomplished by amplifying the signal.
If you can usefully amplify the signal to ISO 800, that means there
was a signal there in the first place to amplify, one that *could*
have been used at ISO 100, but was ignored at that setting. If no
data from the sensor were ignored, there would be nothing left to
amplify, and ISO 200 would just be ISO 100 with one stop less of
range and no actual advantage whatsoever. That is, it *would* be
better to underexpose at ISO 100 and then push it in processing, to
avoid the amplification step.

So, it seems that either 16-bit A/D conversion is more complicated
to put into a camera than it sounds, or we are having our dynamic
range artificially limited in order to allow camera manufacturers
to say that their cameras can go to ISO 800 or whatever.

Of course, if the sensor can provide more range than a 16-bit
conversion would need, then there would still be room for higher
ISO settings.

--
Jeremy |
  #10  
Old August 7th 05, 05:18 PM
Alan Browne
external usenet poster
 
Posts: n/a
Default

wrote:

After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.

Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.


This is photography so photographic terms apply.

There is nothing wrong with the word exposure for digital capture.
After all the sensor is exposed to light for a period of time and during
that time the sensors 'charge up' from the expsoure and then the data is
recorded.

From Webster's:
4 : a piece or section of sensitized material (as film) on which an
exposure is or can be made 36 exposures per roll

While they state film, the "material" can be anything that is sensitive
to photons including the sensors (sites) that make up the sensor array
in the camera.

Regarding RAW processing, it is analogous (at a high enough level) to
the adjustments one might make in the darkroom (pushing, puling,
burning, dodging, pre-flashing the paper ... etc.)

For that matter, the same applies to scanners.

You may be right about the term "digitization" but you draw strange
looks becasue it is not the famillar term. And there really is nothing
wrong with the term exposure. That word says is all: Time X aperture.

Cheers,
Alan

--
-- r.p.e.35mm user resource:
http://www.aliasimages.com/rpe35mmur.htm
-- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
-- [SI] gallery & rulz: http://www.pbase.com/shootin
-- e-meil: Remove FreeLunch.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
COMM: Australia only- film prices Karl General Equipment For Sale 1 February 9th 05 01:25 AM
What densities at which zones? ~BitPump Large Format Photography Equipment 24 August 13th 04 04:15 AM
Kodak on Variable Film Development: NO! Michael Scarpitti In The Darkroom 276 August 12th 04 10:42 PM
Digital Exposure Question -- Middle Gray vs Exposure At Highlights MikeS Digital Photography 1 June 24th 04 08:04 AM
Develper for Delta-100 Frank Pittel In The Darkroom 8 March 1st 04 04:36 PM


All times are GMT +1. The time now is 10:14 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.