A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

low light



 
 
Thread Tools Display Modes
  #71  
Old March 9th 07, 01:28 PM posted to rec.photo.digital,alt.photography
Roger N. Clark (change username to rnclark)
external usenet poster
 
Posts: 1,818
Default low light

Paul Rubin wrote:
"Roger N. Clark (change username to rnclark)" writes:
Here is pushing some other limits with a Canon 1D Mark II:

Night and Low Light Photography with Digital Cameras
http://www.clarkvision.com/photoinfo...ht.photography


Wow, neat. The super-high-ISO examples have very visible horizontal
banding--what happens if you take that out with a notch filter?


I have not tried that.

But all this talk about banding is a little
mis-informed in my opinion. John Sheehy seems to be saying that
because there is banding obvious at the low end is evidence for
non-photon noise sources. While true, one must look at the
level of the banding. For example, examine Figure 5 on the above web
page. Set 5 in Figure 5 shows banding at a similar level as the signal
in panels A and B (the left most two squares). But look at the table:
the photons per pixel is only 1.2 in panel A and 0.8 in panel B!
The read noise is 3.9 electrons, so the pattern noise is
about 1/4 the read noise. The problem is that our eyes plus
brain are very good at picking out patterns, whether that pattern
is below random noise, or embedded in other patterns.

It would be interesting to try some filtering on the images.

Is there a feasible way to remove the Bayer filter from a DSLR sensor?


I do not know.

What about shorter exposures at super ISO's?


Figure 12 on the above web page is a 1/20 second exposure at equivalent
ISO = 320,000. Do you want faster than that? It is simply a matter of photons
per pixel per exposure. I would not think faster exposures with
similar photons/pixel would appear any different. Longer at lower
light levels would not appear any different either until noise from
dark current starts to show. Dark current noise is the square root
of the dark current, and the 1D Mark II under the temperatures used
was around 0.03 electron/second. So a 10 second exposure would
about a 0.5 electron noise to the 3.9 electron read noise. Thermal
noise equals read noise after about 5 minutes.

Roger
  #72  
Old March 9th 07, 01:49 PM posted to rec.photo.digital,alt.photography
acl
external usenet poster
 
Posts: 1,389
Default low light

On Mar 9, 2:28 pm, "Roger N. Clark (change username to rnclark)"
wrote:
Paul Rubin wrote:
"Roger N. Clark (change username to rnclark)" writes:
Here is pushing some other limits with a Canon 1D Mark II:


Night and Low Light Photography with Digital Cameras
http://www.clarkvision.com/photoinfo...ht.photography


Wow, neat. The super-high-ISO examples have very visible horizontal
banding--what happens if you take that out with a notch filter?


I have not tried that.

But all this talk about banding is a little
mis-informed in my opinion. John Sheehy seems to be saying that
because there is banding obvious at the low end is evidence for
non-photon noise sources. While true, one must look at the
level of the banding. For example, examine Figure 5 on the above web
page. Set 5 in Figure 5 shows banding at a similar level as the signal
in panels A and B (the left most two squares). But look at the table:
the photons per pixel is only 1.2 in panel A and 0.8 in panel B!
The read noise is 3.9 electrons, so the pattern noise is
about 1/4 the read noise. The problem is that our eyes plus
brain are very good at picking out patterns, whether that pattern
is below random noise, or embedded in other patterns.


Actually I read his posts as saying, not that photon shot noise is
less important than you say in absolute terms, but that he finds
banding more disturbing. It seems to be a perceptual judgement; he
doesn't appear to be claiming anything quantitatively different from
what you say, just that it bothers him.

For what it's worth, I personally also find patterned noise much more
disturbing than random noise (I really don't mind random noise all
that much unless it gets to very high levels; of course it complicates
my postprocessing but that is another story). It also seems to be the
case that this patterned noise is more obvious to some people than to
others: I have prints from pushed high isos which I find have very
disturbing patterned noise (it jumps out at me immediately, and is
perceptually almost as strong as the image), while my wife and a
couple of friends don't notice it until I point it out, and then seem
to be unaware of it unless they consciously decide to "see" it. I
can't avoid seeing it at all. It seems to depend on the person; maybe
this is part of this confusion (or maybe not).

  #73  
Old March 10th 07, 02:33 AM posted to rec.photo.digital,alt.photography
Roger N. Clark (change username to rnclark)
external usenet poster
 
Posts: 1,818
Default low light

acl wrote:

Actually I read his posts as saying, not that photon shot noise is
less important than you say in absolute terms, but that he finds
banding more disturbing. It seems to be a perceptual judgement; he
doesn't appear to be claiming anything quantitatively different from
what you say, just that it bothers him.

For what it's worth, I personally also find patterned noise much more
disturbing than random noise (I really don't mind random noise all
that much unless it gets to very high levels; of course it complicates
my postprocessing but that is another story).


I too agree that pattern noise is more obvious that random noise.
Probably by at least a factor of ten. It is our eye+brain's
ability to pick out a pattern in the presence of a lot
of random noise that makes us able to detect many things
in everyday life. It probably developed as a necessary
thing for survival. But then it becomes a problem when we try
and make something artificial and we see the defects in it.
It gives the makers of camera gear quite a challenge.

Roger


  #74  
Old March 11th 07, 02:45 PM posted to rec.photo.digital,alt.photography
John Sheehy
external usenet poster
 
Posts: 878
Default low light

"Roger N. Clark (change username to rnclark)" wrote
in :


But all this talk about banding is a little
mis-informed in my opinion. John Sheehy seems to be saying that
because there is banding obvious at the low end is evidence for
non-photon noise sources. While true, one must look at the
level of the banding. For example, examine Figure 5 on the above web
page. Set 5 in Figure 5 shows banding at a similar level as the
signal in panels A and B (the left most two squares). But look at the
table: the photons per pixel is only 1.2 in panel A and 0.8 in panel
B! The read noise is 3.9 electrons, so the pattern noise is
about 1/4 the read noise. The problem is that our eyes plus
brain are very good at picking out patterns, whether that pattern
is below random noise, or embedded in other patterns.


Yes, that is a problem, and that is exactly why you can't evaluate noise by
standard deviation alone. It doesn't even take human perception to focus
on the banding; binning and downsampling math focus on it too; an
blackframe from a 20D with 10x the total noise as the horizontal banding
component will show *only* banding noise, and no visible 2D noise at all,
if binned down far enough. I think that this fact speaks volumes as to how
useless standard deviations and S/N ratios based on them can be when
comparing different *characteristics* of noises.


Thermal noise equals read noise after about 5 minutes.


Statistically, perhaps, but standard deviation does not tell the full
story. You can clearly compare the standard deviations of two noise
situations with the same characteristics, which only vary in terms of
amplitude, but noise comes in a variety of characteristics, and the
standard deviations are not necessarily related to the visual strength of
noise when the characteristics are different. Dark current noise is much
more visible than shot noise, with the same standard deviation, because
most of its energy goes into a minority of pixels.


--


John P Sheehy

  #75  
Old March 11th 07, 02:47 PM posted to rec.photo.digital,alt.photography
John Sheehy
external usenet poster
 
Posts: 878
Default low light

"Roger N. Clark (change username to rnclark)" wrote in
:

I too agree that pattern noise is more obvious that random noise.
Probably by at least a factor of ten. It is our eye+brain's
ability to pick out a pattern in the presence of a lot
of random noise that makes us able to detect many things
in everyday life. It probably developed as a necessary
thing for survival. But then it becomes a problem when we try
and make something artificial and we see the defects in it.
It gives the makers of camera gear quite a challenge.


How does that co-exist with your conclusion that current cameras are
limited by shot noise?

Saying that current cameras are limited by shot noise means that all future
improvements lie purely in well depth, quantum efficiency, fill factor, and
sensor size (you'd probably add "large pixels", but I'd disagree). The
fact is, a 10:1 S:N on the 1DmkII at ISO 100 would be 1.5 stops further
below saturation, and 1:1 would be 4.3 stop further below it, if there were
no blackframe read noise

http://www.pbase.com/jps_photo/image/75392571

and that is only statistically, without consideration for the pattern noise
effects, which widen the visual gap even further.

--


John P Sheehy

  #76  
Old March 11th 07, 11:11 PM posted to rec.photo.digital,alt.photography
Bart van der Wolf
external usenet poster
 
Posts: 314
Default low light


"John Sheehy" wrote in message
...
"Roger N. Clark (change username to rnclark)"
wrote
in :


The problem is that our eyes plus brain are very good at
picking out patterns, whether that pattern is below random
noise, or embedded in other patterns.


What's worse, we see non-existing patterns (e.g. a triangle in the
following link) because we want to:
http://www.xs4all.nl/~bvdwolf/temp/Triangle-or-not.gif.

Yes, that is a problem, and that is exactly why you can't evaluate
noise by standard deviation alone.


That depends what one wants to evaluate. Standard deviation (together
with mean) only tells something about pixel to pixel (or sensel to
sensel) performance. It doesn't allow to make valid judgements about
anything larger. Banding could be either calibrated out of the larger
structure, or an analysis of systematic noise should be done (and care
should be taken to not mistake Raw-converter effects for camera or
sensor array effects).

--
Bart

  #77  
Old March 11th 07, 11:15 PM posted to rec.photo.digital,alt.photography
Bart van der Wolf
external usenet poster
 
Posts: 314
Default low light


"John Sheehy" wrote in message
...
"Roger N. Clark (change username to rnclark)"
wrote in
:

I too agree that pattern noise is more obvious that random noise.
Probably by at least a factor of ten. It is our eye+brain's
ability to pick out a pattern in the presence of a lot
of random noise that makes us able to detect many things
in everyday life. It probably developed as a necessary
thing for survival. But then it becomes a problem when we try
and make something artificial and we see the defects in it.
It gives the makers of camera gear quite a challenge.


How does that co-exist with your conclusion that current cameras are
limited by shot noise?


Shot noise is a physical limitation, not a man made one. The man made
limitations can be improved upon.

--
Bart

  #78  
Old March 11th 07, 11:38 PM posted to rec.photo.digital,alt.photography
acl
external usenet poster
 
Posts: 1,389
Default low light

On Mar 12, 2:11 am, "Bart van der Wolf" wrote:
"John Sheehy" wrote in message

...

"Roger N. Clark (change username to rnclark)"
wrote
:


The problem is that our eyes plus brain are very good at
picking out patterns, whether that pattern is below random
noise, or embedded in other patterns.


What's worse, we see non-existing patterns (e.g. a triangle in the
following link) because we want to:
http://www.xs4all.nl/~bvdwolf/temp/Triangle-or-not.gif.

Yes, that is a problem, and that is exactly why you can't evaluate
noise by standard deviation alone.


That depends what one wants to evaluate. Standard deviation (together
with mean) only tells something about pixel to pixel (or sensel to
sensel) performance. It doesn't allow to make valid judgements about
anything larger.


As a matter of fact, they don't tell you anything (literally) about
pixel to pixel behaviour. If I tell you that a signal has mean zero
and given standard dev, what else can you tell me about it? Nothing.
It could be anything from an otherwise random time series to a sine
wave to a series of square waves to anything else. It's like knowing
the first two coefficients in an infinite power series (well that's
exactly what it is: the first two coefficients in an infinite power
series).

the reason people use the first two moments (mean and std) is that the
noises under consideration are often assumed to be gaussian, in which
case these 2 qtys completely characterise the noise. this is usually a
good approximation when the noise comes from many different sources.

Banding could be either calibrated out of the larger
structure, or an analysis of systematic noise should be done (and care
should be taken to not mistake Raw-converter effects for camera or
sensor array effects).



  #79  
Old March 12th 07, 04:16 AM posted to rec.photo.digital,alt.photography
Roger N. Clark (change username to rnclark)
external usenet poster
 
Posts: 1,818
Default low light

John Sheehy wrote:
"Roger N. Clark (change username to rnclark)" wrote in
:

I too agree that pattern noise is more obvious that random noise.
Probably by at least a factor of ten. It is our eye+brain's
ability to pick out a pattern in the presence of a lot
of random noise that makes us able to detect many things
in everyday life. It probably developed as a necessary
thing for survival. But then it becomes a problem when we try
and make something artificial and we see the defects in it.
It gives the makers of camera gear quite a challenge.


How does that co-exist with your conclusion that current cameras are
limited by shot noise?

Saying that current cameras are limited by shot noise means that all future
improvements lie purely in well depth, quantum efficiency, fill factor, and
sensor size (you'd probably add "large pixels", but I'd disagree). The
fact is, a 10:1 S:N on the 1DmkII at ISO 100 would be 1.5 stops further
below saturation, and 1:1 would be 4.3 stop further below it, if there were
no blackframe read noise

http://www.pbase.com/jps_photo/image/75392571

and that is only statistically, without consideration for the pattern noise
effects, which widen the visual gap even further.

Nice plot. If you look at my past posts, you would also see that
I've said for at least a couple of years 14-bit or higher A/D are
needed too because current DSLRs are limited by 12-bit converters.
Some attacked me in this NG with the idea that "if more than 12-bits were
really needed, then why haven't camera manufacturers done it?"
We'll we now see they have, and I'm sure 14 or more-bits will become a
new standard in future DSLRs.

Regarding fixed pattern noise versus photon Poisson noise, your plot
and some simple illustrations show what is dominant. First clue,
look at the thousands of images on the net. How many show fixed
pattern noise? It is very rare. You tend to see fixed pattern noise
at the very lowest lows in an image. Second, if fixed pattern noise
is really a factor, guess what, you can calibrate most of it out with dark
frame subtraction. I think good examples of fixed pattern noise is
illustrated at:
http://www.clarkvision.com/photoinfo...ht.photography
Figure 1, for example shows two merged low light images and fixed pattern
noise is not apparent, nor is it the dominant noise source in the image.
Figure 2 shows the black sky above the Sydney opera house in an ISO 100
20 second exposure. Fixed pattern noise is a little over 1 bit out of 12
in the raw data. It simply is not a factor. But where the scene has
signal, e.g. the lit roof, noise is proportional to the square root
of the signal strength, with photon noise up to 18 out of 4095
in the 12-bit raw file. So, over most of the range, photon noise
dominates. The low end, the bottom few values or bottom couple of bits,
is a combination of photon noise, read noise, and fixed pattern noise.
That gives about 10 bits out of 12 with photon noise as the dominant
noise source. Again, if you work at the low end, calibrate out
the majority of fixed pattern noise with dark frames.


Let's work an example.
Let's assume fixed pattern noise is more objectionable by
10 times random noise (this is a reasonable estimate
for me, and I find fixed pattern noise quite objectionable).
But then with processing, e.g. dark frame subtraction, it can
be reduced about 10x, then filtered and reduced more, all with
minimal impact on resolution. Random photon noise in an image
from can only be reduced by pixel averaging, thus reducing spatial
resolution.

Let's use your full well depth, rounding off to 53,000 electrons.
Fixed pattern noise in DSLRs like the 20D and 1D Mark II are between 1 and
2 bits in the A/D at low ISOs. At low signal levels, line-to-line
pattern noise is on the order of 7 electrons in the 1D Mark II, with
low frequency offset of a few tens of electrons (at ISO 100 fixed pattern
noise appears at about the 1-bit level, which is ~13 electrons. The low frequency
fixed pattern noise is entirely eliminated by a dark frame subtraction,
and line-to-line (what you call 1D) is reduced by about 10X with
dark frame subtraction.

So there are multiple conditions. Here is one example:

ISO 100, 1D Mark II, 53,000 electron full signal:

Signal Photon noise Read Noise Fixed-pattern What noise dominates
(elect- stops (electrons) +A/D noise noise Photon, read, or pattern
rons) (electrons) (electrons)

53,000 0 230 17 ~13 Photon
12,250 -4 110 17 ~13 Photon
3,312 -6 57 17 ~13 Photon
828 -8 29 17 ~13 Photon
207 -10 14 17 ~13 all 3 similar
51 -12 7 17 ~13 read + pattern

The above table demonstrate the the sensor has noise dominated by photon
statistics over most of its dynamic range. Each generation
of cameras that comes out pushed the floor where other noise sources in the
electronics show. It is likely we'll see the 1D Mark III push those limits
a stop or two lower. But photon noise remains and is the ultimate
limit.

Here is another test series that illustrates the above conclusions:
Digital Camera Raw Converter Shadow Detail and Image Editor Limitations:
Factors in Getting Shadow Detail in Images
http://www.clarkvision.com/imagedeta....shadow.detail

Figure 6 shows areas from +2 to -7.6 stops. But if you look at the different
raw conversions, you'll see widely different results and wildly different
fixed pattern noise. Then look at Figure 16: the camera jpeg looks pretty
clean with less pattern noise than some of the raw conversions.
So when you say you don't believe photon noise versus fixed pattern noise,
understand the effects of converters too.

Roger
  #80  
Old March 12th 07, 01:27 PM posted to rec.photo.digital,alt.photography
acl
external usenet poster
 
Posts: 1,389
Default low light

On Mar 12, 2:53 pm, "Roger N. Clark (change username to rnclark)"
wrote:

And that is why people who evaluate sensors do more than simply
study the standard deviation of one image. To understand noise sources,


Never claimed otherwise! By the way, why don't people study the full
power spectrum of the noise (ie of a blackframe)? That would give
quite a lot of information (it should allow distinguishing between the
white part of the noise and things like banding). And it should not be
too hard (eg with IRIS, split the channels and FT them). And if you do
that to an average of many frames, you'll be studying repeatable noise
only. Is there some particular reason this isn't done by anybody?

The Nikon D50 Digital Camera:
Sensor Noise, Dynamic Range, and Full Well Analysis
http://www.clarkvision.com/imagedeta...tion-nikon-d50


That's quite interesting, why don't you include dark frames from more
cameras? I'd think that this would be quite useful for people
intending to do very low light work.

http://www.clarkvision.com/imagedeta...mparisons/inde...

and more at:http://www.clarkvision.com/imagedeta...ensor_analysis

other:http://www.astrosurf.org/buil/20d/20dvs10d.htm

Roger


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
light MC Digital Photography 2 August 26th 06 02:18 PM
[SI] Low Light Paul Furman 35mm Photo Equipment 13 February 1st 06 03:11 AM
How would you light this? Roxy Durban 35mm Photo Equipment 39 December 28th 04 02:44 AM
How much light? Robert Meyers 35mm Photo Equipment 8 October 5th 04 06:24 PM
reflected light vs incident light metering Gordon Moat 35mm Photo Equipment 15 July 16th 04 12:27 AM


All times are GMT +1. The time now is 09:53 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.