A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital SLR Cameras
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

dpreview pixel density metric



 
 
Thread Tools Display Modes
  #21  
Old July 8th 08, 08:34 AM posted to rec.photo.digital.slr-systems
Kennedy McEwen
external usenet poster
 
Posts: 639
Default dpreview pixel density metric

In article , nospam
writes
In article , Kennedy McEwen
wrote:

by slicing the pixel into three layers, each layer will have a lower
well capacity (particularly the top layer since they're not equivalent
thicknesses).

so instead of (say) 60k photons for the entire pixel, it would be 20k
each (for equivalent size slices) or for foveon, ~8k for the top layer
(it is about 1/8th the thickness of the total) which is quite low.


Provide evidence of this naive claim, since it certainly isn't
consistent with the available Foveon data. Indeed some manufacturers
using the Foveon chip concede that the SNR is actually limited by their
selection of 12-bit ADC precision, which certainly would not be the case
for a carrier saturation limit of only 20,000e.

Similar comparative test images on Dpreview for Sigma cameras using the
Foveon chips show no more noise than comparable Bayer sensors from other
manufacturers, and actually a lot less noise than many.

Finally, the lower pixel capacity you claim besets the technology would
also yield a higher intrinsic ISO, and there is no evidence at all of
Foveon or anyone else claiming a higher base ISO than anyone else. On
the contrary, they offer a lower base ISO than most Bayer type
Nikon/Sony sensors.

looking at foveon images, the blue channel (mostly from the top layer)
is quite noisy.


Looking at the noise levels for most Bayer sensors (ie. the actual
sensor data sheets) you will find that their noise is much worse in the
blue channel as well! Even monochrome sensors show reduced sensitivity
and hence higher noise in the blue region. This has been an issue with
silicon sensors since they first appeared, and nothing intrinsic in the
Foveon design makes it any worse.

The question is about what pixel count is relevant to image resolution.
The Bayer camp are certainly as guilty of exaggerating that as much as
the Foveon camp.


they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples - usually taken from three individually filtered CRTs. It only
became one of those colour samples well into the Bayer era, at the start
of the megapixel war of digital camera marketing.

What do you think the discussions about hard and soft AA filters are all
about?


the aa filter limits detail that can't be resolved.


Either it is resolved, in which case the AA filter is irrelevant and
pixel density is what matters for resolution, or it isn't resolved and
what should be used in terms of resolution is much less than the pixel
density. You are being "Bayersially" ambiguous in a biased attempt to
knock an alternative technology.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #22  
Old July 8th 08, 10:42 AM posted to rec.photo.digital.slr-systems
nospam
external usenet poster
 
Posts: 24,165
Default dpreview pixel density metric

In article , Kennedy McEwen
wrote:

by slicing the pixel into three layers, each layer will have a lower
well capacity (particularly the top layer since they're not equivalent
thicknesses).

so instead of (say) 60k photons for the entire pixel, it would be 20k
each (for equivalent size slices) or for foveon, ~8k for the top layer
(it is about 1/8th the thickness of the total) which is quite low.


Provide evidence of this naive claim, since it certainly isn't
consistent with the available Foveon data.


it's *from* foveon themselves. refer to page 3, figure 5:

http://www.foveon.com/files/CIC10_Lyon_Hubel_FINAL.pdf

the depths are also described in the patent. another way to visualize
is it from this illustration:

http://upload.wikimedia.org/wikipedi...ption-X3.png/5
00px-Absorption-X3.png

Indeed some manufacturers
using the Foveon chip concede that the SNR is actually limited by their
selection of 12-bit ADC precision, which certainly would not be the case
for a carrier saturation limit of only 20,000e.


this person measured the dynamic range at 62db, or 10.3 bits
http://forums.dpreview.com/forums/read.asp?forum=1027&message=26937648
http://forums.dpreview.com/forums/read.asp?forum=1027&message=25231317

and that's consistent with data from alt-vision, who list it as 61db,
as well as foveon, who lists the dynamic range 'in excess of 62 db':
http://www.foveon.com/files/F13_image_sensor_Product_Flier_RevD.pdf

Similar comparative test images on Dpreview for Sigma cameras using the
Foveon chips show no more noise than comparable Bayer sensors from other
manufacturers, and actually a lot less noise than many.


i see noise and green/magenta splotches, even on images produced by
sigma themselves (the dp1 brochure, for instance).

here's a picture from dick lyon, foveon's chief scientist, who should
know how to obtain the best quality from the sensor. it's *really*
noisy, and it's only iso 200!

http://www.pbase.com/rfl/image/76937712/original

and dpreview found the dp1 to be noisy:

http://www.dpreview.com/reviews/Sigmadp1/page21.asp

"Some chroma noise even at base ISO, lots of it at higher sensitivities"

Finally, the lower pixel capacity you claim besets the technology would
also yield a higher intrinsic ISO, and there is no evidence at all of
Foveon or anyone else claiming a higher base ISO than anyone else. On
the contrary, they offer a lower base ISO than most Bayer type
Nikon/Sony sensors.


the base iso of the foveon sensor is 100, just like most bayer sensors,
other than the 6mp sensor that started at 200. however, it gets quite
noisy very rapidly as iso increases. on the sd14, iso 1600 is even in
'extended mode.' the sd9 didn't even *have* an iso 1600.

looking at foveon images, the blue channel (mostly from the top layer)
is quite noisy.


Looking at the noise levels for most Bayer sensors (ie. the actual
sensor data sheets) you will find that their noise is much worse in the
blue channel as well!


any links? i've looked at roger clark's data, and the noise is quite
low, particularly with canon's sensors. also, bayer images lack the
green magenta splotches common to foveon images. in foveon, the blue
channel is highly posterized. it's a mess.

Even monochrome sensors show reduced sensitivity
and hence higher noise in the blue region. This has been an issue with
silicon sensors since they first appeared, and nothing intrinsic in the
Foveon design makes it any worse.


see above.

The question is about what pixel count is relevant to image resolution.
The Bayer camp are certainly as guilty of exaggerating that as much as
the Foveon camp.


they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples - usually taken from three individually filtered CRTs.


it still does. it's a spatial element of an image. it also doesn't
have to be three samples per pixel (e.g., cmyk or hexachrome).

It only
became one of those colour samples well into the Bayer era, at the start
of the megapixel war of digital camera marketing.


what's on the sensor is really a sensor element (sensel), not a pixel,
but the count of them are the same and people use the terms
interchangably. a bayer sensor measures one component per pixel and
calculates the other two. there's still three components per pixel in
the output image. chroma resolution is lower, but so is that of the
human eye, and it's not noticable, except in artificial test cases.
  #23  
Old July 8th 08, 05:04 PM posted to rec.photo.digital.slr-systems
Ray Fischer
external usenet poster
 
Posts: 5,136
Default dpreview pixel density metric

Kennedy McEwen wrote:
nospam


The question is about what pixel count is relevant to image resolution.
The Bayer camp are certainly as guilty of exaggerating that as much as
the Foveon camp.


they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples


That has never been true. Even now there are sensors which are
monochrome and yet capture monochrome pixels.

--
Ray Fischer


  #24  
Old July 8th 08, 09:43 PM posted to rec.photo.digital.slr-systems
Kennedy McEwen
external usenet poster
 
Posts: 639
Default dpreview pixel density metric

In article , nospam
writes
In article , Kennedy McEwen
wrote:

by slicing the pixel into three layers, each layer will have a lower
well capacity (particularly the top layer since they're not equivalent
thicknesses).

so instead of (say) 60k photons for the entire pixel, it would be 20k
each (for equivalent size slices) or for foveon, ~8k for the top layer
(it is about 1/8th the thickness of the total) which is quite low.


Provide evidence of this naive claim, since it certainly isn't
consistent with the available Foveon data.


it's *from* foveon themselves. refer to page 3, figure 5:

http://www.foveon.com/files/CIC10_Lyon_Hubel_FINAL.pdf

the depths are also described in the patent. another way to visualize
is it from this illustration:

http://upload.wikimedia.org/wikipedi...ption-X3.png/5
00px-Absorption-X3.png

The above links only show the three layer structure, which is not in
dispute since it is the now well known Foveon principle. However *none*
of the above links show that the structure is commensurate with a
reduced storage capacitance or higher noise per pixel and none of these
references even discuss your claim. Indeed, the second paragraph of
Page 5 appears to directly dispute your claim in stating that the use of
non-RGB Bayer filters cause an SNR reduction relative to other
approaches, including Foveon's own.

Indeed some manufacturers
using the Foveon chip concede that the SNR is actually limited by their
selection of 12-bit ADC precision, which certainly would not be the case
for a carrier saturation limit of only 20,000e.


this person measured the dynamic range at 62db, or 10.3 bits

....using data *output* by the camera, ie. *after* deconvolving the
colour matrix. You can't draw any conclusions about the actual pixel
storage capacity from that!

and that's consistent with data from alt-vision, who list it as 61db,
as well as foveon, who lists the dynamic range 'in excess of 62 db':
http://www.foveon.com/files/F13_image_sensor_Product_Flier_RevD.pdf

Again, you use different parameters to draw wrong conclusions - the
dynamic range is not solely related to the storage capacitance. Where
is your evidence that the storage capacitance is actually 1/3 of a
conventional pixel?

the base iso of the foveon sensor is 100, just like most bayer sensors,
other than the 6mp sensor that started at 200. however, it gets quite
noisy very rapidly as iso increases. on the sd14, iso 1600 is even in
'extended mode.' the sd9 didn't even *have* an iso 1600.

Again, that is to be expected due the reduced colour separation and
consequential matrix deconvolution required to create conventional RGB
samples. It is *not* evidence of your claimed reduced storage capacity
per colour sample compared to conventional devices.

The question is about what pixel count is relevant to image resolution.
The Bayer camp are certainly as guilty of exaggerating that as much as
the Foveon camp.

they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples - usually taken from three individually filtered CRTs.


it still does. it's a spatial element of an image. it also doesn't
have to be three samples per pixel (e.g., cmyk or hexachrome).

It needs to be *at least* three samples per pixel for colour. You, like
most Bayer-fanbois, count each and every monochrome sample, which is
quite wrong.
It only
became one of those colour samples well into the Bayer era, at the start
of the megapixel war of digital camera marketing.


what's on the sensor is really a sensor element (sensel), not a pixel,
but the count of them are the same and people use the terms
interchangably. a bayer sensor measures one component per pixel and
calculates the other two.


The Bayer sensor does not calculate the other two - it is up to the
processing electronics to *estimate* the other two by interpolation from
adjacent monochrome information in alternate colours by one of many
algorithms. It requires AA filters to ensure that these interpolation
algorithms are not fooled into creating completely erroneous data and
hence cross-colour aliasing.

there's still three components per pixel in
the output image. chroma resolution is lower, but so is that of the
human eye, and it's not noticable, except in artificial test cases.


Whether it is noticeable in the human eye is irrelevant since the image
can be viewed at any size, but it is actually a completely false
argument. Without the AA filter blurring the optical image across
multiple Bayer sensors, the cross colour aliasing certainly would (and
often is in cases where manufacturers have attempted to gain resolution
by using an inadequate AA filter) be very noticeable to the human eye.

Since the Bayer filter requires the AA filter to blur the image over
several different monochrome pixels to obtain full colour pixel data, it
is quite wrong to use the total monochrome pixel count directly in
estimation of resolution. Which is back to where we came in... both
sides exaggerate the relevant pixel count in image resolution estimates.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #25  
Old July 8th 08, 09:43 PM posted to rec.photo.digital.slr-systems
Kennedy McEwen
external usenet poster
 
Posts: 639
Default dpreview pixel density metric

In article , Ray Fischer
writes
Kennedy McEwen wrote:
nospam


The question is about what pixel count is relevant to image resolution.
The Bayer camp are certainly as guilty of exaggerating that as much as
the Foveon camp.

they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples


That has never been true. Even now there are sensors which are
monochrome and yet capture monochrome pixels.

That is why I emphasised *colour* in the text above. A monochrome pixel
can, of course, be a single image sample, but that isn't what was being
discussed. It is, however, precisely the metric that Bayer users use
for their *colour* sensors, whilst objecting to Foveon using the same
definition. I argue that *both* are wrong.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #26  
Old July 8th 08, 11:22 PM posted to rec.photo.digital.slr-systems
nospam
external usenet poster
 
Posts: 24,165
Default dpreview pixel density metric

In article , Kennedy McEwen
wrote:

The above links only show the three layer structure, which is not in
dispute since it is the now well known Foveon principle. However *none*
of the above links show that the structure is commensurate with a
reduced storage capacitance or higher noise per pixel and none of these
references even discuss your claim. Indeed, the second paragraph of
Page 5 appears to directly dispute your claim in stating that the use of
non-RGB Bayer filters cause an SNR reduction relative to other
approaches, including Foveon's own.


please explain how a sensor with a well capacity of 60k can maintain
60k photons per layer, which suggests that the total would then be 180k
photons. that makes no sense at all and i don't see any evidence to
support it.

if the full well is 60k and you slice it into three equal parts (which
foveon doesn't do), then each part can capture 1/3 of the total, or in
this case, 20k photons. combine the three layers and you have the
original 60k capacity and no colour differentiation.

according to foveon, the top layer is about 1/8th the thickness of the
entire well, thus its capacity is about 1/8th of the total, or 7500
photons.

i'd love to see a foveon sensor analysis along the lines of what roger
clark has done and get a more accurate count of the entire pixel versus
each individual layer.

absent that, all of the evidence that i've seen, including noisy and
posterized blue channels in images, noise being worse in tungsten
light, problems at moderate iso, the need to get the exposure exactly
correct (very little latitude), etc. are all consistent with this.

they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples - usually taken from three individually filtered CRTs.


it still does. it's a spatial element of an image. it also doesn't
have to be three samples per pixel (e.g., cmyk or hexachrome).

It needs to be *at least* three samples per pixel for colour. You, like
most Bayer-fanbois, count each and every monochrome sample, which is
quite wrong.


the correct way to count pixels is to count spatial elements, since
that's exactly what they are.

the fact that bayer measures one component per pixel and calculates the
other two is an implementation detail that makes the sensor a little
less accurate than it otherwise could be, but the error is actually
very small.

foveon measures three samples per pixel and then goes through a lookup
table and a transform to get the final three component rgb pixel, so in
reality, it's interpolating all three components, versus bayer
interpolating only two.

The Bayer sensor does not calculate the other two - it is up to the
processing electronics to *estimate* the other two by interpolation from
adjacent monochrome information in alternate colours by one of many
algorithms.


technically true, but that's a nitpick. i don't know of anyone who
uses a bayer sensor without the associated electronics and raw
processing to produce images. thus, referring to a bayer sensor
implies using the entire system and not just the sensor alone.

there's still three components per pixel in
the output image. chroma resolution is lower, but so is that of the
human eye, and it's not noticable, except in artificial test cases.


Whether it is noticeable in the human eye is irrelevant


it's very relevant. why bother capturing what the eye can't see?

sure it sounds great to have full colour fidelity, but humans aren't
going to see a difference. why bother?

since the image
can be viewed at any size, but it is actually a completely false
argument. Without the AA filter blurring the optical image across
multiple Bayer sensors, the cross colour aliasing certainly would (and
often is in cases where manufacturers have attempted to gain resolution
by using an inadequate AA filter) be very noticeable to the human eye.

Since the Bayer filter requires the AA filter to blur the image over
several different monochrome pixels to obtain full colour pixel data,


all sampling systems require anti-alias filtering, or they risk alias
artifacts with signals near and certainly past nyquist. on a bayer
sensor, the artifacts produce ugly colour patterns. on foveon, the
artifacts are not as readily noticable, but they are definitely there.


here's an old example photo:
http://www.wfu.edu/~matthews/misc/DigPhotog/alias/artifact.jpg

it
is quite wrong to use the total monochrome pixel count directly in
estimation of resolution. Which is back to where we came in... both
sides exaggerate the relevant pixel count in image resolution estimates.


pixel counts are just that -- counts of spatial elements.

the sensor in the nikon d300 has 12 million spatial locations, or 12
megapixels. the sensor in the sigma sd14 has 4.7 million spatial
locations, or 4.7 megapixels. each foveon pixel has 3 receptors, for a
total of 14 million receptor sites, but it does not have 14 million
pixels.
  #27  
Old July 9th 08, 07:47 PM posted to rec.photo.digital.slr-systems
Kennedy McEwen
external usenet poster
 
Posts: 639
Default dpreview pixel density metric

In article , nospam
writes
In article , Kennedy McEwen
wrote:

The above links only show the three layer structure, which is not in
dispute since it is the now well known Foveon principle. However *none*
of the above links show that the structure is commensurate with a
reduced storage capacitance or higher noise per pixel and none of these
references even discuss your claim. Indeed, the second paragraph of
Page 5 appears to directly dispute your claim in stating that the use of
non-RGB Bayer filters cause an SNR reduction relative to other
approaches, including Foveon's own.


please explain how a sensor with a well capacity of 60k can maintain
60k photons per layer, which suggests that the total would then be 180k
photons. that makes no sense at all and i don't see any evidence to
support it.

if the full well is 60k and you slice it into three equal parts (which
foveon doesn't do), then each part can capture 1/3 of the total, or in
this case, 20k photons. combine the three layers and you have the
original 60k capacity and no colour differentiation.

Your entire assessment assumes that the total storage capacity is simply
the same as a single layer device. It isn't. Foveon could only design
their detector once NatSemi had developed their production facility that
provided the additional layers of metalisation and polySi deposition
necessary for multilayer structures. They made that clear in their
initial press release and is also clear in the paper you yourself cited
earlier at http://www.foveon.com/files/CIC10_Lyon_Hubel_FINAL.pdf just
above Figure 3:
"Three separate PN junctions are buried at different depths inside the
silicon surface and used to separate the electron-hole pairs that are
formed by this naturally occurring property of silicon."

The 3 photodiodes are created in a stack and, just as in monochrome and
Bayer sensors, the storage capacitance is actually the diode capacitance
itself. I see no reason to believe that the breakdown voltage of the
diodes should be any lower than the CMOS rail voltage just because they
are thinner than standard panchromatic photodiodes, since such small
featured diodes occur throughout CMOS chip designs and these devices are
manufactured in foundries which are nowhere near current minimum
geometry limits. Since the photodiodes can operate at the same voltage,
the storage capacity is exactly the same, or very similar.

In short, each photodiode in the layers has the same capacity as in a
single layer structure. And before you ask the obvious question of why
no Bayer chip designers stack diodes or capacitors to achieve higher
capacity, they certainly could however it would result in reduced ISO
since the photodiode QE would remain the same but the storage
capacitance would increase, whilst most commercial pull is towards
higher ISO.

The key to this is the NatSemi multilevel metal and polysilicon process.
Some manufacturers in other fields use something similar to create
higher storage capacitance in a fixed area, such as TI's old trench
capacitor design.

Foveon themselves state that the noise on the detector itself is similar
to that of conventional devices and that "most of the
luminance and chrominance noise that was noticeable in the final
image was due to the color transformation matrix magnifying the
noise..." from http://www.foveon.com/files/CIC13_Hubel_Final.pdf

they're not guilty at all. bayer uses the term 'pixel' correctly, as
it has been used long before there *was* a bayer or foveon.

Wrong! Pre-bayer a *colour* pixel comprised three individual colour
samples - usually taken from three individually filtered CRTs.

it still does. it's a spatial element of an image. it also doesn't
have to be three samples per pixel (e.g., cmyk or hexachrome).

It needs to be *at least* three samples per pixel for colour. You, like
most Bayer-fanbois, count each and every monochrome sample, which is
quite wrong.


the correct way to count pixels is to count spatial elements, since
that's exactly what they are.

Yes, but colour spatial elements are not just the data samples.

the fact that bayer measures one component per pixel and calculates the
other two is an implementation detail that makes the sensor a little
less accurate than it otherwise could be, but the error is actually
very small.

I disagree. They get away with it because there is no immediate visual
reference. However, if you take a typical Bayer (or any other sensor)
12Mp image from today's sensors and downsample that by a factor of 3
(ie. 1.73 linear downscaling) you can create a synthetic reference and
it is significantly better than an equivalent 4Mp Bayer image from
yesterday's sensors.

foveon measures three samples per pixel and then goes through a lookup
table and a transform to get the final three component rgb pixel, so in
reality, it's interpolating all three components, versus bayer
interpolating only two.

You clearly misunderstand the difference between matrix colour
processing and spatial interpolation. There is no spatial interpolation
in the former, it is linear arithmetic at each point in the image. No
output pixel on a Foveon image contains any information from adjacent
pixels. Every output pixel on a Bayer image not only contains
information from adjacent pixels, but the image is first smeared by an
AA filter to ensure that the spatial interpolation does not create
significant colour errors.

The Bayer sensor does not calculate the other two - it is up to the
processing electronics to *estimate* the other two by interpolation from
adjacent monochrome information in alternate colours by one of many
algorithms.


technically true, but that's a nitpick. i don't know of anyone who
uses a bayer sensor without the associated electronics and raw
processing to produce images. thus, referring to a bayer sensor
implies using the entire system and not just the sensor alone.

Quite - so why do you insist on ignoring the effect of more than half of
the Bayer system in estimating its pixel density?

there's still three components per pixel in
the output image. chroma resolution is lower, but so is that of the
human eye, and it's not noticable, except in artificial test cases.


Whether it is noticeable in the human eye is irrelevant


it's very relevant. why bother capturing what the eye can't see?

Accepted - bad wording on my part. I should have said that it is not
relevant that the human eye has a lower chroma resolution, since the
image can be viewed at any scale - even when the chroma limitations of
the image are well resolved by the eye.

sure it sounds great to have full colour fidelity, but humans aren't
going to see a difference. why bother?

They are, they just need to look closer.

all sampling systems require anti-alias filtering, or they risk alias
artifacts with signals near and certainly past nyquist.


Great theory, but you only quote (and possibly understand) part of it.
Alias spatial frequencies caused by undersampling are *always* lower
than the input stimuli that cause them. However, the aliased spatial
frequency has the same spatial extent as the higher spatial frequency
that causes it. These two facts mean that unless the input frequency
has a spatial extent greater than a complete cycle of the alias
frequency, ie. the input must be a repeated regular pattern, then the
result is simply a moderation of the output sample amplitude, not the
creation of classical aliasing all. Repeated regular patterns don't
happen much in nature, so aliasing in imaging is much rarer than is
generally believed. That isn't to say it doesn't exist, it certainly
does, but it also exists in the real world too, without any image sensor
- the term moiré existed long before digital sensors for good reason.

on a bayer
sensor, the artifacts produce ugly colour patterns. on foveon, the
artifacts are not as readily noticable, but they are definitely there.

There are two significant differences:

First, without an AA filter the Bayer sensor actually has two Nyquist
sampling limits - one for chroma and one for luma - and therefore two
alias reflection points. The chroma limit is half the luma limit. The
Foveon design has a single Nyquist limit irrespective of chroma or luma
scene content, ensuring that when aliasing does occur both chroma and
luma remain in phase irrespective of the amount of aliasing that occurs.
The Nyquist limit of the Foveon sensor is twice as high as the chroma
limit of a Bayer sensor relative to the spatial sampling frequency.

Second, chroma aliasing is much more visually objectionable than luma
aliasing and requires a much more aggressive AA filter to eliminate it.
The filter must cut on at a lower spatial frequency because the Nyquist
point is lower and it must reduce the amplitude above that cut to a
lower level because the chroma aliasing is much more objectionable.

here's an old example photo:
http://www.wfu.edu/~matthews/misc/DigPhotog/alias/artifact.jpg

Nice example, showing pure lima aliasing and that a 3.4Mp SD-9 has
comparable, if not more, spatial resolution than the 6.3Mp D60 as shown
in the random fur, the stitching of the liner and texture of the
foreground part.

it
is quite wrong to use the total monochrome pixel count directly in
estimation of resolution. Which is back to where we came in... both
sides exaggerate the relevant pixel count in image resolution estimates.


pixel counts are just that -- counts of spatial elements.

Only if, as you are consistently doing, you ignore the AA filter which
reduces the number of *independent* spatial samples and hence the number
of effective pixels. Upsampling a 3Mp image to 12 million samples using
pixel replication doesn't make an image of 12Mpixel resolution. That is
effectively the reverse operation of the AA filter - 12 million spatial
samples, but much less than 12Mp resolution.

the sensor in the nikon d300 has 12 million spatial locations, or 12
megapixels.


No, it has 12 million *monochromatic* spatial samples, of which only
around 3 million are truly independent and hence a similar number of
*trichromatic* spatial samples, the reduction caused by the AA filter
and pixel interpolation - and must do to avoid cross colour aliasing
artefacts.

Using the number of spatial samples as a resolution metric for Bayer
sensors is only relevant when comparing to other Bayer sensors which
suffer from the same handicap. As soon as you bring in something that
does not have that particular handicap, such as a 3-chip sensor or a
Foveon sensor, then the metric fails. Similarly it is wrong to use 3x
the number of spatial samples for such devices as a comparison metric
with Bayer sensors.

There is no doubt that the actual number of spatial samples in the
Foveon chip is the correct colour resolution metric since it is
comparable with other technologies such as 3 chip colour cameras,
scanning cameras and, indeed, conventional image scanners.

There is also no doubt that the number of spatial samples on a Bayer
chip is an exaggerated colour resolution metric, since the images are
noticeably inferior in resolution than the other imaging technologies
mentioned above of the same sample count.

The question that remains is how much is Bayer exaggerated - it is
certainly more than 1 but less than 4 and probably even less than 3, but
varies from implementation to implementation depending on each company's
AA offering at the time. Between 1 and 3 is quite a wide range, much
more than the MP count between the leading examples of each vendor in
recent times. Yet there is no established means of reviewing it.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #28  
Old July 18th 08, 05:02 AM posted to rec.photo.digital.slr-systems
nospam
external usenet poster
 
Posts: 24,165
Default dpreview pixel density metric

In article , Kennedy McEwen
wrote:

please explain how a sensor with a well capacity of 60k can maintain
60k photons per layer, which suggests that the total would then be 180k
photons. that makes no sense at all and i don't see any evidence to
support it.

if the full well is 60k and you slice it into three equal parts (which
foveon doesn't do), then each part can capture 1/3 of the total, or in
this case, 20k photons. combine the three layers and you have the
original 60k capacity and no colour differentiation.

Your entire assessment assumes that the total storage capacity is simply
the same as a single layer device. It isn't.


it's fairly close. it's certainly not triple as you're implying.

In short, each photodiode in the layers has the same capacity as in a
single layer structure.


so explain how that works.

the foveon images i've seen don't bear that out, nor have i seen any
evidence that supports it. quite the opposite in fact.

You clearly misunderstand the difference between matrix colour
processing and spatial interpolation. There is no spatial interpolation
in the former, it is linear arithmetic at each point in the image. No
output pixel on a Foveon image contains any information from adjacent
pixels. Every output pixel on a Bayer image not only contains
information from adjacent pixels, but the image is first smeared by an
AA filter to ensure that the spatial interpolation does not create
significant colour errors.


the fact you call anti-aliasing 'smear' shows where your bias is. the
anti-alias filter attentuates what the sensor cant' resolve properly.
pick your poison, artifacts or 'smear.'

pixel counts are just that -- counts of spatial elements.

Only if, as you are consistently doing, you ignore the AA filter


of course. pixel counts are counting *what's on the sensor*. it has
nothing to do with an aa filter or the capabilities of the lens that's
used.

Using the number of spatial samples as a resolution metric for Bayer
sensors is only relevant when comparing to other Bayer sensors which
suffer from the same handicap. As soon as you bring in something that
does not have that particular handicap, such as a 3-chip sensor or a
Foveon sensor, then the metric fails.


i'm not using the number of samples as a resolution metric. it's
merely a physical property of the sensor.

Similarly it is wrong to use 3x
the number of spatial samples for such devices as a comparison metric
with Bayer sensors.


agreed that 3x is wrong.
  #29  
Old July 18th 08, 07:31 PM posted to rec.photo.digital.slr-systems
Kennedy McEwen
external usenet poster
 
Posts: 639
Default dpreview pixel density metric

In article , nospam
writes
In article , Kennedy McEwen
wrote:

please explain how a sensor with a well capacity of 60k can maintain
60k photons per layer, which suggests that the total would then be 180k
photons. that makes no sense at all and i don't see any evidence to
support it.

if the full well is 60k and you slice it into three equal parts (which
foveon doesn't do), then each part can capture 1/3 of the total, or in
this case, 20k photons. combine the three layers and you have the
original 60k capacity and no colour differentiation.

Your entire assessment assumes that the total storage capacity is simply
the same as a single layer device. It isn't.


it's fairly close.


It is far from close.

it's certainly not triple as you're implying.


I am not implying it is triple. In fact, as shown in the original
Foveon patent (No. 5965875), it is just over double.

In short, each photodiode in the layers has the same capacity as in a
single layer structure.


so explain how that works.

I thought I already had. These are 3 independent photodiodes, each
capable of being biased at the full rail voltage of the chip without
breakdown. That gives *each* photodiode exactly the same storage
capacity per unit area as conventional single layer photodiodes, or 3x
as much as any individual Bayer diode. In practice, as shown in the
Foveon patent, the area of the green and blue diodes is less than that
of the red, hence the total storage capacity is somewhat less than 3x.

the foveon images i've seen don't bear that out, nor have i seen any
evidence that supports it. quite the opposite in fact.

You clearly misunderstand the difference between matrix colour
processing and spatial interpolation. There is no spatial interpolation
in the former, it is linear arithmetic at each point in the image. No
output pixel on a Foveon image contains any information from adjacent
pixels. Every output pixel on a Bayer image not only contains
information from adjacent pixels, but the image is first smeared by an
AA filter to ensure that the spatial interpolation does not create
significant colour errors.


the fact you call anti-aliasing 'smear' shows where your bias is.


The only bias I have is towards fair representation, on both sides. For
the record, I don't and never have owned a dSLR with a Foveon sensor in
it, although I have evaluated the chip itself. I own, and regularly
use, the Bayer sensor based Canon 5D. Bias clear enough for you?

the
anti-alias filter attentuates what the sensor cant' resolve properly.
pick your poison, artifacts or 'smear.'

Smear is precisely what the AA filter does - one AA filter smears the
image in the horizontal axis and the other AA filter smears the image in
the vertical axis, in each case by a significant part of the pixel
pitch, depending on how strong the AA filter is.

pixel counts are just that -- counts of spatial elements.

Only if, as you are consistently doing, you ignore the AA filter


of course. pixel counts are counting *what's on the sensor*. it has
nothing to do with an aa filter or the capabilities of the lens that's
used.


Now who is biased? Of course the AA filter matters, just as the lens
does. There simply is no point in claiming 12.7Mp resolution for an FF
sensor (ie. 8.4um pixels) and then using it with a lens which can only
resolve features down to 25um. The result is only marginally better
than a true 3.3Mp system. Just the same as claiming 12.7Mp for a sensor
which requires an AA filter to smear pairs of Bayer sensels in both
axes. The result is much less than is claimed - Bayer resolution
exaggeration.

Its only two posts back in the thread when you were claiming, correctly
as I noted, that the sensor couldn't be treated independently, it was
part of the whole system. Well, sunshine, that AA filter is part of
that whole system and it smears the image to ensure that the actual
resolution is considerably lower than the number of data samples.


Using the number of spatial samples as a resolution metric for Bayer
sensors is only relevant when comparing to other Bayer sensors which
suffer from the same handicap. As soon as you bring in something that
does not have that particular handicap, such as a 3-chip sensor or a
Foveon sensor, then the metric fails.


i'm not using the number of samples as a resolution metric. it's
merely a physical property of the sensor.

You are using it a resolution metric as soon as you refer to pixels as
spatial samples. They are not independent spatial samples in the Bayer
sensor and so the metric you are using for Bayer sensors is completely
invalid in comparisons with sensors which are not similarly handicapped.

Similarly it is wrong to use 3x
the number of spatial samples for such devices as a comparison metric
with Bayer sensors.


agreed that 3x is wrong.

They are BOTH wrong, and whilst you seem to be keen to falsely accuse of
bias, you appear to unable to recognise your own.
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's ****ed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
  #30  
Old July 21st 08, 06:27 AM posted to rec.photo.digital.slr-systems
nospam
external usenet poster
 
Posts: 24,165
Default dpreview pixel density metric

In article , Kennedy McEwen
wrote:

it's certainly not triple as you're implying.


I am not implying it is triple. In fact, as shown in the original
Foveon patent (No. 5965875), it is just over double.


where in the patent does it discuss double photon counts?

In short, each photodiode in the layers has the same capacity as in a
single layer structure.


so explain how that works.

I thought I already had. These are 3 independent photodiodes, each
capable of being biased at the full rail voltage of the chip without
breakdown. That gives *each* photodiode exactly the same storage
capacity per unit area as conventional single layer photodiodes, or 3x
as much as any individual Bayer diode.


first you say it's not triple, it's double, now you say it really is
triple. which is it?

In practice, as shown in the
Foveon patent, the area of the green and blue diodes is less than that
of the red, hence the total storage capacity is somewhat less than 3x.


yet another contribution to the poor performance in the blue channel.
also, the patent does show concentric rings, but there's some debate as
to whether the existing sensor is built that way.

Smear is precisely what the AA filter does - one AA filter smears the
image in the horizontal axis and the other AA filter smears the image in
the vertical axis, in each case by a significant part of the pixel
pitch, depending on how strong the AA filter is.


it's only 'smearing' the detail that the sensor can't resolve without
aliasing, so it's really a question of accuracy versus artifacts, and
that's subjective.

pixel counts are just that -- counts of spatial elements.

Only if, as you are consistently doing, you ignore the AA filter


of course. pixel counts are counting *what's on the sensor*. it has
nothing to do with an aa filter or the capabilities of the lens that's
used.


Now who is biased? Of course the AA filter matters, just as the lens
does.


when evaluating the resolution of the entire system, the anti-alias
filter and the lens definitely matter. however, when *counting* pixels
on a particular sensor, the anti-alias filter is not relevant at all.

for example, take a 6mp cellphone camera and a 6mp dslr. they both
have 6 million pixels (because it's just a physical count), but they
have drastically different resolution. further, the cellphone lacks an
anti-alias filter, relying on diffraction and a crappy lens to limit
the spatial detail.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
dpreview pixel density metric Alan Browne Digital Photography 12 July 21st 08 05:26 PM
Lens resolution versus pixel density Rich Digital Photography 4 December 18th 06 09:55 PM
what is Dynamic PIXEL and Real Type pixel means [email protected] Digital SLR Cameras 0 September 19th 06 11:57 AM
pixel density .::SuperBLUE::. Digital SLR Cameras 13 March 8th 05 12:02 AM
Metric print sizes Moses Fridlich Digital Photography 20 September 7th 04 12:17 PM


All times are GMT +1. The time now is 03:49 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.