A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

'Magic' kernel for image zoom resampling?



 
 
Thread Tools Display Modes
  #1  
Old June 6th 06, 01:38 PM posted to comp.graphics.algorithms,rec.photo.digital
external usenet poster
 
Posts: n/a
Default 'Magic' kernel for image zoom resampling?

Hi folks,

I've recently generated a solution to an image zooming problem, and
can't find anything quite like it on our good old web or usenet. I'm
trying to figure out if I've stumbled on something that might actually
be useful to other people.

My need was to downsample an image by a factor of 8 in each direction
(i.e. 1 downsampled pixel for every 64 original image pixels), and then
upsample the result back to the original resolution. I understand all
the information theoretical aspects of this process; I was trying to
figure out if there was a kernel that did both tasks well (i.e., the
final result was smooth and without stellations or other strange
artifacts), and also did them fast.

After some truly terrible attempts (even though I thought I had sinc
functions well and truly under my belt 20 years ago), I found that the
following recursive algorithm works amazingly well:

1. To downsample or upsample by a factor of 2, use a kernel

1 3 3 1
3 9 9 3
3 9 9 3
1 3 3 1

placed over every second pixel in each direction. (Total normalization
of 64 is relevant for downsampling. For upsampling, only divide by 16,
because every upsampled pixel gets a contribution from four downsampled
pixels' kernels).

2. To downsample or upsample by a factor of 2^k, perform the above
resampling recursively, k times.

The results for upsampling are astoundingly good, really astoundingly
good, and better than what I could get Photoshop to do. (Some years ago
I did my fair share of 'extreme zooming' in some work on the
photographic evidence of the JFK assassination, so I'm very familiar
with all the strange artifacts that can pop out, and the disappointment
that accompanies it.)

I can upload a few images to my website if need be, to demonstrate what
I am talking about (or, equivalently, point me to some test images and
I'll show you what comes out).

For my particular purpose (8:8 downsampling and upsampling), applying
this 'magic' kernel three times yields a kernel that is only 22 x 22 in
size (if you want to precompute the kernel and apply it once, as I
ultimately want to, rather than actually perfoming the doubling process
three times). (Every time you apply it, double the kernel width or
height and add 2, so 2 x 4 + 2 = 10, and 2 x 10 + 2 = 22.) That's
pretty good, considering that, for 800% zooming, it's already 16 pixels
from nearest pixel to nearest pixel, in each dimension.

Of course, if you wanted to resample by something that is not a power
of 2, then you'd need to use the 'magic' kernel for the nearest power
of 2, and then use something more traditional for the final small
adjustment in resolution. That's no major problem, because getting to
the final result from the nearest power of 2 is never worse than
between a 71% and 141% zoom, and just about any resampling method does
a decent job in this range.

My question is: Is this 'magic kernel' something new, or is this trick
known?

The closest I could find on the net is the 'stair interpolation' trick,
which uses Photoshop's bicubic for successive 110% increases, which is
sort of, kind of, the same idea, but not quite. The other resampling
kernels I could find on the net look much more like what I was trying
to do in the first place, but nothing like what I ended up with.

The 'magic' kernel sort of reminds me of the Fast Fourier Transform,
which also gets all sorts of amazing efficiencies with powers of 2, and
then needs a bit of a (non-critical) fudge if you don't quite have a
power of 2.

Oh, and by the way, I know how I arrived at the 'magic' kernel above
(for another aspect of the project that needed just 2:2 downsampling
and upsampling), and it has some nice mathematical properties (it is
'separable' in the sense that it is the product of an 'x' 1-3-3-1 and a
'y' 1-3-3-1, and its normalization is always a power of 2, which means
everything is integer look-ups with bit-shifts, which makes me
extremely happy), but I have no mathematical proof at all of why it
works so damn well when applied to itself recursively.

Anyhow, thanks in advance for any words of advice. And apologies in
advance if I've rediscovered someone's magic kernel, as is bound to be
the case.

John Costella

_______________________________________________
Dr. John P. Costella
BE(Elec)(Hons) BSc(Hons) PhD(Physics) GradDipEd
Melbourne, Victoria, Australia


assassinationscience.com/johncostella

  #2  
Old June 7th 06, 03:06 AM posted to comp.graphics.algorithms,rec.photo.digital
external usenet poster
 
Posts: n/a
Default 'Magic' kernel for image zoom resampling?


wrote in message
oups.com...
SNIP
Anyhow, thanks in advance for any words of advice. And apologies
in advance if I've rediscovered someone's magic kernel, as is bound
to be the case.


John, you might want to also post your question to
news://sci.image.processing, some knowledgeable folks there and the
tone of voice is reasonable. I've yet to test your kernel on e.g. a
zone-plate type of target, like I used for my informal/empirical
verdict on proper down-sampling:
http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm.
Objectively quantifying the resulting quality is probably beyond my
capabilities, unless you have a suggestion for further research.

Obviously this was intended to inform about down-sampling pitfalls
only, since down-sampling is becoming more of a common process as the
average sensor array increases in size/sampling density.

As an aside, last night I saw a feature on TV following a new approx.
3 hour documentary on the JFK assassination by (apparently) James
Files rather than Lee Harvey Oswald (who was in the documentary
assumed to be an undercover CIA operative), based on the extensive
research of fellow Dutchman Wim Dankbaar, a Dutch researcher
fascinated by the conspiracy, murder, and cover-up of this factual
coup-d'etat. Lot's of new (to me anyway) details were uncovered,
including an interview with James Files who revealed a lot of
verifiable specifics about how he as a backup Mob hitman fired the
lethal head shot with a mercury filled bullet with a "Remington Arms
Fireball XP-100" from behind the fence at the grassy knoll
(http://www.jfkmurdersolved.com/filestruth.htm).

Bart

  #3  
Old June 7th 06, 05:02 AM posted to comp.graphics.algorithms,rec.photo.digital
external usenet poster
 
Posts: n/a
Default 'Magic' kernel for image zoom resampling?

Factor of two resampling is trivial. It's the other ratios that tend to
produce beats where pixels go in and out of phase with the original.
Various algorithms attempt to reduce the beats without blurring the
image too much. One trick is to find image edges and nudge them into
alignment with the destination pixels, but that only creates a different
kind of distortion.
  #4  
Old June 7th 06, 01:00 PM posted to comp.graphics.algorithms,rec.photo.digital
external usenet poster
 
Posts: n/a
Default 'Magic' kernel for image zoom resampling?

Bart van der Wolf wrote:
[...]
I've yet to test your kernel on e.g. a
zone-plate type of target, like I used for my informal/empirical
verdict on proper down-sampling:
http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm.


Bart,

That's an interesting test you have applied, and you're right in that
it highlights obvious pitfalls. However, distinguishing the 'best' ones
is tricky. I note that your original GIF image has faint Moire rings
away from the center itself, as can be seen if you zoom it and look
carefully. I guess what you're really testing is 'relative badness'.
(And I guess the way that you generated your 'antialiased' original
image preassumes that you know how to do this in the first place!)

It's also tricky looking at your site, because the results are
presented at actual size, and my LCD screen is producing Moire rings on
some of the ones you claim to be the 'best'.

But in any case, I used the 'magic' kernel on the original image, to
downsample it to 25%. I've posted the result to

http://www.assassinationscience.com/...ings1_on_4.gif

To me, it looks comparable to the best results you obtained. Given that
it's remarkably simple to implement (see my additional posting today,
above, outlining a very simple algorithm for generating an arbitrary
2^k kernel), I'm happy with the result. However, again, I'm a little
reserved about your original image, and also about whether a zone plate
is necessarily the best test of downsampling, given the difficulties in
generating it in the first place.

John

  #5  
Old June 7th 06, 01:09 PM posted to comp.graphics.algorithms,rec.photo.digital
external usenet poster
 
Posts: n/a
Default 'Magic' kernel for image zoom resampling?


Kevin McMurtrie wrote:
Factor of two resampling is trivial. It's the other ratios that tend to
produce beats where pixels go in and out of phase with the original.
Various algorithms attempt to reduce the beats without blurring the
image too much. One trick is to find image edges and nudge them into
alignment with the destination pixels, but that only creates a different
kind of distortion.


Kevin,

You may be right about a factor of 2, but what about a factor of 2^k? I
know that when I ask Photoshop to zoom something 800%, it doesn't do a
very good job. I am really looking at the case of high zoom (like 800%
or 1600% or 25600%). If you are telling me that 2^k is trivial, then I
do not agree.

I'm not really touching on the issue of resampling by a factor between
70% and 141%. You're right that frequencies in the original image will
beat in a resampling, but there is nothing much you can do about that
(unless, as you say, you distort the original). I'm taking it as
granted that resampling in this range has its own issues, but basically
gives acceptable results except in singular cases. It's the poor
performance for large factors that I'm targeting.

John

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
A truly HORRIFIC tsunami picture Mike Henley Digital Photography 872 January 30th 05 12:45 AM
The digital zoom myth busted bob Digital Photography 14 October 28th 04 01:01 PM
SLR Zoom David Dyer-Bennet Digital Photography 4 August 8th 04 02:37 AM
Canon zoom question bb Digital Photography 20 July 9th 04 07:51 AM
Question on digital zoom. Evan Platt Other Photographic Equipment 1 December 9th 03 12:15 AM


All times are GMT +1. The time now is 11:52 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.