If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#131
|
|||
|
|||
Thirsty Moth
In article , Eric Stevens
wrote: In a different newsgroup altogether, I have just read: "In The Republic, Plato worried that democracy meant the rule of the ignorant over the wise." Fortunately, democracy is not the decider of whether or not computing is involved. I read your quote of a Platonic concern and your computing corollary. Surely you could have pulled a little something that is less than 100 years old, perhaps a little something from Mencken, or this example from de Saint-Exupery. "The machine does not isolate man from the great problems of nature, but plunges him more deeply into them." I was giving you a reason why you should not rely on 'most people' to decide what is and what is not a computer. actually you *should* go by what most people think. that's how language works. otherwise, there will be mass confusion. a device that contains an embedded microprocessor is *not* what people think of when they hear 'computer'. a usb keyboard has a microprocessor and is used *with* a computer. nobody thinks of a usb keyboard by itself as a computer. the same goes for bluetooth headsets, lithium ion batteries, radios & tvs and much more. |
#132
|
|||
|
|||
Thirsty Moth
On Fri, 24 Jul 2015 17:30:27 -0700, Savageduck
wrote: On 2015-07-25 03:35:47 +0000, Eric Stevens said: On Thu, 23 Jul 2015 14:14:32 -0700, Savageduck wrote: On 2015-07-24 01:07:53 +0000, Eric Stevens said: On Thu, 23 Jul 2015 09:55:06 -0700, Savageduck wrote: On 2015-07-23 15:42:48 +0000, Whisky-dave said: Probbab y why it''s called RAW and like meat it needss convering so humans can eat it, but other animals can. There is always PICTBridge, with no computer involved, sort of RAW tartar. There has to be computing involved: probably in both the camera and the printer. In this context most folks think of a "computer" as a desktop or laptop computer, not a processor chip with PICTBridge capability in a camera, or printer. In a different newsgroup altogether, I have just read: "In The Republic, Plato worried that democracy meant the rule of the ignorant over the wise." Fortunately, democracy is not the decider of whether or not computing is involved. I read your quote of a Platonic concern and your computing corollary. Surely you could have pulled a little something that is less than 100 years old, perhaps a little something from Mencken, or this example from de Saint-Exupery. "The machine does not isolate man from the great problems of nature, but plunges him more deeply into them." Yet there are those who determined not to do more than get their toes wet. -- Regards, Eric Stevens |
#133
|
|||
|
|||
Thirsty Moth
On Fri, 24 Jul 2015 17:30:27 -0700, Savageduck
wrote: On 2015-07-25 03:35:47 +0000, Eric Stevens said: On Thu, 23 Jul 2015 14:14:32 -0700, Savageduck wrote: On 2015-07-24 01:07:53 +0000, Eric Stevens said: On Thu, 23 Jul 2015 09:55:06 -0700, Savageduck wrote: On 2015-07-23 15:42:48 +0000, Whisky-dave said: Probbab y why it''s called RAW and like meat it needss convering so humans can eat it, but other animals can. There is always PICTBridge, with no computer involved, sort of RAW tartar. There has to be computing involved: probably in both the camera and the printer. In this context most folks think of a "computer" as a desktop or laptop computer, not a processor chip with PICTBridge capability in a camera, or printer. In a different newsgroup altogether, I have just read: "In The Republic, Plato worried that democracy meant the rule of the ignorant over the wise." Fortunately, democracy is not the decider of whether or not computing is involved. I read your quote of a Platonic concern and your computing corollary. Surely you could have pulled a little something that is less than 100 years old, perhaps a little something from Mencken, or this example from de Saint-Exupery. "The machine does not isolate man from the great problems of nature, but plunges him more deeply into them." I was giving you a reason why you should not rely on 'most people' to decide what is and what is not a computer. -- Regards, Eric Stevens |
#134
|
|||
|
|||
Thirsty Moth
On Fri, 24 Jul 2015 23:52:57 -0400, nospam
wrote: In article , Eric Stevens wrote: In a different newsgroup altogether, I have just read: "In The Republic, Plato worried that democracy meant the rule of the ignorant over the wise." Fortunately, democracy is not the decider of whether or not computing is involved. I read your quote of a Platonic concern and your computing corollary. Surely you could have pulled a little something that is less than 100 years old, perhaps a little something from Mencken, or this example from de Saint-Exupery. "The machine does not isolate man from the great problems of nature, but plunges him more deeply into them." I was giving you a reason why you should not rely on 'most people' to decide what is and what is not a computer. actually you *should* go by what most people think. that's how language works. otherwise, there will be mass confusion. a device that contains an embedded microprocessor is *not* what people think of when they hear 'computer'. a usb keyboard has a microprocessor and is used *with* a computer. nobody thinks of a usb keyboard by itself as a computer. the same goes for bluetooth headsets, lithium ion batteries, radios & tvs and much more. All of that is true. Nevertheless, if you look up the definition of computer you will be hard put to find one requires that a computer requires a key board, screen, printer etc. See for example: http://www.oxforddictionaries.com/de...glish/computer http://searchwindowsserver.techtarge...ition/computer http://www.thefreedictionary.com/computer -- Regards, Eric Stevens |
#135
|
|||
|
|||
Thirsty Moth
| Wow. Stuck in the last C. and proud of it!
Did you have a point there? Nospam seems to be right in saying that OSX provides an "image class", which is not surprising. and that class has extensive functionality. Again, not surprising: https://developer.apple.com/library/.../ci_intro.html But the whole thing is still about pixels. A grid of pixels is what you see onscreen. A grid of pixels is what comes out of your printer. A grid of pixels is what gets sharpened, brightened, etc. Those are all operations that operate on pixels. In the Windows API all image operations are done to DIBs, device independent bitmaps, which are grids of pixels. The file type is the method of storage for that grid. It's not essentially different in Win95 or in Win10. What is different is that there are now more wrappers/objects/ classes/libraries available that encapsulate graphic operations for convenience. If you look at the Apple developer pages you won't see any mention of bitmaps because Apple wraps it all for their developers. A "class" is simply a wrapper; a package that encapsulates functionality and makes it available to the outside. The programmer calls a function like BrightenMyPic 5 and the class returns an image brightened by 5%. The caller doesn't need to know how it was done. A class is a programming term, analogous to the cook's window in a diner. The waiter/waitress calls in and says, "Two over easy, wheat toast and hash". Their request shows up in the cook's window a few minutes later. The waiter doesn't need to know how the dish was created. For their job it's enough to know how to ask for it. Calling into a class -- Apple or otherwise -- is exactly the same thing. It's important to understand that eggs over easy are not somehow magically generated just because you don't see the cook or the stove. And that a more modern diner doesn't somehow create eggs differently. ....Or rather, it's important to understand if you want to use the "diner class" successfully. You have to know what to ask for. If you call in "3 martinis, extra dry" you'll just get an error message back. It's also important to understand if you want to go beyond the class. For instance, say Adobe comes out with a new, superior interpolation method for enlarging. If they make their code public (highly unlikey, I realize then you can use it, but only if you know how to apply the formula to a bitmap. If Apple doesn't add that method to their class then Apple developers won't be able to feature Adobe Super-Duper Interpolation in Apple software any more than you can get a dry martini at a diner. So for nospam to say that Apple uses a class and that bitmaps are out of date is directly analogous to saying that stoves and cooks are out of date and no longer used in modern restaurants, which have replaced all that with a cook's window. If you think that cooks and stoves are "so last century" then try returning your dish a couple of times saying that it's "not quite hot enough". My guess is that the cook will eventually get annoyed and spit on your meal. The fact that it happens inside the opaque wrapper of the "cook's window class" doesn't make it any less real. |
#136
|
|||
|
|||
Thirsty Moth
In article , Mayayana
wrote: | Wow. Stuck in the last C. and proud of it! Did you have a point there? he did, which is that you're stuck in the past. Nospam seems to be right in saying that OSX provides an "image class", which is not surprising. and that class has extensive functionality. Again, not surprising: https://developer.apple.com/library/...aging/Conceptu al/CoreImaging/ci_intro/ci_intro.html core image is not what i was talking about. start here for core graphics, which is the foundation for the more advanced stuff: https://developer.apple.com/library/...ntation/CoreGr aphics/Reference/CoreGraphics_Framework/index.html and then move on to nsimage/uiimage, for mac and ios, respectively: https://developer.apple.com/library/...oa/Reference/A pplicationKit/Classes/NSImage_Class/ https://developer.apple.com/library/...it/Reference/U IImage_Class/index.html But the whole thing is still about pixels. A grid of pixels is what you see onscreen. A grid of pixels is what comes out of your printer. A grid of pixels is what gets sharpened, brightened, etc. Those are all operations that operate on pixels. so what? nobody said otherwise. the point is all that stuff has already been done in a way that takes full advantage of whatever graphics hardware that might be present. why reinvent the wheel, especially since you won't be able to do as good of a job. will your sharpening algorithm offload to gpus, handle various pixel types, file formats etc.? not without a *lot* of effort that could be better spent elsewhere. however, you still can if you really want. In the Windows API all image operations are done to DIBs, device independent bitmaps, which are grids of pixels. The file type is the method of storage for that grid. It's not essentially different in Win95 or in Win10. What is different is that there are now more wrappers/objects/ classes/libraries available that encapsulate graphic operations for convenience. good to see that windows is catching up. by the way, win7 was the first version of windows to have a compositing window manager (dwm), something osx had a decade earlier. If you look at the Apple developer pages you won't see any mention of bitmaps because Apple wraps it all for their developers. wrong. there's definitely mention of bitmaps and developers can create and modify bitmaps if they need to. what you're missing is that 99% of the time, you don't need to work at a low level. A "class" is simply a wrapper; a package that encapsulates functionality and makes it available to the outside. The programmer calls a function like BrightenMyPic 5 and the class returns an image brightened by 5%. The caller doesn't need to know how it was done. why would they need to know how it's done if all they want to do is make it brighter? nevertheless, anyone can still write their own brightenmypic filter (or any other filter) should the built-in ones not suffice. however, the built-in ones are almost always going to be faster and less buggy than anything you could write. A class is a programming term, analogous to the cook's window in a diner. The waiter/waitress calls in and says, "Two over easy, wheat toast and hash". Their request shows up in the cook's window a few minutes later. The waiter doesn't need to know how the dish was created. For their job it's enough to know how to ask for it. Calling into a class -- Apple or otherwise -- is exactly the same thing. odd analogy, but you're correct. a waiter does not need to know how to cook and a cook does not need to know how wait tables, serve food and interact with people. each one does their own job. It's important to understand that eggs over easy are not somehow magically generated just because you don't see the cook or the stove. And that a more modern diner doesn't somehow create eggs differently. ...Or rather, it's important to understand if you want to use the "diner class" successfully. You have to know what to ask for. If you call in "3 martinis, extra dry" you'll just get an error message back. did you have a point? what do you expect to happen if you ask for a martini in an actual diner? It's also important to understand if you want to go beyond the class. For instance, say Adobe comes out with a new, superior interpolation method for enlarging. If they make their code public (highly unlikey, I realize then you can use it, but only if you know how to apply the formula to a bitmap. If Apple doesn't add that method to their class then Apple developers won't be able to feature Adobe Super-Duper Interpolation in Apple software any more than you can get a dry martini at a diner. wrong. nothing stops someone from writing their own super-duper algorithm, whether it's from adobe (who does have an open source code library, btw), from another developer or something someone dreamed up on their own. they can write it as a core image filter so that other apps can benefit or they can write it specific to their own app. most of the time, direct pixel access is not needed, but it's there in the event that it is. that lets the developer work on more important things. |
#137
|
|||
|
|||
Thirsty Moth
In article , Eric Stevens
wrote: In a different newsgroup altogether, I have just read: "In The Republic, Plato worried that democracy meant the rule of the ignorant over the wise." Fortunately, democracy is not the decider of whether or not computing is involved. I read your quote of a Platonic concern and your computing corollary. Surely you could have pulled a little something that is less than 100 years old, perhaps a little something from Mencken, or this example from de Saint-Exupery. "The machine does not isolate man from the great problems of nature, but plunges him more deeply into them." I was giving you a reason why you should not rely on 'most people' to decide what is and what is not a computer. actually you *should* go by what most people think. that's how language works. otherwise, there will be mass confusion. a device that contains an embedded microprocessor is *not* what people think of when they hear 'computer'. a usb keyboard has a microprocessor and is used *with* a computer. nobody thinks of a usb keyboard by itself as a computer. the same goes for bluetooth headsets, lithium ion batteries, radios & tvs and much more. All of that is true. Nevertheless, if you look up the definition of computer you will be hard put to find one requires that a computer requires a key board, screen, printer etc. See for example: http://www.oxforddictionaries.com/de...glish/computer http://searchwindowsserver.techtarge...ition/computer http://www.thefreedictionary.com/computer did you have a point with all this or are you just arguing again? |
#138
|
|||
|
|||
Thirsty Moth
On 7/24/2015 11:52 PM, nospam wrote:
snip actually you *should* go by what most people think. that's how language works. otherwise, there will be mass confusion. Uniformity, that's the key. Line uo, fowaaaard march. Screw inovation, to hell with change. Most people haven't done it. -- PeterN |
#139
|
|||
|
|||
Thirsty Moth
On 7/25/2015 12:54 PM, Tony Cooper wrote:
On Sat, 25 Jul 2015 12:40:11 -0400, PeterN wrote: On 7/24/2015 11:52 PM, nospam wrote: snip actually you *should* go by what most people think. that's how language works. otherwise, there will be mass confusion. Uniformity, that's the key. Line uo, fowaaaard march. Screw inovation, to hell with change. Most people haven't done it. His hyprocrisy knows no bounds. Written language works with proper capitalization, and proper capitalization is what most people think is right, but it confuses him that this is exprected. He is entirely consistent. He uses whatever statement he thinks benefits his argument. It difficult not to notice that in another thread, about an ideal camera, he made negative comments about statements of others, but had nothing positive to say about his own concept of an ideal camera. Now we all know that nospam never argues just to argue. So it must mean what? -- PeterN |
#140
|
|||
|
|||
Thirsty Moth
| why reinvent the wheel, especially since you won't be able to do as
| good of a job. will your sharpening algorithm offload to gpus, handle | various pixel types, file formats etc.? not without a *lot* of effort | that could be better spent elsewhere. | You missed the basic point, twice now. There are no "pixel types" or file formats in a sharpening routine. It's a bitmap. A related topic actually came up last week in a Windows programming discussion. The topic was the fastest and most efficient way to get a thumbnail from a typical JPG of 5-10+ MB. The Windows API is not best. Windows GDI+ (a more recently developed wrapper) was quite good, but not the fastest. It turned out the best combination of both quality and speed was achieved by using the OSS library jpegturbo to reduce the image to 1/8 size and then using a straight code version of bilinear interpolation to take it down the rest of the way. The advantage is that jpegturbo resizes as it decompresses the data and does the job quicker and better than any other option. Straight bilinear interpolation then optimizes the speed by carrying out simple math calculations on an array of pixel values. It doesn't get faster than straight math. So the best solution was a combination of a 3rd-party wrapper and "hand coded" bilinear interpolation. There's nothing wrong with high level wrappers, but they're not always the best option. And they're still just wrappers, which is the point I've been trying to clarify. It's become so abstracted that you think you've moved beyond bitmaps, but that's not the case, as I was trying to make clear with the diner analogy. | most of the time, direct pixel access is not needed, but it's there in | the event that it is. | that lets the developer work on more important things. More important things than getting the best sharpening or resizing routine? You mean like picking the right pastel shade for their jelly buttons? You sound like you might be the sort of person who orders the rye toast at the diner because you're "gluten free". Right application, but wrong concept. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Super Zoom's Moth | Dudley Hanks[_4_] | Digital Photography | 1 | November 18th 10 01:40 AM |
Just a pretty moth | Nervous Nick | Digital Photography | 2 | April 5th 07 08:14 AM |
What type of moth? | [email protected] | Digital Photography | 8 | May 30th 06 05:51 PM |