A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Where I keep my spare cats.



 
 
Thread Tools Display Modes
  #191  
Old July 16th 17, 09:23 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article
S3DQmpBqD, Diesel
wrote:

a chromebook would actually be a very good choice for school
papers, particularly when schools want the papers submitted via
google. it's also cheap enough that it doesn't matter a whole lot
if it's damaged, lost or stolen.


Almost anything can make use of google docs. So I see no real
comparison here...


not very many that cost under $200 or as secure.

Granted, I'm no wizard with photoshop, but, I know what the
mining rigs typical hardware consists of. I've built several of
them. Some were infact, using liquid cooling and I don't mean
recirculating water.


big deal. liquid cooling isn't going to make photos look any
better or make the user more productive.


That depends on how much time the user is forced to wait for photoshop
to perform various functions. If you can reduce the wait time, you get
a more productive user.


nope.

what makes someone productive is the entire workflow, best in class
apps and overall user experience as well as optimizing for the actual
bottlenecks (not what you think they are).

that's why users are *far* more productive using photoshop versus the
gimp on exactly the same hardware, even your liquid cooled monstrosity.
it's also why users are more productive using mac/windows versus linux.

it's the *software* that matters, not the hardware specs.

put simply: someone would be more productive on a 'slower' system (mac
or windows) than your behemoth.


sure, you 'can' run photoshop on a mining system, but it won't
work anywhere near as well as running it on a system designed for
photo editing, and quite possibly with so many compromises that
it's not usable.


Again, you demonstrate gross ignorance on what a mining rig is, what
hardware is present, and, how it works. Photoshop is childs play to a
mining rig. A waste of good hardware, infact.


i know what a mining rig is and what hardware it involves and it's the
*wrong* hardware for photoshop.

you've haven't used photoshop much (or at all) nor do you understand
what its bottlenecks are. that much is clear.



the others are open
to debate. Especially the lower cost of ownership and higher
resale value.


none are open to debate.


The context is missing because you removed what I wrote.


nope. i backed up my claims. *you* removed them because they show you
to be wrong.

In 1999, for instance, Gistics released a landmark report
analyzing Macs and PCs in terms of return on investment (ROI).
Gistics' study was limited strictly to the publishing, graphics
and new media fields. Among many other findings, the authors
concluded that Mac creative professionals were producing $26,000
more each in annual revenues for their employers than their
Windows counterparts.


1999 was a long time ago.


it doesn't matter when and nothing has changed since then anyway. only
*one* instance is all that's needed to prove you wrong and there are
many.

the ibm link, from just nine months ago, is another one that proves you
wrong, which is why you snipped it.

here it is again:
https://www.jamf.com/blog/debate-ove...acs-are-535-le
ss-expensive-than-pcs/
But isnıt it expensive, and doesnıt it overload IT? No. IBM found
that not only do PCs drive twice the amount of support calls, theyıre
also three times more expensive. Thatıs right, depending on the
model, IBM is saving anywhere from $273 - $543 per Mac compared to a
PC, over a four-year lifespan. ³And this reflects the best pricing
weıve ever gotten from Microsoft,² Previn said. Multiply that number
by the 100,000+ Macs IBM expects to have deployed by the end of the
year, and weıre talking some serious savings.



one of the major attraction of a mac is that it can run mainstream
apps that don't exist on unix (and never will) *and* has unix
under the hood for those who want to tinker.


You're contradicting yourself and demonstrating (again) that you really
don't know how the machine in front of you actually works 'under the
hood'. You seem to think a varient of UNIX (which is what mac runs) is
completely isolated from the cute GUI mac has. That's simply, not the
case.


it's not a 'varient' of unix nor is there any contradiction.

mac os is certified unix and it's *much* more than a 'cute gui' on top
of unix.

unlike you, i know *very* well what goes on under the hood of a mac.

The 'mainstream apps' designed to run on a mac are infact,
running on that uber 'cute' varient of unix your mac has.


you continue to demonstrate that you know *nothing* about what goes on
under the hood on a mac.

mac apps are not unix apps. they don't use unix apis.

the mere fact you even think that shows just how little you know.

A closed source, proprietary varient, I might add.


it's less closed and less proprietary than windows.

*only* the stuff apple added *on* *top* *of* unix is what's closed. the
unix part is open source.

not that it matters, since an open source os doesn't improve user
productivity. users want to get work done and want the best apps to do
it, not modify the os.

a generic unix box is stuck with ****ty unix apps, and a vm
doesn't count. no graphic artist would ever run photoshop or linux
in a vm on top of unix.


Those lines are nothing more than your own, tainted, personal opinion.


it's verifiable fact.

photoshop is mac/windows only, not unix.

running photoshop in a vm or using wine are not viable options.

that means unix users are stuck with crap like the gimp, which is so
far behind photoshop it's actually sad how bad it is.

not only is the gimp *much* slower on the same hardware, but it *still*
lacks many key features photoshop has had for 20+ years, and based on
the gimp's roadmap, it won't ever get them.



macs have thunderbolt which must be included in any comparison.

you don't get to ignore specs that a mac has that other systems do
not.


I didn't ignore it, I stated that it's just not as popular as you seem
to think on the PC platform. Which is why Intel changed their policy
concerning royalties. They'd like to make it more common by having more
manufactuers of PC components adopt it. It seems to be going very slow,
considering how long it's been available. Intel obviously shares the
same opinion, why else would they forgo royalties in an effort to
increase it's adoption.


popularity does not matter.

the fact is that thunderbolt exists, it's widely used in the industry
(not just apple), macs have had it standard for years and it *cannot*
be ignored when making a comparison.





windows is closed source, while much of macos is open source.

Er, no, not much of Macos is open source. Some apps created by
Apple are open source, but, MacOS itself most certainly is not
open source. Neither is the hardware Apple creates to run it.


far more of apple's software is open source than microsoft, some
of which is used by apple's own competitors, including android.


You stated that much of MacOS was open source, and, that's not the
case, it's never been the case. The only way to have a good look around
is to break copyright/patent laws as you do so. I wasn't comparing
Apple to microsoft, I was correcting your erroneous statement
concerning what is/what isn't 'open source'


as usual, absolutely wrong.

https://opensource.apple.com


Windows is one operating system an individual can choose to run
on his/her PC. PC gives people options, including the OS you want
to run on it. This is because the PC hardware architecture is
open source. Not closed, not proprietary like Apple. Apple has
always liked doing their own thing in their own way. And charging
insane (imo) amounts for the shiny case.


nonsense.


The last two lines are my own personal opinion, otherwise, the rest is
factual and not simply my own opinion.


all of it is nonsense. every single word.

apple is more standards compliant than microsoft and macs can run a
number of operating systems (not that anyone cares, it's either mac os
or windows).


prices are competitive and macs are the *only* platform that can
run mac, windows *and* unix.


http://emulators.com/


emulation means the host system *can't* run it, it has to emulate it.
you just proved my point.




if it wasn't for mac os being free, there never would have been
a free upgrade to windows 10 for the first year.

MacOS isn't completely free.


yes it is. mac os is completely free.


ROFL, only if you meet the requirements. Hence, conditions. Otherwise,
it's NOT free.


it's free. period.

the only requirement is having a mac that can run it, the very same
requirement for *any* software, something which should be obvious, but
apparently not.




You really think Microsoft pushed Win**** 10 on people for 'free'
because Apple released a 'free' upgrade with limitations? Not
hardly.


yes hardly.


ROFL. Umm, you're wrong. MS wants everybody using the SAME version of
windows. It's *easier* for them from a suppport pov if that happened.
One code base, instead of several others with differences. Much less
headache, for them.


every company wants their user base to be using the same version of
their products, not just microsoft.

You clearly aren't quite the coder? you claimed to
be if you don't understand that.


insults means you have nothing.

I think if we took a head
count of all the PC engineers, it would grossly exceed the
amount Apple has.

you're confusing quantity with quality.

That's nothing more than a personal opinion. One of which I don't
share or have any real interest in debating with you.


it's not an opinion.


You really should consult with a dictionary.


says the person who writes 'varient' and 'paultry'.

apple's chip design team is one of the best in the business.


Another opinion.


nope. it's fact. what apple has done, particularly in the past 5 years
or so, is nothing short of impressive. everyone else is trying to
catch-up.

in less than a decade, apple's own processors are matching intel
in benchmarks, and in some cases, exceeding it.


Some specific processors intended for very specific roles. We're not
talking about desktop cpus here, though. Different design purpose n
all.


nope. what's in ios devices are desktop class processors, which match
or exceed intel's cpus.

from *three* *years* *ago*, about the several generation old a7 chip
(a11 is about to be released):
https://www.extremetech.com/computin...clone-cpu-deta
iled-a-desktop-class-chip-that-has-more-in-common-with-haswell-than-krai
t
Some six months after Apple shocked the world with its 64-bit A7 SoC,
which appeared in the iPhone 5S and then the iPad Air, we finally
have some hard details on the Cyclone CPUıs architecture. It seems
almost every tech writer was wrong about the A7: The CPU is not just
a gradual evolution of its Swift predecessor ‹ itıs an entirely
different beast thatıs actually more akin to a ³big core² Intel or
AMD CPU than a conventional ³small core² CPU.

PC technologies are so good, Apple is going with Intel
processors, in lieu of their own.

wrong on that too.

Nope. You admitted it yourself, they're using Intels Ix series
CPUS. Instead of their own.


for now they do, just like other pc makers, but that is going to
change real soon now, and across the industry too.


Apple doesn't make PCs. They make Apple products. Although the term
actually stands for Personal Computer, when an individual hears the
word PC, they aren't thinking about Apple. According to you, they're
thinking about Microsoft. Technically, the coco series, the commodores,
the amigas, original Apples, etc, are all 'PC's, but, nobody thinks of
them that way these days. Micro computers really, but, why split hairs
at this point...


steve ballmer considers an *ipad* and other tablets a pc, and that was
when he was still running microsoft, before he got fired for doing a
****ty job.

http://kensegall.com/2011/01/apples-final-humiliation-of-microsoft/

Mossberg: Šthis is semantics maybe, but, youıre using the term PC ‹ I
thought I just heard you use the term PC ‹ to kind of envelop the
things that I think a lot of average people donıt think of as PCs,
like the iPad, or other tablets that might be coming. Is that kind of
thing a PC?

Ballmer: Sure, of course it is.

Mossberg: It is.

Ballmer: Of course it is. Itıs a different form factor of PC.

An ios device is comparable to a PC now?


absolutely, especially since it can do things a pc cannot.


Likewise, a real desktop/tower can do things the Ios devices aren't
able to do...What's your specific point here?


pick the best tool for the job.

do try to keep up.

Or, is this another weak
attempt by you to move the goalposts? If Apples processors are so
much better than Intels, why are they using Intels?


i'm not moving anything.

there will be macs with apple-designed processors in the not so
distant future.


That doesn't answer my question...


yes it does. you obviously don't understand the issues.

there will also be windows systems with arm cpus, which have
already been demoed and expected by year's end.


Will be? Try, already exists and have for several years now. MS Surface
RT is a fine example of that, actually. But, it's not the only one...


windows rt is dead and completely different than whats coming, which is
something which has not been done before.

you haven't any clue what's going on in the industry.


apple's processors are already matching intel in benchmarks and
in some cases, exceeding.

Same question as above. Are we still discussing what's found in
Desktop/Tower PCs and Apple All in Ones, or the ARM processors
found in mobile devices? You keep trying to move the goalposts,
it's hard to tell.


i'm not moving a thing.


Yes, you are. We've gone from x86/amd64 to Arm cpus in this discussion.
What else would you call it? You seem to think ARM chips are 'new' as
well. They are RISC processors, which isn't 'new'...Unless you think
the 1980s is just around the corner.


intel chips aren't 'new' either. they're actually *older* than arm.

and if you think arm of the 1980s is the same as apple's current 64 bit
arm chips, you're even more ignorant than i thought.

you're also unaware that arm was originally co-founded by apple:
https://en.wikipedia.org/wiki/ARM_Holdings#Founding
The company was founded in November 1990 as Advanced RISC Machines
Ltd and structured as a joint venture between Acorn Computers, Apple
Computer (now Apple Inc.) and VLSI Technology. The new company
intended to further the development of the Acorn RISC Machine
processor, which was originally used in the Acorn Archimedes and had
been selected by Apple for their Newton project

but why let facts get in the way of your babble.

For end users and app developers, there's effectively no
difference between an Intel-based machine and one with a
Snapdragon processor under the hood. As PC Magazine notes, the
ARM build of Windows 10 works its magic using a built-in
emulator that translates instructions in realtime.


Did you skim the article? The ARM processor is using emulation. It's
*not* a native instruction set to that CPU. And contrary to claims,
emulation does slow down the process. Additional steps are required to
do it.


no additional steps for the user and the speed impact won't be noticed
in most cases. for common tasks such as email, the computer is waiting
on the user.

apple has changed cpus twice before without issue. the technology to do
it is well understood. except by you.

I'm not
talking about mobile devices, either. I'm still talking about PCs


there is no longer a distinction (and never really was either).


Actually, there is. For end users, evidently like yourself, it doesn't
matter. But, we're not all end users.


actually, there isn't a distinction, for anyone.

tablets and phones are just another type of computer, optimized for
different purposes than a desktop/laptop.

pick the best tool for the task.

for many people, a mobile device is their only computer, and they
do more with it than you do on your liquid cooled system.


LOL. I doubt that.


doubt it all you want, but it's true.

they're not tiny nor are they a gadget, the battery life is
comparable to most laptops (if not better) and their lifespan is
no different than any other computer.


They are disposable devices.. for a reason.


nonsense. they're no more disposable than any other tech device.

nothing lasts forever.

many people are doing real work on mobile devices, some of which
is not possible on a desktop/laptop.


You aren't running a full blown copy of Autocad 2018 on a 'mobile'
device. I'd say that is one example of 'real work'


autocad is not the only example of real work.


what article about an iphone? i listed things a mac could do that
a pc can't. there's a ****load more that i didn't list.


You listed Apple specific applications.


nope. i listed numerous features a mac has that a pc does not.

none of them are apple specific applications.

you really have *no* clue whatsoever.

since you snipped the list, here it is again:
easy migration, target disk mode, target display mode on
select models, handoff & continuity, airdrop, quicklook, universal
clipboard, touchid, applepay, touchbar, secure element, unix under
the hood, cocoa, metal, multitouch gestures, forcetouch trackpad,
wide gamut display, messages/calls with any device, versioning,
local facial & scene recognition, differential privacy, machine
learning, time machine, snapshots, higher user productivity, lower
cost of ownership and higher resale value.


those are *features*, not applications.

And with emulation, a PC can
run many of them.. so...


given that none were apps, no it can't, and many of those features
require hardware not available on a pc *and* software support.



sometimes it's a mac, sometimes it's a pc, sometimes it's a
smartphone and sometimes it's a tablet. sometimes it's a
combination. sometimes it's none of those.


Again, I don't disagree with that.


apparently you do, because you continue to make up ****.


So you don't have any way for anyone else to verify your claim
then?


if you actually worked in the industry you'd learn what *really*
goes on, not what you read about in a google search.


So you don't have any way for anyone else to verify your claim then?


work in the industry and you'll get all the verification you want and a
whole lot more.



so far, everything you've said about apple has been *completely*
wrong and you keep arguing when shown to be wrong.


Nope.


deny it all you want, but you *still* get stuff wrong, and you continue
to argue when the facts are pointed out to you.

you don't want to learn anything. you're ignorant as can be.



although anecdotal, i've had far more problems with enterprise
class routers/switches than i have with consumer grade stuff.


I find myself wondering how much of that might be attributed to your
own mistakes/lack of understanding of the product vs an actual
failure/design flaw with the product. Granted, I wasn't there when the
issues took place and don't know what you did to try to resolve them,
but going by your posts, and only your posts in the limited time I've
know of your existance, it does make me wonder if the failure was with
you, not the product.


insults means you have nothing.

the problems were *hardware* failures (more than one). my consumer
stuff is rock solid.

once the faulty hardware was replaced, everything works great (and it's
far from a simple network).


you can't get more out of it when its specs are worse and there's
no reason why it would fail. displays are very reliable.


I don't know of any commercial grade products whos specs are actually
worse than that of the consumer grade by the same manufacturer. That
doesn't even make sense.


moving the goalposts again. now you're trying to claim that commercial
grade is specific to a manufacturer??

the hp display *you* mentioned, which *you* claim is 'commercial
grade', has *worse* specs than the display in the retina 5k imac, which
*you* claim is not commercial grade, but by your own definition, it is.


that means the hp can't do what the imac can do.

simple as that.

I last looked at Apple.com in May, when I wrote the post.
Whatever new products they've added since then wouldn't
obviously, be included in my comparison.


apple didn't add any new products. all they did was bump up the
specs of the existing products.


Okay, so, as I said, I visited the page in May, whatever they've done
since then I wouldn't have first hand knowledge of. Duh. So, you can
stop accusing me of lying anytime, then. As, I wasn't.


what other conclusion is there? the specs are clear as can be.

either you lied about what you claim to have seen you're just
incredibly stupid, and since you insist you're not lying, there's only
one other conclusion.

that's false and always has been false, as i explained already.


You're contradicting your previous statement. How would I know what
they've done since May as far as bumping specs up? At the time I
visited, it was i5. Not i7.


i'm not contradicting anything.

you were wrong then, you're wrong now and you refuse to admit being
wrong despite it being explained to you numerous times.

once again:
imacs have had an i7 configuration since 2009, shortly after the i7
chip came out.

the retina 5k imac 5k, released in 2014, has *always* had an i7
configuration, from nearly *three* *years* before you claimed to have
looked at apple.com.

there was an i7 configuration in may. just admit you're wrong.

from october, 2014:
http://www.telegraph.co.uk/technolog...-with-5K-Retin
a-display-specifications-and-features.html
Apple's new Mac desktop computer will have 5K display to make for one
of the highest definition screens available built into a desktop
computer.
....
Processor: up to 4Ghz i7 Quad-Core Processor

you are full of ****.



a i7 retina imac (which do exist, despite your claims otherwise),
is an additional $300, bringing the price to $2099.


I didn't see any i7 imac in May on Apple.com.


then you didn't look very hard. or at all.

as i said, the first i7 imac shipped in 2009. that's eight years ago.

the first i7 retina 5k imac shipped in 2014. that's three years ago.

there was an i7 retina 5k last may.

either you're lying or you're stupid, although both cannot be ruled out.

And by your own words in
this very post, Apple 'bumped' up the specs, AFTER I'd already written
the post. Which would explain why it took you until July to respond to
my post.


bump the specs does *not* mean add a new processor configuration.

it means bump up the clock speed, increase the capacity/speed of the
ssd, etc. of *existing* configurations.

once again, there has been an i7 configuration for the retina imac
since 2014 and the standard imac since 2009.

you're full of ****.


in other words, the mac is very competitive with other offerings.


The mac isn't easily upgraded. You most likely can bump up the ram and
single internal hd, but, otherwise, you won't be adding additional
cards to it's mainboard for more features. The PC, oth, will happily
accept new cards, more ram, bigger HD (multiple HDs internally infact),
etc.


few people care.

laptops greatly outsell desktops and have for years, so clearly
upgrading is not a concern.

the microsoft windows laptop can't even be opened without destroying
it. everything is soldered, glued or otherwise locked down.

and as for upgrading, try adding a 10 gig-e nic to a windows laptop.

for a mac (laptop or desktop, doesn't matter), connect it with a cable.
no need to even open the computer.
  #192  
Old July 16th 17, 11:52 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article , Tony Cooper
wrote:

it's the *software* that matters, not the hardware specs.


In which case, users would be better off using Photoshop on a basic
Windows machine instead of an overpriced Mac?


macs aren't overpriced and your attempt at trolling has failed.
  #193  
Old July 17th 17, 01:18 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article , Tony Cooper
wrote:

it's the *software* that matters, not the hardware specs.

In which case, users would be better off using Photoshop on a basic
Windows machine instead of an overpriced Mac?


macs aren't overpriced and your attempt at trolling has failed.


What was that? It's hard to understand you when that hook in your
mouth.


troll. and idiot.
  #194  
Old July 18th 17, 01:07 AM posted to rec.photo.digital
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Where I keep my spare cats.

On Sun, 16 Jul 2017 18:52:48 -0400, nospam
wrote:

In article , Tony Cooper
wrote:

it's the *software* that matters, not the hardware specs.


In which case, users would be better off using Photoshop on a basic
Windows machine instead of an overpriced Mac?


macs aren't overpriced and your attempt at trolling has failed.


You mean that you are ignoring the obvious deduction from your
statement "it's the *software* that matters, not the hardware specs"
that you would get no advantage from higher spec'd and presumably more
expensive hardware.
--

Regards,

Eric Stevens
  #195  
Old July 20th 17, 05:43 AM posted to rec.photo.digital
Diesel
external usenet poster
 
Posts: 346
Default Where I keep my spare cats.

nospam
Sun, 16 Jul 2017
20:23:49 GMT in rec.photo.digital, wrote:

In article
S3DQmpBqD,
Diesel wrote:

for macs and all things apple, you know basically nothing.

HAHAHAHA. Again, not true.

don't laugh too hard because everything you've said about apple
has been completely wrong.


Nope. I shared urls yesterday concerning Apples DRM and it's
lockdown limitations. Although they don't do that silly ****
anymore with their compressed music tracks, they used to.


nope. apple never did what you claimed they did.


Apple used DRM and was restrictive in doing so. A claim that's
supported by facts.


You couldn't just copy drm tracks to all the devices you
wanted at one time.


you could copy to unlimited ipods and burn unlimited audio cds, so
yes.


No decryption key, no playback. itunes wouldn't give the key to an
unlimited number of ipods. Would have defeated drm had it done so.
Or, not bothered using public/private crypto in the first place.

that's why it was the least restrictive of any drm at the time.


No, it wasn't. public/private crypto used to enforce drm vs setting a
specific byte as a toggle was hardly least restrictive. Forcing you
to authorize/deauthorize devices, hardly least restrictive. A
deauthorized device no longer had the ability to decrypt tracks
stored on it. It couldn't play them, it couldn't copy them to
something else and it play them, without the required, MATED
decryption key.

So your attempted redefinition of what itw malware is is a moot
point then. Fact is, Malware for mac does exist, ITW. End of
story as far as that's concerned.


i never said it didn't exist. i said that unlike windows, it
requires user participation and isn't a significant risk unless
the user ****s up, which is a user exploit, not a mac exploit.


You don't seem to have a firm understanding of how malware works with
windows either. Windows itself is an OS. You can't rightfully (well,
I could see how you would) blame 3rd party apps for letting something
happen when it shouldn't on Windows itself.

it's so well established that google prohibits using windows
internally unless it's absolutely required.


I could care less what googles personal business operation rules are
in their own IT departments. It's a moot point, anyway.

Google staff will instead be asked to use Apple's OS X operating
system, or an open-source Linux platform, as the search giant
tries to close the security loopholes that made it possible for
Chinese hackers to gain access to email accounts. Security
experts believe the hackers exploited a loophole in Microsoft's
Internet Explorer browser to hack in to the Gmail accounts of
human rights activists and Chinese dissidents.


Internet Exploder is NOT Windows OS. It's a web browser, a vulnerable
web browser. Many of it's issues can be mitigated by not surfing the
web with admin level account login and/or granted IE admin level run
rights. Google should have properly trained it's employees, but, they
obviously did not do so. They might also want to review their own IT
dept network configuration protocols if compartmentalized damage
control was not in play and a workstation allowed a network wide
breach of any kind. It doesn't bode well for their competence in that
regard.

in addition, *millions* of windows systems have been pwned by
wannacry and petya in recent months, shutting down entire
companies.


Again, that's due to the fault of incompetent users and it staff for
having an improperly setup LAN that allows a single workstation or
multiple workstations the ability to harm the network as a whole
and/or cause damage to a file server beyond that of the logged in
users personal files/shares.


Again, you haven't got the foggiest idea how this works from a
low level aspect. You've likely never written low level code
yourself on any modern machine (mac, or pc) and certainly nothing
intended to be propagating, either via user interaction and/or on
it's own.


nonsense and it's *you* who hasn't the slightest idea how macs
work.


What specifically that I wrote is nonsense? Your reply (which seems
to be quite typical for you) is ambiguous, at best.

you know *nothing* about macs, other than they're made by apple.


Not true.

everything you've said about macs is anywhere from wrong to flat
out absurd.


Again, not true.

That's only part of the reason, with exceptions. It's not about
ease per say, it's about value. Macs have no value for the
intended purpose. If one day, the user base grows to where a mac
is in charge of something worth taking datawise and/or control
over, things will change. Until then, malware authors (myself
included at one point) go for the big fish. And, that's not mac.
As a hobby though, some malware authors do like to **** around
with mac users, just to remind them that they aren't as immune as
they'd like to think.


more accurately, they're too incompetent to know how to do an
effective job writing mac malware (or anything else for that
matter), so they go for the *easy* fish and the low hanging fruit,
which is windows and android.


You clearly have no idea how malware works or the difficulty/ease
(depending on what you're writing) that goes into it. Executable
based infectors aren't the easiest things in the world to write.
Especially when they must be able to infect files that you've never
and most likely never will actually see/get a sample for analysis to
ensure compatability issues don't crop up. You are making
modifications to already existing code, after all. Code that is still
expected to function normally, without alerting the user that
something has been added; and that includes evading so called
sanity/self checks that may also be present in the original code
where you now call home.

Do continue writing from your arsehole though, it does amuse me.

they might experiment with macs, but they don't get very far.


Not true.

Again though, it's more to do with the userbase and value of
target than it is anything else. You're in the minority. For the
time being. Make a greater effort to plant macs in more important
roles, and, you'll become a value target and you'll learn the
hard way you were never safe from malware in the first place.


it has more to do macs being more secure from the start.


No, it doesn't. It has everything to do with the value of the target.
Your mac is of no value in the real world. Your mac isn't running
electrical power grids, has no control over plc based switch gear,
etc. You won't be rerouting power to the wrong transformer or
substation by ****ing with a mac. So, you won't get the satisfaction
of doing serious damage to a substation/nearby power station by
****ing with the mac user. OTH, if you're serious about the harm you
want to cause, you target machines known to be in control of such
things. And, those machines aren't doing Apple.


malware authors target windows and now android because it's
easy and the return on investment is huge.


Please don't pretend to tell me what malware authors do. You've
never been one.


i'm not pretending. that's exactly what they do.


You're writing from your arsehole on a subject you have no personal,
first hand knowledge of. That's called 'pretending' You're attempting
to claim knowledge that you do not have. And attempting to bull****
others into thinking you're some kind of authority on the subject,
with not a single line of malicious code to your name. Quite a feat
you've set out for yourself. You have a better chance of winning a
mega lottery, each time you play, ten times in a row.

writing mac malware is not worth the greater effort, while writing
windows malware is easy and profitable. simple as that.


As I said.. You know nothing about it.

Obviously you do if you think mac is immune. And if you actually
knew wtf you were writing about, you wouldn't have made the
statement concerning what a malware author would love/not love to
have, either. AV has never really gotten in the way of a serious
author. It was always retroactive with my stuff and that of my
peers.


i never said macs were immune. stop lying. nothing is immune, it's
that the risk is so low that it isn't an issue.


I'm not lying, and, do try to remain on topic. I don't think it's too
much to ask. You made statements that are blatently, NOT true
concerning AV and the malware authors mindset. You've *never* walked
in those shoes, you have NO KNOWLEDGE of the subject. None, nothing,
zip, zilch, nada.

the *user* is the weak point, not the mac.


The USER is the weak link in the majority of security breaches. One
article YOU cited even called it, Internet Explorer (Which is NOT
Windows) was found guilty of exploitable vulnerabilities.

and if something does go wrong, regardless of reason, simply
restore from backup. a savvy user can be up and running in as
little as 10-15 seconds. no big deal at all.


Again, you're writing from your areshole. There is this nifty malware
(bad guy) technology called slow data diddling. It's intended to nail
the people who think backups alone are kind. And, system images are
the cure all. You *slowly* swap sectors around. Sometimes, you could
do this for months and months before the user thinks anything might
not be right. During those months of your funtime, the user has been
imaging/backing up your wanted (but, probably not what they want)
changes. Depending on how far back their backups/images go, they may
not be able to reverse your data diddles fully or at all. And, the
best part, they have no way to confirm whether or not they were
succesful. After all, the typical user does good to remember to
backup the machine once the first time, let alone stick with a
reliable schedule. And, well written, uber malicious natured malware
isn't going to kill/upset the host right away. That's counter
productive. Early discovery by the user spells early doom for your
creation. Your creation wants to exist for as long as possible, and,
hopefully be included on those cure all backups/images you think are
the quick and easy fix.

anti-malware utilities are utter **** and cause far more
problems than they attempt to solve.


Personal opinion which isn't really backed up by much evidence.
What evidence that does exist is mostly speculation and can be
attributed to user error along the way.


absolutely wrong. there's *extensive* evidence.


Actually, there isn't.

the only 'user error' is that of the inept 'coders' who wrote the
****ty anti-malware apps and the lack of testing it.


You obviously know as much about whats involved in antimalware
development as you do the creation of Malware. Which is to say,
nothing.

my personal favourite is when a mac anti-malware utility
quarantined the virtual memory swap files. needless to say, that
didn't end well. the level of stupidity for that to even happen,
nevermind get past testing, is mind boggling.


I can see how that might have happened, but, that's because I
understand what's involved in the creation and detection of malware,
from first hand experience...

worse, some anti-malware companies have actually written their
own malware and released it, then bragged that they were first
to 'detect' it.


Heh, that's actually a common myth. I'm surprised someone of your
supposed stature actually bought it. even for a second. Well, not
really, but...


actually, it's a fact, not a myth.


It's a myth. The fact you actually believe it, does speak volumes.

i have direct first-hand personal knowledge of one company that
did exactly that, creating malware, releasing it and then bragging
that they were first to detect it. i know several people involved
and what transpired and that's all i'm going to say about it.


Sure you do. That's like the story that cousin bob knows someone
named jimmy who knows someone named john that swears the 2.5 litre
chevy engine is the best thing since sliced bread, because some
person named jennifer who knows the person jimmy knows designed the
intake.

*yawn*

You're full of ****, sorry, but you are. You probably don't know
this, so I'm going to enjoy telling you. Vxers/malware authors of any
value were on first name basis with MANY reputable, established
antivirus companies, right down to their coders. We'd give them 0day
samples of our work in exchange for the commercial versions of their
software. I had a copy of TBAV registered to my real name, long
before ANYBODY outside a very small circle even know what my real
name was. It cost me a single 0day binary sample of a new virus
(which was to become a family) for it.

The reason such and such company could brag about being the first to
detect such and such work was because we had a deal with them and we
sent them the work, DIRECTLY. They didn't author it, we did, we
traded our stuff for theirs. If we hated mcafee/symantec, we wouldn't
do a trade with them. They'd have to WAIT until a participating AV
member forked them a sample of our newest creation; and, that wasn't
instant. It gave the ones we liked/respected a small competitive edge
window.

Antivirus companies did not release malware into the wild to claim
credit for being the first to detect it in an effort to boost their
own sales. WE (the virus author/ the malware author) had trade deals
worked out with some of the antivirus companies and we would give
them brand new samples, before we released them into the world, in
many cases. That's how they got to be the first to detect such and
such bug. Not because they wrote it.

your claim is NOTHING MORE than a common myth, spread ad nausem by
non malware authors who didn't have the first clue how such and such
company new about a new bug before anyone else did.

Now, you know. Which amuses me that much more. As, it's COMMON
knowledge to any malware author who was worth something. And by
worth, I mean an antivirus company representative would reach out and
try to establish a trade deal with you; you didn't have to beg them
for attention. They wanted what you had. They wanted first dibs. And,
if the trade was a good one, they got what they wanted. And so did
we.

Regged copies of their finest ****, bragging rights to have av such
and such in your real name, a spiffy writeup of your work on their
website, discussing it's technical abilities and low level system
access; bragging rights for you. You could show off your work and
others could read about it.

And them keeping their mouths shut, knowing who you were irl; knowing
what you were working on, and, not snitching you out. They were happy
enough to be able to offer detection for your new bug, the moment it
got 'out there'; because they were already provided a copy,
sometimes, days/weeks ahead of release schedule and their competitors
didn't get any advance copies and had to work for it, or, wait on the
internationally recognized trade table to get it to them, instead.

You could even get specific in the terms of the trade. When you find
my bug going itw, you give me 48 hours BEFORE you add detection
signatures, or, I won't give you any more hot off the
assembler/compiler samples. And, the majority of the time, they
honored ALL trade terms. What choice did they really have anyway.
Established malware authors do have some pull, my friend.

If your product nails my bug thats not even out yet on the next
definitions update, I'll make a couple of quick changes so that the
sample I sent you is null and void and you don't detect me anymore.
So that ****s you from being the first. And, I'll let my other
malware authoring friends know you ****ed me over, they won't trade
with you anymore either and they'll make the necessary mods to null
and void the samples you already got ahead of release schedule.

So you and the customers you protect ALL lose. That's bad for
business, as at that point, you can't offer them any better
protection than your competition can.

Av/am companies, trade samples, securely. They don't pass them around
to every tom dick and harry who isn't responsible. You have to know
WTF you're doing to get a sample and you have to have demonstrated
that to someone inside that closed circle you can be trusted with the
sample without posing a danger to yourself or others when you examine
the sample for whichever company you work for.

I've been retired for seventeen years, officially myself, and, still
get emails from av/am companies asking if I'd be willing to trade
malware samples with them for copies of their goods. Because they
know who I am, they know what the connections I have still are, and
they know I have massive zoos of malware archived. And, they want
access.

there's another company that offered a free 'clean up' utility
that supposedly deleted files that weren't needed to 'speed up'
the computer (itself a bogus claim), but it actually installed
obnoxious adware to harass the user into buying their paid apps.


Bull****.

there's also this:
https://thenextweb.com/insider/2015/...antivirus-accu
sed-o f-creating-malware-for-over-10-years/
Hereıs a crazy report: Kaspersky Lab, makers of a popular
antivirus service, might*have created fake malware for over ten
years to harm its competitors.*The software was benign, but
Kaspersky fooled other antivirus software into marking it as
infected.


Actually, the samples kaspersky released as the article clearly
stated was harmless. not actually malware. Intended to expose
competitors who were stealing definition data. Malwarebytes did the
same thing by releasing known bogus defintions, and, iobit happily
stole them without checking the results. It made exposing iobits
intellectual property theft a simple case, btw. We could prove they
stole our work, at the definition level because they detected samples
that didn't actually exist; thanks to bogus definition data we
inserted. It's an old 'trick of the trade' to keep an eye on your
competition. Av/am is a cutthroat business. And, people will steal
your work without batting an eye if they think they can get away with
it.

Two ex-employees told Reuters that the clandestine attack was
originally meant to punish smaller rivals that Kaspersky felt
were Œstealingı its technology.


Yep. Keyword, theft of intellectual property and technologies. No
actual malware released by Kaspersky, no risk to end users, either.


They do nothing of the sort. You have no idea how any of this
works.


they *have*, and i do, much more than you do. see above.


No, they haven't, and, you have no ****ing idea how any of this
works. The article you cited even backs me up. They didn't release
any malicious code onto the public. They released bogus definitions
that some less than honorable competitors took without permission and
tried to use. Definitions for NON EXISTANT malware.

I don't know where the **** you get off thinking you know a damn
thing about how this works behind the scenes when you've never been a
part of it, from either side of the fence. I have, you haven't. I
know a hell of a lot more about this, from both sides than you ever
will. I wasn't asked to come work for Malwarebytes because of my
charming personality. Kim Neely didn't write that article about me in
Rolling stone magazine because I'm a sweetheart. BOTH events took
place due to the technical skill and knowledge I have in my head.

Dumb****, ****ing asshat. You're in way over your head with me on
this.

almost none? By what do you base such a silly claim on?


a solid understanding of mac os and windows.


I don't think so. You tried to compare an Internet Explorer incident
to being the direct fault of Windows itself. One is an OPERATING
SYSTEM, the other is a WEB BROWSER. An important, if not subtle
difference.

wannacry and petya can't happen on a mac.


They were not designed to run on a mac. Mac was not a target. Hell,
even the nsa thinks the mac is a ****ing joke. They have no dedicated
mac hacking/malware teams. Macs aren't used in places that are
considered high value targets. If/when they are, more interest will
be taken in 0wning them, too. And, thanks to the stupid users (like
you) who automatically assume you're safe unless you do something
stupid, it'll be like taking candy from a mute baby.


If all the terrorists go serious mac and mac only, then the three
letter agencies will take a keen interest in them. Until then, enjoy
the false sense of security you think you have. Just remember, it's
not security you're enjoying, it's low user numbers and non position
of serious interest usage. Nobody wants your photo collections. Only
some? pirates care about the work in progress post production efforts
you might be doing with a mac, and, there's so many other ways to get
it easier if they really want it, why even bother with the mac user?

Make yourselves a viable target and you'll learn what your security
blanket isn't made of. Until then, continue to suck on your thumb and
think all is well in the world of mac.

--
https://tekrider.net/pages/david-brooks-stalker.php

I don't have OCD - I'm just defragmenting the universe.
  #196  
Old July 20th 17, 12:12 PM posted to rec.photo.digital
-hh
external usenet poster
 
Posts: 838
Default Where I keep my spare cats.

On Thursday, July 20, 2017 at 12:47:42 AM UTC-4, Diesel wrote:
nospam
Sun, 16 Jul 2017
20:23:49 GMT in rec.photo.digital, wrote:

In article
S3DQmpBqD,
Diesel wrote:
...
You couldn't just copy drm tracks to all the devices you
wanted at one time.


you could copy to unlimited ipods and burn unlimited audio cds, so
yes.


No decryption key, no playback. itunes wouldn't give the key to an
unlimited number of ipods. Would have defeated drm had it done so.
Or, not bothered using public/private crypto in the first place.


Looks like someone failed to RTFM at the time: nospam did mention
what the DRM circumvention procedure was, which was a quite well
known "hole" - it apparently was left wide open on purpose because
of what Apple had negotiated with the labels and ultimately was probably
a big part of why Apple was able to subsequently negotiate with the
labels to do away with DRM entirely.

that's why it was the least restrictive of any drm at the time.


No, it wasn't. public/private crypto used to enforce drm vs setting a
specific byte as a toggle was hardly least restrictive. Forcing you
to authorize/deauthorize devices, hardly least restrictive. A
deauthorized device no longer had the ability to decrypt tracks
stored on it. It couldn't play them, it couldn't copy them to
something else and it play them, without the required, MATED
decryption key.


The "weeds" isn't what's at issue, but the net policy facing outwards
to the customer. To illustrate, kindly show via citation just whose DRM
systems during that period actually allowed a customer to install a
DRM-protected music file on *more* devices/systems than Apple's
system did (IIRC, Apple's allowed up to 5 mobile devices?) - - and note
that this is while the DRM system is remaining intact, not circumvented.


in addition, *millions* of windows systems have been pwned by
wannacry and petya in recent months, shutting down entire
companies.


Again, that's due to the fault of incompetent users and it staff for
having an improperly setup LAN that allows a single workstation or
multiple workstations the ability to harm the network as a whole
and/or cause damage to a file server beyond that of the logged in
users personal files/shares.


Blame the Server for this, specifically where all of these systems are
essentially copies of Unix, where the policies were very simple:
it is the likes of "drwxrwxrwx" that is what allows a 'bot to run wild
through a Server (and historically, is precisely what happened in
workplace office servers with IIRC 'Melissa', causing infections to
spread from a single node through a believed-trusted source).


it has more to do macs being more secure from the start.


No, it doesn't. It has everything to do with the value of the target.
Your mac is of no value in the real world. Your mac isn't running
electrical power grids, has no control over plc based switch gear,
etc. You won't be rerouting power to the wrong transformer or
substation by ****ing with a mac. So, you won't get the satisfaction
of doing serious damage to a substation/nearby power station by
****ing with the mac user.


But Aunt Ethyl's Windows PC isn't running the local hydro dam either,
and yet it is still quite extensively targeted. As such, your claimed
motivations need adjustment so that they reflect ground truth.


writing mac malware is not worth the greater effort, while writing
windows malware is easy and profitable. simple as that.


As I said.. You know nothing about it.


That statement fails to dispute the point being made. As has been
noted before, there's a hell of a lot of gold stored in Ft Knox. And yet,
its the corner bodega which gets robbed for a relative pittance. But
hitting such a vulnerable site is easier and repeated, it adds up the
same way that consistently hitting a bunch of 'easy' singles in a baseball
game invariably racks up the score.


-hh
  #197  
Old July 23rd 17, 08:05 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article
5X1o52P7L0d2ZyLf8D,
Diesel wrote:


Nope. I shared urls yesterday concerning Apples DRM and it's
lockdown limitations. Although they don't do that silly ****
anymore with their compressed music tracks, they used to.


nope. apple never did what you claimed they did.


Apple used DRM and was restrictive in doing so. A claim that's
supported by facts.


nope.

the *actual* facts, which you repeatedly ignore, is that apple's drm
was the least restrictive of any drm at the time, so much so that it
was invisible to the end user unless they were up to no good.

apple even used an industry standard format, as opposed to what
microsoft did (they too had drm), with their protected wmv.

You couldn't just copy drm tracks to all the devices you
wanted at one time.


you could copy to unlimited ipods and burn unlimited audio cds, so
yes.


No decryption key, no playback. itunes wouldn't give the key to an
unlimited number of ipods. Would have defeated drm had it done so.
Or, not bothered using public/private crypto in the first place.


except that's *exactly* how it worked.

you are wrong again.

http://www.trivergence.com/market.asp?MarketID=4050
€ April 2003 -- Apple launches the iTunes Music Store, which enables
users to download more than 200,000 songs for $.99 each. The service
provides music with a range of rights, including listening on an
unlimited number of IPod music players. Jimmy Iovine of Universal
Music characterizes the service as "the first offensive move in what
has been a primarily defensive game."

https://www.apple.com/asia/support/i...l/aciTunesPc_t
16.html
You can also download your purchased music to an unlimited number of
iPod devices to take your songs on the road. You can add the music
you purchased from the iTunes Music Store to any of your playlists.
And you can even burn your purchased music to a CD.



it's so well established that google prohibits using windows
internally unless it's absolutely required.


I could care less what googles personal business operation rules are
in their own IT departments. It's a moot point, anyway.


it's not moot at all and the only reason you don't care is because it
shows you to be wrong.

and it ain't just google:

https://www.theinquirer.net/inquirer...warns-against-
using-windows-8-due-to-security-risks
THE GERMAN GOVERNMENT reportedly has warned against using Microsoft's
Windows 8 operating system, claiming that technology in PCs running
it makes them more vulnerable to cyber attacks.





Again, you haven't got the foggiest idea how this works from a
low level aspect. You've likely never written low level code
yourself on any modern machine (mac, or pc) and certainly nothing
intended to be propagating, either via user interaction and/or on
it's own.


nonsense and it's *you* who hasn't the slightest idea how macs
work.


What specifically that I wrote is nonsense? Your reply (which seems
to be quite typical for you) is ambiguous, at best.


i've already explained *numerous* things you've gotten wrong.

everything you've written about apple is anywhere from wrong to flat
out absurd, and what's worse is you refuse to learn.

you know *nothing* about macs, other than they're made by apple.


Not true.

everything you've said about macs is anywhere from wrong to flat
out absurd.


Again, not true.


yet you keep getting everything wrong.



malware authors target windows and now android because it's
easy and the return on investment is huge.

Please don't pretend to tell me what malware authors do. You've
never been one.


i'm not pretending. that's exactly what they do.


You're writing from your arsehole on a subject you have no personal,
first hand knowledge of. That's called 'pretending' You're attempting
to claim knowledge that you do not have. And attempting to bull****
others into thinking you're some kind of authority on the subject,
with not a single line of malicious code to your name. Quite a feat
you've set out for yourself. You have a better chance of winning a
mega lottery, each time you play, ten times in a row.


insults means you have nothing.

you have *no* knowledge of what i've done or haven't done.

i've been in the industry for *years*, having written a *lot* of mac
and ios software, including for some rather well known companies, along
with writing some windows, unix, mainframe and mini software. i am
*very* familiar with the internals of mac os and ios.

if anyone is pretending, it's *you*, who repeatedly insists you know
more about apple and their products than apple does.

you're nothing more than a poseur, and a ****ty one at that.


my personal favourite is when a mac anti-malware utility
quarantined the virtual memory swap files. needless to say, that
didn't end well. the level of stupidity for that to even happen,
nevermind get past testing, is mind boggling.


I can see how that might have happened, but, that's because I
understand what's involved in the creation and detection of malware,
from first hand experience...


the only reason it happened is because it was written by incompetent
morons and not tested before release.

worse, some anti-malware companies have actually written their
own malware and released it, then bragged that they were first
to 'detect' it.

Heh, that's actually a common myth. I'm surprised someone of your
supposed stature actually bought it. even for a second. Well, not
really, but...


actually, it's a fact, not a myth.


It's a myth. The fact you actually believe it, does speak volumes.


it's not a myth. once again, i personally *know* several people who
were involved.

just because you haven't heard about it doesn't mean it never happened.

there's a lot you don't know.
  #198  
Old July 23rd 17, 08:05 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article , -hh
wrote:

that's why it was the least restrictive of any drm at the time.


No, it wasn't. public/private crypto used to enforce drm vs setting a
specific byte as a toggle was hardly least restrictive. Forcing you
to authorize/deauthorize devices, hardly least restrictive. A
deauthorized device no longer had the ability to decrypt tracks
stored on it. It couldn't play them, it couldn't copy them to
something else and it play them, without the required, MATED
decryption key.


The "weeds" isn't what's at issue, but the net policy facing outwards
to the customer. To illustrate, kindly show via citation just whose DRM
systems during that period actually allowed a customer to install a
DRM-protected music file on *more* devices/systems than Apple's
system did (IIRC, Apple's allowed up to 5 mobile devices?) - - and note
that this is while the DRM system is remaining intact, not circumvented.


once again, it was an unlimited number of ipods.

the only limit was for *computers*.

the reality is that the majority of users only needs to authorize one
computer, the one which hosts their music library and from which they
sync their ipods.

five is actually rather generous. most people don't even own five
computers.
  #199  
Old July 24th 17, 01:21 AM posted to rec.photo.digital
PeterN[_6_]
external usenet poster
 
Posts: 4,254
Default Where I keep my spare cats.

On 7/23/2017 3:05 PM, nospam wrote:

snip


the only limit was for *computers*.

the reality is that the majority of users only needs to authorize one
computer, the one which hosts their music library and from which they
sync their ipods.


Realistically, the restrictions creates a problem. Several years ago my
daughter gave me an iPod, about half filled with some of my favorite
music. The source for the music was my music collection, consisting of
some irreplaceable recordings, including original Louis Armstrong,
Toscanini, Caruso, Lanza, three different versions of Wagner's first,
etc. The original computer that she used is long gone. I would like to
transfer the music to my iPhone, and add new music. Apple has told me
that it can't be done. I'm sure there is a way. The suggestions on some
of the Internet forums has not been very helpful. Realistically, the
reality is that you don't know what the hell you are talking about.
Don't give me that bull about "most." There are many others who have
needs similar to mine.




five is actually rather generous. most people don't even own five
computers.

Irrelevant.

--
PeterN
  #200  
Old July 24th 17, 01:38 AM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article , PeterN
wrote:


the reality is that the majority of users only needs to authorize one
computer, the one which hosts their music library and from which they
sync their ipods.


Realistically, the restrictions creates a problem. Several years ago my
daughter gave me an iPod, about half filled with some of my favorite
music. The source for the music was my music collection, consisting of
some irreplaceable recordings, including original Louis Armstrong,
Toscanini, Caruso, Lanza, three different versions of Wagner's first,
etc. The original computer that she used is long gone.I would like to
transfer the music to my iPhone, and add new music. Apple has told me
that it can't be done. I'm sure there is a way.


of course it can be done.

just copy the music off one of the backups you made before getting rid
of the computer. very easy.

since the music is supposedly irreplaceable, you would certainly have
made *multiple* backups. after all, it's irreplaceable. so you say.

The suggestions on some
of the Internet forums has not been very helpful.


then you didn't look very hard. it's very easy to copy music off an
ipod.

Realistically, the
reality is that you don't know what the hell you are talking about.


realistically, i know *far* more about it than you do.

Don't give me that bull about "most." There are many others who have
needs similar to mine.


not many, but those that do are only in that situation because of their
own mistakes, that being not having backups.

you ****ed up. simple as that.

try to learn from your mistakes rather than argue.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
THE 20D JUST LOVES CATS! annika1980.com 35mm Photo Equipment 4 June 4th 07 06:56 AM
Famous cats...... William Graham 35mm Photo Equipment 24 May 29th 07 08:20 AM
Cats and flash Roger (K8RI) Digital SLR Cameras 20 November 7th 06 08:14 AM
Storing Spare CF cards next to Spare Battery Ken Digital Photography 5 July 5th 06 08:58 PM
Cats Eye... (D70) Seymore Digital Photography 0 December 23rd 04 05:42 PM


All times are GMT +1. The time now is 07:01 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright İ2004-2024 PhotoBanter.com.
The comments are property of their posters.