View Single Post
  #254  
Old August 12th 17, 08:38 PM posted to rec.photo.digital
nospam
external usenet poster
 
Posts: 24,165
Default Where I keep my spare cats.

In article
XnsA7CBEE17927AAHT1@dr50n7Lg2Q3UT128P02Mrd6V9H9D2 cxu7y3AkI70C48M5kGFxO.
4m50YcRx, Diesel wrote:

it's a very good comparison and exactly why unix apps are second
rate. there's no incentive to fix any of the problems.


Horse ****.


it's true.

state of the art software is almost entirely mac/windows and more
recently, ios/android, because there's no financial incentive to bother
with anything else.

there's the occasional exception, but it's very rare. the best software
engineers do *not* work for free.

only for mindless sheep does popularity matter.


That would be the majority of the general public.

that's why apple bashers keep citing the 90% market share as if
it's an indication of quality. it's not.


I'm not an Apple basher, either.


bull****.

any time anyone says anything anti-apple, you cheer it, even when it's
completely bogus.

any time anyone says anything positive about apple, you immediately
dismiss it as false.

as for beta/vhs, tv stations used betacam (the pro version of
betamax), not vhs. they chose that because it was the only system
that did what they needed, not because it was popular, so your
example doesn't even prove what you thought it did.


Talk about moving goal posts.


which is why i pointed out that you did exactly that by mentioning
beta/vhs.

For the end user, betamax lost out to
vhs, and, it wasn't because vhs was better technology or video
quality wise. TV stations are not the typical end user, and, they
usually have much nicer gear; specialized gear for the work they do.


in other words, tv stations bought what fits their needs, not what was
popular.

did you have a point? nope.
did you just contradict yourself again? yep.


prices are competitive and macs are the *only* platform
that can run mac, windows *and* unix.

http://emulators.com/

emulation means the host system *can't* run it, it has to
emulate it. you just proved my point.

Actually, I discredited what you wrote.

no you haven't. you don't even *understand* what i wrote.

Yes I do. You claimed that macs were the only ones that could run
mac, windows and unix, but, that's not true.


it is true.


No, it isn't. As the url I provided shows.


wrong again.

macs can run mac/windows/unix apps *natively*. no emulation required.
no other system can do that.

not only that, but emulation won't work for a lot of apps, not just
mac, even if the performance hit was not an issue.

running mac os on a hackintosh is a violation of the eula, plus a
hackintosh doesn't do most of the things a genuine mac can do,
even if violating the eula wasn't a concern.


what exactly is a hackintosh?


it's a non-mac that runs mac os, except that it requires jumping
through a lot of hoops to get it to work and more hoops to keep it
working in the event updates undo previous hoops. also, quite a bit of
mac os won't work at all because the hardware it requires is not
present.

You keep spouting that mac does things
a PC cannot do, aside from running some very specific apple software
or including some very specific Apple hardware; what specifically can
a mac do that a PC cannot? This is your chance to try and sell me on
a mac, btw. Or, perhaps another individual reading our 'discussion'
if you want to stretch the words meaning that far. I'll give you some
leeway on that.


i already listed a bunch of things.

you didn't understand any of them then and i haven't seen any
indication that you will now.

you claimed a pc could emulate most of them (which is amusing in
itself). i challenged you to cite specific examples, but you (wisely)
chose to not embarrass yourself any further.

i'm not interested in selling you or anyone else on a mac. i don't give
a flying **** what your or anyone else uses.

what matters is that people make an *informed* *decision* based on
*accurate* information, not myths and propaganda, so that they can
choose a product that best fits their needs, no matter who makes it.

no single product is best at everything.

The iphone battery typically lasts less than two years. And,
that's if you take good care of it.


complete utter nonsense. not even close to reality.

http://www.cio.com/article/2419524/c...gy/how-to-know
-if-your-iph one-battery-is-on-death-row.html


bogus article.


I seriously doubt that.


doubt it all you want, but it's completely bogus. it's bull****. you
can't see past your hate to realize t's bull****.

batteries do *not* fail in two years, whether it's apple, samsung or
some other company.

android devices, which use the same battery technology as apple does
(lithium ion or lithium polymer), often from the very same battery
manufacturer, last much longer than 2 years.

http://www.macworld.com/article/1058.../iphonebattery.
html


Oh, that wouldn't have any bias towards it. None at all. /sarcasm.


you only believe something if it's anti-apple.

jason is one of the most respected journalists in the industry and that
article is *accurate*.

he's also quoting apple, who are legally bound to *not* lie about their
products.

the reality is that an iphone battery lasts many *years* with
minimal degradation.


That isn't reality.


yes it is reality, and not just apple either.

in general, lithium polymer and lithium ion batteries are rated at 5
years with minimal degradation (80% capacity).

only a *defective* battery will fail in 2 years.

nothing is perfect and a tiny percentage of batteries will be defective
and may prematurely fail, but that's the *exception*, not the rule.

defective batteries will be replaced *for* *free* under warranty and in
many cases, outside of warranty. apple is well known for the latter,
other companies not so much.

nearly half of iphones currently in use (as of this past april) are at
least 3 years old:
http://zdnet3.cbsistatic.com/hub/i/r...015-46e7-b23b-
b1db6c59eb52/resize/770xauto/fbef1701d6f7cfaab3fff552d60fd517/2017-05-01
12-41-05.jpg

so much for your bull**** claim of it'll be dead in two years.

even at 80%, it's still very usable. an iphone 7+ is rated for 21 hours
talk time and 15 hours wifi use, so after 5 years at 80% capacity, it
would have roughly 17 hours talk time and 12 hours internet use. that's
still *very* good.

in fact, that's better than many android phones when new, some of which
can't make it through a single day (mostly on standby) without being
recharged.

complete non-issue, fabricated by a hater.



it means bump up the clock speed, increase the
capacity/speed of the ssd, etc. of *existing*
configurations.

Are you going to overclock the existing cpu to bring up the
clock speed, or, changeout the cpu for one that runs at a
higher speed, natively? I'm not a big fan of overclocking
myself. Or, did you throttle the cpu clock speed back and
decide to bring it to the rate it was originally designed to
run and increase cost to consumer for that 'higher' clock
speed? The latter seems shady to me. Almost dishonest.

nobody said anything about overclocking.

yet another thing about which you know nothing.

Okay. I'll play along with this. Explain how you bumped up the
clock speed if you didn't do anything I described?


you haven't a clue.


Oh, yes, I most certainly do. You've entered a discussion with me
where you have no chance. You might as well have tried to bluff your
knowledge concerning assembler and/or machine code at this point. The
result for you would be the same. a thorough arse kicking by me.


you're talking out your ass again. you have *no* clue what the term
speed bump means, a term that's commonly used in the industry. you
guessed wrong and now trying to save face.

you're full of **** and you ain't fooling *anyone*.

nowhere did anyone say the end user bumped the specs.


I said nothing about the end user bumping anything.


yes you did:
Are you going to overclock the existing cpu to bring up the
clock speed, or, changeout the cpu for one that runs at a
higher speed, natively? I'm not a big fan of overclocking



You claimed the
vendor (apple) bumped the clock spec. If they didn't do any of the
aforementioned things, what did they do to achieve it?


what apple did was release a *new* product with better specs than the
previous model, the same as many other companies do.

you're not involved in the industry to know commonly used terms and
completely fail at pretending to be.




a single desktop/laptop can *easily* saturate a gigabit link
without much effort, making gigabit the bottleneck.


If you're moving some rather large files around using multiple
computers, sure. A single machine isn't that likely to saturate the
entire 1gigabit switchgear, though. A NAS system oth, is another
beastie outright and that's not exactly a fair comparison, either.


a single mac can *easily* can saturate a gigabit link without any
effort whatsoever, *without* a nas.

nases are also *very* common, particularly for mac users. nothing
unfair about a nas, although not required in this example.

And most home users are not going to move enough data from machine to
machine at one time to saturate a gigabit based LAN. Many small
businesses depending on the type of business won't even do that.
Digital movie production, serious photographic work, maybe. And the
latter is pushing it. I've setup audio recording studios that
wouldn't tax 1gigabit based LAN. And that's working with RAW audio
files.


your knowledge and experience is incredibly limited.

those studios should hire someone who knows what they're doing.

removing that bottleneck is generally a wise investment, one which
will pay for itself fairly quickly in increased productivity. it
might not matter that much for a home user, but it would for smb.


Of course it won't for a home user. A home user isn't going to have
multiple computers taxing a 1gigabit link.


yes they will, and have for many years.

usb 3.0 is 5 gb/s.
usb 3.1 gen 2 is 10 gb/s.

that's already an issue.

recent macs can do 2-3 giga *bytes*/sec from ssd, which can saturate a
10 gig link, nevermind gigabit, which it does routinely.

https://apple.insidercdn.com/gallery/18862-18409-Extended-l.jpg


It may matter for an smb,
depending on what the network is actually being used for and how many
users are actually using it at one time as well as file server/work
station configurations. However, to claim automatically that it would
for smb is a foolish claim to make that isn't supported by facts.


it's well supported, even for a solo user, so it absolutely would for
smb.

the fact is that existing macs can easily support 10 gig, whereas
most pcs cannot. pc users will need a new computer. mac users
won't.


Actually, some pcs have had 10gigabit cards built into the mainboard
for several years now.


not at the consumer level and certainly not on laptops, due to cost,
space and thermal constraints.

macbooks under $1000 can use 10 gig network adapters at full bandwidth.

Users that have serious
LANs that actually do have a reason/need for it, that is. LAN setups
with one or more dedicated and serious storage capable NAS devices,
primarily. Not your typical home users.


wrong.

there's no need for a 'serious lan' or 'serious storage capable nas
devices' to benefit from 10 gig.

anything with usb 3.0 can saturate gigabit.

recent macs push 10 gig to its limits.

just two computers on the network transferring files, and gigabit is a
bottleneck.

furthermore, few people actually upgrade anyway.


That's nothing more than your own personal opinion.


it's a well established industry statistic.

One of which I do
not share, because the experience I have in the field doesn't support
your opinion.


your experience, as with everything else you've done, is extremely
limited, and what's worse is you blindly discard what others have done
and refuse to learn.

Apple users likely don't do much in the way of hardware
upgrades because they are rather limited in what upgrades they can
actually do.


nonsense.

macs have a longer useful life than pcs and can be upgraded when pcs
cannot. that's one of many reasons why the resale value of a mac is
much higher than a pc.

even a lowly $499 mac mini from 5 years ago can have a 10g nic plugged
into it and can utilize all of its bandwidth, thereby extending its
useful life several more years.

a similar pc from that era would need to be replaced.

PC users are a different animal.


nope.

pc users, just like mac users, want to get work done.

users don't give a **** about what parts are inside and when it comes
time to upgrade, they get a new computer because *everything* in it is
outdated, not just one component.

upgrading would entail replacing *all* of the parts piecemeal, which
would cost *more* than buying a new system.