View Single Post
  #252  
Old August 9th 17, 11:00 PM posted to rec.photo.digital
-hh
external usenet poster
 
Posts: 838
Default Where I keep my spare cats.

On Tuesday, August 8, 2017 at 11:15:54 PM UTC-4, Diesel wrote:
nospam
Tue, 08 Aug 2017
20:33:00 GMT in rec.photo.digital, wrote:

it's a very good comparison and exactly why unix apps are second
rate. there's no incentive to fix any of the problems.


Horse ****.


Unfortunately, the Linux (+Unix) desktop products fail to
demonstrate otherwise...but I digress.



only for mindless sheep does popularity matter.


That would be the majority of the general public.


Except that both points are misguided...but I digress.


as for beta/vhs, tv stations used betacam (the pro version of
betamax), not vhs. they chose that because it was the only system
that did what they needed, not because it was popular, so your
example doesn't even prove what you thought it did.


Talk about moving goal posts. For the end user, betamax lost out to
vhs, and, it wasn't because vhs was better technology or video
quality wise. TV stations are not the typical end user, and, they
usually have much nicer gear; specialized gear for the work they do.


Except that Beta did die, which illustrates that VHS was determined
by the competitive marketplace to have been 'better', even if you're
not personally able to ascertain the reason why...YA digression.


prices are competitive and macs are the *only* platform
that can run mac, windows *and* unix.

http://emulators.com/

emulation means the host system *can't* run it, it has to
emulate it. you just proved my point.

Actually, I discredited what you wrote.

no you haven't. you don't even *understand* what i wrote.

Yes I do. You claimed that macs were the only ones that could run
mac, windows and unix, but, that's not true.


it is true.


No, it isn't. As the url I provided shows.


Just because one *can* do something, doesn't mean that it is in
full compliance with all legal obligations. Gosh, another digression.


The iphone battery typically lasts less than two years. And,
that's if you take good care of it.


complete utter nonsense. not even close to reality.

http://www.cio.com/article/2419524/c...gy/how-to-know
-if-your-iph one-battery-is-on-death-row.html


bogus article.


I seriously doubt that.

http://www.macworld.com/article/1058.../iphonebattery.
html


Oh, that wouldn't have any bias towards it. None at all. /sarcasm.


From a design engineering standpoint, lifespan boils down to what the
discharge limit parameters are set at: it is very easy to be tempted
to cheat and dip too far down, which kills a Li-Ion in relatively
few cycles but lets you claim a higher battery rating for marketing
and sales to sell product. Thus, if your experience is that the
stuff only barely lasts two years, you're either really beating on
it with multiple full cycles daily, or you've bought cheap crap that
was deliberately under-engineered in order to sell better.

Meantime, Apple's apparently following in their design space choice
for iOS power specs which are pretty conservative: it makes a two
year life expected at ~95% percentile, and the practical battery
capacity EOL's start to show up at the 4-5 year mark for the more
typical customer's duty cycle (50th percentile; phones tend to be
more cycled per unit time than tablets). YA digression.


the reality is that an iphone battery lasts many *years* with
minimal degradation.


That isn't reality.


A PSA "FYI" digression:
Next time you see an iOS with the old style (pre-lightning) plug,
be aware that those devices are all now more than 5 years old.



I'm not like the IT people you've known over the years. Nor are the
majority of my peers.


That's because your day job is still as an Electrician's helper,
pulling wire...right? But we digress even further downhill.


they need people to feel threatened by malware so that their sales
continue.


That's why almost every player has free versions of their products
that use the same detection/removal engine and shared definitions as
the commercial counterparts, right? Do you even believe the bull****
you're trying to peddle here yourself?


How many IT certifications are there in Marketing? /S




a single desktop/laptop can *easily* saturate a gigabit link
without much effort, making gigabit the bottleneck.


If you're moving some rather large files around using multiple
computers, sure.


Even the latency will clobber you.

A single machine isn't that likely to saturate the
entire 1gigabit switchgear, though.


Gigabit Ethernet = 125 MB/sec
SATA-1 = 150 MB/sec

Oops!


And most home users are not going to move enough data from machine
to machine at one time to saturate a gigabit based LAN.


Where "most home users" is being defined as individuals who
employ DAS instead of NAS because a Gigabit Ethernet connected
NAS is more expensive and slower...right?

Many small businesses depending on the type of business won't
even do that.


Of course not, because the local data storage has been regularly
running on SATA-3 (600 MB/sec) for years now, which means that
Gigabit Ethernet has been a huge bottleneck for just as long.


Digital movie production, serious photographic work, maybe. And the
latter is pushing it. I've setup audio recording studios that
wouldn't tax 1gigabit based LAN. And that's working with RAW audio
files.


Just because you've anecdotally not had bandwidth problems with GbE
doesn't mean that you can try to look down your nose at others.
For example, a circa 2010 technology "600x" CF card for still
photography was spec'ed at a 90 MB/sec read, which effectively pushed
Gigabit Ethernet to its practical limit...and the new stuff today
(such as Cfast) has specs of 510 MB/s reads, which is 4x the max
theoretical for GbE (and is already ~40% of the max capacity of 10GbE).


removing that bottleneck is generally a wise investment, one which
will pay for itself fairly quickly in increased productivity. it
might not matter that much for a home user, but it would for smb.


Of course it won't for a home user. A home user isn't going to have
multiple computers taxing a 1gigabit link.


Depends on the home user and their use case ... a single higher end
photo hobbyist can saturate a GbE link on but a single machine.
Contemplate making a 200GB transfer on GbE ... even if it was able
to run at theoretical max, its still a quick half hour time suck.


furthermore, few people actually upgrade anyway.


That's nothing more than your own personal opinion. One of which I do
not share, because the experience I have in the field doesn't support
your opinion. Apple users likely don't do much in the way of hardware
upgrades because they are rather limited in what upgrades they can
actually do. PC users are a different animal.


There's been industry studies which do indeed show that hardware
upgrades have become the exception, not the rule ... this isn't
the 1990s anymore.

And sure, IT-centric hobbyists tend to do more DIY upgrades,
although much of this is driven by economics: they're not able
to write off their expenses like a business does, even when
they had the free cash to spend on it, so there's different
motivations to their behaviors. This tendency tends to be strongest
amongst the Linux "l33t" stereotypes, which also tend to be those
who have the tightest discretionary budget too, so it becomes a
self-fulfilling positive feedback loop.


-hh