Comment Re:Several THOUSAND (Score 1) 66
Exactly how many suppliers does it take to supply an indicator bulb???
That's a trick question.
Answer: None. In 2025, Everything's Computer.
Exactly how many suppliers does it take to supply an indicator bulb???
That's a trick question.
Answer: None. In 2025, Everything's Computer.
Omitting the camera saves the consumer the minor cost of having to use up a square inch of electrical tape.
Show me the how you can create a system where the price totals of all possible combinations of inventory selections result in only (3 or 4) mod 5.
So back then, prices were incremented by more than today's quarter.
People need to consider: Rounding to a nickle isn't going to be greater than 2 cents more inaccurate than rounding to pennies. Let's say you live in a backwater state, and still only make $7.25 per hour. Each transaction could potentially cost you at most 10 seconds of extra wages. However, transactions randomly round up and down, so the average error gets reduced by the square root of the number of transactions you make. Statistically speaking, you'll gain or lose only a couple of seconds of your time per purchase. Probably less time than it took to fumble for all those pennies.
But it sucks to be poor. Without pennies, someone who makes $50k per year will gain or lose only milliseconds worth of salary per transaction on average.
"But the stores will set prices so that it always rounds up!!!!1!" -- That only works for one item at most. Savvy shoppers would strategically buy combinations of items that always round down.
In a few years, all of these GPUs will be available on eBay for a few bucks each.
Then I'll finally be able to snag a whole bunch of them and build a Beowulf cluster to run SETI@home faster than anybody else.
Because someone still has to take time to read the slop. Over and over.
That work sounds like a great candidate to offload onto AI!
No; it's absolutely a terrible idea. It may be great for the businesses; but, it's absolutely fucking terrible for the consumer.
This is absolutely fucking insanity. Imagine having to carry 6 different cards and wondering which one a particular store is going to take.
The merchants need to consider that if their competitor down the street still accepts rewards cards, the customers might just switch, and then they've just lost the whole sale. All this over a 1% extra cost to the merchant.
In the meantime, they think nothing of offering things like buy-one-get-one-free deals to lure in a few more customers.
That summary at the top of this story is just way too long. I'll have a chatbot break it down and give me the gist.
There is no new Firefox for OS/2, I will not supporting Kit.
If you're locked inside an ATM, have you tried banging on the case to alert passers by?
Strangely, no one connects the many claims that garbage collected languages "eliminate a whole class of programming errors" is good with the aforementioned "typed languages eliminate a whole class of programming errors" as good also.
Almost nobody uses "untyped languages". Few of those even exist, with Forth and various assembly languages being the main examples. (C, with its type system that is as airtight as a sieve, gets an honorary mention.)
You're probably harping about dynamically typed languages. In such languages, the runtime still knows *exactly* what type every item of data has. These are not weakly typed. But what you obviously prefer are "statically typed" languages.
Static typing might statistically reduce some errors, but it certainly can't "eliminate whole classes". Consider "set_warhead_target(float latitude, float longitude)". Did the type system give you any protection from accidentally swapping the two parameters? That's really the problem that you're so worried about: accidentally using the wrong data value in the wrong place.
However, very few statically typed languages (with Rust being a notable exception) have eliminated the biggest source of type errors in computing: Null, which is a bogus placeholder that matches any pointer type (or reference type, depending on the language's nomenclature). So in many cases you have no less risk with static typing than you do with accidentally feeding a string into a Python sqrt() function. And in the case of C or C++, you can be much worse off, as in segfaults and remote exploits.
You can technically do your taxes for free by manually filling out the forms yourself.
I can't think of any business or other government function that still makes me fill out any paper forms. At one recent employer I did not fill out a single paper or PDF-style form, HR or otherwise, in the entire experience from the day I applied until the day I resigned.
Nobody uses paper forms any more. Everything is online. Taxes should be no different, and there should be no 3rd party middlemen collecting tolls for the "privilege" of doing something online the way everything else is done.
SAS has been dead for 15y; it started with R and then Python absolutely destroyed it. No one teaches SAS in universities any longer, why would they? It's terribly expensive and absolutely fucking dead.
We migrated away from SAS back in 2017 and never looked back. The only verticals still using it are heavily regulated and running long-standing legacy code that they're slowly migrating to Python.
I remember absolutely dying when they tried to renegotiate our contract UP back in 2015. I flat out told them they were dead and we were moving away from them and they told me, "good luck managing your data without us!"
Two companies and 10 years later, we're doing just fine and they are not.
Can anything be sadder than work left unfinished? Yes, work never begun.