Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Poor James (Score 4, Insightful) 106

James Strawn, who was laid off from Adobe over the summer after 25 years as a senior software quality-assurance engineer.

I can only assume that for the past decade, James has been ignored, or terrible at his job. Every Adobe product has gotten progressively worse to use, forums are filled with bug reports that get ignored release after release, and the increase in system requirements do not reflect improvements in functionality.

Whether because Adobe didn't like what he had to say, or they decided not to listen to him, it's completely unsurprising that he lost his job.

The folks offering $500K/year for AI experts aren't going to take anyone who makes the claim on a resume, they're almost guaranteed to be looking to poach someone at OpenAI or Google. Practically speaking, they're looking to benefit from the experience that those companies paid for...and James doesn't have it.

On the upswing, odds are pretty good that James will have a job in short order, helping to deal with the fallout of 'vibe coders' who don't know how to do real-world testing. He's probably going to run into some combination of age discrimination and salary discrimination (no way he's working for $60K if he has 25 years at Adobe), but once the messes start being too big to ignore, I'm pretty sure he'll be able to become a project manager that helps direct fixes for deployed code that didn't get actual-QA. The need is most definitely there, it'll just take a bit more time to prove to the brass that he's more valuable to the company than the MBAs that are looking at their now-spherical product for more corners to cut.

Comment Re:What an amusing coincidence! (Score 3, Insightful) 35

We thought we would save money by going a-la-carte. Now it's more expensive.

Well, I think we did 'simple math' when we thought that, rather than 'real-world-math'. If my cable bill is $150/month for 100 channels, that's $1.50/channel. Since I only watch maybe 20 of them at most, 20*1.5 = $30/month for the 20 channels I watch. Who wouldn't want that?

The problem is that channel costs don't divide evenly. ESPN is very expensive ($8 or more of one's cable bill goes for just this channel), while Home Shopping Network and QVC have historically paid the cable companies for inclusion in the lineup. Public Access stations are both legal requirements in many jurisdictions, and the content is paid for by whoever submits it for broadcast...and again, roughly nobody would include it in their custom lineup.

Finally, there *is* a baseline amount of cost for the last-mile distribution. Whether it's got 1 channel or 1,000 channels, the infrastructure needs to exist. I'm not making any excuses for Comcast here, but someone needs to pay the right-of-way to the townships for the wire runs, the backend equipment costs money, the staff to service it, and the staff to answer the phone for CSR requests all cost something. Streaming services generally get away with chat-only support and don't have any wires to run.

So...while I'm not calling cable a good deal by any stretch, I *am* at least acknowledging that a-la-carte math would be something closer to $30 infrastructure + $10/month ESPN + probably-something-closer-to-$3/channel for the ones with actual-content, making that beautiful $30/month bill something closer to $80/month in practice.

Personally, I think that there *does* need to be some sort of court case that uses United States v. Paramount Pictures, Inc. (1948) as precedent to decouple distribution from production, which would probably do more to solve the cost issues than anything else.

Comment Re:Used cars... (Score 1) 62

an 8 year old car.

"certified for windows Vista" .

So, please forgive the pedantry - actually, it might actually be more to your point - but the math ain't mathing here.

General Availability for Windows Vista was in January of 2007, which was 18 years ago. Its EOL date was in April 2017, 9 years ago.

If the car is really eight years old, even if it was a model-year and was the very first one of its model year, it STILL would be newer than Vista's EOL date. Eight years ago, Windows 10 was already out for two years.

Half of me wants to believe the story that the car did, in fact, roll off the assembly line with a splash screen reflecting an already-EOL operating system. The other half is wondering if perhaps the car is older than eight years old.

Comment Re: "It might be tempting to blame technology... (Score 5, Insightful) 109

Yeah, there was never an irresponsible young person before this generation.

Way to completely miss the point...

It's not that this never happened, or even that it was some extremely exceptional case in the millennial/GenX/boomer eras...but it was extremely different.

For starters, I think that there are two sides that have both valid points and problematic extremes. In the days of yore, companies generally saw employees as an investment. It wasn't uncommon for people to work at the same company for 25 years or more. Certainly not every company, but the majority of them accepted the tradeoff that new employees would require a good amount of training before the company broke even on their paycheck, and in return, the employee who stayed would earn the company a tidy profit over time, receiving enough money to support a family in return. Today, it seems that everyone wants to 'skip to the end' - companies want to hire savants with 20 years experience in Debian Trixie, make them always-on-call, and in return want to pay them less than half of a living wage, and keep a fear of layoffs at the front of everyone's mind.

The counterbalance is that Gen-Z knows this is the case, and treats employers with the same sense of expendability. If a company sees an employee as someone not worth investing in, with little ability to get promoted, and both promotions and layoffs being abstracted from job performance or dedication, it's not unreasonable to adopt the mindset of "if they're doing what's best for them, then I'll do what's best for me"...and it's only a few more steps from there to "I'll work when I feel like it", which is demonstrated in the willingness to no-call/no-show, doomscrolling at work, or having to be 'helicopter managed'.

This leads us to the downward spiral - with both sides leaving themselves vulnerable to exploitation if loyalty and dedication are expressed, it is only natural to reduce that vulnerability moving forward. The result, however, is both sides getting worse...it's a Prisoner's Dilemma, and neither side has incentive to break the cycle.

Comment Re:What's the difference between tablet and phone? (Score 3, Interesting) 122

Back when the iPhone was introduced I was convinced that within 10 years computing would be mostly done this way; connecting your portable computer (smart phone) to a dock that turned it into your home computer. I'm surprised that this idea never gained traction.

I think there have been a few reasons for this.

I think the biggest one is that nobody could meaningfully agree on a form factor. Now, *I* always thought that a great option would be to have a 'zombie laptop' that had a keyboard, trackpad, webcam, and a battery, with a slot to slide your phone into. The phone would connect to the peripherals and give a 12" screen and a keyboard, while charging the phone in the process.

The devil, of course, was in the details. Even if Apple made such a device and molded it to the iPhone, the problem then became that a user couldn't put their phone in a case, or it wouldn't fit in the clamshell's phone slot. There would also need to be adapters to fit the different sized phones, or different SKUs entirely with per-device slots, which then also pigeonholes Apple into a particular physical form factor. That begets the "use a C-to-C cable" option, which is better, but makes it ergonomically annoying to use if one isn't sitting at a desk. A wireless option solves both of these problems, but kills both batteries in the process. Finally, there's the price point: the cost for the end user would need to be low enough that it doesn't just make sense to have two devices, AND the first-gen owners would likely feel some kind of way if they were stuck with their old phone because it meant buying a new clamshell. It works well on paper, but pretty much any real-world testing would show the shortcomings pretty quickly.

Supposed that was solved somehow...while the Samsung Fold phones are helping justify time spent in adding a multi-window interface to Android, try installing Android x86 on a VM for a bit and watch what happens. It's been a while since I tried, but the experience was pretty bad - the inability to open e-mails in new windows was particularly infuriating; many apps take exception to having multiple concurrent instances for side-by-side usage, and window focus gets pretty tricky to navigate. It *can* be done, but it ultimately felt like all-compromise, no-improvement.

Finally, there *is* such a thing, at least to an extent. Many, MANY apps are just frontends on a website. iCloud is like this, the whole Google ecosystem is like this, Salesforce is like this...for a solid number of apps, there is a browser-based frontend that works just as well, if not better in at least some cases. Data is commonly synced with Google or iCloud or Dropbox. The number of apps that are worth running on a phone, without a desktop or browser analogue, that would justify a user getting a clamshell to run that app in a larger window...is small enough that it is seldom worth dealing with all of the *other* compromises involved.

Comment Wow... (Score 5, Informative) 89

they are constraining what you can do using the software they provide with said hardware

It has been a VERY long time since I've seen such a textbook definition of the phrase "a distinction without a difference".

On an Intel x86 PC, even the most locked-down iterations of Windows give users a means of running whatever code they want. If the user doesn't want to run Windows at all, a user can download an ISO of Ubuntu or Fedora or Proxmox or VMWare or GhostBSD or Haiku, make a menu change in the BIOS, and install those OSes instead. Done and done. Windows can be replaced in 30 minutes or less if a user wants to, with nothing but GUI tools and youtube tutorials that are universally accurate (admittedly with slight variations on where to disable secure boot in the BIOS).

On an Android phone, one must unlock the bootloader (which some phones prevent through artificial constraints), then hope that some Good Samaritan has made a different OS for it...and then go through 101 steps involving CLIs, recovery environments, and ADB interfaces...AND those steps and software downloads vary with each model of phone, AND Google gives app developers a means of telling users "sorry, I won't run on a phone you have control over", AND that assumes that a replacement OS is available in the first place...otherwise, the user needs to replace the phone, or go all the way to doing their own compiling of AOSP, which is its own rabbit hole.

So yeah, the argument rings incredibly hollow: "we're not constraining what the hardware can do...but we ARE constraining what the software can do AND constraining your ability to replace that software if you so choose." If the argument is that the constraints are purely related to software, then Google needs to put way more effort into streamlining the ability for users to depreciate the use of whatever software those constraints are implemented to protect. If they aren't going to do that, then they are being disingenuous.

If, in a court of law, they cannot produce documentation regarding the means by which the hardware can be used to run unapproved code, then I would deem them guilty of perjury for making this statement under the current climate.

Comment Re:Don't get it (Score 1) 155

You know why I don't drink alcohol? In part, because of the high cost. Why the fuck would I pay $5-$10 for a small glass of liquid?

$5-$10?! What a bargain! I'd probably order a mixed drink with my meal if it was still only $10.

Most of the restaurants I've been to in the past year have bumped the cost of cocktails to $15-$20 for standard stuff. I might accept it for drinks that have several different ingredients, but even a rum-and-coke goes for $15 around me - and I'm not in NYC or LA. One brunch spot I went to had a memosa flight that was champagne and four different fruit juices, each in MAYBE 6oz. glasses, and they wanted $40 for it.

Not to be outdone, the last time I *did* visit NYC, I went to a restaurant that made mocktails...and while my blueberry mint lemonade was indeed delicious, it was NOT worth the $10 my friend paid for it.

So yeah, I definitely share your sentiment - a nontrivial portion of the reason people aren't drinking alcohol at restaurants is due to the pretty significant cost to doing so...and I'm pretty sure that those higher priced drinks *also* have less alcohol in them than they did ten years ago, too. Given this, it makes way more sense to get a 1L bottle for $20-$50 that I can use to make my own drinks for a month...and as much as restaurants have always used bar drinks as a source of high-margin revenue, it's not really justifiable anymore to spend that kind of money on a single drink.

Comment Re:It's a weird Puritan Christian thing (Score 1) 175

It's more likely just based on a really piss-poor understanding of STIs in biblical times. People understood that if you did a bunch of sleeping around, you'd likely fall ill. They didn't understand what caused it, so it was just "punishment from God".

I'm not sure I entirely buy this. Even as recently as the 1960's, the incidence of STIs was somewhere around 1:35, and that's inherently including all of the STIs that made the jump from animals to humans in the time after the time of Christ (or Paul, who was more outspoken about it, or Moses, if you're going that far back). Yes, STIs undoubtedly happened, but with a less than 3% chance of getting one, a person had to either be "well traveled" or extremely unfortunate to get an STI in that era. We also have to discount the asymptomatic STIs; this line of reasoning might hold water with STIs that have visible scarring in genital regions, but not all STIs fit that description. The clearest symptom of chlamydia is difficulty in conception and birthing, but there were many, many virgins-on-wedding-night who had trouble with giving birth throughout history.

I think there were other, more practical reasons for this system. For starters, a faithful, single-partner wife would ensure paternity in a time prior to DNA testing. In addition, a virgin woman was deemed more desirable by the men and were able to attract more desirable suitors. Promiscuity after marriage was a poor social reflection on the husband as well.

Even though we now understand what actually causes sexually transmitted diseases and ways to reduce the likelihood of their spread, some people still cling to the whole "God doesn't like it" thing for the usual reasons that people still believe in aspects of religion which don't stand up to logical scrutiny.

Well, the Mosaic law managed to outlast the Assyrians, the Babylonians, the Persians, the ancient Egyptians, the ancient Greeks, the Romans, and the dozens of "ites" listed in the Old Testament narrative. I'm not saying that presenting the Pentateuch to Congress to put it into law in 2025 to be enforced with police and military is something that would benefit modern society, but I *am* saying that its staying power reflects some sort of a societal benefit. I submit for your consideration that even if one disagrees with the mandates present in the Mosaic law, it might be overly reductive to assume that the rules came into existence through a concern for relatively-rare STIs.

Comment Re:If it were like it was back in the good old day (Score 1) 66

By contrast, a $70 AAA title is the equivalent of spending $35 or less when most of us were kids and AAA games were, bare minimum, $50 (many SNES were $60-$80 for bigger games).

Video game prices are not gouging anyone right now.

And if games were an $80, one-time purchase, nothing-more-to-buy, multiplayer-over-TCP/IP-forever investment, yes, you're right. I have no problem paying even $100 for such a game.

Except most of them are not. There are a handful of exceptions (Elden Ring, Baulder's Gate 3, and so on), but the majority of games are $60-$80 for the standard edition but $100 or more for the deluxe edition, and then there are the season passes, battle passes, in-game purchases (they're not 'micro' anymore...), lootboxes, multiple in-game currencies, and the fallout76 mechanic of "pay for your purchased items to not lose the stats you bought them with" mechanic...oh, and all of this only lasts for as long as the company keeps the servers up - even though one can technically play FIFA 21 in offline mode, one is stuck with their current roster and cannot unlock additional players through gameplay.

So no, $70 isn't a problem for a complete video game, like what was being sold in the SNES era. FIFA 08 was the last release of the game to include all of the players in the box. The estimate to unlock every player in FIFA 25 is 100 million in-game coins, which cost about $50,000.

So yes, video game prices ARE gouging players right now...we've just somehow accepted that $80 for an incomplete game is the same as $60 for a complete game, as it was before video games became casinos on the internet.

Comment Re:As an IT expert I am .... (Score 1) 132

... and always have been completely bedazzled on why MS Word even has a business case. How this piece of software could gain the market let alone survive to this very day is a mystery to me.

Because Corel doesn't aggressively market the fact that WordPerfect 1.) still exists, 2.) is less expensive, 3.) is much faster and more stable, because 4.) it's not sold as SaaS, and 5.) it can open and save Word documents natively.

Unfortunately, even if they did, there are too few people who perceive WordPerfect as a "big name" anymore; nobody wants to be the first to shift away from Office or Google Docs and be the office that everyone hates sharing documents with, so it's a classic case of "everyone uses it because everyone uses it"

Comment And the PowerPDF Migration Continues... (Score 1) 69

Been moving PLENTY of my clients over to PowerPDF from Acrobat. $179 one-time, no AI garbage, no half-dozen services sending notification nags, and really the only function that's keeping anyone on Acrobat is the send-and-track functionality, which is admittedly a bit more polished than PowerPDF's analogue.

Seriously, Adobe as a whole is coasting on inertia at this point; nearly everything in their portfolio has viable replacements in one form or another.

Comment Re:I'm surprised (Score 1) 41

I'd probably go for a well-built Dell instead. Looks like their competition must be doing even worse if they're still selling.

The most recent crop of Latitude laptops have gone to hell. They used to be solid, boring laptops that were "everything you need, nothing you don't"...but they're doing all the Macbook crap now - soldered storage, nonreplaceable batteries, going for the svelte look that prevents decent cooling so the CPUs are clocked down, the keyboards have no travel anymore so they're not all that great to type on...My company has been a Dell reseller for nearly 20 years but we're getting clients E-Series and T-Series Thinkpads now because Latitudes are getting all the worst elements of the XPS line we avoided.

Comment Simper Reason - Covid PCs Are Aging Out... (Score 3, Interesting) 41

People who bought laptops and PCs during the 2020 lockdowns - because they realized that trying to use their iPads for everything wasn't all it was cracked up to be - are realizing that those machines are reaching the end of their life cycle. With Copilot+ PCs from Lenovo being less expensive than their "non-AI" counterparts, combined with Windows 10 reaching its EOL in a few months, it's completely unsurprising that people are buying new computers and that they're picking inexpensive ones.

Comment Re:DIgital Camera (Score 3, Interesting) 109

Kodak invented the digital camera, but its leadership feared it would cannibalize its film business so it killed it. The company would be in a different place if it had accepted the innovation, refined the digital camera and produced a product.

That's a bit oversimplified, because I don't think digital cameras killed Kodak - Instagram did.

Before the cell phone converged everything, we had cell phones, PDAs, MP3 players, and...digital still cameras. Canon, Nikon, Sony, Fuji, and Kodak were all on display at your local Circuit City and Fry's. While Canon had a bit of a vertical with their inkjet printers, Kodak actually WAS pretty innovative with their entire EasyShare platform - one could download photos to the PC and use the online EasyShare Gallery (an early Flikr with limited free space and paid tiers), or use the dock printer and pop out printed photos. It was a digital camera system so easy that grandparents the world over embraced it. While Canon and Nikon used their consumer cameras to on-ramp hobbyists to their SLR cameras, Kodak used their cameras to sell EasyShare memberships and the dye-sub printer paper cartridges.

Instagram catapulted camera phones from being "the inferior camera that is used when my phone is the only thing I have on me", to "the default camera". Phones ended up getting similar sensors to the dedicated devices, and the ability to share photos via Facebook and Instagram and MMS meant that there was no need for ANYTHING akin to the EasyShare system. The cameras weren't necessary anymore because the phone was built-in, the EasyShare Gallery wasn't necessary because Instagram was free, and fewer and fewer photos were getting printed at all, because sharing was possible both immediately and irrespective of location.

I think Kodak could have pivoted more to being a chemical company like BASF and survived, but Instagram and services like it were the ultimate evolution that Kodak simply couldn't compete with any more than Polaroid could.

Slashdot Top Deals

Thus mathematics may be defined as the subject in which we never know what we are talking about, nor whether what we are saying is true. -- Bertrand Russell

Working...