Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Correlation does not equal Causation (Score 3, Interesting) 37

A friend of mine is extremely fortunate to have a bit more of an 'old school' environment. They have a TV, but she doesn't let her kids use her phone. She's able to be a stay-at-home-mom, supplementing the household income with baked goods and Etsy projects and eggs from her chickens. She pays attention to her kids, not as a helicopter parent, but as a genuinely involved parent - going on walks, taking them to the library, teaching them how to interact safely with the chickens, having them cook with her, teaching them arithmetic and reading, playing with them, giving them simple chores...really making it a point to focus on early childhood education. This in turn is evident in her kids' longer attention spans, and ability to have discussions at levels in excess of their peers.

Something tells me that they will do far better than their peers on standardized tests...not because they had less screen time and spent their formative years staring at the wall instead, but because she's been an active parent and made it a point to make the most of the pre-kindergarten years.

She's an exception, sure...but the point generally stands - parents who just hand their kid an iPad and leave them alone are going to end up with kids focused on entertainment rather than exploring their world and gaining understanding, which will likely be reflected on standardized test scores to some extent.

I would also submit that one of the contributors to this problem is how basically every video game has devolved into a skinner box and dopamine dispenser. Puzzle games exist, but it's an incredibly exhaustive process to load an iPad exclusively with games that are pay-once, no-IAPs. It would be interesting to see if such a thing *could* be used as part of an experimental group, where kids who only played games that had traditional progression mechanics were compared to kids who had games that were colorful slot machines.

Comment Re:Will they be making modern DVRs? (Score 1) 67

I can understand exiting legacy DVRs because we are no longer using analog video. Will they be making new DVRs that work with HDMI input, are able to record digital broadcast and from other set top boxes and streaming devices?

I'd be absolutely shocked if they did.

DRM has been a problem for TiVo pretty much since digital cable became a thing. TiVo managed to squeak out a longer shelf life as a DVR because they kissed the ring, but they had to go through all kinds of hoops to do it. Amongst the reasons the GPLv3 exists is because of what Stallman called the "TiVo-ization" effect, where the GPL components were released, but the useful extensions weren't, and they couldn't because they were the components that allowed the TiVo to work despite the DRM.

In practice, I will concede that HDCP has been mostly-seamless, most of the time, for most users...but I don't think TiVo is going to play games with the HDCP licensing and enforcement to then allow timeshifting. I submit that there are issues on both sides with doing so. On one hand, what you're suggesting is the ability to timeshift content that is inherently on-demand. TiVo's popularity was entirely due to making it possible to timeshift linear broadcasts in a way that was far more functional than VHS. Having a TiVo that would accept an HDMI signal from a Fire Stick would be mostly-pointless...unless the user is intending to archive a video due to concerns about takedown...and it's not like TiVo would roll the dice on a device that would be a lightning rod for litigation, if not actually-legal ones, if TiVo's HDCP license got revoked, they would either neuter everyone's hardware, or they'd be signing up for a court case they have no chance of winning.

The alternative exists; HDMI capture cards are plentiful, and there is plenty of software that can record from it. Screen recording software is also super plentiful, and examples of these things which also manage to play fast-and-loose with DRM are readily available on the market.

So, TiVo would be caught in the middle, and the liability is well below whatever they would make in revenue...it's certainly not worth it to them.

Comment Sad, because they had the perfect inroad (Score 1) 67

At its peak, TiVo was a household name...and in hindsight, it probably would have been possible to keep it going with relatively small amounts of effort.

For starters, they weren't all that great to their base - the folks with the lifetime subscriptions. They were the early-adopters and the enthusiasts, but they ran into issues once the lifetime subscriptions were tied to the analog boxes and the cable companies stopped offering that service...in such cases, the customers had to choose, and at the very least, some of them were unlikely to choose to continue with TiVo. It probably didn't help that some of those boxes required dial-up connections to get guide data, and they weren't exactly upgradeable to wi-fi.

From there, cable companies competed with first-party DVR boxes...which everyone hated because their UI sucked and was slow, but it came bundled with cable service and the cable companies actively supported and marketed them, so lots of people used them instead. Had TiVo offered an onboarding service, where TiVo would get a CableCARD on the users' behalf and send it to them preconfigured, that may have helped justify the purchase.

TiVo was already at least some living rooms; they could have gone toe-to-toe with Roku and Amazon Fire Sticks...and I believe the later models *did* allow Netflix streaming and such, but it was too little, too late, and TiVo didn't have retail space in the same way Roku did. For sure, Roku would have won on price, but it likely would have been possible for them to have leveraged their brand recognition.

Finally, the nail in the coffin for TiVo was that they didn't lobby to retain the legal mandate for CableCARDs. I had a SiliconDust HD Homerun Prime, and I loved it for the time I could use it, but my cable company wouldn't give me a CableCARD when I changed my service, because they didn't have to.

It's completely unsurprising that the pioneer in the space lost their spot, but they absolutely could have owned the market with a bit better marketing and availability.

Comment Re:I forsee... (Score 1) 81

Lots and lots of cheap VPSes offering cloud servers and storage from all of these soon-to-be future useless AI datacenters.

That may be true to a certain extent, but I submit two counterbalances to this:

1. VPS workloads aren't usually helped by GPU acceleration; even if DigitalOcean or Hetzner got these datacenters in a fire sale, they'd still be sitting on a pile of GPUs they would either have to leverage or sell.

2. While VPS companies will undoubtedly continue to have their workload quantities increase over time, if the AI tech bubble bursts, they are likely to end up being on the working end of that at some level, too. I'm sure they aren't going anywhere, but even if revenue is down 10%, it's going to be a tough sell to get leadership to think about making an investment of that size when it's unclear whether next year will *also* have a 10% loss of revenue.

Comment Poor James (Score 4, Insightful) 106

James Strawn, who was laid off from Adobe over the summer after 25 years as a senior software quality-assurance engineer.

I can only assume that for the past decade, James has been ignored, or terrible at his job. Every Adobe product has gotten progressively worse to use, forums are filled with bug reports that get ignored release after release, and the increase in system requirements do not reflect improvements in functionality.

Whether because Adobe didn't like what he had to say, or they decided not to listen to him, it's completely unsurprising that he lost his job.

The folks offering $500K/year for AI experts aren't going to take anyone who makes the claim on a resume, they're almost guaranteed to be looking to poach someone at OpenAI or Google. Practically speaking, they're looking to benefit from the experience that those companies paid for...and James doesn't have it.

On the upswing, odds are pretty good that James will have a job in short order, helping to deal with the fallout of 'vibe coders' who don't know how to do real-world testing. He's probably going to run into some combination of age discrimination and salary discrimination (no way he's working for $60K if he has 25 years at Adobe), but once the messes start being too big to ignore, I'm pretty sure he'll be able to become a project manager that helps direct fixes for deployed code that didn't get actual-QA. The need is most definitely there, it'll just take a bit more time to prove to the brass that he's more valuable to the company than the MBAs that are looking at their now-spherical product for more corners to cut.

Comment Re:What an amusing coincidence! (Score 3, Insightful) 35

We thought we would save money by going a-la-carte. Now it's more expensive.

Well, I think we did 'simple math' when we thought that, rather than 'real-world-math'. If my cable bill is $150/month for 100 channels, that's $1.50/channel. Since I only watch maybe 20 of them at most, 20*1.5 = $30/month for the 20 channels I watch. Who wouldn't want that?

The problem is that channel costs don't divide evenly. ESPN is very expensive ($8 or more of one's cable bill goes for just this channel), while Home Shopping Network and QVC have historically paid the cable companies for inclusion in the lineup. Public Access stations are both legal requirements in many jurisdictions, and the content is paid for by whoever submits it for broadcast...and again, roughly nobody would include it in their custom lineup.

Finally, there *is* a baseline amount of cost for the last-mile distribution. Whether it's got 1 channel or 1,000 channels, the infrastructure needs to exist. I'm not making any excuses for Comcast here, but someone needs to pay the right-of-way to the townships for the wire runs, the backend equipment costs money, the staff to service it, and the staff to answer the phone for CSR requests all cost something. Streaming services generally get away with chat-only support and don't have any wires to run.

So...while I'm not calling cable a good deal by any stretch, I *am* at least acknowledging that a-la-carte math would be something closer to $30 infrastructure + $10/month ESPN + probably-something-closer-to-$3/channel for the ones with actual-content, making that beautiful $30/month bill something closer to $80/month in practice.

Personally, I think that there *does* need to be some sort of court case that uses United States v. Paramount Pictures, Inc. (1948) as precedent to decouple distribution from production, which would probably do more to solve the cost issues than anything else.

Comment Re:Used cars... (Score 1) 62

an 8 year old car.

"certified for windows Vista" .

So, please forgive the pedantry - actually, it might actually be more to your point - but the math ain't mathing here.

General Availability for Windows Vista was in January of 2007, which was 18 years ago. Its EOL date was in April 2017, 9 years ago.

If the car is really eight years old, even if it was a model-year and was the very first one of its model year, it STILL would be newer than Vista's EOL date. Eight years ago, Windows 10 was already out for two years.

Half of me wants to believe the story that the car did, in fact, roll off the assembly line with a splash screen reflecting an already-EOL operating system. The other half is wondering if perhaps the car is older than eight years old.

Comment Re: "It might be tempting to blame technology... (Score 5, Insightful) 109

Yeah, there was never an irresponsible young person before this generation.

Way to completely miss the point...

It's not that this never happened, or even that it was some extremely exceptional case in the millennial/GenX/boomer eras...but it was extremely different.

For starters, I think that there are two sides that have both valid points and problematic extremes. In the days of yore, companies generally saw employees as an investment. It wasn't uncommon for people to work at the same company for 25 years or more. Certainly not every company, but the majority of them accepted the tradeoff that new employees would require a good amount of training before the company broke even on their paycheck, and in return, the employee who stayed would earn the company a tidy profit over time, receiving enough money to support a family in return. Today, it seems that everyone wants to 'skip to the end' - companies want to hire savants with 20 years experience in Debian Trixie, make them always-on-call, and in return want to pay them less than half of a living wage, and keep a fear of layoffs at the front of everyone's mind.

The counterbalance is that Gen-Z knows this is the case, and treats employers with the same sense of expendability. If a company sees an employee as someone not worth investing in, with little ability to get promoted, and both promotions and layoffs being abstracted from job performance or dedication, it's not unreasonable to adopt the mindset of "if they're doing what's best for them, then I'll do what's best for me"...and it's only a few more steps from there to "I'll work when I feel like it", which is demonstrated in the willingness to no-call/no-show, doomscrolling at work, or having to be 'helicopter managed'.

This leads us to the downward spiral - with both sides leaving themselves vulnerable to exploitation if loyalty and dedication are expressed, it is only natural to reduce that vulnerability moving forward. The result, however, is both sides getting worse...it's a Prisoner's Dilemma, and neither side has incentive to break the cycle.

Comment Re:What's the difference between tablet and phone? (Score 3, Interesting) 122

Back when the iPhone was introduced I was convinced that within 10 years computing would be mostly done this way; connecting your portable computer (smart phone) to a dock that turned it into your home computer. I'm surprised that this idea never gained traction.

I think there have been a few reasons for this.

I think the biggest one is that nobody could meaningfully agree on a form factor. Now, *I* always thought that a great option would be to have a 'zombie laptop' that had a keyboard, trackpad, webcam, and a battery, with a slot to slide your phone into. The phone would connect to the peripherals and give a 12" screen and a keyboard, while charging the phone in the process.

The devil, of course, was in the details. Even if Apple made such a device and molded it to the iPhone, the problem then became that a user couldn't put their phone in a case, or it wouldn't fit in the clamshell's phone slot. There would also need to be adapters to fit the different sized phones, or different SKUs entirely with per-device slots, which then also pigeonholes Apple into a particular physical form factor. That begets the "use a C-to-C cable" option, which is better, but makes it ergonomically annoying to use if one isn't sitting at a desk. A wireless option solves both of these problems, but kills both batteries in the process. Finally, there's the price point: the cost for the end user would need to be low enough that it doesn't just make sense to have two devices, AND the first-gen owners would likely feel some kind of way if they were stuck with their old phone because it meant buying a new clamshell. It works well on paper, but pretty much any real-world testing would show the shortcomings pretty quickly.

Supposed that was solved somehow...while the Samsung Fold phones are helping justify time spent in adding a multi-window interface to Android, try installing Android x86 on a VM for a bit and watch what happens. It's been a while since I tried, but the experience was pretty bad - the inability to open e-mails in new windows was particularly infuriating; many apps take exception to having multiple concurrent instances for side-by-side usage, and window focus gets pretty tricky to navigate. It *can* be done, but it ultimately felt like all-compromise, no-improvement.

Finally, there *is* such a thing, at least to an extent. Many, MANY apps are just frontends on a website. iCloud is like this, the whole Google ecosystem is like this, Salesforce is like this...for a solid number of apps, there is a browser-based frontend that works just as well, if not better in at least some cases. Data is commonly synced with Google or iCloud or Dropbox. The number of apps that are worth running on a phone, without a desktop or browser analogue, that would justify a user getting a clamshell to run that app in a larger window...is small enough that it is seldom worth dealing with all of the *other* compromises involved.

Comment Wow... (Score 5, Informative) 89

they are constraining what you can do using the software they provide with said hardware

It has been a VERY long time since I've seen such a textbook definition of the phrase "a distinction without a difference".

On an Intel x86 PC, even the most locked-down iterations of Windows give users a means of running whatever code they want. If the user doesn't want to run Windows at all, a user can download an ISO of Ubuntu or Fedora or Proxmox or VMWare or GhostBSD or Haiku, make a menu change in the BIOS, and install those OSes instead. Done and done. Windows can be replaced in 30 minutes or less if a user wants to, with nothing but GUI tools and youtube tutorials that are universally accurate (admittedly with slight variations on where to disable secure boot in the BIOS).

On an Android phone, one must unlock the bootloader (which some phones prevent through artificial constraints), then hope that some Good Samaritan has made a different OS for it...and then go through 101 steps involving CLIs, recovery environments, and ADB interfaces...AND those steps and software downloads vary with each model of phone, AND Google gives app developers a means of telling users "sorry, I won't run on a phone you have control over", AND that assumes that a replacement OS is available in the first place...otherwise, the user needs to replace the phone, or go all the way to doing their own compiling of AOSP, which is its own rabbit hole.

So yeah, the argument rings incredibly hollow: "we're not constraining what the hardware can do...but we ARE constraining what the software can do AND constraining your ability to replace that software if you so choose." If the argument is that the constraints are purely related to software, then Google needs to put way more effort into streamlining the ability for users to depreciate the use of whatever software those constraints are implemented to protect. If they aren't going to do that, then they are being disingenuous.

If, in a court of law, they cannot produce documentation regarding the means by which the hardware can be used to run unapproved code, then I would deem them guilty of perjury for making this statement under the current climate.

Comment Re:Don't get it (Score 1) 155

You know why I don't drink alcohol? In part, because of the high cost. Why the fuck would I pay $5-$10 for a small glass of liquid?

$5-$10?! What a bargain! I'd probably order a mixed drink with my meal if it was still only $10.

Most of the restaurants I've been to in the past year have bumped the cost of cocktails to $15-$20 for standard stuff. I might accept it for drinks that have several different ingredients, but even a rum-and-coke goes for $15 around me - and I'm not in NYC or LA. One brunch spot I went to had a memosa flight that was champagne and four different fruit juices, each in MAYBE 6oz. glasses, and they wanted $40 for it.

Not to be outdone, the last time I *did* visit NYC, I went to a restaurant that made mocktails...and while my blueberry mint lemonade was indeed delicious, it was NOT worth the $10 my friend paid for it.

So yeah, I definitely share your sentiment - a nontrivial portion of the reason people aren't drinking alcohol at restaurants is due to the pretty significant cost to doing so...and I'm pretty sure that those higher priced drinks *also* have less alcohol in them than they did ten years ago, too. Given this, it makes way more sense to get a 1L bottle for $20-$50 that I can use to make my own drinks for a month...and as much as restaurants have always used bar drinks as a source of high-margin revenue, it's not really justifiable anymore to spend that kind of money on a single drink.

Comment Re:It's a weird Puritan Christian thing (Score 1) 175

It's more likely just based on a really piss-poor understanding of STIs in biblical times. People understood that if you did a bunch of sleeping around, you'd likely fall ill. They didn't understand what caused it, so it was just "punishment from God".

I'm not sure I entirely buy this. Even as recently as the 1960's, the incidence of STIs was somewhere around 1:35, and that's inherently including all of the STIs that made the jump from animals to humans in the time after the time of Christ (or Paul, who was more outspoken about it, or Moses, if you're going that far back). Yes, STIs undoubtedly happened, but with a less than 3% chance of getting one, a person had to either be "well traveled" or extremely unfortunate to get an STI in that era. We also have to discount the asymptomatic STIs; this line of reasoning might hold water with STIs that have visible scarring in genital regions, but not all STIs fit that description. The clearest symptom of chlamydia is difficulty in conception and birthing, but there were many, many virgins-on-wedding-night who had trouble with giving birth throughout history.

I think there were other, more practical reasons for this system. For starters, a faithful, single-partner wife would ensure paternity in a time prior to DNA testing. In addition, a virgin woman was deemed more desirable by the men and were able to attract more desirable suitors. Promiscuity after marriage was a poor social reflection on the husband as well.

Even though we now understand what actually causes sexually transmitted diseases and ways to reduce the likelihood of their spread, some people still cling to the whole "God doesn't like it" thing for the usual reasons that people still believe in aspects of religion which don't stand up to logical scrutiny.

Well, the Mosaic law managed to outlast the Assyrians, the Babylonians, the Persians, the ancient Egyptians, the ancient Greeks, the Romans, and the dozens of "ites" listed in the Old Testament narrative. I'm not saying that presenting the Pentateuch to Congress to put it into law in 2025 to be enforced with police and military is something that would benefit modern society, but I *am* saying that its staying power reflects some sort of a societal benefit. I submit for your consideration that even if one disagrees with the mandates present in the Mosaic law, it might be overly reductive to assume that the rules came into existence through a concern for relatively-rare STIs.

Slashdot Top Deals

A right is not what someone gives you; it's what no one can take from you. -- Ramsey Clark

Working...