Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Subscriptions != Forever (Score 1) 177

Never let anyone tell you subscriptions mean forever. Not 5 questions above in the interview, she mentions them leaving the smart home market and supporting the existing products for 'some time'. They will support the products as long as it makes financial sense. Long enough to clear stock and prevent a viable lawsuit. They do not care if the hardware becomes a brick, especially if they can sell you a new one. 'I'm sorry sir, we only offer updates for the Logitech Cloud Mouse v3 and higher. Also, your subscription only covers it on Windows 2026, not Chrome OS or Windows 2027, which we've detected it on, so you'll be billed retroactively."

Comment Re:Slightly insulting (Score 1) 133

Except.... that's an oversimplified view of it.
Look at a full list of 802.11 standards and amendments.
Yes, your average consumer knows their alphabet, and can probably figure out that 802.11ac is better than 802.11n. But it isn't clear or concise, and the other IEEE 802.11 standards could get in the way. 'Oh, I heard about 802.11ad, WiGig - isn't that faster and newer?'
If these versions functionally act as the yearly rollup meta-standards as well (for example, IEEE 802.11-2016 rolls up ae, aa, ad, ac, and af), then this makes a lot of sense.
Also, throw in 802.11bb - Light-based wireless data communication aka LiFi - and it breaks the 'bigger letters are better and backwards-compatible' scheme entirely.

Comment Re:Do people actually want (Score 1) 29

In my house, yes. We're actually up to a number of Google Homes in my home.
  • Direct control of Philips Hue lights and Wemo/TPLink Kasa outlets is surprisingly handy.
  • Nest thermostat control is nice.
  • Streaming music from Google Play Music on demand is pretty nice.
  • The alarm/timer functions are nice when cooking.
  • My wife (a SW tester) asks it the weather in the morning so she knows whether or not to take a jacket.
  • Adding things to the shopping list is easy.

I'm also running Home Assistant (https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.home-assistant.io%2F) to try to beef it up. Some of the projects I'd like to tackle:

  • When the washer or dryer finish, announce it in rooms that have lights on.
  • Announce package deliveries via the email notifications (HA has UPS and Fedex hooks that work, but the USPS is only web-scraping, and apparently they DO ban...)

Comment Re:Stopped reading (Score 5, Interesting) 350

I think part of the problem is that this is a complex issue that it's really hard to boil down to bullet points. Let's see:
  • - USB-C connectors are a huge upgrade w/r/t size/sturdiness/reversibility (nice)
  • - The USB-C connector was so good that it was rushed to market (bad)
  • - USB-C connectors don't tell you anything else about the device or cables (bad)
  • - The fact that USB-C makes no modern minimum speed guarantees - ie, there are USB-C devices with USB 2.0 signaling, like the Nexus 6P (bad)
  • - USB-3.1+ 'Alternate Mode' is confusing because you can't tell from the port which, if any, are supported at all (bad)
  • - USB Charging standards are mediocre. At least you theoretically can charge a phone with a charger you haven't researched. (bad)
  • - Quick Charge was a great interim step by one large manufacturer but needs to die because it's not open at all (bad)
  • - USB PD seems to be hard for manufacturers to get right, and as such there's risk - see the Nathan K. / Google stuff (bad)
  • - Part of the problem is that the 3.1+ chipsets are still immature, and we were just starting to get USB 3.0 down right (bad)
  • - Losing the audio jack to USB-C may be more bad-PR than a frequent actual inconvenience, but either way it's (bad)

So, except for that first bullet point, we are in the worst USB timeline. Still, even as bullet points, it's describing a mess.

Comment Re:Microsoft's Actual Logic (Score 1) 419

I absolutely agree with this. Yes, it's updates in this case, not drivers. But the number of edge cases to test and the cost to do so is substantial, and the number of affected users is relatively small. Meanwhile, this means they can make it so they don't need to backport patches that have odd forks, etc. for new processors.

Comment Part of the problem will self-correct... (Score 1) 184

Right now, I'd say a substantial part of the problem is insurance protection against cyber attacks.

If a company can go to a bog-standard insurance company like Travelers or AIG and spend a small fraction of both the real breach cost and the cost of actually securing things, they will - the profit motive demands it.

What the profit motive DOESN'T demand is the insurance company look at their costs with a blind eye. Right now, I'm sure a large number of those policies are untriggered, so in aggregate, they are still profitable. But when those costs become comparable, and a company factors in the lost productivity and PR issues (both of which are hard to quantify), they will actually secure things. Partially to save money on or qualify for their cyber insurance.

That's part of why news coverage of breaches and forced disclosure laws are so important - right now, to both businesses and insurers, the productivity and PR costs are too easy to ignore, and the insurer has little motive to force compliance. (In fact, it's theoretically more profitable to 'prove' to their customers that attacks happen and no tightening will prevent all attacks - both of which are absolutely true no matter what happens.)

Comment Re:Goodbye Windows. (Score 1) 585

This! Mind you, it's a bit worse than that - ie, Intel won't make a signed driver package that will allow Kaby Lake to work on Windows 7/8/8.1, because Microsoft will not make new drivers.

But let's play devil's advocate for a second - is this just Microsoft pushing Windows 10 for the sake of Windows 10? Or... is this because the driver model has changed since Win7/8 and supporting both is a higher cost for driver makers (who, by definition, would have to spend more or split quality/features to support all of these platforms)? Is it because Microsoft doesn't want to sign a driver unless it goes through the whole WHQL certification process (to ensure it's a clean build, it's stable, it's malware-free, etc.), and Microsoft can't financially justify/support keeping up the WHQL pipeline given Win7/8's current levels of popularity and general downward trend (since they no long sell or support Win7/8, Win 10 doesn't necessarily have to grow, but Win 7/8's installed base will drop due to natural attrition)?

But there's nothing here stopping manufacturers from making Linux or Mac drivers, nothing here preventing third-party, open-source drivers (albeit requiring users to allow unsigned drivers and the inherent security risks), and nothing here about Microsoft artificially pushing Win10 for the sole sake of pushing Win10.

Comment Re:Latency (Score 5, Informative) 159

This!

Even with a stable framerate, this technique intentionally delays the next frame to add compensation frames.

As an example, let's have a magic VR helmet running at 120Hz and instant processing (ie, 0ms GTG time, which doesn't exist) and a video card capped at a perfectly stable 30 FPS (aka 30Hz).

We will split a second into ticks - we'll use the VR helmet's frequency of 120 Hz, so we have 120 ticks, numbered 1 to 120. (Just to annoy my fellow programmers!)

We therefore get a new actual frame every 4th tick - 1st, 5th, 9th, etc.

Without motion compensation, we would display a new frame every 4th tick - 1st, 5th, 9th, etc.
With ideal (instant) motion compensation, we can't compute a transition frame until we have the new frame. So we could, theoretically, go real frame #1 on 1st tick, computed frame based on #1 and #2 on 5th tick, real frame #2 on 6th tick, computed frame based on #2 and #3 at 9th tick, etc.

This would also be jerky - 2 motion frames then 3 at rest? We could push the frames back a tick and fill the interval with three compensation frames, but then we increase the delay, which is always higher than this example (and is multiplicative). So we'd have frame #1 at 5th tick, computed frames at 6th/7th/8th, frame #2 at 9th tick, etc. You've now introduced a minimum 4 tick delay, which at 120Hz is 1/30 of a second, or 33ms! To an otherwise impossibly-perfect system!

What about historical frames instead to PREDICT the next blur? Well, then, when something in-game (or, really, on screen) changes velocity, there would be mis-compensation. (Overcompensating if the object slows, undercompensating if it speeds up, and miscompensation if direction changes).

There's more problems, too:
- This doesn't help when the video card frameskips/dips.
- Instant GTG and instant motion frame computation do not exist. At best, they're sub-tick, but you'd still operate on the tick.
- Input delay already exists for game processing, etc.
- Increased input delay perception would be exponential to the actual length of the delay. For example, 1-2ms between keypress and onscreen action? Hardly notable. 50ms delay just to start registering a motion on screen and course correct? Maybe OK, maybe annoying. 150-200ms? Brutal.

Slashdot Top Deals

Blessed be those who initiate lively discussions with the hopelessly mute, for they shall be known as Dentists.

Working...