Comment Makes me think of Professor Frink (Score 2) 54
I predict that within one hundred years computers will be twice as powerful, ten thousand times larger, and so expensive that only the five richest kings of Europe will own them.
I predict that within one hundred years computers will be twice as powerful, ten thousand times larger, and so expensive that only the five richest kings of Europe will own them.
I was figuring I'd upgrade my W10 box after more W11 bugs were shaken out and MS retreated on some of the user-unfriendly decisions. But from everything I've heard, the 24H2 release was a disaster and is still broken for many users today. The number and severity of known issues with 24H2, and how many of them persist half a year later, seems unprecedented for a feature update in the W10-11 era.
On top of that, MS seems to be adding more user-unfriendly decisions (such as going to considerably more lengths to force a Microsoft account login).
I've taken steps to keep 24H2 from installing on my W11 machine and I am just hoping the picture improves before the W10 EOL date this fall.
Percussive sounds have always been a strong point for Opus relative to other codecs, avoiding problems with pre-echo etc, while sparse pure tones, as in e.g. glockenspiel solos, were something Xiph had to work at doing better with.
Part of that is simply due to the nature of the short-time Fourier transform. Since it was designed first as a VOIP codec, prioritizing low latency, Opus uses short transform windows, while most other music-capable codecs use long ones. This results in Opus having naturally better temporal resolution, while other codecs have naturally better frequency resolution. That's a Gabor limit/ Uncertainty Principle type of deal. Opus includes extra tricks to improve its performance on tonal content, and some of that includes boosting VBR bitrate; other codecs take corresponding measures to try to improve their performance on transients.
So while it may be worth encoding whatever cymbal-heavy tracks you have in mind and doing a blind listening test, I think it's likely the cymbals have been encoded pretty well even by pre-1.0 versions of Opus, which are now over twelve years old.
You have got to be kidding me. Which is it: are you an ignoramus or a shill?
Though there were certainly exceptions, during much of Zemin's and Jintao's administrations, China seemed to be moving towards more personal freedom and towards less oppression of groups not closely aligned with the Party and Han supremacy. These leaders and their premiers also seemed less interested in consolidating autocratic power for themselves. In foreign policy it seemed quite possible that China's development would be a global net boon.
Xi Jinping's administration has gone far, far backwards. Xi is the most autocratic paramount leader since Mao, and has gone to extreme lengths to promote a cult of personality and to consolidate power. He has cracked down on personal freedoms in countless ways. Hong Kong has been utterly brutally repressed and all the promises of "one country, two systems" have been blown to smithereens. His administration has perpetrated cultural genocide against the Uighurs. There has been a vast increase in China's sabre-rattling and threats against Taiwan, in harassing Chinese dissidents living in other countries, in the Ministry of Public Security operating clandestine "police stations" in other countries, in all kinds of other underhanded bad international behavior.
Sometimes it seems that the only hope now of avoiding a world war is to hope that Xi, who is 70, passes away before he can get around to starting one.
The Economist is not "group think echo chamber tripe," it is one of the most credible journalistic organizations in the world, and their take on China is a million times more well-reasoned than your ridiculous claims.
Suppose there's a serious disaster capable of destroying tons of infrastructure - like Katrina or the 1906 SF earthquake. Afterwards, you need to get crucial disaster relief information to a million survivors.
To get that information to everyone by radio, you need to get a power source and a well-placed broadcast transmitter.
To get that information to everyone by cellular internet, you need...
OK, I don't know cell networks well enough to be sure what's required. But the infrastructure we normally use in a large city involves thousands of access points, thousands of miles of fiber backhaul, and data centers. Wireless backhaul is real and can help with some kinds of emergencies, but how would that kind of thing scale to metro area size?
Broadcasting information rather than unicast is just vastly better for emergencies.
(And ultimately, for any time when significant numbers of people should receive the same data, but we've grown so accustomed to bandwidth being nearly free that we just ignore that, until we can't because the infrastructure we've built our expectations around is no longer in place.)
It seems to me that, for emergency preparedness reasons, regulators should be looking skeptically at the tendency to shave a couple cents here and there by getting rid of FM radio capability in cell phones and by reducing the ability of cars to receive radio. (I also think we should plan for a long term digital radio transition aimed at increasing reliability.)
darn it, problem with the comment form. trying again
Suppose there's a serious disaster capable of destroying tons of infrastructure - like Katrina or the 1906 SF earthquake. Afterwards, you need to get crucial disaster relief information to a million survivors.
To get that information to everyone by radio, you need to get a power source and a well-placed broadcast transmitter.
To get that information to everyone by cellular internet, you need...
OK, I don't know cell networks well enough to be sure what's required. But the infrastructure we normally use in a large city involves thousands of access points and thousands of miles of fiber backhaul and
In an emergency, if significant amounts of infrastructure have been destroyed, all you need to get radio transmission back in working condition for a city of millions is a power source for one broadcast transmitter.
To get emergency
Amen.
Bunch of smartasses here critiquing the survey questions because they can avoid email tracking by using Mutt and avoid smart TV tracking by putting it in a Faraday cage.
I exaggerate - I myself configure a primary webmail account to not load external images - but it's the default/encouraged behavior that people need to be aware about . If you didn't know that Smart TVs can track you to serve ads, you wouldn't have the chance to make an informed decision about the privacy tradeoff of connecting one to your network. The fact that these choices and tradeoffs exist isn't a refutation of the survey question, it's the point of the survey.
I got 15/17 myself. I've heard enough concerns about Ring that I made the pessimistic assumption about their privacy policy. And while I knew states would certainly require notification on breaches for some personally identifying information (SSID, credit card, etc) I didn't believe it'd be required for breaches of all personally identifying information (IP? User agent???).
In both cases I was more pessimistic about the privacy situation than the 'correct' answer. Many of us here at
IRL the only people I know who understand the privacy situation are programmers. I do know one other person who's actually too paranoid about Google (thinks it's listening to their smartphone mic to decide what ads to serve- no, it's just your browsing habits make you predictable enough) but who isn't paranoid enough about Facebook (if you think storing a document on Google Docs is an unacceptable privacy risk, you probably shouldn't have a Facebook account at all).
I'm fully in agreement with the study's conclusions: "informed consent at scale is a myth," and policymakers should be doing more to protect privacy by restricting some ways companies collect and use data and, where trading away privacy should still be an option, make the nature of tradeoffs clear to the public.
One other area where fuels have their place: portable heat.
Yes, turning combustion into work is an inherently inefficient process. When you say 80% of the energy in the fuel is wasted, well, it's getting thrown away as waste heat. But when heat is what you want, fuel can be efficient.
Electricity is great for doing useful work, but resistive heating is wasteful -- you're throwing away wonderful, usable, zero-entropy electricity by just turning it into max-entropy heat. In moderately cold weather, electric cars can do more than 4 times better (coefficient of performance >4) by using an air source heat pump. But in seriously cold weather that doesn't work so well (COP <2), and it takes a huge amount of energy to keep the car components and the cabin at workable temperatures.
We've largely ignored this up to now, and electric car adoption has been largely driven by people in warm areas. If we want to reduce fossil fuel use in seriously frigid parts of the world, we should address this.
Here's the 8-year-old story of someone who installed a little diesel heater in his electric car. The heater nearly doubled his winter driving range, and uses so little diesel that he gets 380 MPG.
We ought to encourage electric car manufacturers to have some kind of fuel use available for extreme cold. Long term, maybe that's hydrogen or some kind of synthetic fuel, but allowing for fossil fuel heaters right now could actually dramatically reduce the amount of fossil fuel used, by driving electric car adoption in cold areas.
Perhaps it might also make sense to say "well, if we're burning fuel in freezing temperatures anyways, might as well turn that burner into a generator, providing both heat and electricity, making the car a little bit of a serial plug-in hybrid when it gets seriously cold."
No, the MS patent is not the real issue here.
(Link is to Reddit rather than to the MS statement directly. That's because Duda, the inventor of ANS, is responding directly to your concern, and he interprets the MS response, which gives a little more weight to it than just the Polish article + machine translation.)
There is no evidence for learning styles. There is LOTS of research on this point and it is well confirmed that this is just a myth.
Do we need more approaches to teaching programming? Sure.
One approach that I wish we'd find ways to pursue more - in the past people devised 'soft introductions' via applications that pushed their users to gradually pick up programming concepts - command line shells, spreadsheets, RPN/RPL on HP calculators, word processor macros, HyperCard, etc. There's been rather little innovation in that space in the last twenty-five years. There's been an ever-widening gulf between what interfaces invite regular users to do and anything that resembles programming.
There'd be plenty of other approaches. But starting with "learning styles" as a primary target is just clearly barking up the wrong tree.
we are living in a world with more new content than ever before
Sure, but a higher proportion of it is miserable schlock than ever before. Decent writing seems rarer and rarer.
Meanwhile, if you watch classics, you can choose to watch only those things that were good enough to stand the test of time.
I signed up for Netflix back in 2012 or so, but now that they've settled into a pattern of hiking the price every year, I'm going to unsubscribe within a month or two.
$8 a month for access to much of the best of what's been produced in the last 90 years was an interesting value. $16 a month for worthless "new scripted" tripe and Bollywood stuff is hard to justify.
The extreme spread of Covid to rural and suburban areas shows that density has very little to do with this.
This is simply wrong. Yes, timing was not due to density but to international connectivity, and yes, the rural and surbaban USA have been hit hard in this pandemic. But normal social interaction patterns in a dense city are certainly more naturally conducive to the spread of infection.
People have long pointed to urban centers as a primary driver of pandemics dating back to early civilization and especially in the Middle Ages. And they've said that Europe's cities versus native Americans' comparatively rural and isolated settlement patterns was a primary reason why the Columbian Exchange of diseases was so unbalanced.
The reason that the pandemic has had largely equal effects in urban and rural areas in the USA isn't that density has no impact. Rather, it's that urban areas had better compliance with basic public health and safety measures (masks and basic distancing) and added additional measures (going as far as strict lockdowns).
Part of why they were able to get adequate social support and compliance is risk compensation: people in urban areas knew disease could spread there and were more careful, while people in rural areas didn't think believe it could happen to them until it was already too late. The other part is political: highly Democratic areas have been vastly more willing to take COVID-19 seriously. (I say this as a conservative who's been disgusted with the course of the Republican party the last 6+ years.)
You are confused about the purpose of Flash. Yes, in its waning years it was largely used to shoehorn video codecs into browsers that weren't otherwise supporting them. But the real strength of Flash was its original purpose: vector animations. AtomFilms and Homestarrunner were popular in 2001, providing animated video at low bitrates. Lots of illustrations and interactive graphics were done as Flash. And unfortunately many sites were using Flash rather than JS &c for basic navigational elements like menus. Half the 2000-era Web was broken without Flash.
This is the theory that Jack built. This is the flaw that lay in the theory that Jack built. This is the palpable verbal haze that hid the flaw that lay in...