Comment Re:THE REAL CODE (Score 1) 160
I have just used it on my own post where I wrote HTML entities using the classic desktop interface, and it was absolutely the same.
I do not care about any other interface because the other ones are both shit.
I have just used it on my own post where I wrote HTML entities using the classic desktop interface, and it was absolutely the same.
I do not care about any other interface because the other ones are both shit.
He has a point though.
If you read the whole thread, he doesn't. My objection was in fact to his characterization of the system's purpose. It was clearly sold to be used for exactly what it was used for here. My objection to the existence of the system is not relevant to his hypocrisy.
Systems which have active users are not mission critical systems.
You can be affected by failure or downtime of a mission critical system as a user even if you are not logged into it. Do better.
You are correct
I know, except that I forgot to mention that they are all cowards.
Complete nonsense.
Go fuck your straw man on your own time, clown.
I have literally no reason to run FreeBSD, the also-ran of Desktop Unix.
I had history with SunOS and Xenix when I tried to install FreeBSD the first time around. At the time, there was no meaningful install documentation, and my attempts to get assistance from the community were met with noob mockery.
I installed Slackware with zero assistance and zero problems and have never felt a need to look back, and I doubt I ever will.
Yeah, those are all the things people say to excuse the complexity, but the reality is there's no real benefit of the complexity.
I wasn't saying there's a benefit so much as that it was inevitable at the time. Today RAM is cheap and fast enough to where some instruction compression is basically worthless. But on the other hand, the complexity also really isn't a problem any more because of advances in gcc, register renaming, and the fact that all modern CPUs have umpty-million gates so the decoder is not a big part of the processor.
Gaggle is a tool to monitor activity on SCHOOL DISTRICT provided social media applications, she used a school device to access a school social media app and made her comment that was picked up by gaggle.
I think schools offer their own social apps to 'protect' the children from predators, and the district monitors their own app for abuses.
The reaction was over the top, the oversight was appropriate.
Thanks for the clarification. I understood the part that sd:
The 13-year-old girl arrested in August 2023 had been texting with friends on a chat function tied to her school email at Fairview Middle School, which uses Gaggle to monitor students’ accounts.
just to mean she signed up using a school email and the school had access to what was otherwise a private account, not a service run on behalf of the school. TFA was somewhat vague about the exact circumstances. But yea, they over reacted.
AOL was the ONLY option for Mac users at that time around 1990. I t did have it pluses at that time. My wife still misses some of its times.... I was happy for a while with it until I realized you truly couldn't get to the WWW internet. My older brother still has AOL email.
Apple had a product called AppleLink which they opened to end users in the late 80's. The original one was an ihouse/partner/developer system that ran on GEIS machinery, but they partnered with Quantum Computer Services and Steve Case, who was running a service for Commodore 64s. Apple beta tested AppleInk for a while, I still have my TShirt and mug. AppleLink eventually morphed into AOL and the rest is history. In 1991, STS-43 sent the first email from space using a Mac portable and AppleLink.
So in it's own way.Apple was responsible for Eternal September...
I select the text I'm interested in, right-click, and then click "view selection source".
On an inferior browser it might be some other sequence of clicks.
the variable-length instructions take up a horrific amount of silicon to decode in parallel.
That was true back when the dominant processor was the 486, but the x86 decoder is minuscule compared to the rest of a modern processor.
Many systems running Windows are used in the same situations. Windows is perfectly fine when put in a system with controlled updates, controlled choice of drivers
It isn't, though. For example all kinds of things which you can do on Unix[likes] without disturbing users require a reboot on Windows. This creates real and measurable impact to users.
The Windows kernel is actually quite a tight piece of code and that is reflected in the fact that most people haven't seen a bluescreen in a decade, it's just the userland above it sucks balls.
Yes, but that stuff matters! Also, there was all that time when pretty much the whole graphics driver ran in the userland because Microsoft couldn't get any performance any other way, which is where most of the blue screens came from. They were a LOT scarcer in NT 3.51 than in any version of Windows since until, ironically, Vista.
There's a lot of folks around here who really hate what I have to say about the evils of unlimited capitalism, and I presume they typically don't mod me up even when they agree with me on some kind of bullshit principle.
Of course, that explains why they could get away with complexity, but that doesn't explain why it's trash.
It's not so much getting away with complexity as that it would have cost more to have less of it in the instruction set. But if you want to speak to that specifically, part of the reason the instruction set is trash is that the architecture is trash — by modern standards, anyway. For example, from a certain point of view, x86 has zero general-purpose registers because some of its instructions (a ton of them really) require that operands and offsets go into specific registers. But this also made the processor simpler, because it didn't have to be able to use other registers. It also made the code smaller, because some instructions are shorter than others.
This was a problem for performance until register renaming was implemented, which IIRC was also a Cyrix development (when it comes to x86 anyway.) With RR, though, the performance penalty of having to move things between registers so that you could execute successive operations was reduced, or with superscalar processors, more or less eliminated since those "moves" can be processed at the same time as other operations and only take one cycle to complete.
Anyway it ultimately was because intel made the processor as simple as possible through the 486, as this was back when an ISA really was an ISA — the instruction set was defined by the architecture, unlike now where one multi-cycle instruction is translated by the decoder into multiple single-cycle instructions in basically all designs, except for genuine RISC. They made up most of the performance drawback with their compiler, which for many years was the most efficient for x86 as it had optimizations to work around the deficiencies of the design. These days, gcc produces more efficient code for all processors than icc.
x86 was wonky because a big complicated decoder would have taken up a lot of silicon at the time, relative to the rest of the CPU. Instructions weren't decomposed into RISCy micro-ops like they all are today. The first x86 processor to do that was a Cyrix chip I think? AMD's first x86 processor which did it was the Am586, for Intel it was Pentium. Today an x86 decoder (which is a relatively complex beast compared to decoders for other instruction sets) is a very small piece of the CPU.
How many Bavarian Illuminati does it take to screw in a lightbulb? Three: one to screw it in, and one to confuse the issue.