Comment Re:What about... (Score 1) 48
> An OLED display ensures that you'll be buying a replacement every two years as your colours turn to crap.
Depends on the OLED: a WOLED shouldn't have this problem..
> An OLED display ensures that you'll be buying a replacement every two years as your colours turn to crap.
Depends on the OLED: a WOLED shouldn't have this problem..
> RDP protocol support was merged into Wayland over a year ago.
That's incorrect: RDP support was merged into **Weston** if you use another Wayland compositor it may have or not RDP support..
I agree with your point about XWayland: as long as the toolkits doesn't remove support for X, Wayland's remote display capabilities can only be superior or equal to those of X.
Now, the tough question is: how long will the toolkits keep the X support?
The developers of desktop tech have been called CADT among other things..
> Absolutely nothing over any of the well supported and understood open source MIPS implementations.
Ah! Read this ( http://jonahprobell.com/lexra.... ) and be cautious when re-implementing the MIPS ISA..
I'm glad you posted this, but note that the tittle of the article is wrong: the RDP backend was merged into the *Weston* compositor not into the Wayland protocol.
Which means that if you're using E19's own Wayland compositor then of course you **don't** have access to this this RDP backend, unless there is a way to stack compositors?
On one hand OpenBSD is focused on security, on the other hand it use a lot of 'unsafe' programming languages (for example C) where security is only achieved thanks to expert programmers, but even experts have bad days and make mistakes..
Wouldn't it make sense to push the usage of programming language which provide more security by default?
For example, encouraging developers to use Ada instead of C..
I agree with you but not for the same reason: my reason is a variant on 'cannot get the basics right': Chrome is not pleasant to use at all on a 4GB RAM PC, it use too much memory which make the PC swap..
I used to prefer Chrome to Firefox, due to its snapiness, good separation between tab, but when it used frequently so much memory as to make the PC swap, I switched back to Firefox.
> You do realize that quantum mechanics were met with similar derision? Heck, Einstein never really accepted the notion, and that's as great a scientist as we've ever had
And you could add, "and he had a better understanding of quantum mechanic than most of the scientists" cue EPR paradox, how many scientists who accepted QM understood that QM was non local?
> I never used Wave, and it was shut down long before I joined Google, so I have no idea what you're talking about, much less who made that decision. It doesn't sound like the sort of decision made by a CEO, however.
Wave had a synchronous mode (like IRC) but users viewed all the characters you typed instead of every line or having a 'Send' button: this is a really stupid design decision for two reasons: 1) it use lots of bandwith 2) think if you were discussing with your boss how much you'd *hate* this feature.
As for this kind of decision is made by a CEO or not, I'd answer: it depends. If the CEO is Steve Jobs he would have made this decision (and perhaps fired the one who chose this design) if the CEO is the typical CEO, yes you're right.
> And don't think that picking winners and losers is easy. Well, it's easy to *do*, but very hard to do *right*. And, FWIW, I think Larry is doing a great job.
"great job"? Do you remember Google Wave?
A *very poor* job here..
Who made the stupid decision to use letter-by-letter in the synchronous mode instead of line-by-line?
Huh, sorry but I think that your "analysis" was worthless: there is a big difference between communist theory and communism-applied-in-the-real-world: there is the same difference between capitalism in theory and capitalism in the real world..
I remember watching stargate (the movie): they send a robot which sends a beacon signal through a warp gate and then they immediately receive a signal saying that the robot is 10 light-years away, instead of having to wait 10 years..
Even though I didn't care about 'scientific realism' in the movie, my brain told me 'this is wrong' and it "took me out" of the movie, so scientific realism isn't a big issue unless it kills your enjoyement of the movie..
>[cut] with a minimum use of Xrender to stitch them together.
Which is a very good way to use XRender: use the glyph cache in the server for efficient text rendering, for the background render locally and push the pixmap (which could be easily compressed in case of WAN access) to the server and stich everything together.
And you can't do this with Wayland (no drawing API..).
>Even by Qt4.5, they found out that their pure software backend (Raster) was fast than the XRender one (Native).
The benchmark you've linked is a local benchmark, irrelevant for network rendering.
> - XRender allows you to do a lot of things "efficiently" but we can do them more efficiently with direct or client side rendering and just push a shared memory buffer to the server.
More efficiently? Only locally, remotely it depends on a lot of things (bandwith, latency).
> Qt5 will not use XRender.
I'm not so sure: this webpage list XRender http://qt-project.org/doc/qt-5.0/qtdoc/requirements-x11.html
> unless you're satisfied with simply calling the Wayland protocol "X12" and be done with it.
You cannot call Wayland X12: X is a drawing protocol, Wayland isn't: it only provides buffers.
And I disagree that an X12 protocol 'woud look awfully like Wayland':
-with X you know where your pixels are going on the screen, with Wayland you don't!
-with X (XRender), to draw text efficiently you can have a glyph cache managed by the X server, with Wayland you cannot have this.
On X, you can have a Gnome application running on KDE and vice versa, will this also be the case when the desktops will use Wayland?
Or do you have to use XWayland to ensure that this interoperability still works?
He who is content with his lot probably has a lot.