Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Reviewers can't spot all fabrications (Score 2) 39

There's some misconceptions here about reviewers picking up the scientific fraud. Say what you will about Dias, but he's done valid work, and he knows the field. There was probably nothing wrong with the paper itself but if he chooses to fabricate observations, then the only way to pull that up is by replicating the experiment, or failing to replicate it, if you prefer.

This is precisely what was done. People failed to replicate the experiment. This is exactly how it is supposed to work.

That said in my experience there's not really much in the way of processes that assume scientific fraud. It is quite hard to figure out what someone has willfully manipulated results. I also note that there is such a thing as the reproducibility crisis where there is a great deal of science where no one is really attempting to reproduce experiments because it's more prestigious to make your own novel experiment that reproduce someone else's. Of course a track record in unreproducible science is a great deal of smoke, and if it's something groundbreaking like room temperature super conductors then you damn well better believe lots of people are going to try it. So the great mystery to me is why he thought he'd get away with it.

Comment Re:Same old hubris (Score 1) 169

This. I have for decades been fascinated in the peculiar trait of ICT grads being generally intolerant and inordinately affected by the Dunning Kruger effect.

Slashdot is a continual cesspit of it. An endless congo line of programmers telling you that their experience-free evaluation of your skill set and life experience is valid while you, of course, have it all wrong because you've spent your life not thinking about anything quite as _well_ as they have.

Fortunately it's not all like that. I'm fortunate enough to work with people that blend technology and the humanities to help make the world a better place.

Comment Re:amiga = custom chips with little flexibility (Score 2) 221

The Amiga wasn't 'hacks'. The essential hardware design was purpose built for gaming and later acquired to form the basis of the Amiga. It would have 'scaled' just fine in the sense of any other VLSI design, just as the two never released successors to AGA. It's better to conceptualise the hardware design in terms of a videogaming console than a computer. The Doom era was a brief period where CPUs became powerful, video framebuffers fairly rapid and the transition from scrolling/sprite-based setups that featured in all videogaming to that point to the software-based 3D era.

If we are playing a game of what-if and imaging what would happen if the Amiga is around, then it's fairly reasonably to assume it would have gone in the direction that videogame consoles did, hardware wise, starting with the PSX. Would you say that didn't scale?

I think people get too hung up on the hardware anyway. What made the Amiga good, why it kept being good, even when the hardware was essentially obsolete, was the software, and the enthusiast computing scene around it --- something that never transferred to the commercial only, appliance-like computer era of Wintel until decades later. (You might say Linux inherited some of that culture but it was a consumer friendly product).

It can be hard to get across what was lost with the death of the Amiga. Rather than trite examples of it being bad at Doom, consider that it was the *norm* for popular shareware and commercial software alike to expose their functionality through Arexx ports, so you could script and chain apps together. It really was a different mindset. That's what I think of when I ponder the possibilities if the Amiga hadn't died.

The realist in me though, I think a lot of what was good about the Amiga was precisely because it was kind of lightly commercialised, predominently enthusiast. It's hard to see how anything like the Amiga could exist today without looking quite a lot like all the other locked-down, appliance-like mainstream computing nonsense from desktop to mobile platforms we see today.

Comment Memory usage is overrated (Score 1) 158

This whining about browser memory usage is as persistent as it is overrated.

Memory is exactly zero use to you unless it's being used for something. Clearly the browser could unload all of the tabs (and there's many extensions to let you do this) but you don't want that do you? You want to be able to switch between them instantly. Then you leave dozens upon dozens open, and then *whine* about the amount of memory that's being used, even if it --- and it usually is --- less than half of the RAM in your box.

Chrome's memory usage is largely a question of tuning trade-offs, and the same is likely true with other browsers. This can be observed by seeing how they behave on mobile devices. Guess what, they use less RAM, but they run slower.

So what is it that you want? You want memory free for the sake of being free? You don't have enough memory to have 100 tabs open and to run all your power user applications at the same time, and closing some of the 100 tabs is impossible, nor is using a tab unloading extension?

Honestly, digital entitlement at it's finest!

Comment There's a nugget of a point here (Score 4, Insightful) 307

OP is complaining about a valid situation that was a long time ago.

It's true that consumer computing was once an enthusiast's pastime. Virtually all 8-bit computers booted into a command prompt. Even in the 16-bit era, the Amiga came with ARexx and that had a powerful effect --- being able to glue your software together via ARexx ports. The enthusiast computing era was where every computer magazine was rammed with tutorials on how to program, or use complex emerging software.

When the Amiga died and gave way to the gloomy years of the PC. One of the things that irritated me most was that there was no culture of providing software for free as had been common on the Amiga. Everything was commercial, or paid shareware etc. It wasn't easy to make your own stuff. The joy and the tinker had been sucked out of everything.

But... that was a long time ago. The world is a very different place today and we're more empowered to code and build stuff than ever before. It's trivial to install free scripting languages, compiled languages, and --- vitally --- good code editors and IDEs. You want a programming console on every PC? Just hit F12 in the browser. What's different today than the 16-bit era is that everyone uses computers. Most people just want to use apps, some people want to code, for whatever reason, and it's easier than it has ever been.

As for wanting a CLI on Android, that's just daft. I mean, it has one anyway, you just ADB to the shell, but honestly, software development has moved a bit beyond 10 print "hello world".

Comment They're not interested in niche products (Score 4, Insightful) 192

Do you remember how the press endless banged on about how Google+ was small and that it was 'beaten' by Facebook? People are still doing it here. Well, I used it a lot. It was awesome. However Google killed it off before this actual shutdown.

Before G+ was a thing, it was Buzz. Buzz was a locally centred discussion platform and it ended up being an interesting way to establish a social graph. I met some really interesting people in my city and we had many deep conversations. Buzz moved to G+, and everything continued there. Then one day they decided that photos were where it was at and destroyed the platform as a discussion board. Design changes just meant that images were kept and text was shortened to two lines in a feed. It quickly became just another image based platform of narcissists. I don't know how it did numbers wise after that, but it totally killed the platform for most of the people who were there from the start. My archive of G+ shows that I crafted a load of posts that took a lot of time. I was using it in place of a blog, and I benefited from people actually reading and interacting with my posts. Not any more.

It seems to me that it was this theme which shaped the way Google approached all of their products and G+ itself is just the latest. They hated the negative press. If they were seen to be second to someone doing billions, then better not to do it at all. That shift right is what made me, and anecdotally a lot of my friends, change out our view of Google. I don't suppose there was ever a time Google cared what we thought, it's just that now it was clear.

Fuck you Google.

Comment No impediment to this but why?! (Score 1) 67

They are charging the cell at a little over 1C. This is really pretty trivial for modern lithium cells. The problem for mobile devices stems from two things --- power cables that were ostensibly designed for data, and the necessity to put the power circuitry in an already compact box, right next to the battery, and a computer.

A solution to the first problem is to up the voltage on the cable, which is what most of these fast charge standards do. However you *then* require a buck regulator to get that voltage back to what you need, 3V to 4.2V DC. Which physically isn't very big with a high switching frequency, but it's still going to dissipate some heat and realistically that is your practical limit. It can be (probably is in the product demoed by TFA) helped by using exotic semi conductors for the switch like GaAs. Not normally worth the $1 or so the part would cost, but for a high end phone where thermal issues are key... that $1 buys you a feature the other guys don't have. A feature you didn't need but never mind that.

So does it shorten your battery? Not the charge rate, really. It's nothing to write home about, but the whole package is gonna get hot. I have a OnePlus which has some proprietary fast charging and it gets pretty toasty here in the tropics. So much so, I just use a regular charger. Lithium cells do degrade with elevated temperatures.

Frankly with USB-C, I think wireless charging and super fast charging etc are all daft. It's super easy to plug in, and a couple of amps is already catered for in the spec. What is the goddamn hurry? This is basically like wireless charging, a solution in search of a problem.

Comment Nonsense (Score 3, Interesting) 148

Everything you said is stuff that international corporations have been dealing with since long before the cloud existed.

The prevailing model has always been to hold local corps accountable, regardless of who they are owned by. They often have to modify their offering to comply with local laws. What, exactly, is so special about cloud computing services that makes this less true?

The dominant cloud model is already built on local points of presence. Much of the rest of what you're talking about is a random spray of complaints that some people don't want to comply with local environment. Well sure, they don't, but at the end of the day there's billions of dollars at stake, so they will.

Clearly the greatest advantage for cloud providers is the technical capability to spin up infrastructure, not the physical hosting of it in the United States. For most of the planet, the US is unacceptably distant from a latency perspective anyway.

Comment Oh mean I loved Psions (Score 1) 82

There's something cool about a physical keyboard in a portable computer. I used a Psion to write a book during a train commute over a period of a year. So I feel nostalgic about this but...

When I think about it. What is the reason we need a pocketable device with a full keyboard. It seems to me that a touch-screen keyboard is a perfectly adequate compromise for typing in a pocketable device, but if you're actually going to want to type a lot... you can't really go past a mini laptop like an XPS 13 or whatever.

When you talk about running apps on something like this. It has a screen that is more mobile-like in size, but an input mechanism from desktop devices. Are you going to run a mobile browser, or a desktop browser? Probably mobile. Which means you're touching the screen, and then you realise that mobile web sites are usually configured for portrait vews...

So, it's kind of cool, and I'm sure there's niche use cases and hell, it's not breaking the bank. I don't really see much of a use though. Well, actually, I quite like the idea of a linux device where you're interacting with the shell rather than a desktop... but really useful? Hmm.

Comment Re:Sounds nice... (Score 3, Insightful) 124

Okay I'll bite. Because we have it already, it's the goddamn web. Which you can build desktop and mobile apps out of, which just needs some support from Apple for the fancier bits of the standards behind PWAs but which Apple wont support ... because it doesn't force you to buy their goddamn desktop computers just to make things for their mobile phones.

Comment Re:Australia is a small market... (Score 2) 168

Wow, a post that's so off it's not even wrong.

A quick recap. Australia isn't asking for anything special. The USA is a "lot further along. 'Every' device manager? Out of the world's top ten phone makers, one is American, the majority are Chinese. And you think Australia will ask them to do something China isn't? Finally, the 'small' market of Australia is loosely equivalent to Canada, or all of Scandinavian countries combined. A market of tens of millions of relatively high end devices. Not a lot of scope for a principled stand by a mobe maker.

Slashdot Top Deals

Even bytes get lonely for a little bit.

Working...