Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Terrible, wretched, no good science (Score 1) 637

I suspect that the issue here is you're looking at IQ as a distinct trait which is under direct balancing selection, whereas Cochran (and Crabtree, for that matter) look at it as a complex emergent property which is highly (primarily?) dependent upon genetic load--- and also that genetic load, rather than IQ (or even quantitative traits we'd normally associate with IQ), is really what a lot of this selection is about.

I.e., the hypothesis some geneticists are now discussing is that there aren't really "IQ genes" but that a lot of the variance in IQ directly varies with genetic load. I.e., someone with a high IQ will have a lot fewer broken genes (LOF variants) than someone with a low IQ.

I think Cochran et al.'s lens is better than yours in this context. There's plenty more background material at the blog I linked.

Comment Terrible, wretched, no good science (Score 5, Interesting) 637

Greg Cochran over at West Hunter has a pretty damning critique of this paper.

Cochran's review:
In two recent papers, Gerald Crabtree says two correct things. He says that the brain is complex, depends on the correct functioning of many genes, and is thus particularly vulnerable to genetic load. Although he doesn’t use the phrase “genetic load”, probably because he’s never heard it. He goes on to say that that this is not his area of expertise: truer words were never spoken!

His general argument is that selection for intelligence relaxed with the development of agriculture, and that brain function, easier to mess up than anything else, has probably been deteriorating for thousands of years. We are dumber than out ancestors, who were dumber than theirs, etc.

The first bit, about the relaxation of selection for intelligence in the Neolithic -. Sure. As we all know, just as soon as people domesticated emmer wheat, social workers fanned out, kept people from cheating or killing their neighbors, and made sure that fuckups wouldn’t starve to death. Riiight -it’s all in the Epic of Gilgamesh. In the online supplement.

Why do people project a caricature of modernity back thousands of years before it came into existence? Man, he doesn’t know much about history.

Nor does he know much about biology. If he did, he’d understand that truncation selection is what makes such complex adaptations possible. If only the top 85% (in terms of genetic load) reproduce, the average loser has something like 1 std more load , so each one takes lots of deleterious mutations with him. But then, he’s probably never heard of truncation selection. I’m sure they never taught him that in school, but that’s no excuse – they never taught me, either.

If his thesis was correct, you’d expect hunter-gatherers to be smarter than people from more sophisticated civilizations, which is the crap that Jared Diamond peddles about PNG. But Crabtree says that everyone’s the same – stepping on the dick of his own argument. Of course, in reality, hunter-gatherers score low, often abysmally low, and have terrible trouble trying to fit in to more complex civilizations. They do a perfect imitation of being not-smart, amply documented in the psychometric literature. Of course, he doesn’t know anything about those psychometric results.

Which reminds me of secret clearances: it used to be that having a clearance mean that you were entrusted with information that most people didn’t have. Now, it means that you can’t read Wikileaks, even though everyone else does. In much the same way, you may have the silly impression that having a Ph.D. means knowing more than regular people – but in the human sciences, the most important prerequisite is not knowing certain facts. Some kind soul should post the Index, so newbies won’t get themselves in trouble.

He doesn’t even know things that would almost support his case. Average brain size has indeed decreased over the Neolithic- but in every population, not just in farmers. He might talk about paternal age effects, and how average paternal age varies – but he doesn’t know anything about it. He ought to be thinking about the big population increase associated with agriculture, and the ensuing Fisherian acceleration – but he’s never heard of it.

He even gets the peripheral issues wrong. He talks about language as new, 50,000 years old or so – much more recent than the split between Bushmen/Pygmies and the rest of the human race. Yet they talk. He says that the X chromosome isn’t enriched for cognition and behavioral genes – but it is (by at least a factor of two) , and the reference he quotes confirms it.

Selection pressures and mutation rates can vary in space and time. Intelligence could decrease – it’s not impossible. But we know that the pattern he suggests does not exist. Or, to be exact, in exists only in that neighboring world that’s full of Melanesian super-hackers, gay men whose main concern is avuncular investment, and butt-kicking pixies.

Comment Re:ahh, the "singularity"... (Score 5, Insightful) 830

PZ Myers wasn't there; he based his whole critique on gizmodo's writeup.

Speaking as someone who was there and heard Kurzweil's full speech, I can confidently say that PZ Myers does not understand Ray Kurzweil.

First off, a significant factual mistake: Kurzweil -clearly- never said we'd reverse engineer the brain by 2020. He argued against exactly that (his prediction was late 2020s, shading into 2030-- perhaps also unbelievable, but if you're going to critique someone, why not get the facts right?). Sure, gizmodo's writeup was entitled "Reverse-Engineering of Human Brain Likely by 2020". It'd be an understandable attribution mistake for say, an undergraduate.

Second, Myers is critiquing Kurzweil's ontological position based on a throwaway writeup dashed off by gizmodo. (Really, Myers? And you wonder why you're a magnet for shitstorms...)

Third, Myers' criticism is essentially that the brain is an emergent system, and we'll have to understand all the protein-protein interactions, functional attributes of proteins, etc. in order to actually model the brain.

This third assumption is arguable, but Kurzweil wasn't actually arguing against this. All Kurzweil meant with his comment about bytes and the genome was there's an interesting information-theoretic view of how much initial data gives rise to the wonderful complexity of the brain.

I had a lot more respect for Myers before I read this rant.

Comment Impropriety (Score 3, Insightful) 464

One has to wonder, if Blizzard goes that far above and beyond requests of law enforcement and gives mountains of data in response to polite requests-- not even subpoenas-- how seriously do they take the privacy of *your* personal information?

I'm glad the bad guy got caught, etc, but handing over the keys to the kingdom to law enforcement without a subpoena implies, in my mind, that respect for users' privacy is simply not something Blizzard considers when they go about their business. Or rather, that such information is their property, not yours.

Comment You're playing their game (Score 5, Interesting) 375

Given the assumption that cryogenic revival will be possible, this may work in principle-- but the insurance industry doesn't exactly function on immutable code-like rules that can be hacked for fun and profit.

It's much more a game-- and moreover, the game is owned by the insurance industry. You're just playing it. And if you figure out a particularly good trick to beat the house, they're either going to rationalize why certain technicalities mean they don't need to pay you (and thus 'easy money' becomes 'try to drag deep-pocketed defendants into court'), or they'll simply change the rules before you're revived, and you won't have been able to do anything about it because you were dead.

From a what-do-you-have-to-lose perspective, sure, it's worth a shot. But this simply can't be a dependable part of estate planning.

Comment Re:This is important (Score 1) 536

If Modern humans and Neanderthals were so different, how likely is it that fertile offspring could have been born?

We don't currently know enough to say much about the fertility of human-neanderthal hybrids, but see, for example, Ligers for fertile cross-species hybrids (and lions and tigers are separated by about twice as much time-since-divergence as humans and neanderthals, off the top of my head).

If it is not likely, could horizontal gene transfer have been a factor?

In short, no. Very probably not a significant factor. HGT happens quite often between, say, bacteria; bacteria and viruses occasionally leave nonfunctional copies of themselves in host genomes (which can provide entropic fuel for evolution); very seldomly, some other sorts of microorganism-host HGT can happen (e.g., how plants developed chloroplasts). But, from theory and genomic evidence, we can say pretty confidently that HGT just doesn't happen directly between say, two mammals.

Simply put, there just isn't a viable vector (bacteria, virus, loose DNA, etc) that could move a gene from one organism into the germline of another. Something like cannibalism could -very arguably- allow some gene transfer, but it wouldn't get passed down in the germline.

Comment This is important (Score 5, Interesting) 536

The issue of introgression (gene flow from neanderthals to modern humans) is hugely important. It's a lot more important than the curiosity or oddity the Times article makes it out to be.

All the published studies looking for this introgression have been based on neanderthal mDNA. Since it doesn't undergo recombination, it's not a good marker, and the negative results so far are predictable and do not preclude gene flow. It'll be interesting to see Paabo's results. He's been working on getting nDNA data from neanderthal remains for a while now, and perhaps this is a hint that he's found some introgression.

Why it's important:

The small picture of why it's important is it would substantially redefine our family tree. We could refine our primate phylogeny.

The bigger, more hazy, and potentially earthshaking picture of why this could be important is that it doesn't take many viable pairings to get genes from one gene pool to another, and these genes could have been very important to our development. Modern humans and neanderthals were under many of the same environmental stresses but likely developed different adaptions to them. This includes behavior and cognition genes. As Stringer points out in the article, "in the last 10,000-15,000 years before they died out, around 30,000 years ago, Neanderthals were giving their dead complex burials and making tools and jewellery, such as pierced beads, like modern humans.” Proto-modern humans were smart. But neanderthals were also smart, potentially in different and complimentary ways. And perhaps it took a combination of proto-modern human and neanderthal genes to truly make the modern human mind. Our brains could be an example of 'hybrid vigor' on a grand scale.

So the big question mark is whether, given we can determine gene flow, if this hypothetical combination of proto-modern human and neanderthal cognitive adaptions could have led to the cultural explosion of ~30-50 thousand years ago. The biology is plausible and the timing's right. The data's still out, but it's coming in. Odder hypotheses have come true.

Comment Neat, but... (Score 4, Interesting) 43

Here are some questions I have about the chip:

- These chips/systems already exist. What's new about this MIT effort? The Computerworld article was very sparse.

- There's a great deal of bidirectional communication that goes on in normal eyes-- information not only flowing from eye to brain, but from brain to eye as well. As far as I know these tech just discards these signals. Is this important?

- Last I heard, this sort of technology was approaching 1000 effective pixels of visual information (assuming ideal electrode placement). Has this effort from MIT pushed this boundary? How does '1000 effective pixels' compare to the eye's effective resolution? Can we put normal vision in terms of pixel resolution?

- I've read about shunting tactile senses (for instance, the nerves on a person's tongue) over to a digital videocamera. I believe the military has done a fair bit of research into this. Could this sort of approach be viable for helping the blind function as well? Could it become the preferred approach since it seems less invasive than ocular- and neuro-surgery?

Comment No more than a tech demo (Score 5, Interesting) 221

Despite the name, Sidewiki is not a wiki such that people can edit, prune, and synthesize information, nor is it moderated in any way. It's just a comment system, with no way to amplify the signal vs the noise. It's also unclear how people are supposed to use it- e.g., what to post (which is a significant failing imo). Interesting as an approach to layer user comments onto webpages, but not useful yet. Arstechnica pretty much nailed it with the following:

This new offering from Google is intriguing in some ways and it shows that the company is thinking creatively about how to build dialog and additional value around existing content. The scope and utility of the service seems a bit narrow. The random nature of the existing annotations suggest that the quality and depth of the user-contributed content will be roughly equivalent with the comments that people post about pages at aggregation sites like Digg and Reddit.
What makes Wikipedia content useful is the ability of editors to delete the crap and restructure the existing material to provide something of value. Without the ability to do that with Sidewiki, it's really little more than a glorified comment system and probably should have been built as such. As it stands, I think that most users will just be confused about what kind annotations they should post.

Comment Are there any plans to revamp Parental Controls? (Score 4, Interesting) 520

When I play wow, I probably play too much. I'd like to use some built-in functionality to gently put limits on my playtime and remind me how much I've played in a week. At first I had high hopes that the Parental Controls function could help me.

Unfortunately, though the rest of wow's interface is great, its parental controls are not only a crime against all that is beautiful and elegant, but pretty useless in the real world. There's no way to set "able to play X hours per week" or "able to play Y hours per weekday, Z hours per weekend". One must set a hard-coded block schedule, click okay, then hope you've predicted your exact needs. And there's no in-game warning when you're coming up against a limit-- you're simply disconnected when it hits.

Please, please, please tell me there are plans afoot to fix this tool and perhaps remake it into a more general method for account owners to manage playtime better? Extra kudos if it could include a Netflix-style option to put your account on vacation for a variable length of time...

Comment Symantec is saying this? (Score 5, Insightful) 459

If there were any high-quality for-pay alternatives, I'd say he might have a point.

Unfortunately, most antivirus software sucks, with Symantec more or less epitomizing how good ideas on paper can turn into terrible/buggy/bloated security software that actually increases your exposure since it adds another node malicious code can attack. Symantec's argument-from-assertion notwithstanding, there doesn't seem to be any correlation between antivirus software being for-pay and higher quality.

From my experience, there's really bad antivirus software (such as Norton, which I have zero confidence in and would never let touch my machine), and slightly less bad antivirus software. What went wrong? Why does this industry suck so badly? Anyone have any insight?

Slashdot Top Deals

In case of atomic attack, all work rules will be temporarily suspended.

Working...