Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Canada? (Score 1) 208

An AC pointed this out in a reply to my post correcting some factual errors in my post. It was an australian company (BHP) doing the bidding, and that was scuttled. The executives, however, were not happy with the BHP bid (which was hostile), and were trying to arrange a more lucrative deal with a Chinese company.

The federal action scuttled both potential deals. Anyway, the point is that China buys a LOT of potash from Canada, and has strategic interests in that resource.

-Laxitive

Comment Re:Canada? (Score 1) 208

You are right. I didn't have my facts straight... thanks for the correction. So yeah, BHP was bidding, as was a Chinese company. The BHP bid was scuttled, and it seems Chinese offer went down with it. As for why the execs liked the deal, it's because they would have been greatly enriched by the sale.

Quoth the CBC, in an editorial (http://www.cbc.ca/canada/story/2010/10/01/f-vp-newman.html):

The executives at Potash Corp., who will benefit from a huge payout if the company is sold, are reportedly trying to organize a rival bid involving a Chinese government-owned company to drive up the sale price.

But ultimate ownership by a company from China, which is one of the biggest buyers of Saskatchewan potash, would have even greater implications for the value of the product than a sale to BHP Billiton.

So, regardless of who makes the stronger bid, the answer from both Ottawa and Saskatchewan should be the same: "Sorry. No sale."

Comment Re:Canada? (Score 3, Informative) 208

God no. We keep that shit in a bunker underneath the Canadian shield, disconnected from the internet. You don't leave national secrets like that just lying around.

On a serious note, China's main interest is in Canada's natural resources. As they grow and industrialize, their need to import massive amounts of raw resources to fuel their economy and people.

For example, Saskatchewan has basically the largest natural deposits of Potash in the world. The whole province is basically potash.. dig anywhere.. and you'll hit potash. Potash is what they make fertilizer out of. Not too long ago, a chinese firm wanted to acquire Potash Corp., Saskatchewan's potash producer. There was a big ruckus raised about it internally, and eventually the sale was stopped by the federal government after the extremely popular provincial minister went on the warpath about Saskatchewan natural resources being sold to foreign interests.

I don't disagree with that move (It'd be idiotic to sell off the rights to your own land's bounty).. but China really doesn't like not being able to get what they want. While it's not proven that it was the Chinese government behind these attacks, my suspicion is that they are (occam's razor). There's a well known effort by China to influence the Canadian government and people, and it's been brought up in the national media not too long ago.

-Laxitive

Comment What's the control group? (Score 1) 810

I'm assuming you're earnest about this... so on that premise, here's what'll happen:

You'll put together a set of measurements from this place. Then you'll try to interpret it with no reference point. You have no baseline measurements. Have you tested 20, or say even a handful, of regular, non-haunted houses to establish a control that you can compare to? Chances are you'll pick up SOME noise in SOME measurements that may or may not be construed to be paranormal, or maybe not. Who knows.

What are your predictions? Is there a set of particular things you'll be looking for? Can it be summed up as more-or-less "anything that seems wierd in the measurements"?

I'm not trying to dissuade you from doing it. Just don't call it scientific and then do bad science. It could be a very cool movie project, and it could be a lot of fun doing it, so it may entirely be worth your time. So if it seems cool then go for it.. but plz do not slap a "scientific" label onto it frivolously.

Submission + - Mathematics: The Most Misunderstood Subject (fordham.edu) 1

Lilith's Heart-shape writes: Dr. Robert H. Lewis, professor of mathematics at Fordham University of New York, offers in this essay a defense of mathematics as a liberal arts discipline, and not merely part of a STEM (science, technology, engineering, mathematics) curriculum. In the process, he discusses what's wrong with the manner in which mathematics is currently taught in K-12 schooling.

Comment Re:LiveSQL (Score 1) 78

I should have thought things out a bit better with the stddev example - and realized that it does indeed have a reasonable closed form. Good catch.

Complex data mining is hard everywhere, that's true. The problem is that even straightforward data mining is hard once the dataset sizes reach into the hundred-millions or billions or trillions in size (implying absolute dataset sizes of terabytes or more). For google it's webpages, for biology labs it's sequences.

The big killer is the cost of transferring data, which is how traditional data systems are built. A remote host has some software set up, and you send it some data, and it processes it and returns it to you. The distinction with Hadoop is that you keep the data on distributed hosts and send the code (which is typically a lot smaller).

The point stands that incremental update of queries on mutation is not a generally solvable problem: it'll still require the addition of new constructs and the limitation of existing constructs in SQL (e.g. ordering). Hadoop approaches the issue from the other end of the spectrum: focusing on a framework that models distributable algorithms directly using a small set of primitive operators (specifically, "map" and "reduce").

-Laxitive

Comment Re:Am I the only one who finds Hadoop unusable? (Score 1) 78

In situations where you are using Hadoop, your "primary" data store should BE the HDFS store you are using to analyze it. That's a big part of the actual efficiency proposition of Hadoop.

The big trick with the "big data" approaches is to recognize that you keep _everything_ distributed, _all the time_. Your input dataset is not "copied into the system" for some particular analysis task, it _exists in the system_ from the time you acquire it, and the analysis results from it are kept distributed. It's only at specific points in time (exporting data to send to someone external, importing data into your infrastructure) that you should be messing around with copying stuff in and out of HDFS.

-Laxitive

Comment Re:LiveSQL (Score 4, Informative) 78

There are some serious technical challenges to overcome when you think about actually implementing something like this.

Take something like "select stddev(column) from table" - there's no way to get an incremental update on that expression given the original data state and a point mutation to one of the entries for the column. Any change cascades globally, and is hard to recompute on the fly without scanning all the values again.

This issue is also present in queries using ordered results (as changes to a single value participating in the ordering would affect the global ordering of results for that query).

The issue that "Big Data" presents is really the need to run -global- data analysis on extremely large datasets, utilizing data parallelism to extract performance from a cluster of machines.

What you're suggesting (basically a functional reactive framework for querying volatile persistent data), would still involve a number of limitations over the SQL model: basically disallowing the usage of any truly global algorithm across large datasets. Tools like Hadoop get around these limitations by taking the focus away from the data model (which is what SQL excels in dealing with), and putting it on providing an expressive framework for describing distributable computations (which SQL is not so great at dealing with).

-Laxitive

Comment Re:Over commit is great (Score 2, Informative) 4

Well, not really. It's the same as operating systems 'overcommitting' memory by giving each process a full virtual address space and filling it on the go. Operating systems solve this problem by... well.. using paging.

The paging approach works well for systems where you expect the in-memory working set to be tight. Mainly you'll see a graceful degradation in performance as you actually start hitting real memory limits and paging comes into effect.

Eventually, I think that can be resolved by taking a hybrid approach: wait until memory pressure builds and paging hits performance more than you'd like, then auto-migrate machines off the host as necessary. You get the best of both worlds: oversubscription when resource usage is low and performance is not affected, and on-demand resource allocation when resources are known to be needed.

-Laxitive

Submission + - Extreme Memory Oversubscription for VMs (gridcentriclabs.com) 4

Laxitive writes: Virtualization systems currently have a pretty easy time oversubscribing CPUs (running lots of VMs on a few CPUs), but have had a very hard time oversubscribing memory. GridCentric, a virtualization startup, just posted on their blog a video demoing the creation of 16 one-gigabyte desktop VMs (running X) on a computer with just 5 gigs of ram. The blog post includes a good explanation of how this is accomplished, along with a description of how it's different from the major approaches being used today (memory ballooning, VMWare's page sharing, etc.). Their method is based on a combination of lightweight VM cloning (sort of like fork() for VMs) and on-demand paging. Seems like the 'other half' of resource oversubscription for VMs might finally be here.

Comment Re:new? (Score 2, Insightful) 278

It's not that 3d user interfaces have been fully explored, but that simulated 3d interfaces on 2d desktops have some fundamental limitations. We already have some amount of simulated pseudo-depth: windows can lie on top of other windows, etc.

The problem is that by the time you get around to interacting with something, you're interacting with a 2d euclidean plane which presents a projection of some 3d model. It doesn't make the plane 3d. You can't reach around and touch the "middle" of an 3d object projected onto a 2d plane. That's a problem. These might be somewhat ameliorated by true 3d interfaces (where the display itself is 3d), but that tech has yet to mature.

If you think about it, even the way we work on our typical desk is mostly 2d, from a topological perspective. I have a pile of papers and some random crap lying around my desk. When I go to grab a document to work on, I don't just reach into the middle of a stack and pull out the right one. I don't have that capability. I need to go and start flipping pages, basically morphing my 2d topology to reveal some object hidden in 3d, and only then interact with it.

That's not to say that all 3d effects and stuff are useless. Simulated 3d is a great way of providing visual cues that we have been training ourselves on since we opened our eyes. That can be a very important aspect of intuitive interfaces.. but fundamentally it acts as a visual highlight. The goodness or badness of any particular 3d interface depends entirely on how effectively the _2d_ projection is.

Thirdly, "true" 3d is actually too limiting. We are forced to live in a 3d world, but our computers give us access to many more dimensions, weirder dimensions, than that. We can provide 2d projections of abstract non-fixed-dimensional objects, like n-ary trees (e.g. filesystems). An example of a projection of that abstract object to a 2d interface would be spotlight. It provides a 2d textbox which behaves in strange and weird ways - a 2d textbox that projects 2d manipulations (type some characters), into an arbitrary traversal of the tree. Compare the utility of that to the utility of a "true" 3d rendered filesystem. What value would that add? Sure, it would look neat, but what extra thing would you gain from it?

There's nothing magic about 3d. Computers operate above and beyond limitations of 3 dimensions, and are currently constrained to expose their behaviour through primarily 2d interfaces. Simulating 3d on top of 2d user interfaces, aside from the "visual cue" aspect, is kind of an arbitrary choice.. not necessarily the best one.

-Laxitive

Slashdot Top Deals

Doubt isn't the opposite of faith; it is an element of faith. - Paul Tillich, German theologian and historian

Working...