Comment Mod parent up (Score 1) 428
Where's the mod points of a few days ago when I need 'em? Instead, I had 'em for 'Fools' Day!
Where's the mod points of a few days ago when I need 'em? Instead, I had 'em for 'Fools' Day!
> It's not
But "./" is a *ix and therefore geeky way of saying "right here, in this current directory", which by extension means "right here, on this current site."
That current site being "/.", using "./" to refer to it
To be fair, the fact that I run privoxy between my browser and the web... and have since I switched to Linux starting the week eXPrivacy came out... and ran The Proxomitron in the same role on the MS side for several years before that... so I've been proxies at least for the www for nearly as long as I've been browsing the www
And the fact that while I don't tend to run Inet radio proxies or *casting servers personally, they're relatively well documented as options in the streaming clients I've run over the years... plus multiplexed ssh connections and the like altho that's multiplexing for a rather different reason... probably has something to do with proxy options being rather close to the top of the heap of solutions that spring to mind when I read about sharing streaming bandwidth.
As for being implemented, if the company's already bandwidth rationing as it sounded like, all it takes is one person having the idea and running the icecast/shoutcast server, and they were already half-way there with the idea to share bandwidth.
But the illustration remained effective in that they immediately thought about sharing bandwidth, but didn't equally immediately realize that they'd need, effectively, a proxy (or the use of some special multicasting setup) to do it, well illustrated the concept of even rather highly intelligent people in their field not really grasping even these relative basics outside it
Says the geek who once failed to realize that stapling a message to a football wasn't a particularly good idea! Talk about not grasping the basics! =:^( To be fair, I was "impared" at the time, by perhaps a half hour of sleep in 24, 3 in 48, and maybe 8 in 72. I like to think I'd have caught the folly in my reasoning had I been rested and thinking straight.
The implication of your post at least as I read it is that the suggested solution of all listening to the same station would NOT reduce bandwidth usage.
Well, if you're doing it right, and depending on where your bandwidth bottleneck was and the topology of the network, they were right, listening to the same internet radio station
Assuming the congestion is on the internet link(s), setting up a single (or even two or three) icecast/shoutcast/whatever station proxies would work. You'd then have only that single (double/triple) Internet stream, with it in turn re-streaming to as many local machines as its LAN connection allowed. If a dozen or folks end up connected to that local stream, that's an order of magnitude worth of Internet link bandwidth saved....
If the bottleneck's on the LAN, you either have really big LAN bandwidth needs, or need to update your LAN to something modern, switches for hubs, gigabit Ethernet for 10 megabit, 10-gig LAN trunks if the data needs are big enough, etc. Really large companies could replicate the above proxy solution locally, for each office, floor, etc.
Of course, multicast tech would help here as well, and I'd assume that's what the Engineers were assuming, but just because that's relatively rare doesn't mean bandwidth savings from the idea are impossible; one just has to do the streaming proxy thing, instead.
Something I discovered quite by accident, while troubleshooting extension issues here (still on 3.6), is that if I start in safe mode, just long enough to popup the safe mode dialog then cancel it out (or kill it) so not even starting the browser proper, then things "magically" work the next session. But the session after that, back to crashing.
So I rigged up a bash script that starts firefox in safe mode, sleeps a couple seconds, kills it, then starts firefox normally, with the URL I intended to go to. I put the script first in my path so it gets run before the normal firefox binary, and everything works fine beyond a couple seconds additional delay and a short blip of the safe-mode dialog before it gets killed. (That would be seriously irritating if FF was my main browser, but I use kde and thus konqueror for that. Mainly, I generally end up starting up FF on sites not yet setup properly for scripting in konqueror, since konqueror has similar site-level scripting permissions to noscript, but no way to see what domains the page is trying to pull scripts from without diving directly into page sources! =:^( So konqueror gets used on my normal sites including
Unfortunately, Download Helper wasn't freedomware the last time I looked. It was off my system within days of realizing that, as if I wanted to run servantware, I'd have not dumped a decade of MS experience to switch to Linux, now nearing a decade ago (when it became apparent what MS was doing with eXPrivacy; I spent some time preparing and then started my final switch to Linux the weekend eXPrivacy was released).
If I want to bother, often, one of the other available tools does the job. If not, well, watching that video wasn't that important after all.
Well, there's the proprietary drivers which AMD/ATI does what they want with, dropping support for old chips, etc, and there's the native xorg/kernel/mesa/drm and now KMS drivers, which are open. The open drivers support at least as far back as Mach64 and ATIRage, and while I never used those specific drivers, after I realized what a bad idea the servantware drivers were based on the nVidia card I had when I first switched to Linux, I've stuck with the Radeon native drivers. In fact, I was still using a Radeon 9200 (r2xx chip series) until about 14 months ago, when I upgraded to a Radeon hd4650 (r7xx chip series), so I
And why would the free/libre and open source (FLOSS) folks build a shim for the servantware driver? The kernel specifically does NOT maintain an internal kernel stable ABI (the external/userland interface is a different story, they go to great lengths to maintain that stable), and if anyone's building proprietary drivers on it, it's up to them to maintain their shim between the open and closed stuff as necessary. Rather, the FLOSS folks maintain their native FLOSS drivers.
And while for the leading edge it it's arguable that the servantware drivers are better performing and for some months may in fact be the only choice, by the time ATI's dropping driver support, the freedomware drivers tend to be quite stable and mature (altho there was a gap in the r3xx-r5xx time frame after ATI quit cooperating, before AMD bought them and started cooperating with the FLOSS folks again, part of the reason I stuck with the r2xx series so long, but those series are well covered now).
So this
The only thing NAT gives you that a default policy of REJECT or DROP doesn't is extra latency and higher CPU load on the firewall.
Not exactly. Someone above argued, quite persuasively I might add, that had IPv6 been the norm before we got broadband, the differences in firewalls and how they are configured would have had support people simply saying "shut off the firewall", and people would, and it'd work, and they'd never turn it back on. With NAPT OTOH, once there was more than one computer behind it, shutting off the NAPT really wasn't an option at the consumer level, so application writers and their support people had to learn to deal... and that's just what they did! Meanwhile, NAPT got smarter as well, with various auto-configuration protocols like UPnP, etc. None of that would have happened if it was as simple as telling the user to shut off their firewall, and magically, everything worked.
See above for more. The post is high scored, and he said it better than I.
NAT also makes it harder to figure out who the badguy is if one of the internal machines attacks a remote machine (for example, because it got a virus or some employee is running something they shouldn't be).
Actually, that's a good part of the point. Behind the NAPT is private network, for those that run it to manage. It's nobody's business in front what the network behind the NAPT looks like, nor is it their business to trace the bad guy beyond the NAPT. They trace it
Good points. That's beyond what I (think I) know, but it's certainly interesting. I'd love to see Stallman's take on it, or [the name fails me, was the FSF lead lawyer for awhile but I think he has stepped down from that, I see his picture in my mind, but am about ready to go to bed and don't want to go looking for the name...].
As in, put the sources on the same disk as the binaries? =:^)
To the best of my knowledge, no, no "force" is necessary. However, the sources do need to be not just available at the same time, but with a reasonably prominent message saying they are. IOW, if the LiveCDs are out for the taking, there needs to be either another stack of source CDs, or at least a sign beside the LiveCDs saying ask for a source disk and we'll burn one. It can't be simply up to the passer by to ask, without some reasonably sized sign saying to ask if you want a sources disk too, or it falls back to the notice on the binaries disk (which you better be sure and have included), thus triggering the 3-year clock.
Similarly, if there's a nice, prominently placed link on a web site to download the LiveCD, the link to the sources can't be hidden somewhere in the terms and conditions links at the bottom, or it falls back to the provide on request for at least three years clause. Neither does a link to the sources repository in general cover it, AFAIK. It has to be a link to the specific sources used to create the binaries on the distribution media, and it needs to be reasonably visible and logically attached to the link to the binaries. The links to the sources should be placed logically similarly to the links to the LiveCDs, say the next item in the menu, or once you click on the LiveCD link, it takes you to a lander page which for example lists the sources as one choice available among all the supported archs in either a list of links, or a drop-down menu.
Of course, browsing the FTP or HTTP files server itself, none of that is usually a problem (as long as the sources are available at all), because at that level, it's all a bare listing of directories and files anyway, and a logical directory tree makes it easy enough to find the sources. But besides being a mark of good community relations, this clause is behind the big distributions' policies of making srpms, etc, as available as the rpms that far more people use. And the whole policy, while it might seem niggling in its requirements at times, is one of the big reasons the Linux community is as active and vitally healthy as it is. Take away that access to the sources, or even simply make them harder to get (by encouraging the mail a request model, now discouraged by that three-year clause), and you quickly choke the vitality that is the ever living growing changing Linux community.
So if sell someone a box with a linux distribution installed on it do I need to print out all of that distribution's source code and ship it with the computer as well?
You don't need to print it out. In fact, that would be discouraged and may not meet the requirements of being in a customary format (too lazy to go look up the specific GPL wording ATM, but electronic format would be encouraged, dead tree format discouraged as it has to be converted back to electronic format for use), today. You do, however, need to make the sources available -- and no, pointing at upstream doesn't suffice, except "in the trivial case". (Again, I'm not going to go look up the details, but the idea is that individual users can share say Ubuntu and point to Ubuntu for sources, but commercial distributors and the like must make them available themselves.)
With GPLv2 (which the Linux kernel uses), you have two choices. If you provide sources at the time of purchase, say, throwing in a CD/DVD with the sources on it, you're covered, and don't have to worry about it when you quit distributing the binaries. Similarly with a download, if you provide the binaries and an archive with the sources at the same time, you're fine. Similarly if you distribute disks at a FLOSS convention or the like. Have a stack of disks with the binaries and another with the sources (or a computer with a burner setup to burn a disk of the sources on demand, assuming there'll be less demand for that than the binaries).
Alternative two is to include a notice
It is the three-year requirement in this choice that sometimes ensnares distributions that are otherwise playing by the rules, as many don't realize that if they only include the offer for sources, it must remain valid for three years after they've stopped distributing the binaries. Take a distribution that may be distributing historical versions of their LiveCD, for instance, from say 2003. As long as they are still distributing that LiveCD, AND FOR THREE YEARS AFTER, in ordered to comply with the GPLv2, they must make the sources used to compile the binaries on that CD available, as well. So if they're still offering that historical 2003 LiveCD in August 2009, they better still have the sources used to create it available thru August, 2012, or they're in violation of the GPLv2 that at least the Linux kernel shipped on that LiveCD is licensed with. It's easy enough to forget about that part and thus be in violation, when they can't provide sources for the binaries that were current way back in 2003, but that they may well still be making available "for historical interest" on that 2003 LiveCD.
One many distributions become aware of this catch, they make it policy to ensure that they make available the actual sources at the time of distribution as well as the binaries, instead of simply providing the notice that people can request them later, so they don't have to worry about that 3-year thing.
Of course, if an organization is on top of things, and sets it up so they're tarballing all the sources for a particular shipped version and indexing them by product release and/or model number (and perhaps serial number as well, if they do in-line firmwaire updates), then the cost and hassle of archiving said sources for compliance on a product originally shipped in 2003, if it was still available in 2009 so must have sources available in 2012 for compliance with the sources notice provision, will all be fairly trivial. But they have to have thought that far ahead and be actually archiving things in a manner so they can retrieve them as used for a specific product they shipped, not merely the latest updated version thereof.
If I make software that runs on a linux distribution and set linux to run that software at boot-up does that mean I'm really altering linux itself?
It depends. If all you did was alter an initscript, then ordinarily no, tho if you're shipping even unmodified GPL binaries (including the Linux kernel), you likely need to ship sources for them anyway. If you modified the Linux kernel sources directly, or if your software is a Linux kernel module, therefore running in kernel space not userspace, then that's kernel modification and you'd be covered under kernel modification rules.
Do note that the Linux kernel is a special case in a number of ways, however. First, the Linux kernel license specifically disclaims any claim on derivation for anything running exclusively in userspace -- using only kernel/userspace interfaces. The userspace/kernelspace barrier is declared a legally valid derivation barrier, so if you stick to userspace, you are defined as not being derived from the kernel, and the kernel's GPL doesn't apply. Second, the global kernel license is specifically GPLv2 only, so GPLv3 doesn't even enter the equation as far as it's concerned.
Of course, if your software runs on Linux in userspace, it's very likely linked against some very common libraries, including glibc. Of course glibc and most if not all of the most common core libraries are LGPL, not GPL, so unless you modify their code, you're probably safe there. But that may or may not apply to other libraries you may have linked to.
All that said, assuming your software isn't linking to anything it shouldn't if it doesn't want to be GPL encumbered, simply setting up an initscript or the like to invoke it on boot, or even setting it up (using a kernel commandline parameter) to load in place of init, isn't by itself likely to trigger the modification or derivation clauses for anything GPL that also might also be shipped together with it.
But, for commercial Linux based hardware or software products, that still doesn't get one away from having to make available source for any GPLed products you distributed. The one exception to that, that I know of, is if you burn it into ROM, such that neither you nor the customer can upgrade it without physically replacing that chip, THEN you can ship GPL based hardware
That was
What has surprised me is that nobody is congratulating these guys on being great capitalists! They've found a very clever way to separate people from their money, while providing at least
Here's what I see happening, and what it seems most have missed, here. They're doing a several day summer camp -- only they're selling it as MORE than that -- and, if their advice isn't taken to extremes, it won't do any harm (the kids are having a good time at camp, according to TFA), and
Think of it this way. Yes, there's not much positive predictive value in those genetic tests... yet. However, what they're
But more or less any camp can do that, if they aren't already in a one-child thus very competitive society, with a bit more training and observation. If one camp's able to charge extra for it, pretty soon they'll all be doing it and the one won't have its formerly unique hook, any longer.
So, they throw in the genetics testing as well. Probably they screen for a few simple things, Down Syndrome, etc, even tho if something was really wrong it would likely be obvious from the behavior. But this way if something shows up, they can throw it in too. Otherwise, not so much. But they
So anyway, at the end of the five-ish days, they have a fair idea what the kid is interested in, and probably emphasize that. Then they throw the genetic stuff in for good measure, but very possibly fill in a lot more than is actually there, much like a crystal ball or palm reader, by simply being observant and basing their "genetics report" on that.
So the kids have a great time at camp. The camp staff observe them and tell the parents what the kids enjoy. The parents are happy to pay for the camp and service and the kids are happy with it too, and the camp has its hook to set it apart from all the other camps out there and make some extra money in the process!
Then, as you said, the Pygmalion effect kicks in, and since the parents got told that the kids are good at what they enjoy in the first place, they're encouraged to do what they already enjoy, and "what-ju-know", a few years down the road, the camp has a name for itself for predicting so well! =:^)
Of course, at 3 years old, there's probably a limit to how precise they can get in truth, but they may be able to give some guidance, and I'm sure they can convince parents who already want to believe that there's a good reason to bring the youngest back in a few years, perhaps getting them three times in the 9 years 3-12 that's covered, or even every year, for fine tuning as they grow!
So yes, it
So at least given what's in the summary and in the article, I see no reason why this has to be bad at all -- in fact, it could have pretty good results. Of course, there's no way we have enough detail to know for sure either way, but it's certainly not necessarily the doom and gloom that so many folks here seem to think it is, at least not based on the facts as presented in either the summary or TFA itself.
> http://www.copyright.gov/title17/92chap1.html#117
Thanks. You're right. It's pretty straightforward (well, for legalese), now.
I wasn't aware of that, but it's good to read! =:^)
You wouldn't happen to have a reputable link for it around, would you? I'm sure wikipedia does, but if you happen to have one handy...
Good point, and I agree it was a bit confusing, but I had in mind that use requires copying -- and note that one court in the US at least has held that even the copy from storage to memory in ordered to run is covered by copyright, so running directly from a purchased CD (well, unless you're running some sort of XiP, execute in-place, technology, that doesn't even copy it to memory, as happens with some flash based embedded systems, for instance) without any copying to the hard drive or other media is not necessarily a way out, either.
Now, at least in the US, there's the right of first-sale, aka exhaustion rule, but even there, the original user must have been granted permission to possess and use a copy before they can transfer that permission to someone else. Thus, what I was really thinking when I said "use" was that until purchase or other grant of permission, copyright forbids use, especially since that use must involve the procurement of a copy in some way or another.
Once there is legal permission to legally possess a copy, purchased, downloaded, obtained by mass mailing as with AOL disks, whatever, you are correct, actual usage of that copy isn't restricted save for making another copy (but see above where some judge ruled that simply copying it to computer memory in ordered to run or play it is creating a copy, OTOH, this would be an implied permission provided you've legally obtained the work in the first place... and in the US, the implied permission applies to copies made for backup to, I believe). But the author's/owner's permission must be obtained in ordered to legally get the copy in the first place, and that involves copyright, was my point.
But I really should have been more careful in stating that point. Regardless of what I had in mind, your calling me on what I actually wrote was certainly valid given that it didn't actually express what I had in mind. So thanks for bringing up the point. I'm glad someone's on the lookout for such omissions and doesn't hesitate to point them out! =:^)
Don't be irreplaceable, if you can't be replaced, you can't be promoted.