Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:Wrong Problem (Score 1) 39

It was not coding but if an AI cannot understand a simple request to extract one piece of information from a text document then I can guarantee it is going to have problems understanding the far more complex instructions that will be necessary to construct the code needed to do the analysis to extract that information in the first place.

It may be that those limits have expanded in the past few months although that example is only a month old. However, I'd be surprised if my original point does still not hold i.e. that the problem with AI coding is still going to be putting together instructions that are precise enough and that it can understand in order to get it to produce the code that you need. This will be especially true with code that may be doing something less common - like scientific analysis - for which it has either little or extremely poor quality (looking at my own and fellow physicists' code!) training data.

Comment Give it a Choice: Telecom or Media (Score 2) 74

YouTube needs to be regulated as a telecom provider.

It should be given the choice. Either it gets regulated like a telecom provider in which case it cannot discriminate against content unless it is illegal and, in return, cannot be prosecuted for material on its servers OR it gets regulated like regular media in which case it has editorial control over the content it serves but is then legally liable for that content.

Comment Re: Would anyone have noticed? (Score 0) 59

I own a tiny indie studio in Chicagoland and my peers own the some of the huge studios in Chicagoland.

Cinespace is dead right now. It has ONE show active. The other studios are so dead that they're secretly hosting bar mitzvahs and pickleball tournaments for $1500 a day just to pay property taxes.

My studio is surprisingly busy but I'm cheap and cater to non-union folks with otherwise full time jobs.

Comment Re:Wrong Problem (Score 1) 39

You have to get a good grasp of what the AI is appropriately capable of doing and prompt it in incremental steps.

It depends on what you are doing and how specific you need your code to be. The more precise the harder, and harder it is to get the AI to do what you are asking in many cases. For example, when I asked ChatGPT what the statistical signfiicance of a specific scientific result in it said "very significant", when I told it to be more precise in its response it responded that it was a "high statistical significance". I then asked it to specify the number of sigma significance, it responded "more than 5 sigma" so I asked it to get the exact number of sigma to one decimal place (since the paper gave it as 5.9 sigma), the AI's response? "More than 5.0 sigma". I never got it to tell me the correct answer of 5.9 sigma.

If I have to fight with a coder-AI like that to get it to do what I want forget it - it will be vastly less frustrating just to write the code myself and traditional coding is fun even if it may be slower overall. Who wants to fight an AI with excellent training but the personality of a pedantic 2-year-old to get the job done?

Comment Wrong Problem (Score 2) 39

Paying for the compute time and licensing of the code are not going to be the problems that stop this it will be programming the AI and then debugging the output. Generating the correct prompt for an AI is going to be much, much harder than people think. Sure, it is easy to generate a picture in your head of what you want an app to do but, at least in my experience, often the most consuming part of programming is deciding _exactly_ what you want your code to do i.e. converting the vague floppy idea in your head into a concrete set of instructions where all ambiguity is removed and everything is decided.

Having an AI involved might mean that a developer may not need to know the programming details (although I suspect it will be needed since AI code is far from bugfree) but they will still need to describe the program with complete precision e.g. what input parameters are needed, what the controls for each should look like, where they should be placed etc. because if you do not then not only will the AI make a complete mess of an interface but, everytime you tweak the prompt to fix an issue, the interface will likely completely change unless clearly specified.

Now multiply that by all the small decisions that developers make when coding an app and, while AI may help reduce the burden I doubt it is going to massively reduce it plus you have to contend with the issue that unlike a traditional programming language, one slight change to the prompt can radically shift how the AI codes things meaning that a minor change in one area could radically change another area unless the prompt has everything tied down precisely. So I think that developers are absolutely going to be needed but, if this works well, it may make them more productive as well as change their skillset somewhat.

Comment That's quite a blink! (Score 2) 50

300 million years is pretty damn near there to the singularity. In cosmological terms, that's not even a blink of the eye.

300 million years is over 2% of the lifespan of the universe. If we take the average human lifespan to be about 80 years that would be a "blink of the eye" lasting 1.7 years which is quite a long blink.

Comment Supernovae Already Happened (Score 1) 50

Not if there has already been time for supernovae to create heavier elements, like lead, and disperse them which the results indicate is the case here. Stars large enough to undergo supernovae have relatively short lives and, as the summary notes, the presence of heavier elements suggests that this galaxy was around for long enough for supernovae to have already happened meaning that there are even younger galaxies yet to be found.

Comment Worse than Bananas, Not the Same Animal (Score 2) 72

Take bananas: ~99% of global exports are Cavendish, a single strain. No genetic variation...

Strains do have some, albeit very limited, genetic variation. Clones are worse because they have absolutely no genetic variation - assuming no errors in the clning process - making them even more susceptible to single diseases.

Worse, I see absolutely zero point in cloning pets. If we cloned our dog the new clone would lack any of the memories and knowledge of our current dog and since personality and behviour is a combination of both nature and nurture while the clone might end up being similar it would never be the same as our current pet, so in no way can any pet live forever. Identical twins may have similar, but different, personalities and we do not regard then as two copies of the same person!

Comment HUGE FOR HUMAN DEV EMPLOYMENT (Score 1) 84

This is important because what it does is it shows that hiring a human software development engineer is going to be required for that to be considered a business asset or to have some kind of lack of liability from it, Because a non-human won't have the same rights and therefore cannot sign off on transferring those rights to a company.

It also means that code written by AI has Liability that code written by humans does not, Because code written by humans is free speech but code written by AI is not.

Clearly separating human freedom of speech from AI software output also allows us as humans to differentiate between the human that can be hired to do the work legally and the robot that the company bought to maintain profits by not having to hire a human being considered a corporate liability; And it shows that hiring humans to code gives your company rights you don't otherwise get from an AI because an AI is not creating free speech.

It also carves out a dedicated place for professional tech people in a world where you can buy a version of C-3PO or JARVIS to code for you.

Comment Re:Yes, but no.. (Score 4, Insightful) 115

It's not clear to me that the people making the hiring/firing decisions, and deciding how many programmers can be replaced by AI, know the difference between the copy-paste coders you're talking about and the people who are doing the harder things.

And, given the way they think, and given the fact that all of us are subject to a whole host of cognitive biases, some places at least are likely to want to keep on the cheap copy-paste types than the more expensive senior programmers.

Short term, things will look good. Quarterly reports will be up. It will take longer for companies to realize that they've made a mistake and everything is going to shit, but because of the emphasis on quarterly returns, plus because all of these companies are caught up on the groupthink bandwagon of the AI evangilists, a lot of them as institutions may not be able to properly diagnose why things went to shit. (Even if individuals within the institutions do.)

I'm in science (astronomy) myself, and the push here is not quite as overwhelming as it is in the private sector. Still, I've seen people who should know better say "an AI can just do that more efficiently".

Slashdot Top Deals

In Nature there are neither rewards nor punishments, there are consequences. -- R.G. Ingersoll

Working...