Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re: I beg to differ (Score 2) 67

Exactly. But I find it's often faster to review the code that the AI generates, tell it to fix it, and check again than to write everything myself a lot of the time. I also learn a fair amount of new styles and techniques as I review the code. While wondering things like 'Why the hell is it doing that?' and spending the time to research if it's safe or not and finding out that it is safe because of the latest language standards.
I've really only used it for writing test code. Since I don't like the idea of the same developer writing the main code and writing the test code (often both done by me), I like using the AI to write the test code since, while it's not as good as a fresh pair of eyes, when you're working solo it's better than nothing. And I don't work solo by choice; the company is too cheap to hire enough people.
I have tried to use AI to write code for personal projects, like plotting out the position of the moons in the sky of a fictional planet, and found that it completely FUBARed the code, and I had to throw it out and write it myself from the ground up. And let me tell you, astral geometry can be a bitch if you haven't dealt with it for a long time.

Comment Re:I beg to differ (Score 4, Interesting) 67

Not only that. The C-level execs think that AI will help them save on costs and have been holding off hiring more developers because AI is just around the corner. If that's not affecting the industry, I don't know what is. It's holding wages down and drying up the job market. With market consolidation and the monopolistic companies out there (Google, Facebook, Microsoft, Amazon, etc.), they won't be innovating as much as they can. And they have a good habit of buying up any sort of innovation from smaller companies and then botching its rollout and deployment.

And you're right about the junior and mid-level developers being let go. While they can use the same AIs as senior developers, they don't have the experience to know when the AI is wrong. They also have a hard time decomposing the problem space down to something they can then use the AI for. I've found myself more productive with AI myself, and other people say it's shit. But when I tell them how I use the AI, and they try it they realize that AI is not as bad as they thought it was.

With the current AI, you can't tell it: Build me an application to do X. And expect a good result. But you can tell it: Build me a function that does A, B, and C and takes inputs X, Y, and Z. You can get fairly good results from it. Enough that you can build up on that to do the job that would take a week in a couple of hours.

A junior developer doesn't have the experience to know how to do this, because they haven't had to deal with junior developers. Senior developers who've had to work with junior developers and help them grow know how to work with AIs because it's almost the same thing. Except the AIs don't argue with you and produce code much faster. Of course, not having someone argue with you and tell you that your shit does smell is a problem in itself. I am worried about the industry as a whole when I retire, and there are no junior developers left in the experience pipeline to take my place.
 

Comment Re:To test your reading comprehension - you failed (Score 5, Informative) 99

It is gravity. If you look at the entire energy chain:

- The Earth stored potential energy by moving the lime up to the top of the mountain, most likely through tectonic activity. The potential energy stored is E=mgh. g being the gravitational constant.
- The dump truck converts that potential energy into kinetic energy by going down hill.
- The kinetic energy is converted into electric energy (and heat, there's always heat and other waste energy) by the regenerative breaks.
- On the way back up the electric energy is converted into kinetic energy and some potential energy

I wouldn't say that the dump truck generated energy, it just harvested the potential energy that the Earth stored in that block of lime. The thing that is exciting is that its a new way to convert potential energy into electric energy. What would be even better is if they could dump the excess into the electrical grid at the top of the mountain.

Comment I have to take issue with the summary (Score 2) 24

>least a quintillion floating point computations ("flops") per second, where a flop equals two 15-digit numbers multiplied together

1st: FLOPS: FLoating point Operations Per Second. Not FLoating point cOmPutationS.

2nd: A flop is not two 15 digit numbers multiplied together. If that was true my 32-bit MCU without a FPU would have a better FLOPS measurement than it does. The MCU can easily run 123456789012345 x 123456789012345 using 64-bit integer software on the 32-bit ALU, much faster than it can run 1.23456789012345 x 123.456789012345 with floating point software.

Comment Re:Which? (Score 4, Insightful) 93

That isn't really a detailed test methodology.

How close was the phone to the base station when they tested?
Was it a direct line of sight to the base station antenna? Were there buildings in the way? What was the multi-path environment like?
How many other phones were operating in the area? What was the noise floor like?
They say 'carrier' network. Which 'carrier' network? Was it tested over the air, or in the 'carrier's test lab? Qualcom has their own 'carrier' network, does Apple? What are the settings on the carrier network?
Was the phone stationary?

There is a lot of missing information there to be able to reproduce, and Which? could be in a vastly different environment that isn't as forgiving to battery life as the one Apple has detailed.

Comment Test Setup Will Definitely Affect Things (Score 4, Interesting) 93

Modern phone power is controlled by the reception on the base station. If the base station tells the phone to increase power it will, if it tells the mobile to decrease power it will.

Apple phones don't have an external antenna connector, so Which? probably had to hook their test base station up to an antenna. Or they tested against a real network.

Apple probably tested their phone in an RF chamber, with minimum noise, and they might have hooked the base station directly up to the antenna port on the board. They might not have had any attenuators between the test BTS and the phone either, so the phone was probably transmitting and minimum power. Which is biasing the test in favor of long battery times, much like how car companies bias their MPG tests for higher numbers.

This test is about talk time, but they can also bias the test for standby time. I'm not familiar with the latest phone protocols, but in the past you configured the base station to tell the phone how often to turn on its radio to look for a paging message. Normally this is about 1.2 seconds so that a call can be connected quickly, but can be set much much longer. Allowing the phone to be in sleep mode for longer periods of time giving a better battery life measurement.
   

Comment Re:Alternative explanation (Score 1) 149

Could it also be that the data is from Nvidia and gathered anonymous from their GeForce Experience and is completely false?

It's strange that a company that makes money selling high end video cards gets a result saying 'buy more high end video cards if you want to win'.

I can't trust the data provided by such a vested party, concluding something that has a high monetary reward to them.

Comment Re:Bloomberg (Score 1) 356

And it was a very stupid way to claim that they were compromised.

A low pin chip connected at the ethernet port. Or where the PHY is. By this point the data should have already been encrypted and secured. Especially if its in a secure facility, even communications inside a rack are usually encrypted. Besides if they were wanting to get any unsecured data off the network then it would be better just to compromise the switch. That way they get what they need from multiple sources, and compromise the thing that would be used to detect the information drain.

If they wanted to get data that isn't secure, they'd have to tap something on the data bus. I think data buses are around 256 bits in most servers. Add in 40-64 bits for the address lines, and you have over 300 pins on the chip, and then you have to have power, grounds, and the pins to send the data out, which means talking to the PHY. I suppose the chip could send out ethernet direct, requiring only 4 pins, but then it would have to be 12V tolerant, and that you need to use a larger silicon process, and more gap between the pins. Most likely they're would talk to the PHY through SMII (or whatever the gigabit interface is, I'm more familiar with 100 mbit interfaces at the hardware level), which is another 20 pins or so. They probably also need an external oscillator.... So I don't think you're going to find a chip to monitor data in a server with less than 400 pins.

Even with a BGA package this is not a 'small' chip. And then they have to deal with internal RAM/ROM and the processing power to figure out what information they've found and send. There's no way they're going to send all of it. It would take too much time, and make it too detectable.

I'm not saying it can't be done, and the supermicro servers can't be compromised. Just I believe that they can be compromised in the way Bloomberg claims they are. Hollywood Magic doesn't work, you can't just add in a 'chip' and compromise stuff. You have to add it to where it can be effective.

Comment Re:It was nice knowing you (Score 1) 356

I was looking at picking up a new Mac when they do the refresh for video editing, digital artwork and animation. Mac is supposed to be better for that type of work. But if they're switching to ARM, I think I'll pass. I'll just look at something like an i9 and programs that can use multiple cores for the rendering the final output of videos and animations.

While the Intel chips are crufty with all the stuff built up over the years, ARM is not going to be able to replace it for the work I do and plan on doing. I may pick up a Mac mini in the future to cross platform test my games, but its not going to be for any of the major work I do.

I need a powerhouse for what I do, not a phone with a keyboard.

Comment Re:Only? (Score 2) 205

This is the problem my company faces. I also wish they didn't see $30 k in India as luxury wages. There's plenty of competent people in India, but they're competent enough to know that they can get much more than $30 k, or the peanuts my company pays.
The cheapness of my company is also the reason why my job isn't at risk of automation, they look to save money for this quarter, and will never make the up front investment to automate my job. Ship it off to India if they can, but automate it? No.
I also write the build and test automation software as one of my many hats, and I do my best to automate my job so they keep giving me more jobs that I automate. Fortunately its positioned us to get into industries that require a lot of certifications on code quality and process and that means much more work so in a way I'm creating employment by automating the stuff I am.

Comment Re:I witnessed this (Score 1) 479

This happens a lot.

We had a paid intern at work (I don't know how much but it sounded like north of $70k per year). He never showed up, never did any work. Found out he was the son of a VP in another department, and HR rules said he couldn't be in VP's department, so we paid it out of our budget. Fortunately we don't have this paid intern anymore and we can actually use the money for people who actually do work.

Comment Re:Don't be lazy programmers (Score 1) 509

I blame Moore's Law and smaller transistors. And that might end sooner rather than later.

Most coders get to deal with fast processors, fast ram, and its always getting faster and faster. So sloppy programming can be covered with more overhead provided in more recent languages. Java & C# don't really have the concepts of memory allocation and clean up, that's all handled by the overhead. Python & Perl and other similar languages don't even need to worry about what types are passing through the system. Its all handled by the overhead.

I'm from an old school of programming. I can barely tolerate Java & C#. I've been poisoned by C/C++ and assembly. I always have to know when memory is allocated and when it is deallocated. I always have to know when something goes from memory to cache and into registers. I always have to know how much memory a chunk of data takes up. How it flows through the processor. I start to twitch when I try to use a typeless language, because I was trained to know where everything is in memory, and to always clean up after yourself. I can't trust the compiler or run time to do it right. Why? Because I can't see the bits in memory, or toss a logic analyzer onto the memory bus and see what is going on.

Recent machines give the power to wrap recent programmers in bubble wrap. C/C++ gives programmers enough rope to hang themselves and will jerk on the rope if they aren't careful. But with C/C++ you can tune things down to sub microseconds in your drivers (which I've done). I don't think you can expect the same determinism with more recent languages, not with runtimes that can randomly garbage collect or other overhead things.

I deal with embedded parts, microcontrollers, with fixed RAM and Flash and cycles. I can't rely on the overhead to cover my ass. I just don't have the resources to do it. I think that when processor efficiency improvements start to slow down that coding will slowly turn back to less overhead as the software will have to make up the speed differences that the hardware can no longer provide. I saw that when I was programming for game consoles. When a new generation of consoles came the code was sloppy, but as the console cycle went on the code needed to get more and more optimized, the sloppiness needed to go as customers always demand more.

I'm waiting for the new optimization wars to start. I'll be happily sitting back in my rocking chair with my lemonaid and popcorn as a new generation of coders have to fight for microseconds like I did. I'll probably be retired before physics really starts to impact performance efficiency enhancements.

Comment Re:Security needs to be necessary (Score 1) 90

You forgot: Designing will slow them down, much faster to just code.

I'm, unfortunately, in charge of my companies security drive for software and let me tell you I hear everything you said, apart from version control, from my boss.

It's very hard to convince anyone that just drawing a state machine on a white board and have a few people 'throw darts' at it is invaluable in saving time and improving security. A few minutes thinking about the abstract and how people can break it saves a boat load of time. But no, that's design and it slows people down.

The only other person in the group who cares about design and I have two expressions right now at work:
'We're too busy working in the dark to turn the light on'
And 'We're too busy doing it over to do it right'

I'm not advocating a waterfall model or humongous design since that prevents flexibility and does slow you down. But something, anything, just to switch gears from coding to get you thinking of how people can screw with your system and how to mitigate the risk.

And even just trying to get simple changes to code to think about security is hard to push onto some people who have their minds closed. I look at simple code changes to frustrate (not block) the normal attacks, like not having insecure/developer/debug mode to be 0 and it just blows the minds of some developers. Developer Mode must be zero, since it's the first mode!!!

You're definitely right about nothing slows you down as much as trying to pull your trousers from the fire after being caught with your pants down on security. You're not going to want to wear them, and everyone will think you have a strange smell afterwards.

Slashdot Top Deals

Your program is sick! Shoot it and put it out of its memory.

Working...