This is a young industry, and it's changing all the time. What you need to know changes all the time.
As someone who got into IT from (natural) languages, I agree with most of your comments, except that one. From what I can determine, based on reading a lot of books about software development as an activity (not about specific languages, or platforms, or tools, or whatever), very little has changed in the last 30 years. A lot of what people really need to know in IT are softer skills like time estimation, requirements management, change management, customer communication, effective documentation, issue resolution and so on. As much as some people would love to believe it, cranking out code for a solid 8 hours a day rarely happens and when it does the results often aren't pretty.
Realistically, standards in IT are terrible, precisely because we focus on the things that change all the time and deliberately disregard the lessons of the past. We tell ourselves that the IT world is so different from just a few years ago that we can't learn anything useful from what's gone before. And of course that's all part of the 'romance' of IT; every coder wants to feel that he's breaking new ground and doing something totally new. In reality, most people are writing code for fairly mundane purposes and doing it rather badly: just look at the Daily WTF, Coding Horror, or ask a 'senior' developer for a few stories about interview candidates - or worse, colleagues - who couldn't write even a basic function.
Computer Science is exactly that, science, but in most fields the world needs a lot more engineers who can build working solutions out of what the scientists invent, not more scientists. Out of every 1000 CS graduates, how many end up writing compilers, hacking kernels, or doing other 'deep magic'? And how many more end up writing web-based data-processing applications with some simple business logic behind that still somehow never quite work correctly? Yes, there will always be a Google pushing the boundaries and they will always need PhD types to do it, but an awful lot more people just need developers who understand their needs and can build simple, reliable business applications.
My personal opinion is that IT has a higher opinion of itself than it deserves. In the end, we're still a young profession (as you said), but yet we flatter ourselves with job titles like 'engineer' when any real engineer (mechanical, electrical, whatever) would be horrified at the amount of guesswork and imprecision we seem to be happy working with every day.
If we really want to get to the next level as an industry, then we have to stop fixating on the details of languages and technologies and look at the processes and practices. Unfortunately, that's precisely what many techies least want to do, because it's knocking on the door of PHB territory. A professional association would have some problems, because the whole IT industry is so diverse, but it could do a couple of useful things. First, persuade universities to cut back on CS and ramp up "Computer Engineering"; think of CS as "Materials Science" and CE as "Construction Engineering" to see the difference. Make sure the CE course covers effective source control, issue tracking and change management, basic economics and project management, cost calculations, oral and written communication etc., all of which are skills that CS graduates just don't seem to have, but which are clearly needed in the real world.
Second, persuade insurance companies to underwrite large IT projects, just as they do for large construction projects, and use that as a more or less neutral/independent means of raising the industry's performance. They could also offer professional liability insurance for individuals and companies. If large projects could be underwritten against failure, companies would jump on it a risk mitigation measure: the project fails, at least they get some money back. In turn, the insurance companies would push developers to improve standards, because if their standards weren't high enough, they simply wouldn't get insured. This is how many other 'real world' professions already operate; good software vendors would be insured, bad ones wouldn't.
There's a host of obvious problems with that approach, but the insurance industry is very good at assessing risk, and it currently seems to be the only market-driven, technology-independent means of improving standards. Schneier has preached something similar for a while: let insurance companies assess the risk associated with poor software and demand higher premiums from clients who don't implement appropriate security measures.