I'm a Vive owner and I must say it's absolutely incredible. I've owned it for months and still play it almost every day.
I would say that for me, the room-scale integration and motion tracking is key to enjoyment. Having the headset is a nice novelty, but being able to really "be" there, standing in a virtual space, is what's incredible.
I play a lot of Minecraft (there's a free Vive VR plugin for it) and during the nighttime (in-game) I'll climb up on top of my little virtual house, sit down on the "roof" (i.e. my office floor) and just simply enjoy the view while I have a little snack, both in game and in real life. Not to mention actually looking around and mining/placing blocks that are as big as you are, and actually swinging your sword and actually shooting your bow at mobs. I've never had a comparable game experience in my life. And I can play in the same game as people using non-VR minecraft, so there's no restrictions.
Onward, which is sort of VR Counter Strike, is another game that is incredible (but I suck really, really bad). You can peek around corners, duck behind things, crouch, go prone, and "really" reload your gun and throw grenades.
Content is a bit of an issue, but there are plenty more great room-scale games other than Minecraft and Onward... Legend of Luca is one of my favorites, it's like VR meets classic NES Legend of Zelda. Holoball is VR pong and great fun, and you can really get a workout playing it. 5089 and Vanishing Realms are both excellent, immersive RPGs, Arizona Sunshine is the best zombie shooter I've ever played, Anyland is a unique building and community game that is a really different kind of experience, Out of Ammo is a fun FPS with a lot of RTS elements, and there are a few others. Perhaps it's just because Steam's catalog of Vive games lines up with my preferences, but I'm not disappointed in the selection.
Besides content, there are other cons of the Vive that are, unfortunately, pretty bad The huge cable and the bulkiness are the biggest. I do know there is a wireless kit available that has been getting good reviews, but that's another $200 bucks for first-gen hardware. The cable is strange because although it never really gets in your way, you *feel* like it's going to get in your way, which is almost as bad. The headset being wireless would solve a lot of problems, however it's still very bulky and not easy to wear for long sessions (more than 45 minutes, say). It's sweaty, and it feels, for a lack of a better word, claustrophobic. Having two monitors about half an inch from your eyes is not comfortable either, and after long play sessions I feel... unusual. Not nauseous or sick or fatigued, exactly, but unusual. And keep in mind you need a really beefy PC to be able to run this stuff, so that drives up the price tag even more. Also, local multiplayer is basically non-existent because each person needs their own computer and their own 5 x 6 meter space to play in, although the games that have internet play function as good as you'd want them to be.
It bears mentioning that I got a touch of motion sickness with some of the games that don't use teleportation, but you get used to it after about a half hour (and forevermore after that). And that's saying a lot, because I get motion sickness in vehicles very easily. But, your mileage may vary.
With all that being said, do I think it's worth the roughly $900 (if you already have a nice PC)? Absolutely! And I'd buy another one if mine broke. Although the cons are easiest to put into words, the pros are not - it's an experience like no other. VR is not gimmicky like I thought it was going to be. However, I probably wouldn't be saying this if I didn't have the Vive with its motion tracking. Simply having the headset alone would be a bit of a novelty that would wear off fast.
I'd recommend that if you've already got the money and the desire to buy something like a new TV or a surround-sound system or something, spend it on the Vive instead. Way more bang for your buck. It's a professional product and doesn't feel like a prototype, although if you want to wait for the 2nd generation I can only imagine how much better that will be.
Okay, sure, you can watch a coder in real time, but most of the time people don't need a coder, they need a developer. A developer has project management and other "soft" skills. Coding is the hammer and the nails... you have to know what you're creating before you start to build it. Most of the time you would see the developer typing up emails, creating diagrams and flow charts, writing executive summaries, managing their agile tracker, consulting on a conference call, researching documentation, etc.
I'm so tired of people thinking that software engineering is about coding. It isn't about coding, it's about developing real-world technical solutions of which coding is a relatively small part.
I've had dozens of managers (I'm a software developer) and the only ones worth a damn were ones that used to hold a real technical job before moving into management. I can deal with their outdated technological knowledge and their sometimes dogged insistence on old methodologies because at its core the job hasn't changed, and they realize that. My technical managers kept the rest of the business off our backs and helped give us the space we needed.
My non-technical managers never quite understood the level of detail that we are immersed in on a daily basis. They were impossible to deal with because they were always focusing on vague strategies like 'better communication' or 'migrating to best-of-breed solutions' or some-such marking nonsense.
It all comes down to this: How can a person be a good manager if they don't understand what exactly it is that you do on a daily basis?
Teaching people to code by first teaching them a programming language is like teaching them about hammers before explaining that we're trying to build a house. Your programming languages are your toolbox, nothing more.
Perhaps the 'gee-whiz' factor of seeing the code first breeds more interest in children than the engineering process but to my mind it seems that we need to be teaching kids from the top-down if we're interested in creating a generation of good programmers. When kids learn HTML, CSS, and Javascript and then get their first website project written for a client (e.g. modifying the school website) they're shocked to learn that they're not going to be using cutting-edge libraries and that the vast majority of work is more boring frustration than actual magic. Young programmers, in my limited experience, do not like finding out that they don't get to use whatever tools they want to play with at the moment.
You can teach almost anyone to program but developing software solutions is something entirely different.
There's a lot of weird opinions in the comments I've read so far (wait a minute, am I on Slashdot?).
First: The poster wants to telecommute exclusively to do "hardware and network" stuff. That's why he can't find any work. Simple as that. Be willing to get your old ass to the office and you'll find a job.
Second: People argue until they're blue about "old workers" vs "young workers". The fact is that the "team" is what matters, believe it or not. At my job we needed to add a new programmer to our small team, and my boss made sure that I was involved in the interview process. We interviewed three potential candidates: One was a Harvard graduate, one was a very talented middle-aged programmer, and the last was a decently talented 30-something. We caught the Harvard graduate in a lie, so he was out. The middle-aged programmer was absolutely amazing; he would have brought a ton of experience and raw talent to the team, however he was "so much better" than the rest of us that it probably would have created problems in working together. We ended up going with the 30-something, and he's working out just great because he's on the same level as the rest of us.
Every team is unique, and being better than the rest is not always a good thing when you're concerned about getting work done.
First of all, having a degree will help you get those first couple jobs until you gain more experience.
Beyond that, your attitude highlights the problem with the majority of "web developers" - they don't see themselves as computer scientists. This leads to inefficient, cobbled-together solutions. Web developers often want to "just make web sites" but never learn anything about the real skills of software development: requirements gathering, architecture and engineering, testing, deployments, etc. You end up being a web "programmer" but not a web "developer".
I'm not saying that a college degree will give you all of this, but what I am saying is that you shouldn't picture yourself apart from the rest of computer science.
Work is the crab grass in the lawn of life. -- Schulz