Comment Re:GPU (Score 1) 44
Don't forget, Tesla approved a pay package for Elon Musk that is worth ALL THE PROFITS they've ever made.
Don't forget, Tesla approved a pay package for Elon Musk that is worth ALL THE PROFITS they've ever made.
I don't know why you think this. What is the path to profitability? OpenAI is chucking money out the window SO FAST right now. They lose money on their TWO-HUNDRED dollar a MONTH subscription. They lost 5 billion on 3.7 billion in revenue. Where is the path to profitability, ESPECIALLY if energy prices increase?
Either they have to cut costs or raise prices, but if they can't make money on their $200/month tier, every lifting of prices is a drop in the bucket, because there are a lot of people using it for free.
I suspect that they WILL come through this (or be acquired) but there are lots of AI startups right now and zero companies are a lock. As of right now, you cannot rely on any of these companies to be there. You cannot rely on any code that's been written for you if you haven't audited it yourself (and there is plenty of evidence that unaudited code is making it into production all over the place).
This is not a prediction that the industry is going to collapse, it's that you still have to be aware of the risks. Google and Facebook weren't funded by VC in any way close to the same degree as OpenAI is right now, and just because a bunch of companies survived and prospered doesn't undo the fact that HUGE numbers of companies failed. I don't know the future and neither do you, so be careful; it is almost certain that the next several years will see a lot of upheaval.
A reminder that LLM-hawking companies are deeply, comically unprofitable right now. They're giant pits where money goes to be redistributed to NVidia and power companies, as well as huge signing bonuses for individual programmers.
Do not base your workflow or any part of your organization on anything that assumes that LLMs will be there, even on a free tier. If you can't build or maintain it yourself, it's a liability. At these rates, SOMEONE will have to pull back at some point, there's just no choice.
Programmers were moving to DirectX and off of 3D effects glide API.
By that time point (2000s), Glide itself wasn't that much relevant: most game engine relied on high level API (DirectX 3D as you mention, and also OpenGL: Quake3 had already been out for 1 year, and "mini GL drivers" that serve as an adaptation between high level OpenGL and low level such as Glide were all the rage).
Very few engines had Glide-specific optimizations.
API exclusivity wasn't playing a role anymore.
But failing to have distinguishing features that attract users did play a role.
VSA-100 in Voodoo 4/5/6 had a few interesting new feature (*vastly improved Voodoo 1, 2, and 3's ability to do "pseudo 22bit": more than 65k colors in 16-bit modes and to avoid error and dither amplification in 16 modes; new FSAA with a rotated grid that both allows better edge anti-aliasing and circumvents the need for anisotropic filtering; motion blur for supporting games; better texture compression, etc.) but these failed to attract users' interest (I suppose most users didn't even understand those features), whereas Nvidia managed to gain more users' interest with better number in 24/32bit modes benchmarks and some marketing around T&L (despite T&L not being that much actually used in games of that era -- better CPUs with SIMD achieving similar scene-processing in practice)
The next generation would have been more interesting with 3Dfx plans to add programmable pipelines with Rampage and Sage ("Spectre" graphics cards) instead of fixed pipelines (i.e.: to add geometry and pixel shaders in the parlance of high-level API like D3D and OpenGL) but it never reached market, only prototypes existed when 3Dfx folded.
Though some of that development eventually helped the GeForce FX
Of all the graphics card things I always wondered about is what if the power VR card had working drivers.
Well, look at Apple's iPhones...
It was a card based on the same techniques in architecture is the Sega Dreamcast and for about $120 it could perform like a $300 G-Force card.
Speaking the Sega Dreamcast, the whole snafu of "Katana" (the actual NEC / SuperH + PowerVR) vs "Black belt" (competing prototype with a 3Dfx gfx card) also did cost quite some money to 3Dfx and accelerated their demise.
When it work.
The main problem is that PowerVR works in a way that is completely alien when compared to everything else (outputting fully-rendered tiles, when everyone else is drawing polygons one-by-one on a pair of frame and depth buffers).
And tile-based-rendering's performance boost is less significant when there are more transparency layers in a scene which is where most of the industry was heading. (So the advantage that Kyro II had would have melted away with successor cards).
Speaking of TBR, 3Dfx had acquired GigaPixel to get patents to such technologies, did toy with software-based hidden surface removal (HSR) that worked in some OpenGL engines (Everything built on id tech 3 / Quake III engine) and was hoping to add hardware HSR with TBR in the next iteration of Rampage/Sage ("Spectre 2" / Mojo) but never got there before folding.
(And again, more transparency layer in scenes would eventually have made the performance gains less significant in future games).
---
*: Even the venerable Voodoo 1 "Pseudo 22bit" 16bit modes are different from everyone else on the market.
All competition used pure 16bit math pipeline. When combining multiple pixels (multi-texturing, transparency, effects, etc.) rounding error accumulate. It's even more visible when Bayer dithering is used (with an Intel integrated graphics core back then, this was visible on, e.g., Quake III's logo which had multiple translucent flames effect. The errors accumulated with each layer and the logo ends up looking like a checkered board.
In contrast, when in 16bit mode, all 3Dfx chips run at "pseudo 22bits" internally and on their video output, and combine the value of 4 sources pixels (hence the 2 extra bits per channel), less error accumulate on translucent layers, less visible dithering on the output.
On Voodoo 1 and 2, this is simply done by using 4 horizontally adjacent source pixels, giving the characteristic "slightly horizontally blurred" visual effect that is typically for early 3Dfx cards.
On Voodoo 3, the 4 sources can be also arranged in a square and there's more logic in how to combine them (I suspect something like conditional blur, but never read any full description)
On Voodoo 4/5/6, the chip introduces multiple buffers (2 per chip, up to 4 buffers on the most common dual-chip Voodoo 5) (equivalent to OpenGL's accumulation buffers), which introduced tons of cool efffects (introduce an offset between the buffer, and you get both antialiasing on the edges and the same result as anisotropic filtering by using plain trilinear on texture surface; render each buffer at a different time increment, and you get motion blur; etc.) - but this also allows even less blur when pick the 4 sources pixels for "pseudo 22bit" output (just pick 4 pixels at each same coordinate on each 4 buffers)
And I remember back in the day when 3dfx went tits up while their cards were flying off the shelves.
3dfx also did a couple of pretty stupid things: at a time when most 3d companies were just making chips and collaborating with 3rd party graphic cards manufacturer, 3dfx decided to "cut the middle man out" and produce their own first-party cards instead. Cue in all 3rd party graphic cards manufacturers switching to use Nvidia's chips instead (and also some trying this new ATI that was starting to enter the 3d chips market).
> you now have to upgrade$$ your membership to 'gold star' or 'executive'
you're historically backwards.
the business membership is the base membership. they added the less expensive membership, but held back some hours for the business folks to partially placate them.
the business members' purchases are what keep the place going; the cheap membership is just a bit of gravy.
I have two theories
1. Other languages are falling faster than Perl making Perl rise
2. Perl is what smart people use instead of bash. So for all sysadmin tasks or ad hoc system scripts or gluing you'd be crazy to use bash when you can use Perl.
It's not anything to do with political correctness, though I agree with everything else.
Movies throughout the 70s, 80s and 90s had the messages "be nice to people, don't be racist and sexist" all the time.
The problem is that they're recycling our childhoods back to us and they're doing a worse job of it. A New Hope didn't actually need an update to the special effects, and it definitely doesn't need any more modifications.
Honestly, the problem might be the LACK of real political message. Remember Indiana Jones? Nazis bad, kill nazis? We need more movies like that. Stop dressing up the fascists as aliens and just kill the fascists.
>if you ignore that number of animals they have that want to kill you.
so you're saying that the solution is to import hostile wildlife from Australia?
hawks
Oh, it's really simple: the majority of the stock market is owned by rich people. Moreover, the top 10 companies in the S&P 500 are worth about 40% of the index? So when Nvidia, Microsoft, Meta, Amazon, Apple, Alphabet and a few others go up, the whole index goes up. Who's Nvidia selling GPUs to? All the other companies at the top of the index. The AI hype is pumping billions into the top end of the market.
It's worth remembering that the stock market is not the economy, and certainly the S&P 500 is not the economy, and the top 10 companies of the S&P 500 are not the economy. So inflation and joblessness are rising because they're the ACTUAL economy and nobody has any fucking money.
The fundamentals aren't really in line with any of this. The price-to-earnings ratio of these companies is insanely high, even accounting for how much money they're making. And people all over the world see Nvidia is doing well, so they dump money into the stock. And then people see the stock is going up, so they dump more money into the stock! And that's actually gone very well, but who knows how long it'll last.
tl;dr the stock market is a weird collective fever dream that doesn't necessarily (and never has) reflect reality.
That doesn't stop me for Pining for them, though.
does 2x shift the frequency?
We've long been accustomed to France surrendering to Germans, spear throwing tribesmen, and the occasional Boy Scout troupe.
But, really, come on now. Surrendering to jellyfish is a new low . . .
hawk
I don't think that's it, honestly. 75% of truck owners never tow anything.
The cybertruck is bad at literally everything. It failed because it's (probably unintentionally) designed to fail. They made it out of the wrong parts, with the wrong specs, with the wrong intent. It's trash.
This is neither for the nation NOR the individual. Or rather, not good for individuals in general, just the few at the top of AI companies.
I'm all for a more communitarian society, but these AI companies give back NOTHING to the community. They're not co-owned, they're not nationalized. They manage to make a mockery of all possible institutions simultaneously.
Ontogeny recapitulates phylogeny.