The headline is not representative of what the survey says AT ALL, which is that current AI cannot be scaled up to AGI. Anybody with half a brain could have told you that more research is still needed. Even Sam Altman has said as much.
Saying that current AI is a âoedead endâ is completely ignoring all of the demonstrably useful stuff it can do now.
I understand how it sucks for artists to see their work used in ways that they didn't intend, but I don't think there is much we should be doing about it. If the West starts imposing fines or fees for AI training, all that means is the best AI products will start coming out of China. They don't give a rip about IP laws, and would love to see their competitors hamstrung.
There is only one version of DeepSeek R1. It has 671 billion parameters, and common 4-bit quantization requires about a half terabyte of fast memory to load it. If you squeeze it down to 1.5bpw, you can run it in just 200 GB of VRAM (with some loss of quality).
The different âoeversionsâ you might be referring to are just ye old Llama and Qwen, which have been fine tuned with R1 data. Completely different architectures from DeepSeek.
DogFoodBuss writes: SpaceX deployed its 7000th Starlink satellite this week, making the vast majority of active satellites around earth part of a single megaconstellation. The Starlink communications system is now orders of magnitude larger than it's nearest competitors, offering unprecedented access to low-latency broadband from anywhere on the planet.
Why would I want to sacrifice performance by restricting how much memory my browser is using? If I have RAM to spare, go ahead and use it. If another application needs the memory, it's the kernel's job to evict caches and page out intelligently. Users shouldn't have to care about micromanaging memory these days.
Subjectivity is all that matters. These systems are pattern generators, and whatever generates the types of pattern that people like best are the winner.
The wrong use for LLMs is to use them to generate facts or do math, because they will do so whether it's real of not. Use an LLM as an interface to Wikipedia or Wolfram Alpha if you must, but it still will come down to what the output *looks* like.
No - AirTags are cheap passive devices that run on a little button cell battery. They have no GPS or cellular capability, but you can still track them down almost anywhere on earth.