

But spending a lot of processing power to gain smaller sizes matters mostly in cases you want to store things long term. You probably wouldn’t want to keep the exact same LLM with the same weightings and stuff around in that case.
But spending a lot of processing power to gain smaller sizes matters mostly in cases you want to store things long term. You probably wouldn’t want to keep the exact same LLM with the same weightings and stuff around in that case.
How the hell are you confused when EA does something shitty?
Ye but that would limit the use cases to very few. Most of the time you compress data to either transfer it to a different system or to store it for some time, in both cases you wouldn’t want to be limited to the exact same LLM. Which leaves us with almost no use case.
I mean… cool research… kinda… but pretty useless.
Ok so the article is very vague about what’s actually done. But as I understand it the “understood content” is transmitted and the original data reconstructed from that.
If that’s the case I’m highly skeptical about the “losslessness” or that the output is exactly the input.
But there are more things to consider like de-/compression speed and compatibility. I would guess it’s pretty hard to reconstruct data with a different LLM or even a newer version of the same one, so you have to make sure you decompress your data some years later with a compatible LLM.
And when it comes to speed I doubt it’s nearly as fast as using zlib (which is neither the fastest nor the best compressing…).
And all that for a high risk of bricked data.
That’s exactly what I meant
Shoot this fucker in the face already
To be fair, on an american spectrum it’s kinda centrist.
“No we didn’t want to do the only sensible thing to do! We are a shitty company!”
Right 150miles is enough most of the time for most people, but it wouldn’t make much sense to buy a pickup truck if you can’t use it that one day every few weeks/months you actually have to transport something a little farther away.
And most people wouldn’t buy a pickup with two seats, so considering the use case of a two seat pickup truck, 150miles range isn’t that great.
I would appreciate stripping citizenship for racism or fascism related crimes.
Pro:
Cons:
Meh
I’m quite anti-violence but at this point please just go for some other approach…
Well yes but also no. There are quite a few distros that are “minimal effort”, they just work for the average person without any more knowledge you’d need on Windows or Mac. The last part that’s still not so “minimal effort” is gaming, most things just work out of the box, some things don’t. Btw Android is Linux.
So I don’t think that the problem is that Linux needs a little more knowledge or effort, because it mostly doesn’t, but the fact that most people who would switch see a billion different distros and don’t know what to do. Having so much choice here actually hinders people from coming to Linux. Doesn’t mean it would be better with less choices, it’s just one of several reasons why we don’t see mass adoption.
Another reason is the outdated thinking that Linux is complicated to use (and this blog fuels just that).
Why would I need AI for that? We should really stop trying to slap AI on everything. Also no, I’m not that big of a fan of wasting energy on web crawlers.
Yes I agree on that. A lot of people write “C with classes” and then complain…
“americans” is a bad name but it’s more specific than “united statesians”. But I would fully support dissolving that country and founding a new one (or multiple) with a better name.
I’m a full time C++ developer, mostly doing high performance data processing and some visualization and TUI tools, and as someone loving C++, it’s not as simple as you frame it. In sufficiently complex code you still have to deal with these problems. Rust has some good mechanisms in place to avoid these and there are things on the way for c++26 though.
Nintendo fans will vote just like Apple Fans.
Just when you thought Nvidia couldn’t get worse, they praise Trump.