im new to lemmy and i wanna know your perspective about ai
Naw, he a homie. He a real clanka.
AI is riding the surface of a monster bubble and anyone gleefully waiting for the pop has no idea what that’s going to do the US economy, and then everyone elses.
All but 1% of US economic growth last year was AI development and speculation. Combine that with the US passing, for the first time, 200%+ on the Buffett Index and we are screwed.
For reference, the Buffett Index is total stock market valuation vs. GDP. There is better than twice the dollars in the stock market than we produce in a year. The index was around 130% in 1929 and 2008.
I’m biased because of the work my kid does in the field. It’s paying his mortgage so… 😉
PERSONALLY, not a fan, I think it’s a dangerous abrogation of personal responsibility… BUT…
I do think I found a legitimate creative use for it.
There’s an AI powered app for a specific brand of guitar amplifier. If you want your guitar to sound like a particular artist or a particular song, you tell it via a natural language input and it does all the adjustments for you.
You STILL have to have the personal talent to, you know, PLAY the guitar, but it saves you hours of fiddling with dials and figuring out what effects and pedals to apply to get the sound you’re looking for.
Video, same player, same guitar, same amp, multiple sounds:
this is cool tbh. i think thats what ai should be using for. not some ai slop
That’s what I thought, it allows creatives to be creative.
Kind of like if you had an art program you could ask for “Give me a paint palette with the colors from Starry Night.”
You still have to have the artistic talent to make use of them, it’s not going to help you there, but it saves you hours of research and mixing.
they shouldnt allow people to create slop
Definitely agreed, but if AI can, I dunno, generate a list of pantone colors in a work, that leaves the actual art up to the artist.
Only talentless losers make A.I. “art.”
AI is a tool. A Glock 9mm is a tool. A paintbrush is a tool.
Tools are only as good, or as harmful, as their users. If AI is being used to flood the internet with slop, that is a human decision. The fact that AI was used to generate the slop does not taint the AI itself in any meaningful way. On this platform, however, it is fashionable to hate AI.
The people who hate it here are either bad actors or have no real understanding of what AI is, what it does, or what it is for.
Why or how a gun is a tool?
A tool is any human-designed object that extends our ability to perform a task. That is the functional definition. Moral weight does not enter into the classification. By that definition, a gun is unambiguously a tool. A gun is engineered to apply controlled force at a distance. Humans use it for specific purposes: hunting animals for food, self-defense, deterrence, sport shooting, and as an instrument of the state via military and law enforcement. In every case, intent, judgment, and responsibility reside entirely with the human operator. The object itself has no agency, will, or decision-making capability. People often confuse what a tool can be used for with what a tool is. That confusion leads to emotional objections rather than logical ones. A scalpel cuts flesh. A chainsaw destroys wood. A nail gun can kill a person. None of those facts remove them from the category of “tool.” Harm potential does not negate tool status; it simply increases the responsibility of the user. This is directly analogous to AI. AI applies computation. A paintbrush applies pigment. A gun applies force. Different domains, same underlying principle: they are instruments that amplify human capability. If someone uses AI to generate spam, or a gun to commit violence, the fault lies with the human actor, not the instrument. If we redefine “tool” to exclude objects we are uncomfortable with, the definition collapses into incoherence. The correct framework is not to anthropomorphize tools, but to hold users accountable for how they are employed. That is why a gun is a tool, not philosophically, but functionally and unambiguously.
I see that you are okay with that as long it fits by definition, regardless the harm that some tools can cause. Thanks god U.S. abolished slavery. Cheers.
Yes… I am ok if the definition defines the object… As anyone should be? Thank god Britain stopped trying to take over the world. Bye.
It’s a tool for killing people.
That’s not a good or reasonable purpose for a tool, sure, but it doesn’t make it less of a tool, it just makes it morally abhorrent.
This is pretty close to how I think about it. AI (notably not generative) that was developed to sort bread and now is reprogrammed to detect cancer cells: Fantastic. Generative AI to search and cite multiple interacting manuals with thousands of pages each: probably ok with significant guardrails. Generative AI that drives up local electricity costs so someone can see their underage neighbor with 4 boobs: terrible. To extend your metaphor it’s like handing a glock to a toddler.
I completely agree with your statements the data centers are definitely a massive issue however to be fair they are a response to the growing use of AI. These are just for-profit corporations looking for a way to make more profit. The cost of constructing the data centers is significantly cheaper than the amount of money that they would make in the long run.
I love AI. Who made it by the way?
From a practical point, useful in starting a base on projects, but sucks on further progress. I’ve used it in coding and 3d motion, and in both my experience it was like that.
From an environmental point, it’s an overengineered mess. Local models are satisfactory for most use cases, and we don’t really need huge computing clusters dedicated for AI.
He would be true AI. I would shower him with love.
Just because some cock sucking finance bros call an LLM a AI, doesn’t make it an AI.
You’re not saying “cock sucking” pejoratively, are you ?
I think the people who do that to me are pretty cool tbh
I don’t use the term cocksucker myself, but I think fact that vulgarity already gives it negative connotation. Like, I didn’t pat my wife on the head last night and call her my cute little cocksucker. I can imagine that could be sometime else’s pillow talk, but that would leave me touch starved for a while.
I don’t THINK calling a gay man a pussyfucker would have the same weight, but I don’t have deep enough conversations with gay men to really know. I have heard that some men pride themselves in never having been with having never been with a woman, so maybe it would still hurt.
On the flip side, just calling someone a fucker can be enough to start a fight.
I’m not going to pretend that the poster meant to use the word as asshole, because cocksucker definitely hits different to male pride. I don’t think I would use the word to hurt someone I was angry with, but who knows what might come out when emotions are high. I don’t plan on using the word for fighting, but insulting someone could be enough to cause someone to attack riskily. If you don’t practice what you say, then you might just repeat something you will regret.
To summarize, I hope the poster isn’t a bigot, but when given the chance they appear to have doubled down. Guess you got your answer.
Edit: Not wasting my time with discussion starved trolls.
Why still specify they’re sucking cock, then ? They could just be evil money-hoarding pieces of shit, couldn’t they ?
I was joking but I didn’t see your user name. I will not interact with you and waste anyone’s time anymore. Goodbye.
Hey! Don’t insult shit like that.
Atari chess opponent is AI
Age of Empires 2 AI is a cheatin’ bitch.
If every cheating bitch is an AI, then so is the Dancer of the Boreal Valley.
The problem isn’t that “everything is AI” - it’s that people think AI means way more than it actually does.
That superintelligent sci-fi assistant you’re picturing? That’s called Artificial General Intelligence (AGI) or Artificial Superintelligence (ASI). Both are subcategories of AI, but they’re worlds apart from Large Language Models (LLMs). LLMs are intelligent in a narrow sense: they’re good at one thing - churning out natural-sounding language - but they’re not generally intelligent.
Every AGI is AI, but not every AI is AGI.
first ever clanker?
Allen Newell’s Logic Theorist (1955) is broadly considered the first ever AI system.
It’s a tool similar to spell check or search engines. It’s currently not worth the environmental impact imo but obviously time has shown this is a temporary issue should we actually address it. Spoilers we probably won’t.
I think it’ll follow the. Com model we’ll see a crash before a decent surge and plateau.
If you examine closely, you’ll see there is no AI, but Vin Diesel reading a script (written by humans).
Depends on the use case.
- AI that parses manuals, documentation and dumbs it down for me? Yes please.
- AI that generates images? Eh, kinda undecided. I have seen really impressive AI videos, which I couldn’t tell from real videography.
- AI as a personal assistant? No, too much energy wasted on minor issues that could have been solved more efficiently. E.g: I have X at home, what could I make for dinner?
- AI (not LLMs) in medical/scientific fields. Very intruiging. Yes. good shit.
- AI in childrens toys. Eww. Burn it! Fecken burn it!!!
I think it can be a great tool, but it is overvalued atm, and there are AI images anywhere which is really frustrating. When I go to pinterest, I want to see human input. When I check my LinkedIn everyone and their cat is using AI graphics - I get it, it’s quick and easy, but such a waste of energy.
If we want to keep using AI, we need to reduce the quantity in which we use it and it’s resource consumption.
He is what he chooses to be.
Wasn’t he an alien?
One word sums up all that is wrong right now and it is greed.
AI has become synonymous with the worst of human nature hence it has become a loaded term.
Another way to look at this is it is not AI that is the problem we are or more specifically the people that will use AI to control us.
This technology is like the atomic bomb. We are fast racing towards a future were a few people will be able to dictate what everyone will be able to do. The person that controls AI and the computing power associated with it will control the world.This is intoxicating and it has drawn out the worst human beings who want to misuse this technology.
And it has already happened to some degree. Massive data centers, surveillance technology, and AI are being used to profile people and target them for death. In the future AI teachers will become the dominate form of teaching. AI will make our decision and we will be subject to a system without recourse or redressability.
Soon we will have a generation of people who only know what AI has told them. This is the kind of scenario that we have been warned against and the reason that those who dislike like propaganda and misinformation are so upset with where things are heading.
LLMs are fundamentally incapable of caring about what it produces and therefore incapable of making anything interesting. In the early days of LLMs’ mainstream uses that issue was somewhat compensated for by randomness and jank, but the subsequent advancements in the technology have mainly made it’s outputs as generic as possible. None of this has to do with the Iron Giant, as he is a fictional character.







