If they abandon AI usage, I’d consider buying from them. Unless they give AI up and get a team of great humans at the helm to develop. They can get fucked, and I will spend my money elsewhere.
Corporations are already trying and failing to do anything useful with gassed up LLMs, and I highly doubt that it will become useful in the next 5 to 10 years. However, what I do feel will happen is that all the hope posting and gaslighting by the shills, those who’s success rides on massive adoption will do their best to convince everyone to use the shit. Who knows how well that will actually turn out, it’s an open question at the moment. I hope all their efforts leads to failure, personally.
If gassed up LLMs do seep into the gaming industry to the point I can’t avoid it, then I just stop buying games. Spending money on slop isn’t ideal in my opinion. I have a backlog of games that haven’t been tainted by baby’s first lying software, so it isn’t a total loss. I got years of games to play before choosing another hobby altogether, I have adhered to some boycotts for a long amount of time in other areas. Having the will to research products and spend money only on those that don’t offend my sensibilities is natural to me.
I already won’t use Windows because Microslop chose to go hard in the LLM direction and look how well that is going for them. On top of it, I make use of open source software and support projects so that they can continue to thrive, avoiding anything that is stained with enshittification.
My concerns with gassed up LLMs that techbros call “AI” are numerous: Ecological effect, Quality of Media (games, art, movies, etc), how it affects the people who provide us with software and entertainment in general. Gassed up LLMs are being used as an excuse to lay off so many people in tech and gaming spaces. As corporations irresponsibly hired in a boom period and realized they’d have to pay people in so called lean times that aren’t actually lean. The myth of infinite growth has scrambled the brains of the C-Suite and distracted toddler investors, so the profits don’t look so good these days. As they are unwilling to cut off the CEOs who leech off a lot of corporate profits first, naturally the very people who make the stuff that enriches companies, execs, and shareholders…Are laid off first. Thrown into a rabid job market that is falling the fuck apart because rich fuckwits are creating a situation that serves to unilaterally fuck over those that don’t have a financial cushion to fall back on.
This will have a profound impact on the quality and selection of media…Humans are naturally creative, we’ve painted in caves and told stories for most of our life history. Storytelling and creativity is an intrinsic part of humanity, stories feel better when written by someone with that strong ability to tell a compelling tale! Gassed up LLMs mass produce sterile, meaningless slop comprised of stolen training data that doesn’t classify as art to me…Humans do to, but not on an industrial scale like an LLM can. Already, there are sites and instances even on the fediverse dedicated to sharing slop…It’s so bland. I can’t imagine staying interested in buying stories and art produced by slop generating LLMs. I’ll just write my own stuff…Learn how to draw things that exist in my head, that would be a better use of time and money. Creating feels so good, only a talentless techbro or hollow CEO would want to take that away from people to make a quick dollar.
LLMs in their current form use so much water and power, it’s honestly scare how they’ll have to rely on dirty forms of energy production just to keep up with escalating demand. That will have an impact on the local environment and marginalized communities initially and if enough of these irresponsible data centers are created…The scale of environmental and community impact is going to escalate.
We only have one home planet, imagine it being so sullied that human life can’t exist here anymore. To be honest, due to our current antics, I don’t even know if we’ll beat “The Great Filter” and become a sufficiently advanced civilization. We might destroy ourselves first, but at least nature would probably recover after we’re gone and our works eventually stop interfering with the ecosystem.
I don’t feel humanity should be concerned with AI as there is so much we don’t know about ourselves yet. We lack the sufficient understanding of neurology, why consciousness manifests, and how to create machinery that can actually mimic our brain structure. It would take so many generations of humanity, untold amounts of funding, multi-discipline research to produce a true AI. Techbros wanna shill now though, so we are stuck with gassed up LLMs that will probably cause society to collapse or the rich to finally get put to a French revolution style end (if disenfranchised people are feeling spicy enough).
This is a really important comment. Greatly appreciate you adding links to your points and I love learning more about the great scheme that is this bullshit misnomer called “Ai” when it is very much just an LLM.
Now I can bolster my arguments against it with things like water and energy impacting the environment.
You’re welcome, we have to fight the good fight! Technological advancement built on the backs of human exploitation and destabilization of the environment isn’t it; I’d prefer we take another direction of researching and optimizing solar, wind, and low emission power generation. I think humanity can advance without harming one another, we just have to find a better path forward. The way LLMs are being utilized at the moment is a harmful scam, we need to call it out more to reduce their ability to fool the public (who so far isn’t convinced by what the techbros are saying).
Yep, I’m totally on board with what you’re saying. I have no issue with Ai as long as it’s being developed slowly, properly and with way more oversight than what’s happening right now.
I think we need focus on our natural resources and world way more too. Let’s fully explore our oceans and like you said develop more environmentally sustainable solutions to energy generation and what not.
I personally don’t think there is an ethical way of transforming LLMs into AI, at least not yet. As unfortunately there are too many complications with how it’s being peddled and the great slop impact that is hurting open source projects. I do think that companies engaging in machine learning and LLM development need to be heavily restricted and forced to comply with laws that will protect human jobs, human made content, websites, and software projects from their activities. Data stealing crawler bots need to be especially regulated, preferably out of existence, as they essentially DDOS websites that they crawl while stealing data from creatives that host websites and blogs.
Given that humans can reshape their environment in drastic ways, I think humanity needs to our dial collective focus on being more in harmony with nature and less fighting against it. We can do that by better understanding the world, by studying the oceans, fully mapping them in the least intrusive way possible. We have to carefully consider the impact that human activities have on our only home world, when new ideas are being considered. I think an ethical approach would be that technological progress has a positive impact on both the natural world and still improve the human condition.
Given the nature of most executives, rich fucks, I can’t see anything but dystopian coming from these chuckleheads. If you read their unfiltered and uncensored thoughts (often between the lines of their flowery words), how they see others, and notice their detachment for the everyday person. It’s pretty grim. Makes me wonder what you see.
No I don’t see dystopia at all; dystopia is what leads to destabilization and revolution; neither of those are good for profits.
Certainly it’s possible, but Im not going to pretend like I can accurately predict what the future holds.
What I’m not going to do is make up my mind that in the future this trend is incapable of producing new, fun, well made games. It just baffles me when I hear people say that they’ll never touch a game that used AI; like a developer can generate placeholder art and that kills the game for them? What? I just don’t understand this moral road block some people have.
Certainly right now the human mind and our creativity is miles beyond AI, but I don’t think it’s impossible for AI to reach an equal level; and you better believe if it does it’s going to rocket passed us.
I would love to live in a world where I can just click a button and AI generates a new Skyrim, or a Witcher; although that’s probably very far out, if at all.
And I don’t see job loss from AI as an exetential threat. I think the current lay offs in MOST fields are a mistake and they’ll be rehired. Like everyone is already saying, currently AI is just another tool that can accelerate the game development process; and anything that accelerates that processes decreases the cost of production which in turn increased competition. And I think that’s always a good thing.
The entire energy and water issue you’re raising is absolutely a non-issue for me. The model T got 13 miles to the gallon when it released; had no catalitic converter and basically zero safety features. Yet it was still a net benefit to humanity and was just a stepping stone towards the growth of humanity.
Energy use is the hallmark of a modern society; there is absolutely no path towards advancing humanity without forever increases energy use. AI can use energy from renewable sources. Data centers need exactly zero water input if designed that way; and if their current water use is actually an issue then I’m fine with government regulating its consumption.
That’s my book I guess. Please don’t quote every line I wrote and respect and to each individually; such an unatural way to have a conversation and I hate the internet for it.
Love how you called out the OC for “writing a book” but then go off like this.
I think you’re self-centred and need to read more into how big corporations take advantage of poorer communities by leveraging resources be it water, trees, or land and ravage them.
The food industry for example has many of these cases like palm trees, deforestation, over fishing, etc. are all really strong example of this.
Fuck generative shit, there will ways be real passionate people making games, artwork and music and i will buy from them, not the trash spat out by the automated vending machine.
If they abandon AI usage, I’d consider buying from them. Unless they give AI up and get a team of great humans at the helm to develop. They can get fucked, and I will spend my money elsewhere.
Do you feel you stand a chance in that fight?
Once the technology is there, and competition embrasses it, very few companies will actually have the capability of resisting.
In about 5 years I feel you’re literally just not going to be able to buy new games, or you’re going to abandon the idea of such boycotts.
But im curious. Are you concerned about the loss of human jobs from a game quality perspective or a human well being perspective?
Corporations are already trying and failing to do anything useful with gassed up LLMs, and I highly doubt that it will become useful in the next 5 to 10 years. However, what I do feel will happen is that all the hope posting and gaslighting by the shills, those who’s success rides on massive adoption will do their best to convince everyone to use the shit. Who knows how well that will actually turn out, it’s an open question at the moment. I hope all their efforts leads to failure, personally.
If gassed up LLMs do seep into the gaming industry to the point I can’t avoid it, then I just stop buying games. Spending money on slop isn’t ideal in my opinion. I have a backlog of games that haven’t been tainted by baby’s first lying software, so it isn’t a total loss. I got years of games to play before choosing another hobby altogether, I have adhered to some boycotts for a long amount of time in other areas. Having the will to research products and spend money only on those that don’t offend my sensibilities is natural to me.
I already won’t use Windows because Microslop chose to go hard in the LLM direction and look how well that is going for them. On top of it, I make use of open source software and support projects so that they can continue to thrive, avoiding anything that is stained with enshittification.
My concerns with gassed up LLMs that techbros call “AI” are numerous: Ecological effect, Quality of Media (games, art, movies, etc), how it affects the people who provide us with software and entertainment in general. Gassed up LLMs are being used as an excuse to lay off so many people in tech and gaming spaces. As corporations irresponsibly hired in a boom period and realized they’d have to pay people in so called lean times that aren’t actually lean. The myth of infinite growth has scrambled the brains of the C-Suite and distracted toddler investors, so the profits don’t look so good these days. As they are unwilling to cut off the CEOs who leech off a lot of corporate profits first, naturally the very people who make the stuff that enriches companies, execs, and shareholders…Are laid off first. Thrown into a rabid job market that is falling the fuck apart because rich fuckwits are creating a situation that serves to unilaterally fuck over those that don’t have a financial cushion to fall back on.
This will have a profound impact on the quality and selection of media…Humans are naturally creative, we’ve painted in caves and told stories for most of our life history. Storytelling and creativity is an intrinsic part of humanity, stories feel better when written by someone with that strong ability to tell a compelling tale! Gassed up LLMs mass produce sterile, meaningless slop comprised of stolen training data that doesn’t classify as art to me…Humans do to, but not on an industrial scale like an LLM can. Already, there are sites and instances even on the fediverse dedicated to sharing slop…It’s so bland. I can’t imagine staying interested in buying stories and art produced by slop generating LLMs. I’ll just write my own stuff…Learn how to draw things that exist in my head, that would be a better use of time and money. Creating feels so good, only a talentless techbro or hollow CEO would want to take that away from people to make a quick dollar.
LLMs in their current form use so much water and power, it’s honestly scare how they’ll have to rely on dirty forms of energy production just to keep up with escalating demand. That will have an impact on the local environment and marginalized communities initially and if enough of these irresponsible data centers are created…The scale of environmental and community impact is going to escalate.
We only have one home planet, imagine it being so sullied that human life can’t exist here anymore. To be honest, due to our current antics, I don’t even know if we’ll beat “The Great Filter” and become a sufficiently advanced civilization. We might destroy ourselves first, but at least nature would probably recover after we’re gone and our works eventually stop interfering with the ecosystem.
I don’t feel humanity should be concerned with AI as there is so much we don’t know about ourselves yet. We lack the sufficient understanding of neurology, why consciousness manifests, and how to create machinery that can actually mimic our brain structure. It would take so many generations of humanity, untold amounts of funding, multi-discipline research to produce a true AI. Techbros wanna shill now though, so we are stuck with gassed up LLMs that will probably cause society to collapse or the rich to finally get put to a French revolution style end (if disenfranchised people are feeling spicy enough).
This is a really important comment. Greatly appreciate you adding links to your points and I love learning more about the great scheme that is this bullshit misnomer called “Ai” when it is very much just an LLM.
Now I can bolster my arguments against it with things like water and energy impacting the environment.
You’re welcome, we have to fight the good fight! Technological advancement built on the backs of human exploitation and destabilization of the environment isn’t it; I’d prefer we take another direction of researching and optimizing solar, wind, and low emission power generation. I think humanity can advance without harming one another, we just have to find a better path forward. The way LLMs are being utilized at the moment is a harmful scam, we need to call it out more to reduce their ability to fool the public (who so far isn’t convinced by what the techbros are saying).
Yep, I’m totally on board with what you’re saying. I have no issue with Ai as long as it’s being developed slowly, properly and with way more oversight than what’s happening right now.
I think we need focus on our natural resources and world way more too. Let’s fully explore our oceans and like you said develop more environmentally sustainable solutions to energy generation and what not.
I personally don’t think there is an ethical way of transforming LLMs into AI, at least not yet. As unfortunately there are too many complications with how it’s being peddled and the great slop impact that is hurting open source projects. I do think that companies engaging in machine learning and LLM development need to be heavily restricted and forced to comply with laws that will protect human jobs, human made content, websites, and software projects from their activities. Data stealing crawler bots need to be especially regulated, preferably out of existence, as they essentially DDOS websites that they crawl while stealing data from creatives that host websites and blogs.
Given that humans can reshape their environment in drastic ways, I think humanity needs to our dial collective focus on being more in harmony with nature and less fighting against it. We can do that by better understanding the world, by studying the oceans, fully mapping them in the least intrusive way possible. We have to carefully consider the impact that human activities have on our only home world, when new ideas are being considered. I think an ethical approach would be that technological progress has a positive impact on both the natural world and still improve the human condition.
Hey everybody, new LostWanderer book just dropped.
No but really interesting take, rather dystopian, but interesting.
Given the nature of most executives, rich fucks, I can’t see anything but dystopian coming from these chuckleheads. If you read their unfiltered and uncensored thoughts (often between the lines of their flowery words), how they see others, and notice their detachment for the everyday person. It’s pretty grim. Makes me wonder what you see.
No I don’t see dystopia at all; dystopia is what leads to destabilization and revolution; neither of those are good for profits.
Certainly it’s possible, but Im not going to pretend like I can accurately predict what the future holds.
What I’m not going to do is make up my mind that in the future this trend is incapable of producing new, fun, well made games. It just baffles me when I hear people say that they’ll never touch a game that used AI; like a developer can generate placeholder art and that kills the game for them? What? I just don’t understand this moral road block some people have.
Certainly right now the human mind and our creativity is miles beyond AI, but I don’t think it’s impossible for AI to reach an equal level; and you better believe if it does it’s going to rocket passed us.
I would love to live in a world where I can just click a button and AI generates a new Skyrim, or a Witcher; although that’s probably very far out, if at all.
And I don’t see job loss from AI as an exetential threat. I think the current lay offs in MOST fields are a mistake and they’ll be rehired. Like everyone is already saying, currently AI is just another tool that can accelerate the game development process; and anything that accelerates that processes decreases the cost of production which in turn increased competition. And I think that’s always a good thing.
The entire energy and water issue you’re raising is absolutely a non-issue for me. The model T got 13 miles to the gallon when it released; had no catalitic converter and basically zero safety features. Yet it was still a net benefit to humanity and was just a stepping stone towards the growth of humanity.
Energy use is the hallmark of a modern society; there is absolutely no path towards advancing humanity without forever increases energy use. AI can use energy from renewable sources. Data centers need exactly zero water input if designed that way; and if their current water use is actually an issue then I’m fine with government regulating its consumption.
That’s my book I guess. Please don’t quote every line I wrote and respect and to each individually; such an unatural way to have a conversation and I hate the internet for it.
Mmm, I figured your opinion would line up with the line of thinking techbros tend to share. It was nice talking to you, have a good day.
Love how you called out the OC for “writing a book” but then go off like this.
I think you’re self-centred and need to read more into how big corporations take advantage of poorer communities by leveraging resources be it water, trees, or land and ravage them.
The food industry for example has many of these cases like palm trees, deforestation, over fishing, etc. are all really strong example of this.
I love how you loved the joke, and didn’t get it at the same time.
Fuck generative shit, there will ways be real passionate people making games, artwork and music and i will buy from them, not the trash spat out by the automated vending machine.
Interesting, but there’s a lot of presumptions in your post.