• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    edit-2
    6 hours ago

    because anyone who knows even a scrap of how LLM/GANs work knows that the data needs to train a model would be far beyond the reach of a company of Larian’s scale

    If it’s like an image/video model, they could start with existing open weights, and fine tune it. There are tons to pick from, and libraries to easily plug them into.

    If it’s not, and something really niche, and doesn’t already exist to their satisfaction, it probably doesn’t need to be that big a model. A lot of weird stuff like sketch -> 3D models are trained on university student project time + money budgets (though plenty of those already exist).

    We don’t need defenders coming in here trying to pretend that the CEO hasn’t just clarified that they are using AI for preproduction, we know this and it’s not up for debate now.

    No. We don’t know.

    And frankly, why do I need to play their game when I could just AI generate my own slop and save the 70 bucks

    I dunno what you’re on about, that has nothing to do with tools used in preproduction. How do you know they’ll even use text models? Much less that a single would ever be shipped in the final game? And how are you equating LLM slop to a Larian RPG?

    hit, it seems like they’ve forgotten about the community that got them to where they are today in favor of some AAA gaming nonsense.

    Except literally every word that comes out of interviews is care for their developers, and their community, which they continue to support.

    Frankly, there are plenty of games that people judge from the outset. There’s a reason why we have the saying “First impressions matter”. They’ve left a bad taste in anyone who dares question the ethics of AI use, but thankfully there might be an audience of people out there who like slop more than I dislike it so they could be ok. No skin off my nose.

    Read that again; pretend it’s not about AI.

    It sounds like language gamergate followers use as excuses to hate something they’ve never even played, when they’ve read some headline they don’t like.


    …Look, if Divinity comes out and it has any slop in it, it can burn in hell. If it comes out that they partnered with OpenAI or whomever extensively, it deserves to get shunned and raked over coals.

    But I do not like this zealous, uncompromising hate for something that hasn’t even come out, that we know little about, from a studio we have every reason to give the benefit of the doubt. It reminds me of the most toxic “gamer” parts of Reddit and other cesspools of the internet, and I don’t want it to spread here.

    • MoogleMaestro@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      I won’t bother engaging with the “gamergate” false equivalency. I think it’s disingenuous to try to tie any of what I said so far to a some fearmonger induced culture war, biggotted nonsense when we’re talking about a much broader wealth extraction mechanism and misanthropic tech movement. I think you’re saying this from a well-meaning place, but I actually don’t think what I’ve said is overzealous at all. The CEO is saying he’s using AI and, if you’re opposed to the social and financial repercussions of this, it’s fair game to boycott a product over this.

      To pick a true real world example, some people won’t eat meat that isn’t free-range. This isn’t about the quality of the meat really, it’s about the inhumane treatment of animals. Not everyone subscribes to this, sometimes I don’t buy free-range meat either, but it’s not “wrong” for people to choose to not buy meat that isn’t free range. The same can and should be true about the media we consume, whether it’s games or films.

      If it’s like an image/video model, they could start with existing open weights, and fine tune it. There are tons to pick from, and libraries to easily plug them into.

      If it’s not, and something really niche, and doesn’t already exist to their satisfaction, it probably doesn’t need to be that big a model. A lot of weird stuff like sketch -> 3D models are trained on university student project time + money budgets (though plenty of those already exist).

      …Look, if Divinity comes out and it has any slop in it, it can burn in hell. If it comes out that they partnered with OpenAI or whomever extensively, it deserves to get shunned and raked over coals.

      I won’t get into this too much, but “open weights” is not “open source”, and even “open source” is not real “open source” when it comes to AI. Really, what you should be talking about is an open dataset based model, which there are very few of in reality. The issue isn’t the weights, the issue is the data that was used to generate the weights in the first place.

      It’s not impossible that they’re using some bespoke model derived from an open dataset model, but considering the full transcript is now out and he name dropped ChatGPT in particular, I don’t really have much confidence that there’s some kind of ethical silver lining. Since he was the one who mentioned using AI in previs development, it’s actually up to him to clarify what models they’re using and whether they’re ethically sourced. I don’t really have to prove anything beyond them using AI and not thinking AI is to my personal pallet. That’s fine, everyone has their own tastes. To me, I was excited about the new Divinity until this news dropped, and the hype is simply deflated because it is against my morals. That’s on them, not on me.

      If he wants to push for open datasets as an AI industry counter play, then fine – fair play and good riddance to closed source (closed data) AI industry players. But until that happens it’s actually just a fantasy and not based on reality. I’ll stick to what has been said and not extrapolate what could-be.