• PorkSoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.

    For example, if I hadn’t visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up writing a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.

    My point is these algorithms are fucking toxic. They’re focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we’re being manipulated.

    • Fedizen@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      I used google news phone widget years ago and clicked on a giant asteroid article, and for whatever reason my entire feed became asteroid/meteor articles. Its also just such a dumb way to populate feeds.

      • Corhen@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        thats why i always use youtube by subscribed first, then only delve into regular front page if theres nothing interesting in my subscriptions

  • Krudler@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a “recovering person”.

    I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.

    I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.

    Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, “magically” my home page didn’t have booze related ads/subs/recs any more! What a totally mystery how that happened /s

    The post in question, and a perfect “outing” of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.

    Edit: Oh and the hilarious part that many people won’t let go (when shown this) is that it says it’s based on my activity in the Drunk reddit which I had never once been to, commented in, posted in, or was even aware of. So that just makes it worse.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Its not reddit if posts don’t get nuked or shadowbanned by literal sitewide admins

      • Krudler@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Yes I was advised in the removal notice that it had been removed by the Reddit Administrators so that they could keep Reddit “safe”.

        I guess their idea of “safe” isn’t 4+ million users going into their privacy panel and turning off exploitative sub recommendations.

        Idk though I’m just a humble bird lawyer.

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    5
    ·
    1 year ago

    media: Video games cause violence

    media: Weird music causes violence.

    media: Social media could never cause violence this is censorship (also we don’t want to pay moderators)

    • Eximius@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Since media (that you define by the trophes of unsubtantiated news outlets) couldnt sensibly refer to a forum like reddit or even facebook, this makes no sense.

  • The_Tired_Horizon@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you’ve be investigated. Twitter was also bad for responding to reports with “this doesnt break our rules” when a) it clearly did and b) probably a few laws.

    • Alien Nathan Edward@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I gave up after I was told that people DMing me photographs of people committing suicide was not harassment but me referencing Yo La Tengo’s album “I Am Not Afraid Of You And I Will Beat Your Ass” was worthy of a 30 day ban

  • TropicalDingdong@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 year ago

    I don’t understand how a social media company can face liability in this circumstance but a weapons manufacturer doesn’t.

    • gum_dragon@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Or individuals who repeatedly spreading the specific hateful ideology that radicalize people and also encourages them to act on it

    • realbadat@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Same as sort settings for subscribed.

      Active, top for the past x hours/days/month/all, new, etc. You pick, Lemmy doesn’t.

  • Kalysta@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    1 year ago

    Love Reddit’s lies about them taking down hateful content when they’re 100% behind Israel’s genocide of the Palestinians and will ban you if you say anything remotely negative about Israel’s govenment. And the amount of transphobia on the site is disgusting. Let alone the misogyny.

    • captainlezbian@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      Lol, yeah I moderated major trans subreddits for years. It was entirely hit and miss if we’d get support from the admins

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    So, I can see a lot of problems with this. Specifically the same problems that the public and regulating bodies face when deciding to keep or overturn section 230. Free speech isn’t necessarily what I’m worried about here. Mostly because it is already agreed that free speech is a construct that only the government is actually beholden to. Message boards have and will continue to censor content as they see fit.

    Section 230 basically stipulates that companies that provide online forums (Meta, Alphabet, 4Chan etc) are not liable for the content that their users post. And part of the reason it works is because these companies adhere to strict guidelines in regards to content and most importantly moderation.

    Section 230©(2) further provides “Good Samaritan” protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

    Reddit, Facebook, 4Chan et all do have rules and regulations they require their users to follow in order to post. And for the most part the communities on these platforms are self policing. There just aren’t enough paid moderators to make it work otherwise.

    That being said, the real problem is that this really kind of indirectly challenges section 230. Mostly because it very barely skirts around whether the relevant platforms can themselves be considered publishers, or at all responsible for the content the users post and very much attacks how users are presented with content to keep them engaged via algorithms (which is directly how they make their money).

    Even if the lawsuits fail, this will still be problematic. It could lead to draconian moderation of what can be posted and by whom. So now all race related topics regardless of whether they include hate speech could be censored for example. Politics? Censored. The discussion of potential new laws? Censored.

    But I think it will be worse than that. The algorithm is what makes the ad space these companies sell so valuable. And this is a direct attack on that. We lack the consumer privacy protections to protect the public from this eventuality. If the ad space isn’t valuable the data will be. And there’s nothing stopping these companies from selling user data. Some of them already do. What these apps do in the background is already pretty invasive. This could lead to a furthering of that invasive scraping of data. I don’t like that.

    That being said there is a point I agree with. These companies literally do make their algorithm addictive and it absolutely will push content at users. If that content is of an objectionable nature, so long as it isn’t outright illegal, these companies do not care. Because they do gain from it monetarily.

    What we actually need is data privacy protections. Holding these companies accountable for their algorithms is a good idea. But I don’t agree that this is the way to do that constructively. It would be better to flesh out 230 as a living document that can change with the times. Because when it was written the Internet landscape was just different.

    What I would like to see is for platforms to moderate content posted and representing itself as fact. We don’t see that nearly enough on places like reddit. Users can post anything as fact and the echo chambers will rally around it if they believe it. It’s not really incredibly difficult to radicalise a person. But the platforms aren’t doing that on purpose. The other users are, and the algorithms are helping them.

    • yamanii@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      Moderation is already draconian, interact with any gen Z and you gonna know what goon, corn, unalive, (crime) in Minecraft, actually mean.

      These aren’t slangs, this is like a second language developed to evade censorship from those platforms, things will only get worse.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        It’s always been that way though. Back in the day on Myspace or MSN chatrooms there were whole lists of words that were auto censored and could result in a ban (temp or permanent). We literally had whole lists of alternates to use. You couldn’t say sex, or kill back then either. The difference is the algorithm. I acknowledge in my comment that these platforms already censor things they find objectionable. Part of that is to keep Section 230 as it is. A perhaps more relevant part of it is to keep advertisers happy so they continue to buy ad space. A small portion of it may even be to keep the majority of the user base happy because users who don’t agree with the supposed ideologies on a platform will leave it and that’s less eyeballs on ads.

  • ItsMeSpez@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    As much as I believe it is a breeding ground for right wing extremism, it’s a little strange that 4chan is being lumped in with these other sites for a suit like this. As far as I know, 4chan just promotes topics based on the number of people posting to it, and otherwise doesn’t employ an algorithm at all. Kind of a different beast to the others, who have active algorithms trying to drive engagement at any cost.

    • tbs9000@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I agree with you in spirit. The most common sentiment I see among the comments is not to limit what people can share but how actively platforms move people down rabbit holes. If there is not action on the part of the platforms to correct for this, they risk regulation which in turn puts freedom of speech at risk.

  • Minotaur@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

    It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

    Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

    It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

    • Zak@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn’t responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

      With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don’t like this case. I especially don’t like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.

    • Ð Greıt Þu̇mpkin@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.

      Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?

        • jkrtn@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Oh, it turns out an extension cord has a side use that isn’t related to its primary purpose. What’s the analogous innocuous use of a semiautomatic handgun?

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Self defense? You don’t have to be a 2A diehard to understand that it’s still a legal object. What’s the “innocuous use” of a VPN? Or a torrenting client? Should we imprison everyone who ever sends a link about one of these to someone who seems interested in their use?

            • jkrtn@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              You’re deliberately ignoring the point that the primary use of a semiautomatic pistol is killing people, whether self-defense or mass murder.

              Should you be culpable for giving your brother an extension cord if he lies that it is for the porch? Not really.

              Should you be culpable for giving your brother a gun if he lies that he needs it for self defense? IDK the answer, but it’s absolutely not equivalent.

              It is a higher level of responsibility, you know lives are in danger if you give them a tool for killing. I don’t think it’s unreasonable if there is a higher standard for loaning it out or leaving it unsecured.

              • Minotaur@lemm.ee
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                1 year ago

                “Sorry bro. I’d love to go target shooting with you, but you started taking Vynase 6 months ago and I’m worried if you blow your brains out the state will throw me in prison for 15 years”.

                Besides, youre ignoring the point. This article isn’t about a gun, it’s about basically “this person saw content we didn’t make on our website”. You think that wont be extended to general content sent from a person to another? That if you send some pro-Palestine articles to your buddy and then a year or two later your buddy gets busted at an anti-Zionist rally and now you’re a felon because you enabled that? Boy, that would be an easy way for some hypothetical future administrations to control speech!!

                You might live in a very nice bubble, but not everyone will.

                • jkrtn@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  So you need a strawman argument transitioning from loaning a weapon unsupervised to someone we know is depressed. Now it is just target shooting with them, so distancing the loan aspect and adding a presumption of using the item together.

                  This is a side discussion. You are the one who decided to write strawman arguments relating guns to extension cords, so I thought it was reasonable to respond to that. It seems like you’re upset that your argument doesn’t make sense under closer inspection and you want to pull the ejection lever to escape. Okay, it’s done.

                  The article is about a civil lawsuit, nobody is going to jail. Nobody is going to be able to take a precedent and sue me, an individual, over sharing articles to friends and family, because the algorithm is a key part of the argument.

        • Ð Greıt Þu̇mpkin@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Did he also use it as improvised ammunition to shoot up the local elementary school with the chord to warrant it being considered a firearm?

          I’m more confused where I got such a lengthy extension chord from! Am I an event manager? Do I have generators I’m running cable from? Do I get to meet famous people on the job? Do I specialize in fairground festivals?

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            …. Aside from everything else, are you under the impression that a 10-15 ft extension cord is an odd thing to own…?

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Do you not think if someone encouraged a murderer they should be held accountable? It’s not everyone they interacted with, there has to be reasonable suspicion they contributed.

      Also I’m pretty sure this is nothing new

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        I didn’t say that at all, and I think you know I didn’t unless you really didn’t actually read my comment.

        I am not talking about encouraging someone to murder. I specifically said that in overt cases there is some common sense civil responsibility. I am talking about the potential for the the police to break down your door because you Facebook messaged a guy you’re friends with what your favorite local gun store was, and that guy also happens to listen to death metal and take antidepressants and the state has deemed him a risk factor level 3.

        • morrowind@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I must have misunderstood you then, but this still seems like a pretty clear case where the platforms, not even people yet did encourage him. I don’t think there’s any new precedent being set here

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Rulings often start at the corporation / large major entity level and work their way down to the individual. Think piracy laws. At first, only giant, clear bootlegging operations were really prosecuted for that, and then people torrenting content for profit, and then people torrenting large amounts of content for free - and now we currently exist in an environment where you can torrent a movie or whatever and probably be fine, but also if the criminal justice system wants to they can (and have) easily hit anyone who does with a charge for tens of thousands of dollars or years of jail time.

            Will it happen to the vast majority of people who torrent media casually? No. But we currently exist in an environment where if you get unlucky enough or someone wants to punish you for it enough, you can essentially have this massive sentence handed down to you almost “at random”.