Algorithms give you what you want. If you’re getting rage, you’re looking at and upvoting rage. On Reddit, 99% of what I get are posts about Hitman, Blender, Boomershooters and other things I’m interested in. The same with YT except I also get obscure movies from the 30s and 40s. The more of that I consume, the more the algorithms push it to me. The same with news. If there’s a big news event I might read and watch a bunch of news and then all the news stuff gets pushed to me.
Algorithms boil down to “Did user look at, upvote something with these keywords? If yes, then send more things with those keywords their way.” These magical mystery algorithms are probably 5 lines of code most.
You looked at one of those videos at some point. I get them too because I occasionally get curious. Some of it is tied to watch history. You might have one buried in the pile. If you clean it out, you might not see those anymore.
Yes, it gives you what you respond to, it steers towards engagement. So it provides content which has a strong emotional response because the strong emotions drive us to engage. Anger and resentment are very strong driving powers here. I don’t doubt that what you’re saying about how the things you engage with, and how you steer the algorithm or the algorith steers you. But that doesn’t disprove the fact that algorithms are generally making our debates more angry, it would just suggest that you are less vulnerable to this.
What would disprove it is if people would quit upvoting, reading, and watching things that make them more and more angry. Algorithms don’t steer anyone. People steer themselves with the algorithms. They’re essentially mirrors of one’s personality. Just be aware if you upvote a post that says “Tech billionaire kills baby seals with his bare hands, has sex with the corpses, and laughs about it on TikTok”, you’re going to get more stuff that has all those keywords in it. However, if you upvote “How do I subdivide a mesh in Blender?” you will get a bunch of stuff about Blender, subdividing 3D models, and other topics related to that.
Yes, it’s a mirror, that shows us people are very emotional, and prone to react to emotional triggers. The algorithm has figured this out and is using it to maximize engagement. People will not stop reacting to these emotional triggers. Sure it’s commendable to avoid these rage-baits and pay attention to other things, and some will be better at this than others. But to think that people en masse can choose to not be caught and manipulated by these algorithms seems really naive. I know tons of people who spend more time on TikTok than they wish they did, but it’s hard to resists something that’s so cleverly optimized towards you, as Yuval Noah Harari says it: it’s hacking our brain.
The algorithms aren’t that smart. They’re extraordinarily dumb.
For example, on YT recently I got recommended an ancient video from 2011 where an ex-con was giving advice about how to pick up hookers. I looked at the comments and tons of them were recent and also confused about why they were recommended this video. After sifting through my own watch history, I realized I had recently watched the Austrian version of Eyes Wide Shut where the main character meets a hooker. So the algorithm just looked and saw the synopsis had the word “hooker” and found another video where the word “hooker” is used, and then sent it my way.
The algorithm is not even as smart as any given free chess app that some college kid wrote over the weekend as an exercise. If the algorithm is manipulating people and hacking their brains, that’s a problem with the wetware not the software. The remote control isn’t hacking people’s brains when they press 5 and the TV displays channel 5.
Algorithms give you what you want. If you’re getting rage, you’re looking at and upvoting rage. On Reddit, 99% of what I get are posts about Hitman, Blender, Boomershooters and other things I’m interested in. The same with YT except I also get obscure movies from the 30s and 40s. The more of that I consume, the more the algorithms push it to me. The same with news. If there’s a big news event I might read and watch a bunch of news and then all the news stuff gets pushed to me.
Algorithms boil down to “Did user look at, upvote something with these keywords? If yes, then send more things with those keywords their way.” These magical mystery algorithms are probably 5 lines of code most.
Not on YouTube. I only watch instructional videos and how-to fix videos.
But it doesn’t stop YouTube from oddly recommending a right wing grifter or influencer shit with that stupid ass shock face.
You looked at one of those videos at some point. I get them too because I occasionally get curious. Some of it is tied to watch history. You might have one buried in the pile. If you clean it out, you might not see those anymore.
Nope. Not even a bit.
I despise YouTube. But sometimes, instructions are only in video form.
Yes, it gives you what you respond to, it steers towards engagement. So it provides content which has a strong emotional response because the strong emotions drive us to engage. Anger and resentment are very strong driving powers here. I don’t doubt that what you’re saying about how the things you engage with, and how you steer the algorithm or the algorith steers you. But that doesn’t disprove the fact that algorithms are generally making our debates more angry, it would just suggest that you are less vulnerable to this.
What would disprove it is if people would quit upvoting, reading, and watching things that make them more and more angry. Algorithms don’t steer anyone. People steer themselves with the algorithms. They’re essentially mirrors of one’s personality. Just be aware if you upvote a post that says “Tech billionaire kills baby seals with his bare hands, has sex with the corpses, and laughs about it on TikTok”, you’re going to get more stuff that has all those keywords in it. However, if you upvote “How do I subdivide a mesh in Blender?” you will get a bunch of stuff about Blender, subdividing 3D models, and other topics related to that.
Yes, it’s a mirror, that shows us people are very emotional, and prone to react to emotional triggers. The algorithm has figured this out and is using it to maximize engagement. People will not stop reacting to these emotional triggers. Sure it’s commendable to avoid these rage-baits and pay attention to other things, and some will be better at this than others. But to think that people en masse can choose to not be caught and manipulated by these algorithms seems really naive. I know tons of people who spend more time on TikTok than they wish they did, but it’s hard to resists something that’s so cleverly optimized towards you, as Yuval Noah Harari says it: it’s hacking our brain.
The algorithms aren’t that smart. They’re extraordinarily dumb.
For example, on YT recently I got recommended an ancient video from 2011 where an ex-con was giving advice about how to pick up hookers. I looked at the comments and tons of them were recent and also confused about why they were recommended this video. After sifting through my own watch history, I realized I had recently watched the Austrian version of Eyes Wide Shut where the main character meets a hooker. So the algorithm just looked and saw the synopsis had the word “hooker” and found another video where the word “hooker” is used, and then sent it my way.
The algorithm is not even as smart as any given free chess app that some college kid wrote over the weekend as an exercise. If the algorithm is manipulating people and hacking their brains, that’s a problem with the wetware not the software. The remote control isn’t hacking people’s brains when they press 5 and the TV displays channel 5.