• Grimtuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    3 days ago

    Then it’s not engagement. We have to stop this nonsense.

    Also, we need laws making bots that pretend to be real users illegal. We’re heading for some black mirror shit otherwise.

    • thehatfox@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      3 days ago

      The platform owners don’t consider engagement to me be participation in meaningful discourse. Engagement to them just means staying on the platform while seeing ads.

      If bots keep people doing that those platforms will keep letting them in.

      • Cocodapuf@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Which is a good argument for regulation. Regulation prevents industry from doing things that may be in their individual best interest but are against public best interests or the best interests of the industry as a whole.

  • 001Guy001@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 days ago

    I fear that this will be used as a pretense to have a Real ID tied to all of your online accounts, and that it will be publicly facing.

    And then if people come to your house after you share a wrongthink, or if part of who you are clashes with “the good ole days”, well then that’s just an unintended but necessary evil, so sit down and shut up.

    • huppakee@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      That might happen, who knows, but that is not what this article is about af all.

  • Narri N.@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    3 days ago

    No fucking shit, Sherlock. In other news: “water is wet” and “fire is hot”, and later “capitalism does nothing for non-capitalists, and should be overthrown”

  • huppakee@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 days ago

    Thanks for sharing, this could actually be very helpful research for the development of Lemmy and other fediverse platforms. Here is some text from the article that explains what appearantly happens by using bots:

    […] not all bots are the same in the bustling world of Reddit.

    Some bots are simple, […]. Take WikiTextBot, for example. […] Using Reddit’s API, it scans every post and follows its hard-coded rule: “If there’s a Wikipedia link, post a summary.” These bots, [are] known as “reflexive bots,” […].

    Then there are […] the “supervisory bots” tasked with moderating discussions. […]

    [… ] it’s important to understand how the presence of these bots affects human-to-human interactions in these online communities. […]

    They observed that reflexive bots, which generate and share content, increased user connections by providing novel content and encouraging engagement. However, this came at a cost: human interactions became shallower, with fewer meaningful back-and-forth discussions. Instead, bots often replied to posts, limiting deeper conversations between human users.

    On the other hand, supervisory bots, designed to enforce community rules, reduced the need for human moderators. Previously, key community members would collaborate to set and uphold norms, strengthening their roles within the community. With automated moderation, this coordination became less necessary, leading to a diminished role for human moderators in fostering community engagement and culture.

    The story of bots on social media is still unfolding, with platforms and their creators tasked with finding the right balance between innovation and authenticity. As firms weigh the impact of bots, they face an essential truth: how these digital entities are managed will shape the future of online human connection.

    So the last part is why this matters, but I wanted to include lines from the first part because they explain what the basis of the research. I took the liberty to put the last line in bold because that is why I felt the need to write this response. Also worth mentioning is the size of this research:

    Between 2005 and 2019, Lalor and his team studied Reddit communities- almost 70 million posts- experiencing a rise in bot activity.

  • Yggstyle@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    I think the phrase they are seeking is signal to noise. The bots make noise but bury the signal making it harder to actually find and communicate with others who are actually engaging. Weird how that works.

  • zephorah@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    If “engagement” is a tap in, which this article is not clear on, then it’s a piss poor measurement of engagement.

    I think bots are “good” for making an item appear viral through artificial amplification of visibility. Of note, the headlines in Gaza and Israel pre 2024 election were a deluge. Multiples every day. Post election, that deluge of posts dropped abruptly, to 0-1 a day.

    There also needs to be a distinction between passive and active engagement.