Text for readability:

So far, Americans using RedNote have said they don’t care if China has access to their data. Viral videos on TikTok in recent days have shown Americans jokingly saying they will miss their personal “Chinese spy,” while others say they are purposefully giving RedNote access to their data in a show of protest against the wishes of the U.S. government.

“This also highlights the fact that people are thirsty for platforms that aren’t controlled by the same few oligarchs,” Quintin said. “People will happily jump to another platform even if it presents new, unknown risks.”

  • Korne127@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    I don’t see why the concept should be unethical.

    In practice, of course it is insanely unethical as the algorithms are designed to maximize view time which leads to algorithmic radicalization and hate spreading more quickly, but the concept of an algorithm knowing and learning what you like and selecting for you itself isn’t unethical.

    • Amon@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 day ago

      concept of an algorithm knowing and learning what you like and selecting for you itself isn’t unethical.

      Unless you host it yourself, you have practically given away your soul to an instance operator

    • Septimaeus@infosec.pub
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 day ago

      I don’t see why the concept should be unethical

      It’s like engineering drugs specifically targeting reward systems of the brain associated with human emotional development and socialization.

      Edit: more explicit

        • Septimaeus@infosec.pub
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 day ago

          Besides the fact that it’s quite difficult to do this non-invasively, giving anyone instant access to any amount of exactly what they want most is dangerous (Edit: likely irresponsible, potentially dangerous, like designing escapist drugs, fine line between helping and hurting, and you must consider both).

          Definitely find lack of care on the part of fellow computer scientists irresponsible. I’ve rejected grant followups for thinly veiled weapons research for the same reason; i.e., potential misuse.

          • MonkeMischief@lemmy.today
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            14 hours ago

            giving anyone instant access to any amount of exactly what they want most is dangerous (Edit: likely irresponsible, potentially dangerous, like designing escapist drugs

            Oh wow, how you so perfectly, succinctly described all the empty promises of Ai hype in one elegant line. 😬

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      It’s expensive for video though.

      In other words, I have a hard time seeing Pixelfed with a high quality “benign” TikTok algorithm. It’s already possible for music, but video data\analysis is just so voluminous that, without the profitable exploitation backing it, I don’t see how they’d pay for it.

      • MonkeMischief@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        14 hours ago

        We also have to consider moderation. If suddenly everyone just jumped to the fediverse all at once…hoo boi let’s just say I bet the FBI would have quite a field day.

        But then again there’s PeerTube instances that seem to be doing pretty well so…I dunno…?