I have seen some critical views on Nostr as a part of decentralized network discussions, but most seem to be focused on culture not function.

What are the functional / protocol differences that make you prefer ActivityPub over Nostr?

  • Ada@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    1 day ago

    I moved away from centralised social media, because social media owned by multinational corporations benefit from bigotry and rage, and so allow it to fester and grow. They do this by under moderating, or moderating with a bias against the people being harassed and attacked.

    So the last thing I would choose to do is go to a platform/network that prides itself on lack of moderation, and requires vulnerable, targeted folk to play whack a mole, with each person having to reactively block individual bigots, one by one, after they’ve appeared and dumped their payload of hate.

      • squirrel@piefed.kobel.fyi
        link
        fedilink
        English
        arrow-up
        5
        ·
        15 hours ago

        I’ve read somewhere that the fediverse has the most moderators per user of any social network because of it’s decentralized nature. Can’t find the source right now though.

      • Ada@piefed.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        20 hours ago

        In theory yes, in practice, no.

        Nostr uses relays. In some ways, a relay is like an instance on the fediverse. Where they differ though, is that a) relays don’t talk to each other and b) users can sign up to many different relays and pull/push content to all of them.

        So in practice, in order to see a wide amount of content, you need to end up connecting to multiple relays. And even though a relay does have some moderation capabilities to block content, unless every relay you use blocks the content from the bigoted account, you’ll see it.

        If you signed up only to a single relay, and that relay had good moderation, then in theory, your Nostr experience wouldn’t be terrible, but a single niche relay like that will mean you see basically no content. And as soon as you connect to a larger public relay to get more content, you lose all of the moderation advantages offered by your first instance. Which means in practice, there is no incentive to run a well moderated instance.

        And so all of the moderation ends up on the end user, who has to manually block accounts only after they appear and dump their load of hate (at which point, the bigot will just spin up another account). Some people prefer that experience, but when you’re the regular target of hate, that approach just doesn’t work for many folk.

      • lmmarsano@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        15 hours ago

        So there are no moderation tools / whitelist/blacklists?

        This is a good thing: bitchasses need to learn words are harmless & they can ignore them like humanity has done for millennia. It’s not built into the servers. Client-side tooling would handle it, so it’s entirely at the discretion of the user, which seems better to me.

        • Skavau@piefed.social
          link
          fedilink
          English
          arrow-up
          5
          ·
          19 hours ago

          It’s not about that even if you reduce it to purely that. Without moderation every platform becomes infested by spammers, trolls, astroturfers. Topical communities lose focus and communities become little more than hashtags.

          • lmmarsano@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            19 hours ago

            Yup, better. Moderation should be opt-in & is better handled at the client: the user could opt-in to a “moderation community” that publishes tags their client would follow. Such curation for anyone who wants that is a better idea. Far better than moderators we don’t get to choose.

            • onlinepersona@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 hours ago

              So in your world, I could spam the network with CSAM, gore, rape, and everything else and it would be up to a small group of people to filter that out for the rest so they can subscribe to what that small group thinks is appropriate?

            • Skavau@piefed.social
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              19 hours ago

              Yup, better.

              Why is that better?

              Moderation should be opt-in & is better handled at the client: the user could opt-in to a “moderation community” that publishes tags their client would follow.

              Debateable given that any community anywhere online still needs to remove CSAM and gore and other things.

              And what do you mean by “tags” here? Just hashtags or something else? Because a hashtag under a no-moderation concept could still be hijacked.

              Far better than moderators we don’t get to choose.

              Well I suggest you go there then, because the fediverse will never be what you want it to be.

              • lmmarsano@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                17 hours ago

                Why is that better?

                User control & flexibility > illegitimate authority. Also, I remember an earlier, untamed, unrulier, more subversive internet than this corporate-friendly crap: it was funner.

                any community anywhere online still needs to remove CSAM and gore and other things.

                Legal compliance is different from legally unnecessary moderation.

                Because a hashtag under a no-moderation concept could still be hijacked.

                Not really: Nostr content is cryptographically signed. User’s client subscribes to some content curators who post as signed events their tags for other events. The client processes these tagging events to filter according to the user’s preferences.

                Some proposals already exist:

                the fediverse will never be what you want it to be

                Not the topic of discussion, which is function & protocol.

                • Skavau@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  19 hours ago

                  Legal compliance is different from legally unnecessary moderation.

                  Right, so there would still need to be moderators. And if they can remove that, they can remove anything.

                  Not the topic of discussion, which is function & protocol.

                  Right, but you’re just gunna have an unpleasant time here if you loath all moderation. It’s that simple.

                  • lmmarsano@lemmynsfw.com
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    19 hours ago

                    And if they can remove that, they can remove anything.

                    That’s why there’s no limit on the number of relays (events usually publish to multiple) or subscriptions to them (clients usually subscribe to multiple).

                    unpleasant time here if you loath all moderation

                    Still off topic, and the moderators here aren’t reddit moderator scum so far. The modlog offers decent transparency.

        • Cooper8@feddit.onlineOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          16 hours ago

          Client-side curation sounds like whitelisting effectively, if you follow only the curated feed and the curated feed resigns all events posted by selected keys, that’s a whitelist and seems like a decent solution for casual users so long as they can find trusted curators and clients that enable them to be easily discovered and subscribed to. What client is best for this currently?

          On the flipside, if those “curators” were able to export and import lists of keys to automatically exclude from feeds, that would be very useful for the curators who have to manually or automatically sort events and new users to build their feeds. Is that feature currently available? Eliminating known bot accounts from feeds seems like minimum viable feature set for new curators in the current state of play.