all 9 comments

[–]zyxzevn 3 insightful - 1 fun3 insightful - 0 fun4 insightful - 1 fun -  (3 children)

Reputation can be manipulated

You can give yourself reputation with 100s of alternative accounts. Or by infiltrating a site with a group. Or by buying old accounts or even stealing accounts.

Some forums seem to solve it by adding a paywall, but this does not mean that the information is any good. And it does not stop companies or intelligence from manipulating forums.

People with lots of reputation (or karma on reddit), tend to bring people the messages that they want to hear. It is not based on quality or truth.

Trust

I think it could work in some trust-network. Maybe the Mana system works that way.

There will be different networks that trust each other, but not different networks.

And trust depends on the topics. People tend to be good in one topic, but not so much the other. You may trust your relationship, but a good car-mechanic will know more about your car.

[–]icebong[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (2 children)

You can give yourself reputation with 100s of alternative accounts.

This is known as a Sybil Attack which is the main problem that IOTA, as I have linked in another comment in this thread. The way they deal with this problem is by assigning each node (user) a weight (reputation) so that users with more weight are more trusted. It takes a lot of time and posting (for good actors) and a lot of money (buying up accounts on the internet) for the bad actors. So the CIA can buy a trusted account for 1000s of dollars, and the moment they start to spew out propaganda, and users report them , they will start to loose that reputation fast.

So iota says, More mana = more messages read. So, within the community, there is a natural distribustion of users. Say 10% off the top users of a community constribute to most of the posts. Something like that, i can't remember exactly. So with such a system, the higher mana you have, the higher you are on the 'font page'.

But yeah, trust is very important in todays systems. I think users need to 'earn' trust within that community. BUT still allows new users to have a say. So with low mana/trust, their comment/posting will be limited, as well as having a visible indicator (we know we can trust accounts that are less then 1 week old on reddit) . But instead of opening their profile, it should be displayed next tot their username...something like that.

[–]zyxzevn 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (1 child)

I agree that the mana system could work ok.
I am even thinking about having multiple layers of trust networks, but do not know exactly how.

[–]icebong[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

So...

First first level : USER SIDE. User will be given tools to help filter though posts, users decide what kind of posts they choose to want to see, ie Show posts by users only over level 3.

Second level: MOD TEAMS. Mods can and will become corrupt over time. We all start off with the best intentions but nothing is ever guaranteed. Communites should default to the mod team that created the community, however, in the sidebar, anyone can start they own mod team for that particular community. That way, a user can check out what the information looks under another mod team, and even change their settings to see the community under the non default mod team.

Third level: MANA reputation. We talked about this. Beefed up reputation system.

In this way, it's harder for a central entity to control the narrative, as how the filtering of information is now much more 'decentralized'

[–]icebong[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

One form of reputation system. Mana , used by IOTA

https://blog.iota.org/explaining-mana-in-iota-6f636690b916/

[–]fschmidt 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (1 child)

If moderators truly control their forums then they will have enough incentive to deal with this issue manually, and that is good enough.

[–]BadSprocket 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (1 child)

shills & bots are a problem, but you can't reward or penalize them enough to go away. Just use your own filters, like you do with MSM or others. If mods get their hands too dirty, it becomes a full time job and then Reddit.

[–]icebong[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

I completely agree we also need filter from the user side (front end).

[–]Pononimus 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

You know what's the first best solution for getting away from the bots and shills? Find alternatives to being online or find alternative online options (like but not limited to things like IRC) where we wouldn't have to put up with that crap.