Jodel as a community-based app, progresses only with the help and direct involvement of its users. From the beginning we gave our local communities the power to curate their feed through downvoting and reporting abusive content. In our mission to involve the community even further, we decided that our users should also take part in the moderating process.
How does the moderation selection process work?
Moderating a community is a privilege reserved for our most trusted and positive users. There is a minimum requirement of karma to get into the pool of potential moderators. However, karma is not the deciding factor in the selection process.
How are we looking for potential moderators?
When looking for potential moderators, our algorithms are looking for users who:
post positive and supportive content
flag content that's breaking our Guidelines
help other users by answering questions or doubts
tolerate opinions of others and support diversity among our community
How are we avoiding potential abuse?
We need the input of more than one moderator to block content. Once enough moderators decided on a flagged post, our algorithms resolve it. When moderators report content themselves, they do not have the right to moderate it.
When moderators do not follow our guidelines, their decisions can get muted. Muted moderators can still moderate but we don’t take their decisions into account. If their decisions get better, they are "unmuted". If muted moderators do not get better over time, we might permanently remove moderation from them.