Jodel, as a community-based app, thrives on the active participation of its users, who are all directly involved in building a safe, respectful, and fun place where you can share, laugh, and connect with your local community.
To make sure Jodel continues to be a positive and safe space, we have a dedicated moderation system, which combined with the community’s participation, strengthens democratic self-regulation and enables problematic content to be addressed quickly and directly from within the community itself.
This system is anchored on the community guidelines and terms of service, of which posts and users in violation get blocked or banned.
This system combines:
Smart automated tools that quickly spot potential problems
Smart automated tools that quickly spot potential problems
These systems assist our moderation efforts and are at the front line, designed to detect problematic content at an early stage and help enforce our guidelines.
Our automated systems are developed and continuously improved by specialized teams. They focus on specific signals – for example, identifying nudity in images or linguistic patterns that may indicate hate speech or violence – and will automatically remove content if it violates our guidelines.
Overall, these systems help us identify potential violations of our Terms of Service and Community Guidelines at an early stage, thereby contributing to a safe and respectful environment.
For content that cannot be clearly classified – for example, when posts are ambiguous or highly context-dependent – our systems refer the case to human moderators and our support team for review.
We regularly monitor the performance of these systems and make adjustments as needed to ensure that decisions remain as fair and reliable as possible.
Our dedicated team of user-moderators from within communities
Our dedicated team of user-moderators from within communities
Users are directly involved in the moderation process. Reports are reviewed by a network of voluntary moderators who make decisions based on a majority principle.
How do I challenge a moderation decision (Now available only in Germany)
The wider Jodel community through user reports, upvoting and downvoting.
The wider Jodel community through user reports, upvoting and downvoting.
Together, we make sure posts, comments, and images follow our Community Guidelines. Moderation isn’t about limiting what you can say – it’s about protecting everyone from harmful or illegal content and keeping Jodel an open and safe space.
How do we handle wrong flag reasons?
How do we handle wrong flag reasons?
Moderating isn’t always straightforward; sometimes a post is flagged for the wrong reason.
Example: If a harassing post is flagged as “sexually explicit,” it still violates our Guidelines because harassment isn’t allowed. In this case, moderation will still block the post to keep our community safe.