The moderation package is intended to allow for more user participation in the journalistic aspects of a bitweaver website, ranging from simple user feedback on the quality of a news article to collaborative journalism.

The core functionality of the moderation package consists of the ability for users to give feedback on the quality of content on a bitweaver website. This is the fundamental function upon which all other functions of the moderation package are based. There are three decisions we will have to make regarding the implantation of this functionality:

1. The types of content we allow to be rated. Primary candidates are articles, comments, wikis, blogs and media files (e.g. images or sound files), but I imagine other types of content might qualify as well (e.g. users). This decision also largely depends on the technical feasibility.
2. The available methods of rating content. There currently appear to be three primary methods of rating content. The most common method is that of simply giving content a rating on a scale of 0-5, often represented by stars. Another method is that of the simple up/down vote, as employed by [http://digg.com|digg]. Finally, a method particularly popular for rating comments is the [http://www.slashdot.org|Slashdot] rating system.
The different methods have their strong and weak points, and often websites employ different methods for different types of content (e.g. articles vs comments). We will have to decide which method(s) of rating content we provide and whether we allow administrators choose between different rating methods for different types of content.
3. The final point of concern that affects the basic functionality of rating content, is which factors we allow to influence the rating of content and how they influence this rating. In case of rating a media file, e.g. a photograph, I can imagine that merely taking the average of all user ratings will suffice. Other types of content however, might benefit from having other factors influence the rating as well. For instance, in case of user submitted news articles (e.g. http://www.digg.com|digg]), factors like time, frequency of cotes, amount of votes etc would have to be considered as well.
I therefore propose we do not hard-code the formula that determines the rating of content. A better approach, in my opinion, would be to list the possible factors that can be perceived to influence the rating of content and then allow the administrator to weigh these factors based on their unique situation. We could provide default settings per content type as a starting point.

….

Work in progress!
Page History
Date/CommentUserIPVersion
17 Feb 2006 (10:19 UTC)
AC van Rijn146.50.202.12614
Current • Source
AC van Rijn146.50.202.12613
View • Compare • Difference • Source
AC van Rijn146.50.202.12612
View • Compare • Difference • Source
AC van Rijn146.50.202.12611
View • Compare • Difference • Source
AC van Rijn146.50.202.12610
View • Compare • Difference • Source
AC van Rijn146.50.202.1266
View • Compare • Difference • Source
AC van Rijn146.50.202.1265
View • Compare • Difference • Source
AC van Rijn146.50.202.1263
View • Compare • Difference • Source
AC van Rijn146.50.202.1262
View • Compare • Difference • Source
AC van Rijn146.50.202.1261
View • Compare • Difference • Source