Expanding the scope of appeals to include members

Service: Narrative

Currently, the appeal system in Narrative is limited to content, comments, and niches. We're working on some improvements, however, to support reporting a member for chronic plagiarism, profile AUP violations, and potentially more in the future. 

Member-focused appeals will be reviewed and decided by the Tribunal.

About Plagiarism Appeals

We want to make sure that claims of plagiarism are not frivolous and that they are based on provable, chronic behavior.

Anyone who submits an appeal against a member for plagiarism will have to provide evidence to support the case, with at least 5 examples of plagiarized content, and links to the source material that the person stole from.  Only members who are Medium Reputation or higher will be able to file such appeals.

If the Tribunal determines that the member did chronically plagiarize, then the offending member will suffer a direct reputation hit and all posts made by the person prior to the Tribunal determination will no longer be eligible for Content Creator rewards. The second time a member receives a plagiarism judgment, that member shall have a permanent ban on Content Creator rewards.  

(When a plagiarism appeal is active, the appellee may not edit or delete any of their content or comments. This is to allow the Tribunal to properly review the appeal.)

We will have more details about the plagiarism appeal process when we actually support it on the network. No ETA is available at this time.

Original Post

Activity Stream

Thanks @Ted. Definitely a welcome and needed move. In my personal opinion, asking user to provide 5 total plagiarism in one report though would be tedious job requiring users to act as police and keep report, by which time the abuser might see which posts are obviously getting marked as low and remove them.

Any chance this would turn into if user is reported for plagiarism 5 times, he automatically gets investigated? Would make things a lot more efficient.  

Bashar Abdullah posted:

Thanks @Ted. Definitely a welcome and needed move. In my personal opinion, asking user to provide 5 total plagiarism in one report though would be tedious job requiring users to act as police and keep report, by which time the abuser might see which posts are obviously getting marked as low and remove them.

Any chance this would turn into if user is reported for plagiarism 5 times, he automatically gets investigated? Would make things a lot more efficient.  

I agree, that 5 seems like it's too high.  After all, most members have better things to do than check other posts for plagiarism.

I do think this is a big step in the right direction, and something that is needed.  But I'd suggest that the number required to make a claim of plagiarism should be more like 2 or 3.  If a person has been caught plagiarizing 2 or 3 times, it's pretty clear that they know what they are doing.  And there are probably a lot more instances that simply haven't been caught.  

Five examples might seem a bit high, but if someone's a chronic plagiarizer, it shouldn't be hard to find 5 examples in a row. Perhaps the team could add a system where, if it looks like a case where someone just forgot to properly credit a picture or something, the person who notices it could just drop a PM to the post owner so the post owner can check it out. Good on the team for being willing to combat fake reports of plagiarism.

Heidi Hecht posted:

Five examples might seem a bit high, but if someone's a chronic plagiarizer, it shouldn't be hard to find 5 examples in a row. Perhaps the team could add a system where, if it looks like a case where someone just forgot to properly credit a picture or something, the person who notices it could just drop a PM to the post owner so the post owner can check it out. Good on the team for being willing to combat fake reports of plagiarism.

It's not hard, but it is time consuming. Today I see a guy stealing content. So I put him on my radar. After few days I check. He has 4 stolen posts. I open a note to keep this, and wait for the fifth strike. Meanwhile I have seen 10 other people plagiarising. 

The process of reporting abusing members is a lot harder than it is for members to steal content. This manual work need to be cut to encourage more reporting. That, or give big incentive for reporting such cases, as it's tedious task to do. 

Heidi Hecht posted:

Five examples might seem a bit high, but if someone's a chronic plagiarizer, it shouldn't be hard to find 5 examples in a row. Perhaps the team could add a system where, if it looks like a case where someone just forgot to properly credit a picture or something, the person who notices it could just drop a PM to the post owner so the post owner can check it out. Good on the team for being willing to combat fake reports of plagiarism.

We had in instance of a person who was posting artwork from other people as his own.  As I understand it, he was warned not to do this.  At that point, he started cropping small portions of other people's paintings, and posting these small, low-res images.  It made it VERY hard to find the originals that he was stealing.  Even so, a member tracked down one of the originals.

My point is that there was an individual who was knowingly stealing artwork from other people, but he was also covering his tracks.  It would be very hard for anyone to find and verify five instances (beyond what we can reasonably expect member to do).  

Yet this is exactly the kind of bad actor we want caught.

That's why I suggest a lower threshold.

Robert Nicholson posted:
Heidi Hecht posted:

Five examples might seem a bit high, but if someone's a chronic plagiarizer, it shouldn't be hard to find 5 examples in a row. Perhaps the team could add a system where, if it looks like a case where someone just forgot to properly credit a picture or something, the person who notices it could just drop a PM to the post owner so the post owner can check it out. Good on the team for being willing to combat fake reports of plagiarism.

We had in instance of a person who was posting artwork from other people as his own.  As I understand it, he was warned not to do this.  At that point, he started cropping small portions of other people's paintings, and posting these small, low-res images.  It made it VERY hard to find the originals that he was stealing.  Even so, a member tracked down one of the originals.

My point is that there was an individual who was knowingly stealing artwork from other people, but he was also covering his tracks.  It would be very hard for anyone to find and verify five instances (beyond what we can reasonably expect member to do).  

Yet this is exactly the kind of bad actor we want caught.

That's why I suggest a lower threshold.

OK, I can see how that would be a problem. Normally I'd suggest doing a reverse image search, but if this member is modifying the pictures in any way, that could become danged near impossible unless you already have a good idea of where to start looking. My chief concern was that, if the standards of plagiarism were too low, then we would have a lot of problems with people making false reports or members' accounts being barred from receiving rewards when they did nothing more than copy content over from their blog and forgot to link back to their original blog or forget to credit a few pictures in a post that has a lot of media in it. So I guess there's a bit of a balance to consider.

As others have mentioned - great that we're doing *something*.

I also agree the thresholds are way too lenient.

Instead of 5 provable instances of plagiarism (wow...) - please consider the following system.

 

1) A post is downvoted for plagiarism (will require a specific downvote reason for this, rather than it being lumped in the AUP violation downvote reason), and the poster receives an automated email from Narrative, alerting them to the downvote, and explaining to them what constitutes plagiarism, and best practices for providing attribution for images and quotes.  This should be a no-brainer: inform the poster at the earliest sign of trouble, what the rules are!

If you want to make sure they don't receive this message for frivolous downvotes, make it so there is a threshold of downvotes, weighted by rep, before the email is triggered.

The email should allow two possible responses from the poster.

a) "I think there has been a mistake": give them a chance to explain in less than 500 characters, why it isn't plagiarism and tell them to provide evidence.  

b) "I've fixed it". 

If the poster does not respond within a given delay (perhaps 3 days), suspend rewards for the post.  Want to introduce a nice safety precaution?  Check the user has logged in within that period, and only activate the rewards lock out for the post if the person did log in, but ignored the plagiarism issue.

If the poster does respond, or if the poster has failed to respond within the 3 day period, to 2 or more flagged posts, email (or notify on platform) the downvoters who have a high enough rep to appeal, of the result. 

2) If the downvoters find the issue has not been fixed, or the reason provided by the poster is bogus, or if the result is that the poster never responded to 2 or more flagged posts: they can trigger an appeal right there and then.  I frankly don't understand why there is a need for a higher threshold than this.  Increasingly steep rep penalties for appeals deemed to be frivolous would keep excess appeals to a minimum, and would be effective since low rep Narrators can't file them, and higher rep individuals care about their rep.

3) If the appeal is found to be justified, the poster takes a rep hit.  10 points.  Yeah.  Don't mess around.  By now the person has wasted the time of Narrators and the Tribunal, after being reminded of the rules and given a chance to rectify the problem.  If they can't understand what they did wrong, or don't care, the result is the same: they are going to continue to post plagiarised content, so leniency makes no sense.

4) If the same issue arises again, and results in an appeal that is found to be justified, cancel all content rewards for posts prior to that date, and ding the rep 10 points, permanently, leaving a permanent note on their profile.

5) If the issue arises for a third time, allow the Tribunal to either permanently cancel all content rewards for the account, or to ban the account.

--------------------------------------

I've outlined this process as a counterpoint to help discussion.

The main takeaway being - it should only take one provable instance of plagiarism, not corrected after giving the user a chance to correct it, to trigger a consequence.

Secondary takeaway - notice how this process removes the burden from the Community, of commenting to the plagiariser that there is a problem, therefore exposing themselves to abusive downvoting?  Instead, the platform sends an email.  Nobody gets accused of throwing their weight around, and the problem is taken more seriously by the plagiarist.

If the poster is found to have plagiarised, we could even consider discounting any downvotes the plagiariser makes against the appealer(s), to prevent abusive downvotes.

What @Malkazoid said, because of the labor expected from users as described by @Bashar Abdullah, and the example given by @Robert Nicholson.

My rewards went down SIGNIFICANTLY in June because I was spending/wasting so much time investigating suspected plagiarism, commenting with a link to the plagiarized material, and then getting downvoted like crazy for doing so. I'm already taking a financial hit for the amount of work I'm currently doing to help monitor the plagiarism situation, because it's taking away from time that could be spent publishing new content, and now @Narrative wants me to do MORE work while the plagiarists continue to multiply.

Oh, and they're getting sneakier now, because they appear to be pasting foreign language articles into Google Translate and copying the resulting English nonsense into posts on here. All I can do is vote them low quality at this point because I can't reverse-engineer to discern the original post's language and locate the originals.

The full extent of what users should be expected to do to report plagiarism is to select Copyright Infringement from the downvote options and paste the URL of the original content that has been plagiarized for Narrative to use as evidence.

Here's a thought.  I think Ted's original proposal places too much burden on an individual to make a case, and I frankly don't see many people doing that.

One the other hand suppose the Downvote dialog had an option for plagiarism, which required the down-voter to fill in the URL of the original source.  That's not a great burden on the person reporting.

The system could keep track of the number of different posts from a user that had been flagged for plagiarism.  Once a threshold is reach (three, or five, or whatever) the reports would automatically be referred to the Tribunal.  The reports could come from different people, so you wouldn't be asking or expecting one user to take on the heavy burden of building a case.

Robert Nicholson posted:

Here's a thought.  I think Ted's original proposal places too much burden on an individual to make a case, and I frankly don't see many people doing that.

One the other hand suppose the Downvote dialog had an option for plagiarism, which required the down-voter to fill in the URL of the original source.  That's not a great burden on the person reporting.

The system could keep track of the number of different posts from a user that had been flagged for plagiarism.  Once a threshold is reach (three, or five, or whatever) the reports would automatically be referred to the Tribunal.  The reports could come from different people, so you wouldn't be asking or expecting one user to take on the heavy burden of building a case.

I think this is also an improvement on the current plan.

What it lacks compared to my proposal, is an opportunity to automatically alert the poster, neutrally via a platform email, that there seems to be a plagiarism issue with their posting, giving them a chance to rectify and walk the straight and narrow from then on, or to defend themselves.

My proposal also allows the community to be the first assessors of the defense, possibly avoiding the need for the Tribunal to be involved at all.  If the poster says 'hey! that was my post on another platform, so it isn't plagiarism - check out the shoutout to Narrative.org that I added at the end of the post to prove I am in control of that account...', problem solved, appeal averted.

I do think it is a great idea to require the link to the offsite content suspected to have been copied.  I can see only upside there!

 

@Malkazoid That's a good plan as well. User should get nofified. The one thing I don't think we should be doing is giving user chance to fix a stolen article though. He stole an article, someone spends time to find out, user gets an email, he alters the text to make it not so.

I can understand if it's just one photo in long article, but to give a chance to remedy stolen articles means we have to work forever with these plagiarizers. 

 

Thanks, @Ted!  I do think this is a step in the right direction.  I agree with many of the commenters here that 5 instances is way too high.  I prefer @Malkazoid's proposal combined with @Robert Nicholson's approach.  The reporter of the violation would have a specific downvote option and be required to provide a link to the material in question.  If the user fails to defend themselves, or is found to be at fault...they get a strike, along with the removal of rewards for that post.

That being said, I also understand the development effort involved is far less to simply create a way to report a copyright claim to the tribunal.

I think initially to keep the dev effort low, 3 instances should be enough to create a valid plagiarism claim.

My fear is that someone doing this intentionally is just going to create more anonymous accounts.  To help combat this, I think maybe the solution is you can't redeem rewards unless you have a high reputation.  On top of that, if a post is found to be plagiarized, I think the earned rewards should be forfeited back to the rewards pool.  This should significantly reduce / deter this kind of behavior in my estimation as it wouldn't be profitable since the scammers couldn't actually redeem their rewards. 

This is really welcome news @Ted @Christina Gleason makes good points. Not more than 3 offenses, automating contact and a special downvote with a link to plagiarized content would be great additions. Locking rewards if a plagiarism vote or report is made, until it's resolved, and forfeit of rewards on plagiarized posts I also agree with.

Ted posted:

Currently, the appeal system in Narrative is limited to content, comments, and niches. We're working on some improvements, however, to support reporting a member for chronic plagiarism, profile AUP violations, and potentially more in the future. 

Member-focused appeals will be reviewed and decided by the Tribunal.

About Plagiarism Appeals

We want to make sure that claims of plagiarism are not frivolous and that they are based on provable, chronic behavior.

Anyone who submits an appeal against a member for plagiarism will have to provide evidence to support the case, with at least 5 examples of plagiarized content, and links to the source material that the person stole from.  Only members who are Medium Reputation or higher will be able to file such appeals.

If the Tribunal determines that the member did chronically plagiarize, then the offending member will suffer a direct reputation hit and all posts made by the person prior to the Tribunal determination will no longer be eligible for Content Creator rewards. The second time a member receives a plagiarism judgment, that member shall have a permanent ban on Content Creator rewards.  

(When a plagiarism appeal is active, the appellee may not edit or delete any of their content or comments. This is to allow the Tribunal to properly review the appeal.)

We will have more details about the plagiarism appeal process when we actually support it on the network. No ETA is available at this time.

It seems Staff believes our time isn't as valuable as theirs. 5? Really!?

How about two? Plagiarism is only topped by threats of violence and one warning will suffice and the second infraction should mute one's account.

You also need to limit the amount of posts per day for everyone, but reps under 50 should be limited to one.

There are users posting 10 a day. Do you really think people will remain interested in policing the platform, when the theft outnumbers the original content?

Until we are fully decentralized, which won't be for years, Staff is the frontline of defense, like it or not. You need to step up or start giving us more tools, rather than making our jobs harder.

This proposal makes it more difficult and is NOT a step in the right direction.

Malkazoid posted:
Robert Nicholson posted:

Here's a thought.  I think Ted's original proposal places too much burden on an individual to make a case, and I frankly don't see many people doing that.

One the other hand suppose the Downvote dialog had an option for plagiarism, which required the down-voter to fill in the URL of the original source.  That's not a great burden on the person reporting.

The system could keep track of the number of different posts from a user that had been flagged for plagiarism.  Once a threshold is reach (three, or five, or whatever) the reports would automatically be referred to the Tribunal.  The reports could come from different people, so you wouldn't be asking or expecting one user to take on the heavy burden of building a case.

I think this is also an improvement on the current plan.

What it lacks compared to my proposal, is an opportunity to automatically alert the poster, neutrally via a platform email, that there seems to be a plagiarism issue with their posting, giving them a chance to rectify and walk the straight and narrow from then on, or to defend themselves.

My proposal also allows the community to be the first assessors of the defense, possibly avoiding the need for the Tribunal to be involved at all.  If the poster says 'hey! that was my post on another platform, so it isn't plagiarism - check out the shoutout to Narrative.org that I added at the end of the post to prove I am in control of that account...', problem solved, appeal averted.

I do think it is a great idea to require the link to the offsite content suspected to have been copied.  Only upside there!

 

I didn't mean to ignore or downplay your proposal.  My own idea just sort popped into my head.  As an engineer myself, the reason it appealed to me is that it would be trivial to implement .

It would also be very easy to automatically generate a message to the poster any time a post was downvoted for plagiarism.

 

Robert Nicholson posted:
Malkazoid posted:
Robert Nicholson posted:

Here's a thought.  I think Ted's original proposal places too much burden on an individual to make a case, and I frankly don't see many people doing that.

One the other hand suppose the Downvote dialog had an option for plagiarism, which required the down-voter to fill in the URL of the original source.  That's not a great burden on the person reporting.

The system could keep track of the number of different posts from a user that had been flagged for plagiarism.  Once a threshold is reach (three, or five, or whatever) the reports would automatically be referred to the Tribunal.  The reports could come from different people, so you wouldn't be asking or expecting one user to take on the heavy burden of building a case.

I think this is also an improvement on the current plan.

What it lacks compared to my proposal, is an opportunity to automatically alert the poster, neutrally via a platform email, that there seems to be a plagiarism issue with their posting, giving them a chance to rectify and walk the straight and narrow from then on, or to defend themselves.

My proposal also allows the community to be the first assessors of the defense, possibly avoiding the need for the Tribunal to be involved at all.  If the poster says 'hey! that was my post on another platform, so it isn't plagiarism - check out the shoutout to Narrative.org that I added at the end of the post to prove I am in control of that account...', problem solved, appeal averted.

I do think it is a great idea to require the link to the offsite content suspected to have been copied.  Only upside there!

 

I didn't mean to ignore or downplay your proposal.  My own idea just sort popped into my head.  As an engineer myself, the reason it appealed to me is that it would be trivial to implement .

It would also be very easy to automatically generate a message to the poster any time a post was downvoted for plagiarism.

 

I think the problem with the automated approach is that you would have to setup this whole negotiation system for the user to say...oops...didn't mean to....let me change that....or sorry won't happen again....in either case it should count as a first warning or strike regardless of whether they change it in my opinion.  I think the only way an automated solution would work is if you had a dedicated role, like a moderator pool, but composed of voted on high quality members', that would handle each one of these reported requests.  The appeals process would still need to be handled by the tribunal, but maybe not until the user got 3 strikes??  The whole point is, we can't be sending every plagiarism case to the tribunal, they would get absolutely buried and not be able to do anything else....they should really only have to deal with appeals where they are deciding on the final very serious action that is happening to the reported user.  I also like the new role / pool idea because it sources this effort from the community and compensates them (I think it should come out of the creator rewards piece of the pie, to fund this).  The pool can grow or shrink based on the avg # of requests each user is needing to review.

 

I think the problem with the automated approach is that you would have to setup this whole negotiation system for the user to say...oops...didn't mean to....let me change that....or sorry won't happen again....in either case it should count as a first warning or strike regardless of whether they change it in my opinion.  I think the only way an automated solution would work is if you had a dedicated role, like a moderator pool, but composed of voted on high quality members', that would handle each one of these reported requests.  The appeals process would still need to be handled by the tribunal, but maybe not until the user got 3 strikes??  The whole point is, we can't be sending every plagiarism case to the tribunal, they would get absolutely buried and not be able to do anything else....they should really only have to deal with appeals where they are deciding on the final very serious action that is happening to the reported user.  I also like the new role / pool idea because it sources this effort from the community and compensates them (I think it should come out of the creator rewards piece of the pie, to fund this).  The pool can grow or shrink based on the avg # of requests each user is needing to review.

Re-read my post.  The flags for plagiarism would not be referred to the tribunal until a threshold was reached...  a certain number (3? 5?) of posts flagged for plagiarism.  Prior to that, warning messages could be sent to the user, but the tribunal would only be involved when a number of post had been flagged.

I think this would be very simple to implement, and it would not require a lot of effort on the part of member to report plagiarism.  

Robert Nicholson posted:
 

I think the problem with the automated approach is that you would have to setup this whole negotiation system for the user to say...oops...didn't mean to....let me change that....or sorry won't happen again....in either case it should count as a first warning or strike regardless of whether they change it in my opinion.  I think the only way an automated solution would work is if you had a dedicated role, like a moderator pool, but composed of voted on high quality members', that would handle each one of these reported requests.  The appeals process would still need to be handled by the tribunal, but maybe not until the user got 3 strikes??  The whole point is, we can't be sending every plagiarism case to the tribunal, they would get absolutely buried and not be able to do anything else....they should really only have to deal with appeals where they are deciding on the final very serious action that is happening to the reported user.  I also like the new role / pool idea because it sources this effort from the community and compensates them (I think it should come out of the creator rewards piece of the pie, to fund this).  The pool can grow or shrink based on the avg # of requests each user is needing to review.

Re-read my post.  The flags for plagiarism would not be referred to the tribunal until a threshold was reached...  a certain number (3? 5?) of posts flagged for plagiarism.  Prior to that, warning messages could be sent to the user, but the tribunal would only be involved when a number of post had been flagged.

I think this would be very simple to implement, and it would not require a lot of effort on the part of member to report plagiarism.  

The problem is as soon as you report it to the user, they will have the ability to change the post, and as far as I know, narrative isn't storing a post history...so now you have a piece of content that has been flagged that may have been changed so it doesn't violate.  Do they just never get punished because they always change the posts that are caught??

Banter posted:
Robert Nicholson posted:
 

I think the problem with the automated approach is that you would have to setup this whole negotiation system for the user to say...oops...didn't mean to....let me change that....or sorry won't happen again....in either case it should count as a first warning or strike regardless of whether they change it in my opinion.  I think the only way an automated solution would work is if you had a dedicated role, like a moderator pool, but composed of voted on high quality members', that would handle each one of these reported requests.  The appeals process would still need to be handled by the tribunal, but maybe not until the user got 3 strikes??  The whole point is, we can't be sending every plagiarism case to the tribunal, they would get absolutely buried and not be able to do anything else....they should really only have to deal with appeals where they are deciding on the final very serious action that is happening to the reported user.  I also like the new role / pool idea because it sources this effort from the community and compensates them (I think it should come out of the creator rewards piece of the pie, to fund this).  The pool can grow or shrink based on the avg # of requests each user is needing to review.

Re-read my post.  The flags for plagiarism would not be referred to the tribunal until a threshold was reached...  a certain number (3? 5?) of posts flagged for plagiarism.  Prior to that, warning messages could be sent to the user, but the tribunal would only be involved when a number of post had been flagged.

I think this would be very simple to implement, and it would not require a lot of effort on the part of member to report plagiarism.  

The problem is as soon as you report it to the user, they will have the ability to change the post, and as far as I know, narrative isn't storing a post history...so now you have a piece of content that has been flagged that may have been changed so it doesn't violate.  Do they just never get punished because they always change the posts that are caught??

How about we lock the post as soon as the threshold of copyright violation votes has been reached.  Then if the poster chooses to fix the post, they can work on a copy of it generated by the system, but can not modify the original. 

 

Persistent plagiarizing is the target in HQ's appeal method, and since removing or editing posts isn't allowed, is relying on the the profile flagged for plagiarism good enough, or is a lock down needed? 

We own our posts. Is the company or the community tribunal allowed to lock a post so that the owner can't remove their property?  Or, since there's a question if the poster actually is the owner, is locking down permitted until it's decided either way?

An automated message - one that points out the consequences for persisting - that results in the poster changing or removing the post to eliminate copied material, is a good thing, IMO. I think for most, it would be an effective deterrent. 

If locking is allowed, the system can automatically lock any offending posts after the first warning, going through with the already declared consequences. My 2 cents anyway.

Colleen Ryer posted:

 

Persistent plagiarizing is the target in HQ's appeal method, and since removing or editing posts isn't allowed, is relying on the the profile flagged for plagiarism good enough, or is a lock down needed? 

We own our posts. Is the company or the community tribunal allowed to lock a post so that the owner can't remove their property?  Or, since there's a question if the poster actually is the owner, is locking down permitted until it's decided either way?

An automated message - one that points out the consequences for persisting - that results in the poster changing or removing the post to eliminate copied material, is a good thing, IMO. I think for most, it would be an effective deterrent. 

If locking is allowed, the system can automatically lock any offending posts after the first warning, going through with the already declared consequences. My 2 cents anyway.

I don't think it would necessarily be a deterrent for people who are plagiarising on purpose... They'd just take it as a sign to open a new account and keep on keeping on.

There is no ethical problem with locking a post for the duration of a plagiarism process.  If it turns out there was no problem, then the post can be unlocked.  If it turns out there was a problem and the user opts to fix it by editing the copy, then the copy can replace the original.

Malkazoid posted:
Colleen Ryer posted:

 

Persistent plagiarizing is the target in HQ's appeal method, and since removing or editing posts isn't allowed, is relying on the the profile flagged for plagiarism good enough, or is a lock down needed? 

We own our posts. Is the company or the community tribunal allowed to lock a post so that the owner can't remove their property?  Or, since there's a question if the poster actually is the owner, is locking down permitted until it's decided either way?

An automated message - one that points out the consequences for persisting - that results in the poster changing or removing the post to eliminate copied material, is a good thing, IMO. I think for most, it would be an effective deterrent. 

If locking is allowed, the system can automatically lock any offending posts after the first warning, going through with the already declared consequences. My 2 cents anyway.

I don't think it would necessarily be a deterrent for people who are plagiarising on purpose... They'd just take it as a sign to open a new account and keep on keeping on.

There is no ethical problem with locking a post for the duration of a plagiarism process.  If it turns out there was no problem, then the post can be unlocked.  If it turns out there was a problem and the user opts to fix it by editing the copy, then the copy can replace the original.

No, it wouldn't be a deterrent for the ones intent on this. Just a way to let people who either don't understand the rules, or thought they could get away with something a chance to learn - and reduce the burden on the tribunal. And the reporting community. And sure, if posts can be locked, then I agree, they should be. Unfortunately, the serial account opener is a problem no matter what the approach. 

Colleen Ryer posted:

No, it wouldn't be a deterrent for the ones intent on this. Just a way to let people who either don't understand the rules, or thought they could get away with something a chance to learn - and reduce the burden on the tribunal. And sure, if posts can be locked, then I agree, they should be. Unfortunately, the serial account opener is a problem no matter what the approach. 

True!  The serial account opener is something we don't know how to combat.

On the face of things, the only way to combat it is with universal certification, which itself is very problematic, especially because of the cost.

Add Reply

×
×
×
×