Is there a rate limit other than the 1500 ms for users? I'm trying to use MassEdit on a huge Talk namespace, but the program eventually freezes in the same spot no matter how much I increase the delay/interval. When I try using the huge generated list, the program won't even start.
It shouldn't freeze up; I've tested it on some large data sets without issue. Provide me with the output of the MassEdit status log and the relevant contents of your browser console and I'll look into the issue when I have a moment.
The script will still function in test mode, but it can't edit what doesn't exist. If it's not finding anything matching the target input in the text content of the pages you provided it, it won't edit anything. It will simply display the error message Error: No instances of $1 found in $2 where $1 is the target input and $2 is the name of the current page.
Without reviewing any errors or status messages, I can't diagnose your issue further. You will need to provide me with all relevant MassEdit status log entries and browser console error messages if I am to address the issues you've encountered. Furthermore, a link to the wiki on which you're using the script might also be helpful.
Ok here's the link to the Common.js of the wiki I am trying to use it on. I have also attached an image of the settings I'm using. The problem I've been encountering is that the status log eventually freezes before checking all the members of the namespace and doesn't continue even when I try pausing and resuming.
It hasn't frozen yet in that screenshot. It usually freezes around the talk pages of articles that start with "C". On a side note, I'm trying to find a certain copypasta spam message, is using this tool the right approach?
If you're not getting any sort of status message indicating that you've been ratelimited like Error: Editing delayed $1 seconds due to ratelimit, I'd guess an uncaught error is being thrown somewhere. I need to see the contents of your browser console, obtained via pressing Ctrl+Shift+J on Windows or Cmd+Opt+J on Mac.
As for finding certain messages that could appear anywhere, I'd advise you to use a more specialized tool. You could in theory use MassEdit to comb every page on the wiki for a specific message, but that would be an intensely time-consuming task. You may want to investigate dedicated automatic editing software like Pywikibot or AutoWikiBrowser and invest in a bot account.
EDIT: It'd also be helpful if you could provide me with the complete listing of page titles you were attempting to edit so I can attempt to replicate the error myself in my non-edit debugging version of MassEdit.
EDIT: I'm searching through all the pages in the Talk namespace, so you can just change the type to "Namespaces" and enter "1" under "Page Entries." If you want to see the actual list of page titles, use the "Generate page listing" operation.
Alright, thanks. I'm going to run a few tests on your wiki with my non-edit debug copy of the script and see if I can replicate the strange process freezing issue you described.
That said, the article comment namespace on your wiki has over 37,000 entries in total. MassEdit was never designed to handle such intensive tasks; it was developed as "bot software lite" and is not a substitute for dedicated task automation software like the aforementioned Pywikibot and AutoWikiBrowser. I'd encourage you to check out one of those if you intend to comb through this many pages on a regular basis.
Some 3313 entries later in the comment titled "Talk:Celebiemail@example.com," I believe I've discovered the problem. For some reason, the API returns certain "phantom comments" in its collated query results that lack both text content and revision history. Since MassEdit expects every extant page to have at least one revision in its history (as the act of creating a page in the first place is a revision in itself), it never accounted for cases in which a page might exist in the database without any history or content whatsoever and crashes unceremoniously as a result. I chalk this bizarre edge case behavior up to the janky mess that is Wikia's ArticleComments extension.
Anyway, I've added a shim that checks page entries for revision history and content and skips the page if neither is found. Once that's been approved by Staff, you should be able to continue combing through the remaining 34,000 other comments without issue. Be sure to let me know if you run into any further problems.
No idea. This particular example may have been the result of a failed deletion attempt that resulted in some vestigial page data being retained server-side, or it just may be the latest example of buggy behavior in Wikia's custom in-house extensions.
Either way, once the fix goes live, MassEdit should be able to handle such edge cases gracefully without crashing. I'll likely run a few more tests on your comment namespace myself, but be sure to let me know if you run into any similar problems in the course of your own editing. Cheers!