I never did get around to stress testing how many wikis could be loaded, sorry about that. Checked my code real quick and it shouldn't be hard to implement your suggestion (definitely easier than having it load in series instead). So my thought is something like
data-loaddelay=, unless you think it would be good to have the 500ms timeout just be a built-in "precaution" for everyone? (or even just good "etiquette" towards Wikia?)
- Alright, give it a try now (V1.1.5). Attribute is "data-loaddelay" in milliseconds (10ms by default). As a heads up, the way I currently have it work, if one wiki has to reload it's information (loading timeout, etc) it will retry right away vs trying to load it at the "end". I don't believe that should be a problem, but if it does end up being one let me know (fixing shouldn't be too hard, but I'm lazy and any rewrites increases chances of bugs). Let me know if you have any other issues / suggestions / concerns, and thanks for using the script,
Also, since you're VSTF you may be using "onlyshowusers" so this is a "note" and a question; currently the way this works is the script loads all possible entries (up to 500 depending on constraints), and then parses out all except those that belong to the defined user(s). The reason for this is the RecentChanges API ("rcuser: Only list changes made by this user") only allows 1 user to be specified for this behavior (which wouldn't allow multiple users). So my question is as such: is this fine, or should I also set it up so that if only 1 user is defined, it will use the API feature so that you can get upwards of 500 entries for that one user on a wiki (versus the 500 minus the edits by everyone else). I currently don't use the API way since I wanted the behavior to remain consistent (not hard to change though), but I'm not sure how VSTF might use the script exactly so I'm asking.