[08:32:34] hii [08:32:54] !admin [08:33:38] i have something to tell you [08:38:04] Hm, too late. [09:14:34] Holy fuck. https://www.wikidata.org/w/index.php?title=Q4509&action=history [09:22:23] tell me about it :) [09:22:44] I remember seeing this before... [09:23:10] https://www.wikidata.org/w/index.php?title=Q44437&action=history [09:23:25] I've this item on my watch list on trwiki also [09:24:07] Always Spain or Mexico. [09:28:36] they start precisely on 7pm EST [09:30:45] Lydia_WMDE: Vandalism is becoming worser every day. [09:32:01] First, let's masspatroll that ipv6 again... [09:34:33] sjoerddebruin: :( [09:34:42] sjoerddebruin: anything i can do to help? [09:35:32] I really don't know, attract more volunteers to patrol... [09:36:12] Or we need to build better abuse filters, but we still don't have a good relationship with that extension. [09:39:02] *nod* [09:51:04] What tools which make patrolling easier would you recommend? [09:51:35] is there a way to filter just the "changed claim" on recent changes? [09:51:38] that'd help a lot [09:59:36] HakanIST-wrk: as far as I know this is not possible [10:00:00] and it should be technically complicated since the entity has only one edit history [15:16:45] * aude waves [19:34:36] hey aude :) all good? [20:05:11] Lydia_WMDE: back :) [20:05:31] :) [20:06:34] anything important + urgent to work on? [20:06:54] (otherwise, i continue with search + other projects sidebar) [20:07:21] no those are great [20:07:52] ok [20:09:16] maybe we can do https://phabricator.wikimedia.org/T111173 in the next sprint? [20:09:35] sounds good [20:09:42] will add to siggested [20:09:48] suggested even [20:09:50] ok [20:09:57] Hm, anyone knows where bene is? [20:10:19] * aude telling yet another person not to commit to Wikidata.git and instead to build resources which is on github [20:10:46] sjoerddebruin: i have a call with him tomorrow morning [20:11:18] Would be great if he could take a look at the duplicate references gadget again. [20:11:22] k [20:15:01] Almost at 50% with translations on Wikidata. <3 [20:26:51] Oh, wait. :/ https://www.wikidata.org/w/index.php?title=Wikidata%3AFlemish_art_collections%2C_Wikidata_and_Linked_Open_Data%2FWhitepaper&type=revision&diff=292846184&oldid=291706488 [21:12:11] spotted an issue when attempting to post a payload for a cleared entity : i got a wikibase-validator-sitelink-conflict with a link in newwiki with another entity, despite the link provided in the error response NOT being in the payload for my entity [21:20:23] Alphos: oooohhh [21:20:37] more details? which entity? what json? [21:20:41] what sitelink etc? [21:20:43] sure :) [21:21:19] was just making sure i wasn't wrong, before ftping to my server ^^ [21:21:30] :) [21:23:19] there we go : http://alphos.fr/wikidata/rollboterrorlog [21:23:32] posted payload is on top, error response at the bottom [21:24:52] attempting to repost Q23559 to a former state, api complains about [[newwiki:\u0924\u0903\u0939\u0924\u093e\u0903 \u0968]] already being used in [[Q362]], despite that sitelink not being in the payload i have for Q23559 [21:25:02] 10[1] 04https://www.wikidata.org/wiki/newwiki:%5Cu0924%5Cu0903%5Cu0939%5Cu0924%5Cu093e%5Cu0903_%5Cu096813 => [21:25:05] 10[2] 10https://www.wikidata.org/wiki/Q362 [21:26:05] i triple checked my code to make sure that there wasn't some sort of crosspost in the error log, but i assure you there isn't https://github.com/alaefin/RollBot/blob/master/index.php#L375 [21:26:19] addshore --^ :-) [21:27:01] *looks* [21:27:19] i suspect this occurred for two other entities while attempting the same task, but silly me didn't think of appending to his error log, and overwrote instead XD [21:27:43] (but overwriting means writing relevant data ONLY for the current attempted edit, that's what i triple cheched) [21:29:35] hmmm [21:30:01] so the newwiki article name as far as I can tell is the main page for newwiki.... [21:30:09] its https://new.wikipedia.org/wiki/%E0%A4%AE%E0%A5%82_%E0%A4%AA%E0%A5%8C :O [21:30:22] fwiw, the other entities i attempted to revert to their former state were https://www.wikidata.org/wiki/Q2022368 and https://www.wikidata.org/wiki/Q12189183 - sadly i can't know why they were repelled [21:30:34] addshore : seems legit :D [21:31:54] so which revisions are you trying to restore them too? [21:32:28] if i'm not mistaken, this one https://www.wikidata.org/w/index.php?title=Q23559&oldid=288866046 [21:33:15] but yeh, the edit wont work as the newwiki sitelink is used on another item, but then the newwiki sitelink seems wrong as It looks like the item is about Benito Mussolini now the main page...? [21:33:17] namely, the revision immediately before the first edit by Mr.Ibrahembot after 2016-01-18T17:48:12Z [21:33:41] so that revision on which item Q362? (I have all of them open now) [21:34:04] no, that's 23559. i never did anything with or to 362 [21:34:31] "i" (my bot) never "knew" Q362 existed in the whole process [21:34:47] except for the error message he got [21:35:06] https://www.wikidata.org/w/index.php?title=Q23559&diff=293144380&oldid=288866046 [21:35:20] there is no change though between the revision currently there and the one you want! [21:35:32] i know, so he made a null edit [21:35:59] he attempted to revert edits by Mr.Ibrahembot, but they were already reverted by GZWDer [21:36:11] (so, null edit) [21:36:30] so where did you get the json from you pasted in your request? [21:36:41] specifically from Pyb's version [21:37:03] https://www.wikidata.org/w/index.php?title=Q23559&oldid=288866046 that one [21:37:30] okay, well all I can say is the json isnt quite for that version :P [21:37:40] something somewhere has gone wrong! maybe with the copying the JSOn? [21:38:00] argh [21:38:45] hmmmm [21:38:53] I think I'm also confusing myself a bit :D [21:38:56] so your json has \\u092c\\u0947\\u0928\\u093f\\u091f\\u094b \\u092e\\u0941\\u0938\\u094b\\u0932\\u093f\\u0928\\u0940\ [21:39:23] closing a crapton of browser tabs WILL HELP ! :D [21:39:24] which is apparently on Q362 already [21:39:51] Q362 has \u0924\u0903\u0939\u0924\u093e\u0903 \u0968 [21:40:14] which isn't the same as my json, is it ? [21:40:22] no :/ [21:40:27] phew [21:40:27] could be a redirect though? [21:40:31] ah [21:40:35] *checks* [21:41:42] THAT'S IT ! [21:41:46] oh wow [21:42:14] but how did the constraint check not fail earlier when either link was posted ? [21:42:30] could it be that the redirect was made a redirect later on newwiki ? [21:42:30] was that it? :D YAY! [21:42:42] It might have been the sitelink was added at a time when the constraint check had a bug [21:42:54] or yes, the redirect was make after the sitelinks were already there! :) [21:43:30] https://new.wikipedia.org/w/index.php?title=%E0%A4%AC%E0%A5%87%E0%A4%A8%E0%A4%BF%E0%A4%9F%E0%A5%8B_%E0%A4%AE%E0%A5%81%E0%A4%B8%E0%A5%8B%E0%A4%B2%E0%A4%BF%E0%A4%A8%E0%A5%80&diff=819530&oldid=2003 [21:43:39] naughty naughty naughty ! [21:44:22] uh [21:44:31] the first version of the redirect had a category [21:44:52] the * of that category is, uh, odd ? https://new.wikipedia.org/wiki/%E0%A4%A6%E0%A4%AC%E0%A5%82:%E0%A4%9C%E0%A5%80%E0%A4%B5%E0%A4%A8%E0%A5%80 [21:45:19] "cutest penguin's" <== wat ? :D [21:46:42] my clustereffs have clustereffs :D [21:49:02] addshore out of curiosity, other than attempting to repost the other two entities that missed, is there a way to know which of their sitelink is faulty (if that is the cause for the api error response) ? [21:49:19] yeh, thats the cause of the error message [21:50:07] oh and if your writing php bots for wikidata you should totally look at https://github.com/addwiki/wikibase-api [21:50:09] ;) [21:50:15] and the other libraries in that org! [21:50:51] the message for the error does say The link [https:\/\/new.wikipedia.org\/wiki\/%E0%A4%A4%E0%A4%83%E0%A4%B9%E0%A4%A4%E0%A4%BE%E0%A4%83_%E0%A5%A8 newwiki:\u0924\u0903\u0939\u0924\u093e\u0903 \u0968] is already used by item [[Q362|Q362]]. [21:50:52] 10[3] 10https://www.wikidata.org/wiki/Q362 [21:51:22] addshore i did, and discarded it ^^ [21:51:29] :D [21:51:35] addshore i know for that entity [21:51:47] but i don't know for the other two entities [21:51:57] what error do you get for the other 2 entities [21:51:57] ? [21:52:00] https://www.wikidata.org/wiki/Q2022368 and https://www.wikidata.org/wiki/Q12189183 [21:52:08] sadly, i don't know XD [21:52:35] my error logging was stupid, i overwrote previous error, for some reason unbeknownst to me [21:52:51] :D [21:53:03] (that has been fixed :D ) [21:53:16] well, I guessif you get the current json and post it again with clear being true it might spit the error back out for you! [21:53:26] true [21:53:34] if not, it's a null edit anyway [21:53:42] better get crackin' ! [21:54:09] :) [21:55:03] also, better add that to rollbot's report :) [21:55:15] ("that" being the error message) [21:56:21] :D [21:56:28] what does rollbot do then? :D [21:56:51] rollback in a more surgical way [21:57:19] rollbacks reverts all last edits by a given user [21:57:26] so it may revert valid ones [21:57:58] rollback won't revert if another user edited, and the other user may not have reverted to the "good version" [21:58:07] oooh, okay! [21:58:35] so RollBot can revert to the "good version", even by the "bad user", which is the version that existed immediately before he started messing up on that page [21:59:11] also important : it doesn't revert all pages to the same point in time ; it reverts each page to the version prior the first bad edit by the "bad user" [21:59:57] Indeed [22:00:09] the mouseover of the rollback links in the UI say that ;) [22:00:24] and it can aggressively revert (overwriting edits by other users) or just list the pages that were written by other users as requiring human check [22:00:54] rollback is useful, but it's not entirely perfect :) [22:01:36] and between quick statements and bots, reverting a user's edits is not really something that should be done by human hands ;) [22:02:00] https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/RollBot :) [22:02:18] * Alphos is fairly proud of his idea ^^ [22:03:00] * Alphos would like to thank harmonia_amanda and Ash_Crow for helping him fine-tuning rollbot's way of handling things ^^ [22:03:14] :) [22:03:40] * Alphos is in dire need of tea, and will repost on the two entities after brewing some [22:04:09] =] [22:04:14] mhhhm, what kind of tea Alphos ? [22:04:30] http://www.mariagefreres.com/FR/2-chun-mee-zhejiang-the-vert-chine-T225.html that kind [22:04:50] chinese green grilled leaves [22:05:02] (at least this time) [22:11:53] yay, also in dire need of 5 minutes with the oxygen mask, i love my life -_-" [22:12:01] addshore: I was reading the desysop stuff and I noticed https://github.com/miraheze/puppet/commit/8c14084c5a1b2e5a4014a503b765ea01a11aebc3 [22:12:43] interesting [22:13:28] This reads like a nerdy soap opera [22:17:58] addshore: So someone who is accused of getting another free wiki hosting farm compromised assigns himself all rights on another free wiki hosting farm? Wut? [22:30:53] addshore : couldn't reproduce an error for https://www.wikidata.org/wiki/Q12189183 [22:30:59] aude: Around? [22:31:10] i could, however, for https://www.wikidata.org/wiki/Q2022368 [22:31:20] What's the problem with duplicate returns? [22:33:53] hold on, checking again, very weird discrepancy [22:34:39] hoo: ? [22:34:53] i'm not sure it's a problem, but maybe a matter of taste [22:34:56] 'Duplicate "return true" in this method.' [22:35:08] So rather have several nested if blocks? [22:35:19] but we want to cut short in places [22:35:31] so return true makes some sense as a way to do that [22:35:32] imho [22:35:47] yeah, I also like that style over deeply nested blocks [22:36:11] Can also try to make functions smaller in these cases [22:36:18] but I'll run out of names, eventually [22:37:37] addshore : exactly the same issue for https://www.wikidata.org/wiki/Q2022368 , with :de:Teller-Ulam-Design redirecting to :de:Kernwaffentechnik#Teller-Ulam-Design , and the latter being used in https://www.wikidata.org/wiki/Q15221814 [22:41:41] aude: Will try to amend it in a way that makes Thiemo happy... I think we should get that done before branching next week [22:42:41] can action=edit/wbeditentity error messages leak any kind of private information about the current account ? [22:42:51] hoo: definitely [22:50:10] Alphos: It shouldnt [22:50:17] thanks :) [22:50:25] Alphos: tbh it might make sense to create a script finding these things and listing them / removing one of the sitelinks! [22:51:26] so finding which sitelinks are redirects to other sitelinks ? [22:51:38] yeh, so find the conflicts that are and should not be! [22:52:20] how does wikidata check for those conflicts btw ? [22:53:29] so it checks on edit, which is why you are running into issues [22:53:31] (any kind of automation would rely on the same base process) [22:53:31] aude: Try PS5 [22:53:41] addshore yes, but how does it do that ? ^^ [22:53:48] ahh *looks* [22:54:12] hoo: ok [22:54:23] Haven't tested it inside the build, yet [22:54:33] only have it installed as "normal" extension [22:54:43] Alphos: https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/repo/includes/store/sql/SqlSiteLinkConflictLookup.php [22:54:56] it does it in the db on a per item basis [22:55:12] addshore sadly i can't think of any other way of doing it [22:55:21] the nosqlness of wikidata prevents joining ;) [22:58:51] well, youll need to do checks on the repo and the client [22:58:56] might be easier to use the dumps! [22:59:09] or youd have to do something with the labs db replicas [23:01:17] aude: hm... think you need to manually tamper with your composer.lock [23:02:13] and/ or installed.json [23:02:40] After having done that, build's class map looks fine to me [23:05:42] hoo: i tried [23:06:01] addshore : can i mention you in my bug review on the wiki (explaining what failed with the two entities) ? [23:06:01] Did it work? [23:06:03] no [23:06:05] aude: hoo do RFCs still live on wiki? [23:06:16] Not usually, but some still do [23:06:17] Alphos: sure! (and give me a link)? [23:06:21] (on the permission request, because visibility during the perm request) [23:06:27] aude: What error does it give you now? [23:06:29] as "addshore" ? [23:06:34] Alphos: ya [23:06:47] addshore: in phabricator [23:06:54] I tried it locally by using the master build, then checking out my patch, then changing to two files, then dump-autoload [23:07:07] -o [23:07:14] manual tinkering is not nice :/ [23:07:23] Indeed [23:08:15] Probably composer is smart an caches stuff [23:09:25] You would need to check out the patch via composer for things to work... but I don't think that the patch branches are visible to it [23:10:48] addshore https://www.wikidata.org/w/index.php?title=Wikidata%3ARequests_for_permissions%2FBot%2FRollBot&type=revision&diff=293205934&oldid=293172761 [23:10:49] https://gerrit.wikimedia.org/r/#/c/264597/ works for me [23:11:21] What exactly do you mean? [23:11:42] now i get MWException from line 176 of /var/www/wiki/w/includes/Hooks.php: Invalid callback WikimediaBadges\BeforePageDisplayHookHandler::onBeforePageDisplay in hooks for BeforePageDisplay [23:11:55] when i try to purge a page (on the repo,but also is client) [23:12:04] Override commons sidebar link with commons category OOOOOOH [23:12:10] sounds like MediaWiki didn't pick it up [23:12:13] that would solve the wbgetclaims mass callage thing [23:12:20] thus, class unknown [23:12:30] addshore: YES! :) [23:12:35] WOO! [23:12:53] addshore : as for a dump or labs to find hidden duped links through redirects, both are fine suggestions but i'd think labs would be better suited for the job as i'd imagine that should be done periodically4 [23:12:59] s/4$// [23:14:06] hoo: thatll be awesome [23:19:17] I have various things related to that in gerrit... reviews appreciated ;) [23:21:45] :D [23:21:51] I my be able to tommorrow! [23:21:59] writing n rfc on wathlist stuff now [23:31:28] i'll give a solid thought to that conflict report tool in the next few days ^^ i'd rather "finish" rollbot first, let it rest and work, THEN work on another tool :) [23:56:55] Hi hoo! Hi all! [23:57:19] At https://www.wikidata.org/wiki/Special:RecentChangesLinked/Wikidata:Database_reports/WMF_projects?uselang=en you may watch the translation progress of wikidata titles and descriptions.