[01:38:08] any admins alive. pls see https://phabricator.wikimedia.org/T110819 - wikidata Q404 has caused pywikibot automated tests to fail [01:38:24] bringing development to a stop :P [01:39:28] wtf [01:39:31] why was it recreated? [01:40:53] and why mathematics? [01:41:18] i wouldnt be annoyed if the new joke was better than the last joke, but .. i cant figure it out [01:42:34] I think that was its old meaning before it got deleted [01:42:49] 404 doesnt have magical mathematical properties [07:02:31] Hallo. [07:02:46] I see Wikidata edits in my watchlist in the Hebrew Wikipedia, [07:03:22] but not in the English Wikipedia, even though I have "Show Wikidata edits in your watchlist" enabled in the preferences. [07:03:30] I don't see them in the Czech Wikipedia either. [07:03:41] Does anybody have an idea why this could be? [07:09:28] aude: Amir1 jayvdb jzerebecki Lydia_WMDE ^ [07:10:23] Using wikibase - can one get an EntityObject for a different language ie. working in English but want to access the French object that is found in Wikidata... hope I worded this correctly - Спасибо [08:07:31] aharoni: no idea, sorry [08:11:13] i never see wikidata edits on my watchlists; maybe another preference disables it [10:07:34] sjoerddebruin: DuplicateReferences is a gadget now :D [10:07:34] Good job benestar [10:07:34] thx [10:07:34] Some day we might have something like "subitems" for references. Than we'll have a fun bot job to merge all the duplicated references :P [10:12:07] multichill: hmm, but sometimes the reference is the same, only a different page number [12:54:02] sjoerddebruin: What is https://www.wikidata.org/wiki/User_talk:Sjoerddebruin#Freedom_of_expression all about? [12:54:11] multichill: topic in project chat about guns [12:54:11] benestar: great [13:47:59] Does someone know what is expiration time for csrftoken (https://www.wikidata.org/w/api.php?action=query&meta=tokens) [14:09:49] I wonder if I could actually finish this undeletion task today...... [14:09:53] *bets on no.....* [14:10:45] dispatch actually seems to be staying down nicely.... [14:12:56] We have 19,073,753 pages now. :) [14:13:57] sjoerddebruin: https://www.wikidata.org/wiki/Special:Log/Addshore [14:14:13] Jup, great. [14:15:08] also generating a list of Ids that should be redirect but we cant restore them ;) [14:15:08] which I wasn't doing before [14:28:24] aude: is there a graph of dispatch lag anywhere? [14:28:29] hidden on ganglia perhaps? [14:44:04] dispatching seems to be running this far better than before.... [15:30:45] PROBLEM - wikidata.org dispatch lag is higher than 300s on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1424 bytes in 0.145 second response time [15:30:53] :P [15:31:43] yeh.... i'm stopping now :P [15:32:08] been watching it... it doesnt seem to be keeping up as much any more, im going at 1 third of the speed... [15:33:15] its coming down now :) [15:42:34] *twiddles thumbs* [15:54:46] RECOVERY - wikidata.org dispatch lag is higher than 300s on wikidata is OK: HTTP OK: HTTP/1.1 200 OK - 1430 bytes in 0.221 second response time [15:59:35] massive recovery :D *speeds up again* [16:04:56] addshore: Maybe you can also purge all the items that have missing page_props. That's quite a few items still [16:08:12] multichill: hey, importing the information and adding backlinks is done now :) [16:08:23] Thank Amir1! [16:08:43] :) [16:09:25] adding tons and tons statements [16:09:25] https://www.wikidata.org/wiki/Special:Contributions/Dexbot [16:16:39] addshore: running extra dispatcher :/ [16:16:53] *waves* [16:16:58] can keep running it for a while [16:17:14] that would be epic! :P I reckon I might be able to finish this today [16:17:19] ok [16:17:37] It was very odd, the lag was staying down for around an hour, then suddenly shot up.... [16:17:44] yeah [16:17:53] I was going at the same rate however ;) [16:18:00] something is broken [16:18:18] the way the thing works is broken imho, though better with the backports we did last week [16:19:29] * aude goes to look at why we have unformatted stuff like "wbsetclaim-update:2||1: Property:P213: Q850" in client recent changes + watchlist [16:21:05] probably missing translations [16:21:18] It always showed "item was updated" or something like that [16:22:03] aude: on production? [16:22:08] addshore: yeah :( [16:22:55] thats oddd... [16:24:45] addshore: you know you could check the lag every minute and stop if it is greater than 1min. thus not trigger the alert ever... [16:26:59] perhaps ;) [16:27:24] that would require we writing a bit more code though ;) and doing more requests ;) [16:56:48] multichill: https://www.wikidata.org/w/index.php?title=Q1748924&diff=prev&oldid=246585731 [16:57:23] Ah, you started Amir1, nice :-) [16:57:59] Amir1: Decided I felt like doing a batch upload, see https://commons.wikimedia.org/wiki/File:Isles_of_Shoals_-_14.115_-_Minneapolis_Institute_of_Arts.jpg [16:58:54] Last batch upload I did was Van Gogh, but this one is just too easy not to do [16:59:06] this is so cool [16:59:33] Anything related to Wikidata neeeded? [16:59:58] No, it creates both links (Wikidata -> image. Commons -> Wikidata) on upload [17:01:45] great [17:05:56] I only do this on selected collections because it takes so long to upload all the images [17:11:21] understandable [17:15:38] afk for a bit, will be back to check i'm not killing dispatch shortly ;) [17:23:50] back, and woah! [17:30:53] aude: its moment like this, the lag went up to 5 mins, I totally stopped editing, and now it has carried on heading up to 10 :P [17:30:58] * jaydog slaps Amir1 around a bit with a large fishbot [17:31:49] 11 mins now... [17:32:21] well, Stalest is 11 mins, avg is 6 [17:35:53] ahh, but Jura1 is making many fast label edit [18:58:04] Lydia_WMDE: hey, still there? [18:58:23] I wonder if the special page to query badges should be on wikidata.org or on the client wikis [18:59:00] benestar: imho we need both [18:59:08] but client is more important [18:59:09] k [18:59:13] makes sense [18:59:24] I got quite far already :o [18:59:43] \o/ [19:00:33] I was afk [19:00:38] what's happening [19:00:59] why people want to slap me :( [19:05:28] Am I cause of the dispatch? [19:06:21] Lydia_WMDE: do you think we need to cache the badge lookup? [19:06:48] * benestar wonders if we need that to display badges in the side bar [19:07:04] benestar: not sure tbh [19:07:15] * Lydia_WMDE goes and prepares some dinner [19:16:49] it's a hoo :) [19:16:56] hey aude o/ [19:19:27] hoo: hi :) [19:19:35] Lydia_WMDE: how should the special page on client be called? [19:19:38] Special:Badges [19:19:43] Special:QueryBadges [19:19:53] Special:Pages with badges [19:20:10] Let the bike shedding begin! [19:20:11] Working on badges, cool [19:21:04] benestar: please just make it in a way that we can also query with the api [19:21:18] also, with lua, we want site links with badges [19:21:31] aude: yes, I think I will adjust SiteLinkLookup to include badges [19:21:34] * hoo mumbles something about QueryPage [19:21:38] yay [19:22:19] gnah, I guess i have to use it, right? [19:22:36] so I have to hardcode sql queries in SpecialPage code :S [19:24:45] benestar: It sucks... but I guess you'll end up using it [19:24:56] or you can build an own interactor [19:25:09] but that might end up as reinventing the wheel, thus not be way nicer [19:25:16] or not nicer at all [20:21:48] Amir1: Do you still have that creator importer bot? [20:22:08] I think so [20:22:22] all of my codes are in labs [20:23:31] Amir1: Do you think you can import all the creator templates in https://commons.wikimedia.org/wiki/Category:Creator_templates_without_Wikidata_link that have "paint" (case insensitive) in the text? [20:24:55] yeah [20:25:26] I'm a little bit busy though [20:25:37] maybe early September [20:25:41] 1-7 [20:25:48] Someday when you have time. That's in 2 days so fine with me [20:26:12] awesome [20:29:14] Amir1: https://www.wikidata.org/wiki/Wikidata_talk:WikiProject_sum_of_all_paintings#Multichill.27s_tool [20:30:23] :) [20:30:30] sure thing [20:31:29] (I skipped last night sleep, I'm so grumpy now, if I stay awake more, I start biting people) [20:31:41] ! [20:56:17] aude: is there any way in the mediawiki database to get the full page title (including namespace prefix)? [20:56:42] I want to join the badges table against it [21:05:04] benestar: where, why? [21:10:05] benestar: no [21:11:25] aude: finally found a query, after 3 joins [21:11:33] oooh [21:11:34] I want to get list all pages with a badge [21:11:55] aude: http://pastebin.com/M97i2zX7 [21:12:04] any idea how we can improve that situation? :S [21:12:41] I think I should just not use QueryPage... [21:13:41] would be nice if query page had the option to order by something else. [21:13:46] (e.g. item id) [21:14:25] i think query page has advantages (e.g. caching, api support) that is nice but would be nice to reuse that parts we want [21:16:11] aude: but if the query requires 3 joins instead of zero, I'm not sure if we should chose that for caching... [21:17:39] I mean, if we don't do any joins, why should we cache something? [21:17:58] I don't think that is needed for a simple db query with one WHERE clause [21:21:36] benestar: i'd have to see the patch (and the database table schema) ... [21:21:43] and not be so tired :) [21:21:49] hehe [21:21:58] will submit my patches tomorrow so you can have a look [21:22:01] ok