[06:50:48] Lydia_WMDE: Good morning: This is post-mortem of my cleanup. Wrote that only for you: https://etherpad.wikimedia.org/p/wikidata_cleanup_post_mortem [06:51:01] abartov [06:54:38] HakanIST: yes? [06:55:09] hey sorry my son pressed tab button and you are on top of the list :) [06:58:06] HakanIST: ok :) [07:03:10] :D [09:09:29] Amir1: yay :) thank you! [09:09:51] :) [09:10:01] tell me if you need anything else [11:20:29] https://gist.github.com/filbertkm/0254b6fc417ae12bc1b1 [11:20:34] DanielK_WMDE_: [11:21:47] and if https://gerrit.wikimedia.org/r/#/c/259198/1/maintenance/exportSites.php is not too evil [11:22:02] that is how i made this [11:35:48] benestar: foreachwiki extensions/WikidataBuildResources/extensions/Wikibase/lib/maintenance/populateSitesTable.php --load-from http://en${mediawiki::multiwiki::base_domain}${::port_fragment}/w/api.php [11:49:08] Amir1: are you already deleting things with p31 statements, or only if it's something like a category? [11:49:40] nikki: Only categories, check my talk page :) [11:49:53] ah :) [11:50:17] * nikki sees quite a few red items on https://www.wikidata.org/wiki/User:Pasleim/Items_for_deletion/Page_deleted :D [11:50:38] I've been working off the bottom [11:50:59] the bot is done now [11:51:34] hm... why didn't it delete https://www.wikidata.org/wiki/Q9739638? [11:51:52] I only see one sitelink in the history and nothing links to it [11:52:55] I should check [11:53:52] Amir1 I've made a list of items with no claims/sitelinks and history of 2 revisions (bot addition and sitelink deletion) [11:55:02] wish we hit zero [11:56:04] HakanIST: Can you send me list of those? [11:56:17] sure, which format? [11:56:41] nikki: It seems it wasn't in the original list, because I added it and the bot just deleted it [11:57:13] oh, didn't think of that :) [11:57:21] HakanIST: [[Q1234]] [[Q1245]] [11:57:22] 10[1] 10https://www.wikidata.org/wiki/Q123413 => [11:57:24] thanks [11:57:25] 10[2] 10https://www.wikidata.org/wiki/Q1245 [11:58:20] no new lines [11:58:42] doesn't matter, put it for better readablity :) [12:00:34] anyway, I'm really happy to see people working on making bots do things like this :D there's just far too much to keep up with manually and it feels like a waste of time spending so much time doing something a bot could do more efficiently [12:03:25] nikki: if there is anything you think is manual labour and huamn shouldn't do it, just tell me :) [12:03:39] ok :) [12:07:11] I think it would be ok to delete things which only have p31 template/disambiguation page/list article as well as categories [12:08:09] nikki: Can you give me Q id of those? [12:08:59] Q4167410 is disambiguation page, Q11266439 is template, Q13406463 is list article [12:09:22] awesome [14:27:31] benestar is there a bug on phabricator for the problem we discussed this morning [14:28:53] nikki: it's deleting them [14:29:04] HakanIST: Done :) [14:29:15] great :) [15:03:01] I created a tracking bug for issues with wikidata and vagrant https://phabricator.wikimedia.org/T129223 [15:03:10] physikerwelt: I'm not aware of one [15:03:13] thanks for creating [15:10:15] benestar however daniels tipp to use the |from=Qxx helped me to find a workaround for the task that was blocking me. Now I can generate contentMathML see http://wikidata-drmf2016.wmflabs.org/wiki/Talk:Q2 [15:20:20] physikerwelt: quite cool :) [16:43:20] Jonas_WMDE: the config.js thing doesn't work for me :( [16:43:30] oh nose why? [16:43:32] i'm using something other than localhost [16:43:50] you could add yours [16:44:11] i can... it would be nice to have config.local.js or such [16:44:58] i'll hack it for now, but want to think about how to make config more flexible [16:45:54] and the map doesn't work so maybe i have to use localhost [17:57:37] SMalyshev: re https://www.mediawiki.org/w/index.php?title=Wikibase/Indexing/SPARQL_Query_Examples&diff=next&oldid=2070798 currently Category:Wikidata_Query_Service is marked as historical and Category:WDQS has all the current pages. we should do it consistently. either way is fine. [18:00:45] benestar: Lydia_WMDE https://github.com/wikimedia/ifttt [18:17:25] Lydia_WMDE: DanielK_WMDE_: https://phabricator.wikimedia.org/T56085 [18:19:47] jzerebecki: ok [19:49:28] DanielK_WMDE_: around? [19:50:00] SMalyshev: about to wrap up, but yea [19:50:44] DanielK_WMDE_: just wanted to ask about https://phabricator.wikimedia.org/T128667 [19:51:01] DanielK_WMDE_: whether you know what is the story with regard to caching and various URL forms [19:51:32] and how we handle such things in general. If it's a long story then I can wait :) [19:52:29] SMalyshev: the quick answer is: I don't know, ask the varnish folks :) [19:53:29] DanielK_WMDE_: I see. So inside wiki/wikibase code base there are no provisions fot that [19:53:34] I assume? [19:53:48] because that should be story for regular URLs too [19:54:16] SMalyshev: yes, no special handling inside wikibase [19:54:29] from the inside, all requests look like the "ugly" form [19:54:41] DanielK_WMDE_: yeah, but not to varnish... [19:54:48] the pretty form never reaches php, it's hidden by rewrites in the web server [19:54:49] ok, I'll keep digging then [19:55:03] yes, indeed, not for varnish. [19:55:25] unless the rewrite is happening in varnish. which would be nice, but i don't th8ink it's the case [19:56:04] I didn't find any code that would suggest that. So I wonder how purges work... [19:56:19] I'll try to locate whoever knows that :) [19:57:42] SMalyshev: my (naive) understanding is that we don't cache anything that has a "?" in the url. But you have convinced me otherwise, so I just don't know :) [19:58:37] DanielK_WMDE_: hmm... that'd be pretty expensive since many requests can produce ?title= URLs... [20:33:29] benestar: https://phabricator.wikimedia.org/T113034 [20:34:56] benestar: https://phabricator.wikimedia.org/T78639 [22:40:22] n/c