[00:00:17] they are obsessed with complex data viz [00:00:24] https://vega.github.io/vega-editor/?mode=vega [00:00:26] see for yourself\ [00:00:46] on a completely different note, [01:50:49] * AlvaroVALEBASURA (~murismo@139.59.169.235) has joined /// 21:12, 14 July 2016: Vituzzu (meta.wikimedia.org) globally blocked 139.59.0.0/16 (expires on 14 July 2021 at 19:12) (leaky colo + open proxy at 139.59.2.13) [00:01:33] i fiddled a lot with vega when word got out about it, maybe a year or so ago ? [00:02:13] !ops AlvaroVALEBASURA [00:03:07] AlvaroMolina sorry if i HLed you, didn't think you'd HL on "Alvaro" ;) [00:04:00] JD|cloud he left [00:04:06] Alphos: still useful to ban [00:04:13] it's gbanned [00:04:14] he's already +q on his ident which is a good thing :) [00:04:41] (he's the one who just FLOODED the chan repeatedly a few days back) [00:04:57] but that doesn't mean he can't be annoying [01:53:30] -NickServ- AlvaroVALEBASURA!~murismo@139.59.169.235 failed to login to Alphos. There have been 4 failed login attempts since your last successful login. [00:20:40] Alphos_, there is a lot more we can do with vega. I plan to depl vega-lite, but more importantly, to add ability to store tabular data on wiki [00:21:04] there is a big tutorial on how to do complex stuff with it - https://www.mediawiki.org/wiki/Extension:Graph/Interactive_Graph_Tutorial [00:24:36] <3 [10:47:52] order food? Thiemo_WMDE DanielK_WMDE ? [10:48:16] Open for suggestions. [10:48:27] https://www.lieferheld.de/lieferservices-berlin/restaurant-thai-huong-snack/98/ [10:48:29] A bit late maybe. [10:49:17] yes will arrive about 13:45 [10:51:17] Hi! I would like to know if i can set the limit of results of my wbsearchentities above 500 or set no limit at all. [10:51:34] I am running my own private wikibase instance (so i am actually not on wikidata) [10:51:50] I am refering to this: https://www.wikidata.org/w/api.php?action=help&modules=wbsearchentities [10:52:36] my pywikibot code to perform a search by label looks like this: https://dpaste.de/Ac92 [10:57:05] Till___: the limits are hardcoded in ApiBase. Why would you want to change them? pywikibot supports paging, right? [10:57:46] I am pretty new to programming, what is paging? [10:57:52] Ah, nut sure wbsearchentities supports paging. it's generally not easy (and nto very useful) for ranked results. paging needs a unique key to sort by... [10:58:30] Till___: many api requests will tell you how to get the next pag (next 50, next 500) results. pywikibot should handle this automatically, i think [10:59:22] Till___: for example, https://www.wikidata.org/w/api.php?action=query&list=allpages&apfrom=B [11:00:23] so if i cant change the limit of 500 requests, is there a way to get all item pages with pywikibot? [11:01:11] Till___: See the "continue" part? it tells you to set &apcontinue=Q10000004&continue=-|| to get the next "page" [11:01:15] well I am using continuation in my code, but the limit of requests is still 500. The code i use for searching items by label is this one: [11:01:21] https://dpaste.de/Ac92 [11:01:54] Till___: 500 should be the limit per continuation page. the total should be unlimited. [11:02:07] don't know how pywikibot does this, i havn't ever looked at the code [11:02:22] listing all item pages via the api is actualyl not as easy as it should be :( [11:02:47] wbsearchentities isn't a good choice though. wbgetentities would be better, but it doesn't support listing. you have to know the ids. [11:03:23] thats the problem, I dont know the IDs. Is there a way to get all the IDs of my wikibase instance [11:06:49] I mean, can I pywikibot get all the IDs of all item pages on wikibase? [12:13:43] Jonas_WMDE: about primary sources, sometimes when you open a item it opens in edit mode. I will give you a screenshot if I see it again. [12:18:04] sjoerddebruin open an item? [12:18:15] Yeah, by clicking on a link for example. [12:18:40] Happens more if you right click in a new tab, apparently. [12:22:04] Will take a screenshot if I see it again [12:24:22] thx [12:25:59] atm it's just like a squeaking door that doesn't squeak when you want to let other people hear it [12:43:03] that's quite typical sjoerddebruin [12:52:09] How can I indicate on a person item that they found him of her drowned? [13:15:34] SMalyshev: the problems seem to be much worse, I have a Sparql report of missing death dates and one item had one since feb this year. [13:15:41] sjoerddebruin : P509 ? [13:15:41] P509 (An Untitled Masterwork) - https://phabricator.wikimedia.org/P509 [13:16:02] Alphos: yeah, but I also want to add the date they found the body. [13:17:09] P189 as a property ? [13:17:25] uh, as a qualifier, obviously [13:17:34] hm, that's for location [13:17:55] uh, P575 [13:17:55] P575 Starting elasticsearch with log level debug - https://phabricator.wikimedia.org/P575 [13:18:03] (still as a qualifier) [13:18:08] Hm, that would make sense to me [13:18:47] https://www.wikidata.org/w/index.php?title=Q18822039&type=revision&diff=365490261&oldid=365482053 [13:19:00] so P509:Q506616 { P575: } [13:19:00] P509 (An Untitled Masterwork) - https://phabricator.wikimedia.org/P509 [13:19:00] P575 Starting elasticsearch with log level debug - https://phabricator.wikimedia.org/P575 [13:19:12] yup :) [13:19:25] Ugh, how useful is stashbot here Lydia_WMDE? [13:19:44] not, and i noticed it a few weeks ago ^^' [13:19:53] hmmm yeah [13:19:59] not sure who put it here and why [13:20:55] Not sure if it auto-joins, could kick it if you want Lydia_WMDE [13:21:50] don't mind either way really [13:21:54] as you wish [13:30:08] Is there a way with a bot to change a property without change its values? With a bot. [13:31:35] Xaris333: how do you mean? [13:32:11] I want to change {{P|710}} with {{P|1923}} is many articles. [13:32:23] In articles of a certain category. [13:32:58] You could just ask on https://www.wikidata.org/wiki/Wikidata:Bot_requests [13:33:05] Oh, wait. :P [13:33:26] Be a little bit patient. ;) [13:33:43] Thanks! [16:17:26] hey yurik, those pie charts are pretty cool! [16:17:36] multichill, thx :) [16:18:20] multichill, you will need to wrap it into a template with a sparql url escaping [16:19:07] this way you can write {{pie chart | title=blah blah | query=your sparql query }} [16:19:29] and that sparql query could have new lines [16:21:35] hmmmmm, pie... [16:24:05] sjoerddebruin, https://www.wikidata.org/wiki/User_talk:Multichill#as_promised.2C_a_pie_chart_%3A.29 [16:24:12] Yeah, I saw them <3 [16:24:59] multichill, to answer your question - yes, you can use that data for any kind of graphs (but not the {{graph:chart}} template - it doesn't understand external URLs [16:25:25] it should be fairly easy to create any kinds of specific graphs you need [16:25:34] especially if you learn Vega :) [16:25:53] (painful at first, but then you get a hang of it - use special page - graph sandbox) [16:26:32] yurik multichill soo cool :D [16:26:42] do you think this could replace https://www.wikidata.org/wiki/Wikidata:Statistics/Wikipedia? [16:26:49] regenerating those is a huge pain in the... [16:29:25] sjoerddebruin, absolutelly [16:29:34] it seems that those graphs use sparql, right? [16:29:44] I thought they were generated with WDQ. [16:30:05] see https://www.wikidata.org/wiki/Module:Statistical_data/by_project/classes [16:31:09] sjoerddebruin, seems like those are some Lua magic graphs, not the extension [16:31:38] Yeah, so I was wondering if those could be replaced. Or is there not benefit? [16:32:09] we could remake them with the Graph extension. The problem is with the legend -- it will stop being text, and will become part of the graph [16:32:13] (image) [16:32:33] but i see no problem with making them as a graph [16:34:42] sjoerddebruin, it is actually fairly simple - copy the graph that i made to wikidata's special:graphSandbox, and change the "text" mark to be placed under the graph instead of around the graph [16:35:36] you might need to do some work to figure out "y" position [16:41:11] I'm just too busy for it, 3876 people on nlwiki needing a birth date. ;) [18:37:18] https://www.wikidata.org/w/index.php?title=Q23893984&action=history ... [19:10:30] http://polestar.wmflabs.org/ uses google analytics… seriously? *grmbl* [19:12:32] ugh [19:15:48] hoo: it's a copy of original polestar that does that [19:16:19] I can remove it but given nobody should be going there I didn't bother. If it's a problem I'll remove it [19:17:07] we just had other problems to solve before like "how I fix this cryptic npm dependency problem" so I didn't pay attention to that :) [19:35:00] aaaah .... found a bug :( [19:39:22] aude: Don't you eat those for breakfast? ;-) [19:40:36] lol [19:45:23] aude: Do you know who build the list of headers (BETA) thing? [19:45:50] multichill: Jitrixis [19:46:17] the user interaction is not that good. I removed all sorts of stuff and click close and nothing was saved [19:46:22] And changes gone [19:46:33] https://phabricator.wikimedia.org/T106677 ;) [20:15:39] aude: hi! do you have time to talk about T142491 and https://gerrit.wikimedia.org/r/#/c/303838 ? [20:18:45] SMalyshev: ok [20:18:49] btw https://phabricator.wikimedia.org/T143251 [20:19:18] hmm interesting [20:19:21] I'll check it out [20:19:41] that happens with files [20:19:51] https://github.com/wikimedia/mediawiki/blob/120e275384a65e9e255b1aebdf5c9d4bddd15d8e/includes/content/WikitextContentHandler.php#L157 [20:19:54] so on commons [20:20:05] yeah I see [20:20:23] not sure why just now (we deployed, yeah, but this code is not new this week) [20:20:36] may be some special kind of file [20:20:46] maybe though it's quite a bit in the logs [20:22:36] aude: so about this one: https://gerrit.wikimedia.org/r/#/c/303838/19/includes/search/Field/SearchIndexFieldFactory.php I'm not really sure what's the purpose of this class [20:24:40] ContentHandler is big enough [20:24:55] some of the code should be split [20:25:32] and is more reusable, though we're now defining these in the base class [20:26:15] aude: but the whole point we don't need to reuse it [20:26:29] thre's only one instance of category in the map [20:26:40] so there's no point in reusing it [20:27:17] same for template field [20:27:45] newKeywordField looks like think wrapper over existing method, not adding much [20:28:15] quit Zzzz [20:28:24] also more easily testable [20:28:29] so I'm not sure I see how this class will be used in more than one place... [20:28:44] aude: not sure I see how. What it adds in testability? [20:29:00] each field definition can be tested [20:30:26] aude: I can be tested without this class too, how this class helps exactly? [20:44:43] SMalyshev: maybe if we split off more of the field definition code, then it could go in the factory [20:45:04] for now, we could just put it in ContentHandler if that is what you prefer [20:45:14] * aude not happy with that but ok [20:45:31] aude: but field definition code is just calling engine's factory function... there's no real code there [20:45:36] at least right now [20:45:51] so yes, I'd just put it in content handler if that's where you think it should be [20:46:04] contenthandler is over 1000 lines [20:46:27] aude: this would not add more than 2-3 lines - each field is 1 or 2 lines [20:46:36] maybe at some point, content handler can just depend on the factory and not the engine [20:46:39] directly [20:46:51] we could of course take *all* fields into separate class, but I'm not sure it's warranted now [20:47:04] when we add title, etc. [20:47:15] there's more to add, but we can do this later [20:47:31] you need to pass engine one way or another. Of course you could wrap engine with factory, but right now engine *is* the factory [20:47:32] text, source_text ... [20:47:41] yeah :( [20:48:12] adding text would be tricky, as it probably needs its own class [20:48:19] same for source_text [20:48:21] maybe [20:48:32] they have ton of special Elastic hacks [20:48:42] others may be simpler [20:49:05] this is where a factory or something could help [20:49:21] have a dumb implementaiton (for db-based search) [20:49:30] and cirrus have implementation for elastic [20:49:58] well, right now the factory is this: https://github.com/wikimedia/mediawiki-extensions-CirrusSearch/blob/master/includes/CirrusSearch.php#L762 [20:50:14] I don't think it raises to the level of needing its own class yet [20:50:24] but maybe if it gets more complex we can split it out [20:50:59] DB search now ignores fields completely AFAIK [20:51:22] * aude prefers smaller classes [20:51:22] doesn't even use mapping. but in the future if we want to we could [20:51:31] so to split things, but whatever... [20:52:47] so, $file->getHandler returns false because I can't find media handler [20:52:57] what we supposed to do in this case? return ""? [20:52:57] or it returns wrong type [20:53:33] ah, we can just return null [20:55:41] could check instanceof File or something [20:55:52] but still thing somethign else changed and went wrong [20:56:00] and that's the wrong way to handle this [20:56:49] check where? [20:59:02] in https://github.com/wikimedia/mediawiki/blob/master/includes/content/WikitextContentHandler.php#L156 [20:59:36] but it must be trying to call $file->exists() [20:59:43] so is just not getting the handler [21:00:41] Registered MediaHandler for file's MIME type [21:00:41] * or false if none found [21:00:51] makes more sense