[00:00:19] ?place wdt:P84 ?architect for instance [00:00:20] P84 andre queries - https://phabricator.wikimedia.org/P84 [00:00:25] grrr [00:10:17] saranix for instance, http://tinyurl.com/hbtdkg9 with architectural style near Notre-Dame de Paris [00:21:09] thanks! [00:27:15] saranix you're very welcome [05:11:15] HI [05:11:35] please help in #wikipedia-es injustly banned [05:11:40] please [05:12:59] !admin [05:13:06] please help [05:14:14] PLEASE UNBAN #WIKIPEDIA-ES PLEASE !!!!!! [05:14:16] !!! [05:14:33] !ops marielacontador it is a troll [05:15:20] !!!!! [05:15:30] PLEASE UNBAN IN #WIKIPEDIA-ES [05:15:35] PLEASE !!!!!! [05:16:17] PLEASE [05:16:20] !admin [05:16:26] !admin PLEASE HELP [05:16:32] !admin [05:16:46] !admin [05:17:06] !admin PLEASE UNBAN #WIKIPEDIA-ES [05:32:48] too late [05:44:09] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:10] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:11] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:12] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:13] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:15] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:16] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:17] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:18] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:19] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:20] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:21] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:22] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:25] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:26] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:27] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:28] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:29] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:31] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:32] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:33] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:34] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:36] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:37] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:38] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:39] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:41] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:42] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:43] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:44] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:45] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:47] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:48] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:49] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:50] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:51] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:53] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:54] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:55] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [05:44:57] MIERDA.. FUCK IT IS WIKIPEDIA.. PUDRANSE TODOS LOS DE #WIKIPEDIA-ES HIJOS DE PERRA [06:30:42] why is that irc chan so flooded [06:30:51] is there a webchat or something ? [06:45:34] rom1504: this channel is not very artive… we have been dealing with a troll flooding multiple cahnnels. [06:45:39] *channels [06:45:53] Active.. my tying sucks right now [07:43:10] ok [07:48:40] Strange. The IRC network blocked me from entering #wikidata. Now I registered my nick and it works. [07:50:38] Thiemo_WMDE: the channel was blocked from unregistrated users for one moment due to a spammer who changes IP a lot. [07:51:02] This explains it. Thanks! [07:51:13] Have you seen my message yesterday though? [07:51:39] I believe so, about the date gadget? Answer: Feel free to release it as a gadget. [07:52:14] Okay, will take a look soon. [07:52:24] Only seen a handful of people testing though, hopefully enough. [07:58:28] We are here to fix this fast when it breaks. ;-) [09:42:33] Jonas_WMDE: Do you know who is around for a daily? [09:43:43] Not sure where DanielK_WMDE is [09:44:30] I would love to go for lunch now. Quick daily: Fixed font size issue on monobook (up for review). Merged Quantity stuff. I will do the release after lunch. Then I will continue with reviewing all the stuff, focusing on Adrian's. [09:44:42] Hurrayyy, I got my student pack on Github, anyone in need of student pack can apply, its real :) [09:45:00] https://education.github.com/pack [10:32:31] DanielK_WMDE: Around? [13:25:49] Hi [13:25:56] hey :) [13:26:07] Aliossandro: here's more info about the rdf dumps. https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format [13:26:12] doesn't cover flavors, though... [13:26:29] Thanks! [13:28:25] Aliossandro: https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/repo/includes/LinkedData/EntityDataSerializationService.php#L313 [13:28:31] this is what the flavors mean. [13:29:06] Aliossandro: try for instance [13:29:08] https://www.wikidata.org/entity/Q72.ttl?flavor=simple [13:29:56] thanks [13:30:12] Aliossandro: feel free to add info about the flavor parameters to the wiki page :) [13:30:12] that's actually much shorter than the previous file [13:30:20] DanielK_WMDE: https://github.com/DataValues/Number/pull/64 maybe? [13:30:24] Sure :) [13:30:41] Thiemo_WMDE: i was just looking at tha, yea [13:31:02] Thiemo_WMDE: do we have release notes that should be updated fro the 0.8 release? [13:31:28] I will update (without pull request). Only this nullable thing is missing. [13:31:33] what else needs to be done for the release? I'm always confused about all the places where we need to bump the version number [13:31:57] This is done. I will do the release right now. :-) [13:32:16] You can see it here: https://github.com/DataValues/Number/pull/81/files [13:32:21] so why do we not want to allow null there? [13:32:29] i have no string feelings about this, just wondering [13:32:50] *no strong feelings [13:33:00] Because it's pointless and confusing. The same mistake was done in an other class and already fixed by a volonteer. [13:33:31] Conflicts with the option, thats the main reason. And its unused anyway. [13:34:01] i guess it was a b/c measure originally [13:34:08] pointless when working with proper releases [13:34:21] i guess it's a habit from coding against master [13:35:40] Thiemo_WMDE: before we start using this in Wikibase, we need to announce a breaking change to our JSON and RDF models... [13:36:05] sure. [13:36:22] Thiemo_WMDE: merged [14:47:34] Jonas_WMDE: You might want to have a look, because this is the last thing in the sprint! ;-) https://phabricator.wikimedia.org/T115269#2511555 [16:12:50] is the correct way to get wikidata ids for a bunch of wikipedia titles using pywikibot, to use a PagesFromTitlesGenerator and pass the result to a WikibaseItemGenerator? [16:12:56] will that result in a single request? [16:25:28] johtso: pywikibot has a support channel, #pywikibot [16:25:36] you could ask them for best practies [16:26:20] ah, cheers [18:02:51] is there a way to only get the unqualified statements using sparql ? [18:03:40] i don't much mind about references, pretty much all of them are "imported from nlwiki" in my case, but i need the statements which have no property qualifiers [18:04:10] or at least get a list of qualifiers for each statement ^^' [18:04:54] or a count of qualifiers :D [18:06:28] tinyurl.com/j9unhay [18:06:29] this could be it \o/ [18:06:38] i definitely need to check the examples first XD [18:06:55] Ah yes, adding dates to awards? [18:07:45] in my case, adding extant french départements to extant french communes [18:08:09] i *almost* got it, but not quite. definitely getting closer though ! [18:09:50] an incredible amount of them (about half) are P131 into cantons, which in fact aren't a P131 for communes. one of our administrative doozies :D [18:09:51] P131 Failures when applying 178205 - https://phabricator.wikimedia.org/P131 [18:10:41] communes are P:131 départements, cantons are P:131 arrondissements, and arrondissements are P:131 départements ; but communes are NOT P:131 cantons or P:131 arrondissements [18:11:03] even our geography clusterfucks have clusterfucks :p [18:11:03] Ah yes [18:11:33] some cantons are smaller than communes, some are bigger, some have part of one and part or all of others... [18:11:37] Still need to split some Dutch municipalities and places [18:12:07] i even managed to find a canton which has TWO parts of a single commune, and these two parts are not contiguous with each other [18:12:18] (reims-8 canton) [18:57:29] now pounding wdqs with full force of a massive query that should do it all :D [18:57:52] 1 minute and counting, which is odd since i thought it would deadline after 30 seconds :/ [18:59:08] and by "massive", i mean 113 lines, 13 filter not exists, 1 minus ^^° [18:59:29] darn, Error : could not contact server, i'm cursed XD [19:05:03] http://tinyurl.com/z8vau6c Hashtag #massive [19:09:04] is there a limit in size to sparql queries, including comments ? [19:09:17] the queries themselves, not the resultsets [19:13:44] i'll try without the unions [19:17:48] still deadlining :-( [19:49:48] https://www.wikidata.org/wiki/Q21175641 what on earth is that ? :D [19:51:16] something for https://www.wikidata.org/wiki/Property:P2241 [19:51:17] P2241 gradle.properties - https://phabricator.wikimedia.org/P2241 [19:53:49] yup, but still :D [21:02:30] * dachary fishing for reviews of https://www.wikidata.org/wiki/Wikidata:Property_proposal/Creative_work#software_package