[00:42:30] Hello [00:50:37] hi Geoff_ [01:00:19] Hi Nikki, I want to ask about creating wikidata profiles for websites. Can you please help me? [01:02:00] if you're wondering because of freebase, see https://www.wikidata.org/wiki/Help:FAQ/Freebase [01:05:00] Yes and no. Google hints it wants to use wikidata the same way as freebase. And it makes sense to me that capturing structured organisational data makes wikidata appropriate. Do I have the correct interpretation? [01:09:20] I have no idea what google really intends to do (all I know is what it says under the "By adding to Wikidata, I have a free ticket into Google's Knowledge Graph, right?" question :)) [01:10:34] as for capturing structured data, wikidata can do it, but there is also a notability policy and most websites don't meet the criteria on there [01:11:42] Notability Policy - similar critireria to wikipedia's? [01:12:30] wikidata's is at https://www.wikidata.org/wiki/Wikidata:Notability, it's not quite as strict as wikipedia's [01:14:42] Thanks Nikki. I had a quick look. So next question, is there a guide to setting up a profile? [01:18:09] not that I'm aware of [01:20:36] Pity. I am struggling to find my way around. And the main introductory tour doesn't work. [01:22:46] Thanks for your time Nikki. Bye. [01:25:14] hm, it *should* work. maybe it broke again [02:59:10] The regex that can be used to match a valid PropertyId, is it documented somewhere? I'm referring to the one mentioned here https://phabricator.wikimedia.org/P1067 [02:59:25] so that I can refer to it in my patchfix [03:01:46] codezee: The current implementation uses '/^P[1-9]\d*$/i' [03:03:08] https://github.com/wmde/WikibaseDataModel/blob/master/src/Entity/PropertyId.php (yes, I know that's not documentation [03:03:11] ) [03:09:32] hoo|away: thanks anyway! [07:29:48] 'lo everybody [07:30:53] New to wikipedia, and trying to understand what kind of manual and automatic processes there actualy are to import and export Wikidata data to and from Wikipedia. [07:31:15] I've cruised the help a bit, buut it seems their pov is more focused.Anyway, pointers appreciated. [09:09:02] Original exception: [e4b03469] /w/index.php/Special:NewItem MWException from line 343 of gerrit\mediawiki\includes\resourceloader\ResourceLoader.php: ResourceLoader duplicate registration error. Another module has already been registered as jquery.i18n [09:09:03] bah [09:13:59] there can be only one! [09:15:18] hah! I just dont understand where this magical duplicate has come from :P [09:15:32] iirc that is from the language team [09:16:09] hmmmmm [09:16:25] UniversalLanguageSelector perhaps [09:16:32] *disables* [09:16:33] someone mentionned somewhere that we should probably pin globally the npm package we ship [09:16:39] similar to mediawiki/vendor and the global composer [09:16:54] or have jquery.i18n shipped with core :D [09:17:34] yeh, must have been ULS conflicting with something [09:18:14] one day we will caught that from CI :D [09:18:20] catch [09:18:28] :D [10:05:26] hashar: can we not do things like "recheck gate-and-submit" on our jobs? [10:06:18] addshore: yup [10:06:33] what do I have to type? ;) [10:06:44] addshore: the 'test' pipeline is triggered when a patch-set comment match 'recheck' [10:06:55] should be restricted to not run when the patch is CR+2 [10:07:08] then in gate-and-submit add a trigger for patch-set comment having 'recheck' and CR+2 [10:07:24] ahhh [10:07:40] I am not sure the approval restriction works properly on our zuul/gerrit setup though :-/ [10:07:53] sjoerddebruin: thanks :) [10:08:08] another use case was that a CR+1 from a whitelisted user should trigger the test pipeline if only the check pipeline has run [10:08:19] or to say it otherwise, CR+1 from whitelist should run test on V+1 changes [10:08:40] was https://gerrit.wikimedia.org/r/#/c/184886/ [10:08:59] could someone invite me to the admin channel? [10:15:09] nikki: no, your banned ;) [10:15:13] mwhaahhaaaaaaa [10:15:25] aww mean! :P [10:15:46] also nikki you should get a vhost! then I can add you to the invite exempt list [10:16:07] or, well, I can do it without a vhost too (it really depends on how often your hostname will change) [10:18:35] by vhost I guess you mean the hostname mask thing? [10:18:37] vhost/irc cloak [10:19:09] addshore: I dont think the IRC cloak is needed since his nick is registered with Nickserv [10:19:33] mhhhm, if I recall though the invite exception is a channle mode not a chanserv thing [10:19:42] ah [10:19:49] my baad :-} [10:20:12] and yeh, I mean hostname cloak, not vhost :D https://freenode.net/faq.shtml#cloaks [12:59:54] hashar: any idea about https://gerrit.wikimedia.org/r/#/c/227185/ ? O_o [13:00:19] it seems something is broken with a zend mysql combination O_o [13:17:50] 00:00:27.037 ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) [13:17:51] addshore: [13:18:27] addshore: seems mysql disappeared from the Precise slaves :( [13:18:31] ;_; [13:18:52] it's an addshore ! :) [13:18:59] it's an aude ! [13:19:02] :) [13:19:23] *wants to get that thing in before the branch! :P [13:20:25] hello [13:20:51] what thing? [13:21:19] i bet this question had been asked a thousand times, but i failed to find it: why was it decided that data has to be separated from wikipedia instead of standarizing infoboxes etc? [13:21:30] aude: https://gerrit.wikimedia.org/r/#/c/227185/ ;) but some of the CI slaves are slightly broken currently, hence the fails :D [13:22:30] well d33tah I guess the seperation was mainly needed as the same infomation is needed by hundreds of sites [13:22:32] d33tah: so when things like change (e.g. new census), we can update the information in one place [13:22:40] instead of 300x for each wikipedia [13:22:46] ya :) [13:22:57] and is easier for reuse [13:23:33] d33tah: you might also be interested in https://www.mediawiki.org/wiki/Extension:Capiunto which will help with the infoboxes [13:24:00] aude: okay, so it's about not splitting them across languages. but still, such a system could probably be built between languages. [13:24:01] addshore: it seems I have a cloak now [13:24:19] for example by treating enwiki as an authorative source [13:24:38] a system was built between languages, wikidata :P now it just needs better integration [13:24:44] hmmm [13:25:00] is there any plan to allow wikipedia to transparently reference wikidata information? [13:26:14] Something like 'Bill Gates was born on {{D Q5824."date of birth" D}}' [13:28:17] it could work so that wikipedia would detect such a reference, let wikidata know about it and when the underlying wikidata article gets changed, some bot updates all references [13:28:28] it should already be possible to do something like {{#property:P569}} or {{#property:date of birth}} to use data from wikidata [13:29:07] although from what I've heard, people on wikipedia are generally not very keen on using it in articles directly and tend to prefer using it via other templates (e.g. infoboxes) [13:29:24] which sounds like a big problem imho [13:31:57] addshore: somehow mysql disappeared from the slaves :-/ Should be good now. [13:31:59] a big problem that people are reluctant to use it? it's a big change, it'll take time for people to become comfortable with it [13:32:08] awesome hashar :D [13:32:14] addshore: well not yet :) [13:32:19] puppet still running [13:32:25] :D [13:32:39] *holds his finger over the +2 button* [13:33:00] addshore: press !!!!!!!! [13:33:09] hashar: it totally disappeared or just stopped running? [13:33:15] hashar: pressed it!" [13:33:17] stopped running / did not boot [13:33:36] nice :) [13:55:04] Woop, only roughly 11 usages of LibSerializerFactory to go... and 8 of those are in tests :P [14:33:58] la la la la la, remove all of the things [14:47:50] jzerebec1i: so github rate limits us ? :-} [14:48:44] hashar: yup, we have run into the issue before, its just an api class limit for un authenticated api calls [14:49:26] 60 requests per hour [14:49:45] 5000 if you are authed ;) [14:50:32] needs oauth token [14:50:49] afaik to avoid rate limit [14:51:02] yeh, unless we tackle the issue at a bigger level ;) [14:51:39] don't we use composer install with --prefer-dist ? [14:51:49] that should cache the tarball locally on the instances [14:52:48] indeed it does cache, but I guess in the cases we were hitting issue it still had to download over 60 things in an hour [14:53:00] per integration host [14:54:13] :-/ [14:54:22] what do we hit on github? [14:54:28] are they .tar.gz of the tagged version? [14:54:42] Something might have changed with github or with composer meaning it now hits the api were as before it didnt (perhaps) [14:54:54] 00:00:25.969 - Installing wikibase/data-model-serialization (1.6.0) [14:54:54] 00:00:26.024 Downloading https://api.github.com/repos/wmde/WikibaseDataModelSerialization/zipball/1b6df155e1a0a6565789e2258ccee8557ec9a803 [14:57:10] hashar: there is another url path but apparently compose doesn't use it [14:57:24] oh, that might explain why the tour wasn't working for someone earlier... someone had merged the item into Q2 >_< [14:57:31] The download zip link on the site, for example for master, can have master replaced with a hash [14:57:32] https://github.com/wmde/WikibaseDataModelSerialization/archive/1b6df155e1a0a6565789e2258ccee8557ec9a803.zip [14:57:41] which in turn redirects to something like https://codeload.github.com/wmde/WikibaseDataModelSerialization/zip/1b6df155e1a0a6565789e2258ccee8557ec9a803 [14:59:48] hashar: see https://phabricator.wikimedia.org/T106519 for my comment on the /easiest/ way around it ;) [15:09:31] I am really tempted to set a shared proxy :D [15:17:30] hashar: that defiantly sounds like a plan! ;) [15:18:07] that would benefit the other languages tools as well ( pip / gem / npm ) [15:18:17] I am wondering whether varnish could be used as a proxy [15:45:37] addshore: https://github.com/wmde/WikibaseDataModel/pull/516 [15:47:07] JeroenDeDauw: looking [15:47:11] aude: have we branched yet? :O [15:49:53] JeroenDeDauw: I'm happy to merge yours, confused by 513 and 514, so Ill leave those for now and merge yours? [15:50:22] we can always fork another branch off if we want a 3.1 release anyway [15:50:28] addshore: sure. 513 and 514 are not related, appart from them being other stuff that is being worked on in the same component [15:50:49] nothing specifial about this case either, 513 or 514 are not huge things [15:50:56] merged [15:51:11] \o/ [15:51:45] I think I'm going to put my serialization modifier thing in its own lib, might need to use it in 2 more places in Wikibase [15:56:43] or maybe I'll move it... :p [16:06:44] addshore: i don't think so [17:12:24] addshore: do you mean its own component? [17:12:40] Where is it needed besides Wikibase Repository? [17:41:39] https://www.wikidata.org/wiki/User:Aude/uniqueness property label conflicts :/ [17:43:51] branch branch branch [17:45:04] hoo: want to do it? [17:50:39] :D [17:52:20] I can later on [17:52:27] on the phone atm [18:18:55] JeroenDeDauw: thoughts on removing the code I added to result builder and adding it to some new classes? perhaps in Repo/Serialization [18:19:52] Would have a handfull of classes that specifically alter the serialization in the ways that we need. ie. EntitySerializationModifier, SnakListModifier, SiteLinkListModifier, StatementModifier? Each with a small collection of methods? [19:04:35] addshore: sounds plausaible but I'd have to look at the code more to give real feedback [19:05:00] okay, just throwing up a very rough draft with failing tests [19:05:46] https://gerrit.wikimedia.org/r/#/c/227266/ [19:05:56] very rough draft ;) [20:07:23] I'll create the branch later on... need food first [23:56:11] hoo: around? [23:56:49] sure [23:57:21] i'll make the branch now [23:57:24] if not already done [23:57:30] ok, go ahead [23:57:32] k