[06:55:53] hi [06:56:17] is there a plan to deploy wikidata at wv? [06:56:19] and when? [06:58:48] wikiversity? [06:59:26] yep legoktm [06:59:37] Juandev: I don't see a proposal created on https://www.wikidata.org/wiki/Wikidata:Sister_projects, but there's nothing stopping you from starting it :) [07:00:07] well, I thought the deployment was global. to all wmf projects [07:00:09] Once there's a proposal which has wikidatans + wikiversity-ians (not sure what the name is...) and the devs have said it's feasible, they'll put a timetable together [07:00:23] right, but its happening in stages [07:01:09] first was wikipedias, then wikivoyages, and currently wikisource deployment (phase 2 is tomorrow) [07:01:24] Juandev: https://www.wikidata.org/wiki/Wikidata:Development_plan#Access_for_remaining_sister_projects [07:01:30] and what legoktm said [07:01:39] ohai Lydia_WMDE o/ [07:01:43] ohai! [07:06:54] legoktm: and wikipedians also proposed?:-) [07:07:14] they're the ones who proposed the project in the first place :P [07:07:21] well, that's an understatement [07:07:50] Lydia_WMDE: an how long it will take? any estimate? [07:08:50] Juandev: sorry, no - but expect at least 1,5 months for each project [07:08:59] I see [07:09:23] so wikidata operates interwiki, what else? you were talking about infoboxes? [07:09:38] the properties and statements [07:09:40] what about wikt at all?:-) [07:09:45] aka phase2 aka infoboxes [07:10:24] Juandev: https://commons.wikimedia.org/wiki/File:Wikidata_statement.svg is basically what a statement is [07:10:54] so on the page about [[London]], you'd be able to use {{#property:population}} and it would give the population from wikidata [07:11:01] or a much cooler Lua interface [07:11:26] geee, col [07:11:28] cool [07:11:43] also https://www.wikidata.org/wiki/Help:Statements has more info [07:12:19] i'm not sure what the applications for wikiversity are, but the data is available for you to use :) [07:12:33] well, I think we should think how to use it in wv, we have there some databases already [07:14:04] right [07:14:15] first you probably want to figure out how to link pages to existing items [07:14:24] or if that should even be done? [07:14:33] (im not familiar with wikiversity at all) [07:16:34] I see [07:23:02] well, time to k-a and two bubble gums [08:39:04] (03CR) 10WikidataJenkins: "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115310 (owner: 10Thiemo Mättig (WMDE)) [08:40:17] (03PS3) 10Tobias Gritschacher: Browsertests for statements with item values [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/110966 [08:43:30] (03CR) 10WikidataJenkins: [V: 04-1] "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/110966 (owner: 10Tobias Gritschacher) [08:47:57] (03CR) 10WikidataJenkins: "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/110966 (owner: 10Tobias Gritschacher) [09:07:10] (03CR) 10WikidataJenkins: "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/110966 (owner: 10Tobias Gritschacher) [09:13:21] (03CR) 10WikidataJenkins: "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/110966 (owner: 10Tobias Gritschacher) [09:20:03] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/110966 (owner: 10Tobias Gritschacher) [09:20:34] (03PS3) 10Tobias Gritschacher: Wikibase parsing of diff=... parameter was different from core [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115310 (owner: 10Thiemo Mättig (WMDE)) [09:25:37] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115310 (owner: 10Thiemo Mättig (WMDE)) [09:29:51] (03PS1) 10Henning Snater: toolbareditgroup: Registering "cancel" event handler in getButton() [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115351 [09:35:05] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115351 (owner: 10Henning Snater) [09:39:15] (03CR) 10Tobias Gritschacher: [C: 032] toolbareditgroup: Registering "cancel" event handler in getButton() [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115351 (owner: 10Henning Snater) [09:39:36] (03Merged) 10jenkins-bot: toolbareditgroup: Registering "cancel" event handler in getButton() [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115351 (owner: 10Henning Snater) [09:45:37] hoi, the system is not stable ... there are spurious error messages that indicate things like loss of authorisation, loss of connection, database errors [09:51:12] (03PS1) 10Aude: fix reference to WikibaseRepo in client [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115353 [09:56:19] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115353 (owner: 10Aude) [10:00:26] (03PS1) 10: New Wikidata Build - 25/02/2014 10:00 [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [10:02:14] (03CR) 10WikidataJenkins: [V: 04-1] "Build Failed" [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [10:04:05] oh no [10:04:59] looks like core issue [10:08:05] (03CR) 10Tobias Gritschacher: [C: 031] "Looks fine to me." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115310 (owner: 10Thiemo Mättig (WMDE)) [10:09:58] (03PS2) 10Aude: New Wikidata build - 25/02/2014 10:00 [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [10:10:47] (03CR) 10WikidataJenkins: [V: 04-1] "Build Failed" [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [10:19:20] (03CR) 10WikidataJenkins: [C: 032 V: 032] "Build Successful" [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [10:21:59] (03Merged) 10jenkins-bot: New Wikidata build - 25/02/2014 10:00 [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [10:44:15] (03PS1) 10Addshore: Fix composer require versions [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 [10:45:28] (03CR) 10WikidataJenkins: [V: 04-1] "Build Failed" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 (owner: 10Addshore) [11:36:17] (03PS2) 10Addshore: More control in compoer require versions [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 [11:36:22] Tobi_WMDE: ^^ [11:37:19] waiting for jenkins [11:37:31] Hi! please help as admin at https://www.wikidata.org/wiki/User_talk:Magnus_Manske#authority_control.js Thanks! [11:41:19] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 (owner: 10Addshore) [11:43:05] (03CR) 10Tobias Gritschacher: [C: 032] More control in compoer require versions [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 (owner: 10Addshore) [11:43:07] (03CR) 10WikidataJenkins: "Browsertests for new build on beta were successful" [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/115355 [11:43:25] (03Merged) 10jenkins-bot: More control in compoer require versions [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 (owner: 10Addshore) [11:46:48] addshore: aude: Lydia_WMDE: wohoo!! http://wikidata-jenkins.wmflabs.org/ci/job/wikidata-build-browsertests-performance/performance/ [11:49:36] \o/ [11:50:29] https://gerrit.wikimedia.org/r/#/c/115366/ [11:53:15] Tobi_WMDE: holy shit... [12:15:12] (03CR) 10Nikerabbit: More control in compoer require versions (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115362 (owner: 10Addshore) [13:04:04] [13Common] 15thiemowmde comment on pull request #3 145f310a5: I think this must be aware of null: return $dataValue->getValue() !== null ? $dataValue->getValue() : null; 02http://git.io/XcPcFg [13:05:46] [13Common] 15thiemowmde comment on pull request #3 145f310a5: Please add:... 02http://git.io/rKosGg [14:14:49] (03CR) 10Addshore: [C: 04-1] Implement TimeParsers [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [14:39:10] aude: did we deploy the new version on test.wikidata.org already? [14:42:05] last night [14:53:13] yay [14:55:28] (03PS31) 10Addshore: Implement TimeParsers [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 [14:55:39] (03CR) 10jenkins-bot: [V: 04-1] Implement TimeParsers [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [14:57:00] (03PS32) 10Addshore: Implement TimeParsers [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 [14:57:10] (03CR) 10jenkins-bot: [V: 04-1] Implement TimeParsers [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [14:57:18] aude: addshore: ähhm.. https://test.wikidata.org/wiki/Q232 [14:57:18] (03CR) 10Addshore: [C: 04-1] "PS32 is a rebase" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [14:58:15] yikes [14:58:19] what did you do? [14:59:09] aude: nothing [14:59:17] this is just an item that does not exist [14:59:31] hmmm [14:59:37] you can reproduce that with every Q that does not exist [14:59:47] bit I can't reproduce locally [14:59:52] me neither [14:59:52] *but [15:01:24] aude: oh [15:01:33] bad news... i think there is an issue with logging exceptiosn not of MWException type [15:01:33] no you CANNOT reproduce it with every Q [15:01:40] oooo [15:01:52] https://test.wikidata.org/wiki/Q444 [15:01:57] works [15:02:03] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:03:01] oh, i know... [15:03:11] probably @include_once is suppressing it perhaps [15:03:30] Q232 is supposed to be italy [15:03:52] didn't we remove the @? [15:03:59] there was a patch by hoo [15:04:01] not in the branch [15:04:17] https://test.wikidata.org/wiki/Special:EntityData/Q232.json works [15:04:22] oh [15:04:31] https://test.wikidata.org/wiki/Q232 not [15:04:48] so your question was right: what did I do?? [15:04:53] ahhh [15:04:58] no clue [15:05:11] * aude wonders why this does not appear in exception log [15:05:29] makes it almost impossible to debug :( [15:05:38] also it does not appear in the recent changes [15:05:39] :) [15:06:08] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:06:09] oh daniel made [15:06:10] it [15:06:20] "From Q38" [15:06:57] ah [15:06:59] ok [15:07:11] so, there's probably something wrong with the data then.. [15:07:20] aude: can you remove q232? [15:07:21] maybe special:import [15:07:37] then I let the selenium test create a new Italy. :-) [15:07:49] first, try to reproduce [15:08:00] my user is not able to remove entities on test. [15:09:38] https://test.wikidata.org/w/index.php?title=Special:Import&action=submit [15:09:41] throws exception [15:12:34] don't have permissions for Special:Import [15:12:54] the import did work [15:12:59] i just got database transaction warnings [15:13:29] aude: perhaps the import was not working back then when daniel was doing it [15:13:36] that was already some time ago [15:13:49] but on test wikidata i get exception [15:14:12] aude: when looking at an imported emntity? [15:14:25] it does a transwiki import though on test wikidata [15:14:32] i don't have that on my dev wiki [15:14:43] Tobi_WMDE: when submitting the special page [15:14:52] ok [15:15:11] are you tobijat? [15:16:24] yes [15:16:29] you can now try https://test.wikidata.org/wiki/Special:Import [15:17:16] now i have stack traces! [15:17:30] "Malformed quantity: Q23" from quanitty parser [15:17:30] Unexpected non-MediaWiki exception encountered, of type "InvalidArgumentException" [15:17:44] aude: ok [15:17:51] http://dpaste.com/1658053/ [15:18:47] why does it want to parse "Q23" as quantity? [15:18:56] it should fail [15:18:56] ParseException( 'Malformed quantity: ' . $value ); [15:19:07] that seems fine [15:19:08] not InvalidArgumentException though maybe it gets rethrown [15:19:18] oh..... [15:19:31] probably a mismatch of property type in the json [15:19:39] vs the actual property on test wikidata [15:19:50] yes. that can it be [15:20:14] I guess that's it [15:20:27] that will not happen when the selenium test creats the entity [15:20:35] it will create all of the properties first [15:20:36] :) [15:21:05] well, we need to fix [15:21:11] i can't just delete it yet [15:21:35] Q23 might be from my imported item also [15:22:25] gah, https://www.wikidata.org/wiki/Special:EntityData/Q60.json [15:22:32] unavailable [15:23:09] (not new issue) [15:24:45] Tobi_WMDE: http://dpaste.com/1658076/ is for Q232 [15:25:21] aude: no? [15:25:21] so? [15:25:21] imports do not work and never did I guess [15:25:40] why is https://test.wikidata.org/w/api.php?action=wbparsevalue&format=json&parser=quantity&values=Q232&options=%7B%7D [15:25:44] being requested [15:27:05] aude: the 503 is https://bugzilla.wikimedia.org/show_bug.cgi?id=60003 right? [15:27:05] but it's also happening for small items, e.g. q2 [15:27:13] think so [15:28:01] so it is not a "huge item" problem as mentioned in the bug [15:28:01] I don't know waht special:import is doing [15:28:01] are we supporting imports? [15:28:01] is it supposed to work? [15:28:05] i guess not.. [15:28:23] why would somebody want to import an entity from one wikibase repo to another? [15:28:27] one had to create all of the properties first [15:28:38] and all of the referenced items [15:28:52] and their referenced items... [15:28:58] not exactly sure [15:29:06] and the referenced items of these.. aso [15:29:11] it's not supported to import wikitext to entity ns, for example [15:29:49] and it should also not be possible to import entities with special:import [15:30:02] it was never possible i guess [15:30:11] where would somebody import from? [15:30:50] think i will backport the change that eliminates the @ [15:31:00] aude: might be good [15:31:08] there might be more than one exception occurring [15:31:14] so, we can't get rid of 232? [15:31:22] then I have to create different testdata [15:31:25] important for debugging [15:31:37] try infectious disease :) [15:32:39] I'll create a second i`taly [15:32:44] ok [15:34:54] (03PS1) 10Aude: Don't use @include as that supresses useful error output [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115383 [15:38:47] Tobi_WMDE: i imported Q232 to my test wiki [15:38:52] http://dpaste.com/1658116/ [15:39:42] (03PS33) 10Addshore: Implement TimeParsers + poke TimeFormatters [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 [15:39:44] mismatching property for snak, vs what i have in wb_property_info etc for the actual property [15:39:51] (03CR) 10jenkins-bot: [V: 04-1] Implement TimeParsers + poke TimeFormatters [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:40:27] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115383 (owner: 10Aude) [15:41:06] (03PS34) 10Addshore: Implement TimeParsers + poke TimeFormatters [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 [15:41:15] (03CR) 10jenkins-bot: [V: 04-1] Implement TimeParsers + poke TimeFormatters [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:41:37] (03CR) 10Addshore: "WHYYYY!" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:42:45] (03PS35) 10Addshore: Implement TimeParsers + poke TimeFormatters [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 [15:47:04] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:48:58] (03CR) 10Hoo man: [C: 032] "Per master" [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115383 (owner: 10Aude) [15:49:13] (03Merged) 10jenkins-bot: Don't use @include as that supresses useful error output [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115383 (owner: 10Aude) [15:49:47] might as well backbort the composer.json patch too :O [15:50:00] keep the branch tidy ;p [15:51:26] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:53:21] addshore: sounds good [15:53:35] although i think removing @ might not be enough in this case [15:55:45] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/111464 (owner: 10Addshore) [15:57:24] (03PS1) 10Addshore: More control in composer require versions [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115389 [16:02:51] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115389 (owner: 10Addshore) [16:19:55] Tobi_WMDE: http://pastebin.com/A380DtYp [16:20:37] addshore: Interesting :p [16:30:46] (03PS1) 10Aude: backport Iafcc7c, remove @ from @include_once for WikibaseLib [extensions/Wikidata] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115392 [16:38:53] (03CR) 10WikidataJenkins: [C: 032 V: 032] "Build Successful" [extensions/Wikidata] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115392 (owner: 10Aude) [16:41:35] (03Merged) 10jenkins-bot: backport Iafcc7c, remove @ from @include_once for WikibaseLib [extensions/Wikidata] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115392 (owner: 10Aude) [16:42:42] amazing [16:51:49] (03PS1) 10Aude: Handle InvalidArgumentException in ClaimHtmlGenerator [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 [16:52:41] (03CR) 10jenkins-bot: [V: 04-1] Handle InvalidArgumentException in ClaimHtmlGenerator [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [16:52:58] rage [16:53:53] (03CR) 10WikidataJenkins: [V: 04-1] "Build Failed" [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [16:54:22] (03PS2) 10Aude: Handle InvalidArgumentException in ClaimHtmlGenerator [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 [16:55:19] (03CR) 10jenkins-bot: [V: 04-1] Handle InvalidArgumentException in ClaimHtmlGenerator [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [16:56:20] (03CR) 10WikidataJenkins: [V: 04-1] "Build Failed" [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [16:56:26] (03PS3) 10Aude: Handle InvalidArgumentException in ClaimHtmlGenerator [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 [17:01:52] (03CR) 10WikidataJenkins: [V: 032] "Build Successful" [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [17:37:57] DanielK_WMDE__: around? [17:39:28] well, forgot something :P [17:39:37] hoo: sorta [17:39:40] I'll be back in maybe 20 minutes to do the dump debug stuff [17:39:41] sorry [17:53:29] hoo|away: DanielK_WMDE__ willing to look at https://gerrit.wikimedia.org/r/115397 ? [17:53:48] it might help us debug / resolve problem in https://test.wikidata.org/wiki/Q232 [17:54:08] (and in case it's a problem on wikidata somewhere) [18:07:33] (03CR) 10Addshore: [C: 032] Handle InvalidArgumentException in ClaimHtmlGenerator [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [18:23:52] (03PS1) 10Aude: Update build, fixes uncaught InvalidArgumentException [extensions/Wikidata] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115415 [18:31:56] (03CR) 10WikidataJenkins: [C: 032 V: 032] "Build Successful" [extensions/Wikidata] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115415 (owner: 10Aude) [18:34:35] (03Merged) 10jenkins-bot: Update build, fixes uncaught InvalidArgumentException [extensions/Wikidata] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115415 (owner: 10Aude) [18:46:48] https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/extensions/Wikidata,n,z [18:46:51] Lydia_WMDE: [18:46:54] i am looking at each diff [18:47:10] can also look at https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/extensions/Wikibase,n,z [19:06:21] DanielK_WMDE__: it's ridiculous slow... will try to get some useful profiler output [19:08:02] i wonder if it's db bottleneck? [19:08:32] aude: looking.. the current version would take about two weeks to run on all items (very rough estimate, of course) [19:08:35] could the dump script instead work from an xml dump and make json dump [19:08:59] no db access needed in that case, use the pages-current (whatnot) .xml [19:09:32] or maybe have such option [19:09:55] hi aude, how are things going here? [19:09:56] aude: Lets first wait for the profiler [19:10:01] hoo: ok [19:10:08] working with pages current is pretty fast [19:10:17] multichill: good :) [19:12:09] aude: Is Reedy even around this week? Haven't seen him a single time... AFAIR [19:12:15] no idea [19:12:47] pages-meta-current.xml is what i use [19:12:48] mh [19:15:00] yikes... ram usage of the maint. script was over 1.2GiB at the end... [19:15:14] :( [19:15:15] only ran it for about 6k entities, though [19:15:17] I'm usually around [19:15:21] It's whether i'm paying any attention [19:15:23] Or sleeping [19:15:56] Reedy: yay [19:16:16] thought maybe it was holiday [19:16:31] More like exhaustion [19:16:39] * aude nods [19:16:40] DanielK_WMDE__: Ok, first of all: The script is leaking memory... it takes a few MiB more every few seconds... [19:17:10] before deployment, we need https://gerrit.wikimedia.org/r/#/c/115418/ [19:17:17] after deployment: https://gerrit.wikimedia.org/r/#/c/115393/ [19:17:29] anytime: https://gerrit.wikimedia.org/r/#/c/115366/ [19:20:10] aude: we should put the new Lua doc. live I guess https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/docs/lua.wiki [19:20:23] or only on Thursday then it's on the 'pedias? [19:22:10] Reedy: Ariel said you are the person to add people to the wmf-deployment group (in gerrit). At least Katie should have it now, but I could also use it [19:22:30] AFAIK i can't [19:22:39] I've no rights on gerrit [19:22:44] :P [19:22:50] Who's the one, then? [19:22:58] Almost as many as I do on bugzilla [19:23:11] Hmm, maybe I can [19:23:17] Looks like it's self replicating [19:23:44] DanielK_WMDE__: Ok, got the profiling stuff... but I guess you wont like it [19:23:50] hoo: aude isn't in it now... [19:24:08] Reedy: i can ask chad, rob, greg [19:24:35] DanielK_WMDE__: aude: http://fpaste.org/80323/39335618/ [19:24:37] Don't think greg can do it [19:24:47] Reedy: he can say ok [19:24:50] also i can ask ori [19:24:53] I've added you both [19:25:01] :) [19:25:11] :) [19:25:31] Once I've got my mortal, I might also use that as part of my volunteer work :) [19:25:55] i don't want to self merge, so happy to have hoo review [19:26:27] I still believe the bottleneck with dumpJson is memory usage... as that's PHP's general weak point [19:26:50] hmmm [19:27:16] i ran through entire dump with php and put stuff into my db [19:27:32] aude: using what script? [19:27:33] it wasn't bad at all [19:27:38] i'll have to dig it up [19:28:09] https://github.com/filbertkm/wikidata-dump-parser for handling the dump, putting into a db (postgres) [19:28:11] I guess one of the problems is that Wikibase itself statically caches entities [19:28:19] then i used php / data model to read that [19:28:25] the blobs, all of them [19:28:29] :> [19:28:35] then extract coordinates, globes, etc [19:28:41] which is a terrible bad idea if we iterate through all of them and have no logic to keep that cache small [19:28:41] and put them in my db [19:29:32] * aude gave up trying to parse json with java [19:29:43] when i found out doing it with php was good enough [19:30:00] aude: mh... did you also have these memory leaks? [19:30:19] I still think it's because we cache every entity and that ends up slowing down PHP a lot [19:30:28] don't think so, don't remember [19:30:34] hoo: seems possible [19:30:53] we don't even need wikibase to do this [19:31:02] just wikibase data model and load the blobs from the db [19:32:08] That sounds like a saner idea [19:32:31] oooh https://test.wikidata.org/wiki/Q232 is viewable [19:32:42] not pretty or fast, but not a blank page either [19:32:43] although that probably requires some ugly logic or whatever to get the data... [19:32:45] Tobi_WMDE: ^ [19:33:16] i used the dump to make a table with entity id + blob + rev id [19:33:31] 2 step process, but both were reasonably fast [19:33:39] aude: what happened? [19:33:56] Tobi_WMDE: we deployed the fix for the exception [19:34:19] oh, yes [19:34:35] there's no perf on terbium :( [19:35:00] is that where you are trying the script? [19:35:12] aude: I was running it there, yes [19:35:17] ok [19:35:29] why? I think that's the host I'm supposed to use for that [19:35:34] unless wikitech fooled me [19:35:37] i don't know which to use [19:35:42] just curious [19:36:07] Well, wikitech said that and it has a lot of spare resources :) [19:36:21] ok [19:36:35] https://bugzilla.wikimedia.org/show_bug.cgi?id=54369 [19:36:40] ariel agrees [19:37:19] hoo: re caching: you could be right that that is a problem. The patch refactoring the caching infrastructure is at https://gerrit.wikimedia.org/r/#/c/107391/ [19:37:37] it includes a way to get a non-caching entity lookup (if it's not in this patch, it's in a follow-up) [19:37:55] DanielK_WMDE__: would help [19:38:06] cache size should at very least be limited [19:38:07] hoo, aude: reviews of that patch would be very welcome, i'll try to follow up on this while i'm on "vacation" [19:38:53] hoo: limiting in terms of MB is very hard. In terms of entries it's easy; I never got around to implementing a LRU cache for MediaWiki though [19:40:10] aude, hoo: note how the run ariel tried only processed about 10% of all items. not sure why that is. might be that we need to chunk the query that provides the IDs. [19:40:36] reading the list of IDs from a file would be a way to work around this, for testing. there's already an option for that [19:40:44] DanielK_WMDE__: mysql might not have been able to get all results (client side) [19:41:02] hoo: shouldn't it fail if that happens? will it just silently ignore the rest? [19:41:28] also, we could turn pre-fetch off... but then we'd have to use a separate db connection [19:41:29] my bot (running on Labs) takes a lot of ram (~1GiB) only to dump ids belonging to enwiki... [19:41:33] all nasty. chunking is better [19:41:46] hoo: turn pre-fetching off :) [19:42:18] wait, pre-fetch isn't the correct term. grr [19:42:24] what was that called again?... [19:42:51] pre fetch is replication related [19:43:11] deployed! [19:43:21] yay :) [19:43:52] Reedy: https://gerrit.wikimedia.org/r/#/c/115393/ anytime [19:45:03] hoo: i'm talking about unbuffered queries. compare http://us1.php.net/manual/en/mysqlinfo.concepts.buffering.php [19:45:21] but there's a way to do it with the "old" mysql client lib too. [19:46:12] aude: Jenkins says no [19:46:17] gah [19:46:18] ok [19:46:18] actually, you can use bufferResults( false ) on MediaWiki's Database object [19:46:50] Or he's lying [19:46:57] i can rebase [19:47:13] mh [19:47:33] DanielK_WMDE__: Will comment our ideas in teh dumpJson bugs [19:47:48] my ideas are more drastic if nothing simpler works [19:48:41] Yes, but also more hacky [19:49:11] true [19:49:56] data transclusion works on wikisource [19:50:27] hoo: ready to send out messages except for the Lua documentation - i will just copy it now [19:50:38] you can later take care of the link [19:50:41] Reedy: last thing is https://gerrit.wikimedia.org/r/#/c/115366/ [19:50:51] not required for deployment but long overdue [19:52:24] that would also make the propagateChangesToRepo setting unused (at least for the moment) [19:52:28] Lydia_WMDE: Great, thanks :) [19:52:28] Does test2wiki have any needed extra tables? [19:52:36] Reedy: checking [19:52:41] hoo: still used on test.wikidata [19:52:44] err test.wikipedia [19:52:52] doh, right [19:53:08] we mgiht want to bump cache key for test2 [19:54:30] aude: Yeah [19:54:34] Reedy: one sec [19:54:35] Reedy: Tables should be there [19:54:59] not sure about the required sites [19:55:12] sites? [19:56:12] mysql:wikiadmin@db1038 [testwikidatawiki]> SELECT * FROM sites WHERE site_global_key = 'test2wiki'; [19:56:16] it's there [19:56:22] it still pretends to be enwiki [19:56:41] i guess we can change, but might be second step [19:56:56] uh, that's weird [19:57:20] i'm not sure about adding test2wiki to the site links [19:59:41] hoo: Reedy https://gerrit.wikimedia.org/r/#/c/115366/ [20:00:21] looking [20:00:34] i can't add test2 as a site link on test.wikidata [20:01:25] aude: Why do we have on cache prefix for all content wikis, but 2 for the test ones? [20:01:31] * one [20:02:35] probably not necessary now [20:03:58] mh, I'd rather have it as near on production as possible [20:04:08] * to [20:04:11] https://gerrit.wikimedia.org/r/#/c/115366/ [20:05:04] Ok, that should be good :) [20:05:56] grrrrt seems missing [20:06:16] it went down earlier AFAIR [20:06:21] i'll try restarting [20:08:12] he's back [20:08:21] (03CR) 10Daniel Kinzler: "hm, too late, i guess :)" (033 comments) [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [20:08:49] DanielK_WMDE__: shoudl be done nice in master [20:09:13] more details to do nicer than that [20:10:52] The bot sends up missed events? Neat :) [20:14:39] [travis-ci] wikimedia/mediawiki-extensions-WikibaseQuery#105 (master - 8bc2015 : Translation updater bot): The build has errored. [20:14:39] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseQuery/compare/ee4a7b09a0a7...8bc20153e623 [20:14:39] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseQuery/builds/19604497 [20:15:03] (03CR) 10WikidataJenkins: "Build Successful" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/115505 (owner: 10L10n-bot) [20:15:56] (03CR) 10Aude: Handle InvalidArgumentException in ClaimHtmlGenerator (031 comment) [extensions/Wikibase] (mw1.23-wmf15) - 10https://gerrit.wikimedia.org/r/115397 (owner: 10Aude) [20:21:50] Reedy: hoo going home, back in 30 min or so [20:21:59] ok :) [20:22:03] https://gerrit.wikimedia.org/r/#/c/115366/ can wait or whatever [20:24:12] I'm totally ok with it [20:24:30] would approve it, but not sure Reedy is around to deploy [21:07:26] are item values not linked for anyone else on https://www.wikidata.org/wiki/Q72 ? [21:07:51] https://www.wikidata.org/wiki/Q159 sorry [21:13:15] hmmm after a purge it works [21:16:04] Lydia_WMDE: Is that a general issue? [21:16:38] hoo: bene also just noted it on the mailing list [21:16:44] hoo: Probably - just got reported on the mailinglist [21:16:48] i have seen it in at least 2 items [21:16:53] both times fixed by purging [21:17:02] ok [21:17:24] If I find one, I'll ahve a look [21:17:33] might be that we need to push the cache epoche again [21:17:38] possibly [21:17:57] hoo: https://www.wikidata.org/wiki/Q9876112 [21:18:01] not purged [21:18:13] back [21:18:25] ick [21:18:49] what the hell [21:18:58] oh, I'm not logged in [21:19:21] maybe these were formatted before (non-js) without links? [21:19:36] still in parser cache and now they are being scrapped and re-used in js [21:19:41] possible [21:20:09] aude: Should I bump the cache epoche? [21:20:11] again... [21:20:15] not l ike i get js errors [21:20:19] hoo: sure [21:20:29] i hope that's not a problem [21:20:51] we might ask ori or reedy [21:20:57] last two times it wans't [21:21:20] true [21:21:23] two? [21:21:51] i see [21:21:58] aude: Yep, I think we did it twice for some reason [21:22:05] I at least did it once and you did it once AFAIR [21:22:18] "Needed after Iee39ba8 was deployed" [21:23:45] oh, right [21:26:03] wait, wront timestamp [21:28:25] set it to 8pm today, that should be fine (as we deployed on 7:30p or so) [21:31:58] hmmm, scap scripts moved out of puppet [21:32:19] they're in another repo [21:32:39] mediawiki/tools/scap [21:32:48] yep [21:58:58] hi, I have a Lua question: with the new mw.wikidata.entity how do I get ranks and qualifiers? [22:01:06] rotpunkt: That's not (yet) possible [22:02:03] mhm then the old mw.wikibase.getEntity shouldn't be called Deprecated, imho [22:02:39] ranks and qualifiers are the only reason to access wikidata through lua [22:03:02] rotpunkt: The new one got all the information the old one had available [22:03:19] also ranks and qualifiers_ [22:03:21] ? [22:03:50] rotpunkt: I think so... [22:04:16] I don't see the object method... [22:04:18] sorry, I'm doing like a billion things (in parallel) atm, so might not be to responsive [22:04:32] rotpunkt: It's only yet live on wikivoyage and wikisoruce [22:05:23] sorry I try to be more clear, the new API has mw.wikibase.entity:getProperties and mw.wikibase.entity:formatPropertyValues, so how do I read ranks and qualifiers? [22:06:48] You can only read that from the plain lua table, yet [22:07:02] ok, but it's called "Deprecated" [22:07:16] rotpunkt: Yes, you should use mw.wikibase.getEntityObject [22:07:34] which is almost exactly as mw.wikibase.getEntity despite that it returns non-legacy data [22:08:22] AFAIK mw.wikibase.getEntityObject doesn't contain legacy data, am I wrong? [22:08:41] rotpunkt: no, it doesn't only mw.wikibase.getEntity does [22:08:52] I'm having trouble adding new statements for items; is the problem known? [22:09:05] ok so BY NOW mw.wikibase.getEntityObject it's not a replacement to mw.wikibase.getEntity [22:09:13] rotpunkt: It is [22:09:44] (not on Wikipedia though, as it's not deployed there yet... wait till Thursday) [22:12:36] sorry hoo, but I really don't understand, If mw.wikibase.getEntityObject doesn't contain legacy data, how can I read through it ranks and qualifiers, I have only to undestand this [22:14:44] rotpunkt: I don't know that exactly, you have to examine the raw lua data for that... that's not something we yet really implemented [22:15:21] perfect, and how do I get the raw lua data? [22:15:39] rotpunkt: mw.wikibase.getEntityObject [22:15:45] it returns a lua table [22:15:56] ahhhh, mw.wikibase.getEntityObject also returns raw data.... [22:16:21] Lydia_WMDE: the formatting of links is fixed [22:16:26] aude: yay! [22:16:35] aude: cache version raised? [22:16:38] from the documentation it seems it returns only lua object with methos [22:16:40] rotpunkt: Yes [22:16:41] *methods [22:16:44] no -.- [22:17:45] I can no longer save new statements or qualifiers by pressing Enter key; is the problem known? [22:19:35] Lydia_WMDE: ^ [22:19:53] huh [22:20:21] odder: Can you link the item you were trying to change? [22:20:27] odder: works for me [22:21:01] https://www.wikidata.org/w/index.php?title=Q2439016&diff=112214804&oldid=112214764 [22:21:33] when I press enter, the save link turns grey for a split second as if it was about to save the edit, but then nothing happens [22:21:36] that was before we deployed the parser cache fix [22:21:38] it works if I actually click on it [22:22:24] Also, after I add new statements or qualifiers, they disappear for a split second, too [22:22:33] i can click enter and it saves [22:22:44] edited the coordinates [22:22:58] Works for me, too [22:23:25] https://www.wikidata.org/wiki/Q2439016 [22:23:28] could be we just fixed it [22:23:37] aude: try adding Q4520163 as an adjacent station [22:23:51] * aude trying [22:24:06] can do [22:24:14] has no en label [22:25:38] aude: https://www.wikidata.org/wiki/Q2439016 [22:25:57] trying adding qualifier = Q803003 [22:26:03] connecting line [22:27:28] that works? [22:27:47] aude: ? [22:28:00] aude: I can't save the qualifier by pressing enter, only by clicking on the save button [22:28:04] it's already there [22:28:11] what browser? [22:28:44] aude: for the other station, I mean [22:33:41] ok [22:37:18] odder: might be some difference with browser, gadgets or other factor [22:37:59] maybe file a bug? as others might have the same problem [22:41:06] @hoo, thanks a lot. I have done some tests on test2. So at the end getEntity and getEntityObject return the same table, but with getEntityObject there are also some additional methods (getLabel, getProperties, formatPropertyValues), am I correct? [22:42:04] rotpunkt: Mostly, yes... there are some minor differences which break backward compatibility (getEntity should contain some redundant legacy data) [22:42:44] perfect, I didn't understood this from the docs, I am so sorry. Thanks a lot again. [23:54:33] can someone explaint o me how wikidata works. What I have is that you have eitheir user defined data, or by bots that are put n wikidata pages concerning certaint hings, then they are embedded on the according page on the different wikimedia projects