[02:06:37] do we know how many assertions / triples are in Wikidata ? [02:10:51] a lot. [02:10:56] there are statistics somewhere.. [02:11:29] https://www.wikidata.org/wiki/User:Byrial/Property_statistics [02:11:31] no total row though [02:12:33] and it is stale - August 2013 ;-) [02:12:37] but thank you [02:13:14] it only includes up to property 828, while we are up to P1050 or more [02:15:36] hmm. P248 (stated in) and P143 (imported from) need to be removed - or classified as a different type of claim [02:26:43] I have problems with opening an item [02:26:48] https://www.wikidata.org/wiki/Q15124 [02:26:53] excluding only 'imported from', in August there were 19m triples (claims+qualifiers+sources) across 17m items. [02:27:10] I want to add nds_nl Zuud-Tirool but I can't get it working [02:27:15] anyone? [02:36:29] Romaine: I received an warning [02:36:50] a firefox script warning [02:36:51] that the javascript is failing or something? [02:36:53] yes [02:37:29] and the link to add nds_nl is missing [02:38:08] I would like to add the link to nds_nl but I can't somehow [02:38:12] there was a problematic claim [02:38:16] even while it is not in a list [02:39:09] hmm. ignore that; im not sure what the problem is. [02:40:17] definitely a bug somewhere in the JavaScript [02:41:07] horrible [02:45:27] oh yeah [02:45:35] that is supposed to get fixed in Tuesdays deployment [02:45:43] ok, thanks for the info [07:31:23] Hi! I am looking for a way to query the list of enables site prefixes. https://www.wikidata.org/w/api.php?action=sitematrix shows ALL Wikimedia sites, also those not enables for Wikidata. AFAIK it's wikipedia, wikivoyage and commons, but how do I query only those and how do I make the query also return other Wikimedia projects when these enable Wikidata? [07:32:13] if you're looking for the easy way https://noc.wikimedia.org/conf/wikidataclient.dblist is what you want [07:32:22] the *proper* way would look like... [07:33:13] nichtich: https://www.wikidata.org/w/api.php?action=paraminfo&modules=wbgetentities&format=jsonfm [07:33:33] under parameters, name: 'sites', look at the 'type' key [07:33:39] a bit weird [07:33:58] there probably should be a proper API module for this [07:34:05] i'll file a bug [07:34:10] yes, like many internals of Wikidata ;-) [07:34:15] thanks! [07:34:23] np [07:36:00] https://bugzilla.wikimedia.org/show_bug.cgi?id=58200 [07:36:12] addshore: ^ :D [07:43:21] nichtich: oh hi! [07:43:30] Hi Daniel! [07:44:04] I finally dig into making use of Wikidata: https://metacpan.org/pod/Catmandu::Wikidata [07:44:46] (the "type" thing is actually an artifact how how the API verifies input: it's an enumeration type consisting of all possible values) [07:46:01] nichtich: hm, we *really* should get the JSON dumps working soon. And RDF dumps next. So much to do... [07:46:31] though JSON dumps be finally around the corner. [07:47:27] For me it would be early enough in March 2014 ;-) I organize a student's project to Analyze Wikidata at University Hannover next semester. [07:48:17] nice [07:49:47] nichtich: are you focussing on the thesaurus aspect, the web-of-concepts aspect, or analyzing individual claims? [07:52:09] I have <=10 bachelor students with limited or know programming skills, so I name my project "Normdaten und Wikidata" to not have too ambitious goals. The outcome should be a German manual how to use Wikidata for authority control and maybe enrichment of Wikidata with authority data from other sources. [07:52:23] s/know/no/ [07:53:16] Wikidata lacks easy manuals and I doubt that such manuals will come from people that are already deep involved in Wikidata. [07:54:04] nichtich: sounds like you'll be using a wikidata as a (crowd-controlled) tagging vocabulary. should work quite well :) [07:55:09] yea, you are right. Maybe talk to Ziko, he's trying to get funding for writing a decent manual. [07:55:44] Yes Wikidata as vocabulary. I first thought about importing bibliographic data into wikidata but this would be too diffuclt I guess. [07:56:20] I think we don't need just one manual but multiple manuals for different use-cases and different languages [07:56:28] indeed [07:56:59] i'd love to see a nicely integrated importer for bibliographic data. but getting that "right" isn't easy [07:57:48] for instance, i'd want it to work on different sources, based on an abstract description of each source (on datahub.io, perhaps?) [07:58:58] Better first start with entities abstract works such as the Bible and Marx' Kapital. Links between such works (influenced-by, part-of, derived-from...) are interesting enough. [08:02:25] interresting, yes, but not easy to analyze. I doubt you'll find enough "meat" to generate good data. [08:02:38] btw... perl? really?... [08:07:24] when did geo coordinates went live in wikidata? [08:16:06] Does a claim *always* have *exactely* one mainsnak? [08:17:51] nichtich: yes. [08:18:40] nichtich: but that snak may not denote a specific value - it might be a "somevalue" or "novalue" snak. [08:19:07] (more snak types, like range snaks, may be introduced in the future) [08:19:25] There are no examples of "novalue" and "somevalue" at https://www.wikidata.org/wiki/Help:Wikidata_datamodel [08:20:46] nichtich: better stick to the actual spec https://meta.wikimedia.org/wiki/Wikidata/Data_model or at least the primer https://meta.wikimedia.org/wiki/Wikidata/Notes/Data_model_primer [08:21:07] Yeah, it's "in the Wiki" :P [08:21:31] novalue/somevalue are rarely used at the moment [08:21:48] and the page you linked to refers spcifically to the Lua binding, it seems [08:22:03] but there should be a link to the spec there [08:22:21] oh, actually, there is :) [08:36:52] Wikidata really plans ahead! Just found this in a JSON export: "time" : "+00000002013-10-28T00:00:00Z" [08:40:17] nichtich: all the zeros are mostly useful for negative numbers - giving the time of the earth's formation or something [08:40:57] https://www.wikidata.org/wiki/Q622063 [08:40:58] ("4pm, March 22, about 4 billion years ago") [08:50:38] nichtich: https://www.wikidata.org/wiki/Q1 [08:50:45] "13800 million years ago" [08:51:16] wow, that even has a source :) [08:51:40] btw. what are valid datatypes for sources? [08:51:55] do I have to create a wd item for a book if I want to use it as a source? [08:52:10] lbenedix: no. a source is a list of snaks. [08:52:31] any kind of snak, with any kind of value [08:52:34] sorry, but I have no idea what a snak is... [08:52:37] @Danielk_WMDE: good to have a precision at such time values [08:53:06] lbenedix: in the case of a value snak, its the association of a value with a property [08:53:20] nichtich: precision is important! [08:55:00] it it just me, or is wikidata very slow right now? [08:58:35] page load time should improve with the current round of deployments [08:58:47] though we still need to improve JS execution time on page load [09:01:34] looks like datavalues cannot be properties, wikibase-entityid always refers to a some Qxxx, never to some Pxxx, right? [09:03:54] took about 1min to load Q42: http://lbenedix.monoceres.uberspace.de/screenshots/e8yuj2xotq_2013-12-09_10.03.11.png [09:04:40] lbenedix: to to load with JS disabled. also, compare FF vs. Chrome. [09:07:05] nichtich: yes, though it it no guaranteed to stay that way. we are for instance considering support for claims on properties, which may well reference other properties [09:07:27] these would use the same "value type", but a different "data type". [09:07:36] Yes, you could add another datavalue type for this. [09:07:52] (the "data type" is currently only visible in the definition of the property, not when looking at the use of a property. this will change soopn) [09:08:28] +1 for claims on properties [09:08:43] nichtich: same (data) value type, different (property) data type. confusing distinction, i know. we should not be exposing the value type at all. instead, we are currently hiding the more relevant data type [09:08:44] The "data type" is visible in the JSON as I get from the API - it's in field "type" [09:08:57] nichtich: no, that'S the value type, not the data type :) [09:09:01] as i said - confusing., [09:09:05] does this mean you could add restictions like 'mother' has to be 'female'? [09:09:21] nichtich: the three data types "string", "url" and "commonsMedia" all use the same value type: string. [09:09:45] lbenedix: you could declare them. that doesn't mean the software would understand or enforce them# [09:09:56] but bots or gadgets could [09:10:00] but having them is a good start [09:10:04] @DanielK_WMDE: sorry, I don't understand unless stumbling upon an examples that makes use of both datatype and valuetype with distinct values of both. [09:10:06] currently, this is done with templates on talk pages - that sucks [09:10:17] right now bots are cleaning wikidata and no one knows what they are doing [09:10:32] s/cleaning/modifying/ [09:10:43] yes [09:10:44] nichtich: look at any url value. it uses the data type "url" (but doesn't say so, unless you look at the definition of the respective property) [09:11:34] nichtich: see https://bugzilla.wikimedia.org/show_bug.cgi?id=54320 an related bugs [09:13:06] nichtich: essentially, a value type tells you how the information can be stored, compared, sorted, etc. the data type telsl you how it can be used/interpreted, and what constraints apply. [09:13:50] does at least data type imply the value type? [09:15:13] e.g. data type URL => value type String without any exception [09:16:30] nichtich: generally, yes. though I could imaging a data type migrating to use a different value type (e.g. the iri type for url properties), while keeping support for other representations for backwards compatibility [09:16:42] so, i would not rely on the ⅛implied" value type when reading data [09:17:15] nichtich: i'm not very happy about the split, and even less so about the confusing terminology :/ [09:17:30] i hope things will be come less confusing by always including the data type in the json [09:18:27] I don't mind terminology at the moment I am more interested which parts are redundant or implied by other parts. The JSON output contains several redundant parts [09:34:28] (03CR) 10Daniel Kinzler: [C: 04-1] Remove unused params in EntityView::getHtmlForEditSection (033 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99545 (owner: 10Aude) [09:35:48] (03CR) 10Daniel Kinzler: [C: 032] Improve function documentation in LanguageFallbackChain (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99556 (owner: 10Aude) [09:38:53] Tobi_WMDE: are we pairing today? [09:40:20] (03CR) 10Daniel Kinzler: [C: 032] Use Title::getPrefixedText in UpdateRepoOnMoveJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100217 (owner: 10Hoo man) [09:43:49] (03Merged) 10jenkins-bot: Use Title::getPrefixedText in UpdateRepoOnMoveJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100217 (owner: 10Hoo man) [09:50:46] (03PS3) 10Daniel Kinzler: (bug 45529) use composite indexes on wb_terms. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99660 [09:50:55] (03CR) 10jenkins-bot: [V: 04-1] (bug 45529) use composite indexes on wb_terms. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99660 (owner: 10Daniel Kinzler) [09:54:52] (03PS2) 10Daniel Kinzler: (bug 47135) Make row IDs use BIGINT. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99637 [09:57:28] (03PS3) 10Daniel Kinzler: (bug 47135) Make row IDs use BIGINT. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99637 [09:57:29] damn, gerrit is confusing me again [09:57:29] (03PS4) 10Daniel Kinzler: (bug 45529) use composite indexes on wb_terms. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99660 [10:54:05] does the formatting on https://www.wikidata.org/wiki/Wikidata:Books_task_force look funky to anyone else? [10:56:02] edsu: Yep, looks pretty messed [11:03:08] (03PS2) 10Tobias Gritschacher: Adding readme file to /selenium_cuc/ [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98784 (owner: 10Mayankmadan) [11:05:28] (03CR) 10Zfilipin: [C: 032] Adding readme file to /selenium_cuc/ [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98784 (owner: 10Mayankmadan) [11:09:03] (03Merged) 10jenkins-bot: Adding readme file to /selenium_cuc/ [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98784 (owner: 10Mayankmadan) [11:18:16] (03PS2) 10Zfilipin: Moved Selenium tests to /tests/browser folder [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98823 [11:19:43] is there an api for wikidata separate from the mediawiki api? [11:20:08] (03PS3) 10Zfilipin: Moved Selenium tests to /tests/browser folder [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98823 [11:20:24] edsu: No, we use the same entry point (api.php) [11:20:30] DanielK_WMDE: got a few minutes? [11:20:53] hoo: thanks [11:21:38] (03CR) 10Tobias Gritschacher: [C: 032] Moved Selenium tests to /tests/browser folder [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98823 (owner: 10Zfilipin) [11:25:07] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1370 (master - 5b0e0aa : jenkins-bot): The build was broken. [11:25:07] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/e7c995db312b...5b0e0aade80d [11:25:07] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15162215 [11:25:25] Lydia_WMDE: Ist Jens da? ;) [11:25:43] hoo: gerade reingekommen [11:26:39] Ok, dann machen wir das wohl lieber heute Nachmittag, ich muss eh recht bald weg [11:27:30] (03Merged) 10jenkins-bot: Moved Selenium tests to /tests/browser folder [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/98823 (owner: 10Zfilipin) [11:29:36] (03CR) 10Aude: "see comments. if we want to improve the way messages are handled more broadly in entityview, then best done as a separate change." (033 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99545 (owner: 10Aude) [11:31:31] Lydia_WMDE: i am sick and don't want to spread germs to folks in the office [11:31:44] i might do some stuff at home or mostly rest today :) [11:32:29] (03Draft1) 10Aude: Fix invalid covers tags and cleanup [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100175 [11:40:59] aude: alright. get well soon :) [11:42:42] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1371 (master - 4be67e9 : Zeljko Filipin): The build is still failing. [11:42:43] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/5b0e0aade80d...4be67e99c516 [11:42:43] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15163021 [11:44:11] (03PS2) 10Aude: Cleanup in action tests [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100178 [11:46:59] (03CR) 10jenkins-bot: [V: 04-1] Cleanup in action tests [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100178 (owner: 10Aude) [11:47:23] (03Draft4) 10Aude: Cleanup WikibaseRepo tests and add covers tags [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100186 [11:49:38] Lydia_WMDE: http://cre.fm/cre205-wikidata [11:50:49] johl: \o/ [11:50:51] (03PS3) 10Aude: Cleanup in action tests [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100178 [11:51:19] (03Draft2) 10Aude: Ensure TermPropertyLabelResolve memcached is per language [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100187 [11:51:39] * aude flooding the channel [11:51:53] (03Draft2) 10Aude: Require cache key and duration constructor params [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100188 [11:52:43] (03Draft2) 10Aude: Improve TermPropertyLabelResolver and test, including more cache scenarios [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100192 [11:52:47] (03Draft2) 10Aude: Cleanup code in TermPropertyLabelResolver, split up method [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100191 [11:52:52] (03Draft2) 10Aude: Remove subclassing in TermPropertyLabelResolverTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100190 [11:53:02] (03Draft2) 10Aude: Move MockTermIndex to separate file [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100189 [11:54:26] (03CR) 10Aude: "please start review at bottom of the chain :)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100189 (owner: 10Aude) [12:08:29] aude: Lydia_WMDE whats with the API issues with XML on test? [12:08:36] someone looked into that? [12:09:03] addshore? [12:09:23] Tobi_WMDE: no-one yet [12:09:26] let's ask him in the daily [12:12:42] (03PS1) 10Daniel Kinzler: Refactor SqlStore::doSchemaUpdates. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100361 [12:14:04] (03CR) 10jenkins-bot: [V: 04-1] Refactor SqlStore::doSchemaUpdates. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100361 (owner: 10Daniel Kinzler) [12:34:54] hoo|away: hey. back now. [12:42:10] (03CR) 10Daniel Kinzler: "@springle: that would be great, yes!" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99660 (owner: 10Daniel Kinzler) [13:03:09] (03PS1) 10Addshore: Revert "Use correct result path in getentities module." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100364 [13:03:18] (03CR) 10jenkins-bot: [V: 04-1] Revert "Use correct result path in getentities module." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100364 (owner: 10Addshore) [13:10:41] so yes, this was fixed and merged and then another patch later broke it :D [13:10:43] again :d [13:16:22] (03PS2) 10Daniel Kinzler: Refactor SqlStore::doSchemaUpdates. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100361 [13:26:58] (03CR) 10Daniel Kinzler: [C: 04-1] Refactor SqlStore::doSchemaUpdates. (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100361 (owner: 10Daniel Kinzler) [13:30:50] (03CR) 10Jeroen De Dauw: "huh what?" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100364 (owner: 10Addshore) [13:31:18] addshore: jenkins says no it seems ;-) [13:34:10] (03Abandoned) 10Addshore: Revert "Use correct result path in getentities module." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100364 (owner: 10Addshore) [13:34:22] Lydia_WMDE: indeed :P [13:34:41] its the same issue as before, it was fixed and then rebroken in https://gerrit.wikimedia.org/r/#/c/95374/15 [13:37:18] HAH [13:37:20] bingo [13:38:26] (03PS1) 10Addshore: Fix index tag mode in XML getentities output [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100367 [13:38:38] Tobi_WMDE: Lydia_WMDE ^^ [13:38:54] addshore: thx! [13:38:55] (03CR) 10Jeroen De Dauw: [C: 031] (bug 47135) Make row IDs use BIGINT. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99637 (owner: 10Daniel Kinzler) [13:39:21] will try and write some sort of tet for it [13:39:24] *test [13:39:42] cool [13:40:26] addshore: that would be great [13:40:55] I can at least remember three times now where this broke.. (during the project) [13:46:02] DanielK_WMDE: addshore you're going to join the daily via hangout in 15 mins? [13:46:08] yup [13:46:31] ok [13:47:25] heh, there is no way to force output type in doApiRequest in the api testcase... :< [13:53:35] (03CR) 10Daniel Kinzler: (bug 45529) use composite indexes on wb_terms. (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/99660 (owner: 10Daniel Kinzler) [13:56:41] addshore: the API test case gets the result back as an array, no serialization is performed, so no output format applies... [13:56:50] :< [13:56:56] addshore: but doesn't format=xml work anyway? [13:57:13] well, yes, but that works even without the patch to fix the bug [13:57:45] the only way to really test I can see if to actually make a webrequest [14:03:36] (03PS1) 10Addshore: Fix index tag mode in XML getentities output [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100370 [14:21:56] aude: around ? [14:22:06] hi [14:22:11] aude: Guten Tag. [14:22:19] aude: I noticed https://gerrit.wikimedia.org/r/#/c/99681/ which sets $wgWBClientSettings['hideParserFunctionErrors'] = true; [14:22:27] aude: should I deploy it right now? [14:22:28] don't need that right now [14:22:34] * aude should -1 or abandon [14:22:42] go ahead and abandon so :-] [14:22:46] ok [14:22:56] if you have such changes again feel free to ping me [14:23:04] will be happy to be the marionette pushing the buttons [14:23:13] marionette / puppet [14:23:31] i do wonder, though, what is on https://www.wikidata.org/wiki/Special:Version is different than latest in wmf5 branch [14:23:48] * aude wonders about what is deployed [14:24:12] we have live hacks [14:24:19] oh, ok [14:24:23] either security or debug /performances monitoring [14:24:30] * aude aware [14:24:57] hmm [14:25:02] in this case it is bit scarier :-] [14:25:25] * aude getting shell access so i'll be able to know what's deployed :) [14:25:30] without poking people [14:25:56] so that sha1 version is extracted from the repository HEAD [14:26:06] which is referenced somewhere under the .git directory [14:26:06] yeah [14:26:15] sometime do pull the wmf branch [14:26:22] but only sync-dir the extension being updated [14:26:29] (aka the .git repo is not synced) [14:26:35] so the commit shown might be off :D [14:26:36] ok [14:26:53] that is the case currently. There are 5 more commits , each of them being extensions updates [14:27:04] I would sync the .git dir but I am not sure whether it is a good idea :-] [14:27:12] yeah [14:30:18] i did [14:30:29] 1.23wmf5 (cb0ccd1) [14:30:30] :D [14:30:55] k [14:37:05] Lydia_WMDE: https://bugzilla.wikimedia.org/show_bug.cgi?id=52801 [14:37:26] it's two months old, you think this is out of date? probably DanielK_WMDE knows more [14:42:52] Tobi_WMDE: i'm really not sure, sorry [14:43:40] Lydia_WMDE: at least it says in the serializers "Update json.wiki when you make changes here!!!" :) [14:44:15] Tobi_WMDE see mediawiki\extensions\Wikibase\docs\json.wiki [14:44:43] addshore: yes, that's what we're talking about. Lydia_WMDE thinks it is out of date. [14:44:49] I'm not sure [14:44:58] it wasnt out of date when It was merged [14:45:04] hehe [14:45:06] :) [14:45:08] it might well be up-to-date [14:45:10] I went through everything :P [14:45:10] i hope so [14:45:11] i just don't know ;-) [14:45:26] I think some of the ordering stuff is not in there [14:45:34] however, I've already forwarded it to johannes_wmde [14:46:09] Tobi_WMDE: not sure if I will be able to speedily make these xml tests [14:46:16] Tobi_WMDE: vandrew is asking about code-in task in #wikimedia-dev [14:46:36] currently digging through how the api output formatters actuaklly work [14:47:10] addshore: ok [14:47:53] zeljkof: what did he ask? I just joined, so have no history.. [14:48:17] Tobi_WMDE: he is just asking general stuff, what to do (for now) [14:48:34] ok [15:03:15] addshore: Can you do me a favour and install the new version of the AuthorityControl gadget? [15:03:34] Henning_WMDE: I can also do that [15:03:48] Ah, okay, whoever wants to. :) [15:03:58] https://www.wikidata.org/wiki/User:Henning_(WMDE)/AuthorityControl.js [15:04:14] Should solve some problems (well, all, hopefully)... [15:04:56] Henning_WMDE: I suppose you've tested it and/ or it got code review? [15:06:59] Well, I tested it manually. There are no Unit tests for that, I guess. But you may want to do some code review. [15:09:31] Henning_WMDE: meh :P Gadgets suck... this should really be in some kind of Wikidata specific extension [15:11:21] I agree, that would be nice. Such a bug impact for so few lines. ;) [15:20:16] * hoo got the deployment schedule wrong again [15:20:45] https://bugzilla.wikimedia.org/show_bug.cgi?id=53220#c5 [15:22:29] johl: hey [15:22:54] hoo: hey [15:23:25] johl: Ich hab mal weiterexperimentiert und stehe vor neuen Problemen :P [15:23:37] hoo: erzähl! [15:24:06] Ich verwende jetzt ScribuntoHooks::invokeHook in meiner neuen ParserFunction, funktioniert auch... eigentlich [15:24:26] Problem ist: Das Ding kann natürlich nur lokale Module ansteuern und nicht unser mw.capiunto -.- [15:24:31] hoo: was macht das so? was löst es für probleme? [15:25:15] johl: Das ist die PArserFunction von Scribunto... damit wollte ich möglichst einfach {{#infobox:...}} auf {{#invoke:mw.capiunto:...}} wrappen [15:25:28] (ja ich weiß, mw.capiunto ghet so nicht... das ist ja das Problem) [15:25:29] ah [15:26:14] hoo: also muss immer noch ein lokales modul angelegt werden. [15:26:35] Wenn wir nicht tiefer in Scribunto rumhacken wollen... [15:26:52] Das geht bestimmt, die Frage ist, wie tief ich mich dafür in Scribunto eingraben muss [15:27:09] hoo: nur mal so als quatsch-idee, können wir automatisch beim installieren ein modul anlegen? [15:27:10] $title = Title::makeTitleSafe( NS_MODULE, $moduleName ); [15:27:10] if ( !$title ) [15:27:16] das ist verm. das Hautpproblem [15:27:47] Schwierig, weil es ja eigentlich keinen Installer für Extensions gibt (zumindest nicht im klassichen Sinne) [15:27:53] nur für DB-TAbellen [15:32:30] johl: Irgendeine Idee? Man müsste wohl an den Code ran, der intern aktiv wird, wenn wir 'return mw.capiunto' machen, bzw. ihm irgendwie diesen Einzeiler unterjubeln [15:33:00] hoo: ich muss mal nachdenken :) [15:33:04] ok ;) [15:33:46] hoo: mit anomie haste gesprochen und er hatte die tolle idee mit dem hook [15:33:57] hoo: hatte er zu dem problem eine idee? [15:34:05] Genau, hat aber wohl nicht soweit gedacht, dass wir ja nicht an ein Modul wollen [15:34:17] Zu dem Problem hab ich ihn noch nicht befragt [15:34:24] hoo: ich würde eh mit dem reden, bevor ich in den eingeweiden von scribunto wühle [15:34:33] Glaube aber, da bekomme ich wieder nur zu hören, wie komisch unser Ansatz ist :P [15:34:41] so what [15:34:49] unser ansatz ist halt komisch [15:35:25] johl: Er sit gerade im IRC online, ich frag ihn mal.. [15:35:34] hoo: coolio. [15:41:20] Ich ess mal was, mal sehen wann er antwortet [15:57:26] Abraham_WMDE: Lydia_WMDE: Tobi_WMDE: johl: omnomnom? [15:57:37] JeroenDeDauw: yea [15:58:33] JeroenDeDauw: did you know that "nom de guerre" isn't French for "food fifght"? [15:59:20] JeroenDeDauw: where are you going to order? [15:59:52] johl: http://www.lieferando.de/lieferservice-world-of-pizza-berlin#!cart [16:00:30] JeroenDeDauw: i can pay you in BTC, if that's okay. [16:00:58] johl: you want to get rid of your BTC now? ;p [16:01:15] JeroenDeDauw: no. but i'm short on cash [16:03:06] johl: sure. You should still have my btc address [16:03:27] johl: so what do you want? [16:05:29] JeroenDeDauw: Pizza Gyros, 26cm [16:05:40] evil! [16:06:50] johl: pizzasauce or tzatzinki? [16:07:03] JeroenDeDauw: pizza sauce [16:09:28] JeroenDeDauw: BTC sent [16:12:38] johl: "Get $interpreter, then $result = $interpreter->callFunction( $interpreter->loadString( "Lua code", "label" ) ). See some of the unit tests where they do the same thing. Still, hacky." [16:12:52] \o/ [16:13:09] Werde das nachher noch testen [16:13:21] hoo: klingt geil. [16:13:24] aber erstmal mache ich noch einen accessibility patch an core weiter/ fertig [20:05:13] (03PS1) 10Hoo man: Focus the site field when opening the linkItem dialog [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100425 [21:42:42] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1372 (master - 42706df : Translation updater bot): The build is still failing. [21:42:42] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/4be67e99c516...42706dfa8e27 [21:42:42] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15191836 [22:22:17] Hi addshore [22:28:25] (03CR) 10Aude: [C: 032] "verified and looks good" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100425 (owner: 10Hoo man) [22:29:48] thanks, aude :) [22:30:26] (03CR) 10Aude: [C: 032] Fix index tag mode in XML getentities output [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100367 (owner: 10Addshore) [22:30:46] (03CR) 10Aude: [C: 032] Fix index tag mode in XML getentities output [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100370 (owner: 10Addshore) [22:31:04] sure [22:31:33] aude: I'm confused again about our schedule... when is the next code freeze? [22:32:13] I would like to see "Use Title::getPrefixedText in UpdateRepoOnMoveJob" live to gather statistics [22:32:51] (03PS1) 10Aude: Ensure TermPropertyLabelResolve memcached is per language [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100487 [22:32:57] (03Merged) 10jenkins-bot: Focus the site field when opening the linkItem dialog [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100425 (owner: 10Hoo man) [22:33:34] you mean cherry picked? [22:33:39] i think we cna do that [22:33:45] That's fine also [22:34:00] we deploy tomorrow and then not again until after holidays [22:34:08] I just wondered about the next code freeze becuase nothings on our google calender yet, and I would like to see that change soon ;) [22:34:12] oO [22:34:15] january [22:34:18] Ok, tomorrow please [22:34:26] since this is a real bug [22:34:35] it shouldn't wait, imho [22:34:45] I need statistics soon, cause I'm writing about the development of that feature (and how much MediaWiki sucks)... [22:34:50] sure [22:35:02] :) [22:35:17] (03PS1) 10Aude: Use Title::getPrefixedText in UpdateRepoOnMoveJob [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100488 [22:35:48] (03CR) 10Aude: [C: 032] "this shouldn't wait until after holidays" [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100488 (owner: 10Aude) [22:36:16] (03Merged) 10jenkins-bot: Fix index tag mode in XML getentities output [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/100367 (owner: 10Addshore) [22:36:48] :) I recently gathered some statistics and I was like "we only have 150-400 page moves a day across all wikis... impossible" :P [22:37:10] huh [22:38:06] Then I figured out that this number is nowhere near the actual number... long story short I did a bit of digging around unless I figured that the feature was broken :P [22:38:24] Taht happens if you only ever test with space less titles like "Berlin" :P [22:38:26] should be better tomororw [22:38:31] tomorrow* [22:38:41] looking forward... you're a big help :) [22:38:49] :) [22:39:22] heh [22:39:59] (03Merged) 10jenkins-bot: Fix index tag mode in XML getentities output [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100370 (owner: 10Addshore) [22:40:11] [= [22:46:15] hey [22:46:20] what's up with https://bugzilla.wikimedia.org/show_bug.cgi?id=37601 ? [22:46:36] saper is complaining :p [22:46:49] (03Merged) 10jenkins-bot: Use Title::getPrefixedText in UpdateRepoOnMoveJob [extensions/Wikibase] (mw1.23-wmf6) - 10https://gerrit.wikimedia.org/r/100488 (owner: 10Aude) [22:48:45] orm stuff doesn't work great with postgres [22:48:55] there have been some issues [22:50:13] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1373 (master - 00b6221 : jenkins-bot): The build is still failing. [22:50:13] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/42706dfa8e27...00b62214af85 [22:50:13] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15194782 [22:57:02] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1374 (master - 466214a : jenkins-bot): The build is still failing. [22:57:02] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/00b62214af85...466214ad4263 [22:57:03] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15194948 [23:05:13] I am not blogging about it but I am playing with the latest by Magnus :) [23:10:45] * hoo waves at DanielK_WMDE [23:11:28] DanielK_WMDE: Not nice... but it works (like charm): https://gerrit.wikimedia.org/r/100411 [23:13:52] 'night [23:17:40] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1376 (mw1.23-wmf6 - 00cd7f2 : jenkins-bot): The build was broken. [23:17:40] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/b18e86b553d8...00cd7f2db2e4 [23:17:40] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15195461 [23:19:44] unit tests are so fast on travis, but so slow locally (in a VM) :'-( [23:21:41] need a new computer? [23:22:07] nooo, I love this machine :P [23:22:22] Only the per core performance sucks (badly) [23:22:27] wasn't that fast [23:23:06] aude: Over here they take up to 20 minutes [23:23:12] eek [23:23:18] although it does multiple jobs [23:23:29] give your vm more memory? [23:23:48] legoktm: yes, that's ridiculous ... although it's in a ram disk /dev/shm inside the VM [23:23:56] and of course the VM is hardware accelarated [23:43:25] How to migrate interwiki to Wikidata from sh wiki bia python? [23:48:20] "Time: 13.53 minutes, Memory: 283.25Mb"