[00:33:51] (03CR) 10Hoo man: [C: 04-1] "I strongly dislike moving ALL the code into a single messy file... let's not go back to 2007 MediaWiki-style please." (031 comment) [extensions/Capiunto] - 10https://gerrit.wikimedia.org/r/164560 (owner: 10Jackmcbarn) [00:35:14] jackmcbarn: ^ :S [00:36:02] See also https://gerrit.wikimedia.org/r/163063 [01:01:35] (03CR) 10Hoo man: [C: 04-1] "I don't see Title::quickUserCan being used here... also this makes the previous change quite pointless, right?" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/168833 (owner: 10Aude) [01:24:26] 3MediaWiki extensions / 3WikidataRepo: json dumps have duplicate items (one for the redirect, one for the target) - 10https://bugzilla.wikimedia.org/72678#c1 (10Marius Hoch) I wonder whether we want to include information about redirects in there or simply leave redirects out? Just leaving them out will be e... [01:29:56] 3MediaWiki extensions / 3WikidataRepo: Implement a label lookup based on entity info - 10https://bugzilla.wikimedia.org/72307#c1 (10Marius Hoch) entity info? Don't you mean the terms table? In that case this was fixed with https://gerrit.wikimedia.org/r/169330 [01:35:33] 7me f [01:35:36] * hoo facepalms [01:36:40] Tried sending a mail to wikidata-tech using my wikimedia.de mail and it got put on hold :P [01:37:22] heh [01:39:03] Evolution chooses the address to send from from the account you are looking at... lesson learned :D [01:41:58] 3MediaWiki extensions / 3WikidataRepo: Lua error: Failed to serialize data. - 10https://bugzilla.wikimedia.org/71918#c9 (10Marius Hoch) Still occurring after https://github.com/wmde/WikibaseDataModel/commit/05bd532e3ce6f5b9e31259b70acbca27f778e40c has been deployed? [01:48:53] * hoo rages [01:51:58] JeroenDeDauw: Around??? [01:53:23] * hoo will just keep using deprecated stuffs then, if there's no way forward [01:57:33] hoo: it's like that in Module:Infobox, and i don't really see a logical reason they should be separate [01:57:54] jackmcbarn: Because the one is about data logic, the other one about rendering [01:58:02] don't make a mess and put that all back toghet [02:03:46] hoo: i guess [02:04:02] but we should still conditionally load the main one rather than the _render one [02:04:17] both [02:05:09] hoo: yes, technically both, but the first one loading should then pull in the second one without further condition [02:05:12] right? [02:05:29] well, probably only if needed, but that's not that important [02:19:27] (03PS1) 10Hoo man: Make mw.wikibase.entity.formatPropertyValues work for non-items [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170504 (https://bugzilla.wikimedia.org/72124) [02:21:14] (03PS2) 10Hoo man: Make mw.wikibase.entity.formatPropertyValues work for non-items [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170504 (https://bugzilla.wikimedia.org/72124) [02:29:26] 3MediaWiki extensions / 3WikidataClient: Argument 1 passed to WikibaseLuaBindings::renumber() must be an array, integer given - 10https://bugzilla.wikimedia.org/72180#c1 (10Marius Hoch) 5NEW>3RESO/FIX https://github.com/wmde/WikibaseDataModel/commit/05bd532e3ce6f5b9e31259b70acbca27f778e40c should have fi... [02:54:24] (03PS1) 10Hoo man: Show an error if ViewEntityAction is used with non-Entity content [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170506 (https://bugzilla.wikimedia.org/71546) [02:55:40] (03PS2) 10Hoo man: Show an error if ViewEntityAction is used with non-Entity content [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170506 (https://bugzilla.wikimedia.org/71546) [02:59:29] * hoo calls it a day [07:10:12] 3MediaWiki extensions / 3WikidataRepo: Create a Wikibase glossary - 10https://bugzilla.wikimedia.org/70763#c6 (10jeblad) There was a glossary at meta, and later on when the glossary at Wikidata was made I described some of the problems that would arise when when tha community built one glossary on their unde... [10:58:12] Anyone that knows why the wikipedia link tool doesn't work? https://www.wikidata.org/wiki/Wikidata:Project_chat#Birthday_gift:_Missing_Wikipedia_language_links [10:58:30] Added a bunch of links but they does not show up [10:58:47] Does the tool cache the links, or? [12:35:19] Hi [12:35:36] (03CR) 10Ebrahim: "Okay this looks good to me. I just didn't know the only ".wikibase-sitelinkview-page" use is just with creating langlinks table which if i" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/167409 (owner: 10Ebrahim) [12:51:45] Argument 1 passed to Wikibase\Repo\Specials\SpecialEntitiesWithoutPageFactory::__construct() must be an instance of Wikibase\EntityPerPage, instance of Wikibase\Repo\Store\SQL\EntityPerPageTable given [12:51:55] Anyone encountered that before? [12:52:17] extensions/Wikibase/repo/includes/specials/SpecialEntitiesWithoutPageFactory.php on line 47 [12:53:32] extensions/Wikibase/repo/includes/specials/SpecialEntitiesWithoutPageFactory.php on line 26 [12:54:43] Apparently $wikibaseRepo->getStore()->newEntityPerPage() is returning the wrong type? [12:58:55] hmm, quite literally [13:38:25] [13WikibaseDataModel] 15thiemowmde comment on commit 14dd79ac5: Parent? I don't see a parent in `class TermFallBack implements Comparable`. It's either me completely confusing things or this code doesn't compile. 02http://git.io/5nagJg [13:39:10] [13WikibaseDataModel] 15thiemowmde comment on commit 14dd79ac5: `parent::equals` can't do anything since there is no parent, see above. 02http://git.io/WZjLvg [13:41:32] [13WikibaseDataModel] 15thiemowmde comment on commit 14c637664: @filbertkm: Looks like you have the exact same problem as I described in #264. 02http://git.io/Md-oBA [13:56:56] 3MediaWiki extensions / 3WikidataRepo: Lua error: Failed to serialize data. - 10https://bugzilla.wikimedia.org/71918#c10 (10JulesWinnfield-hu) When was deployed? Not occuring for some time. [14:40:45] So, I'm starting to get a handle on how Wikibase and its dependencies are structured, and it seems that implementing a Wiktionary data model will require code changes. [14:40:57] I assumed that Entities were just something you could edit through the interface [14:41:02] but it seems they are hard-coded [14:41:25] which means that there should hopefully be extension points in the code for implementing a Wiktionary extension [14:41:50] (I presume the Commons project is going through similar changes) [14:41:57] (?) [14:51:42] JeroenDeDauw: That's probably most relevant to you ^ [14:55:21] GPHemsley: The code in Wikibase.git still has multiple violations of the rules outlined here https://lists.wikimedia.org/pipermail/wikidata-tech/2014-June/000489.html [14:55:36] Which need to be fixed before we can allow extensions to register new types of entities [14:56:45] GPHemsley: Also, I'd like to see good reason for adding a new type of entity. It's easy to say you could add a new type of entity for Wiktionary, but is that really the best approach, or are you using a golden hammer? [14:57:37] JeroenDeDauw: Have you seen this? https://www.wikidata.org/wiki/Wikidata:Wiktionary/Development/Proposals/2014-10 [15:02:46] GPHemsley: yeah - anything in particular? [15:03:25] JeroenDeDauw: Well, it outlines the different distinct entities. If you have an alternate solution, I'm open to ideas. [15:04:10] GPHemsley: it does not outline why entities should be used [15:04:28] JeroenDeDauw: Well, that's what I mean. What else could be used instead? [15:04:49] JeroenDeDauw: Q-Items are specific to topics that can be covered by a Wikipedia page; they are generally all nouns. [15:05:02] JeroenDeDauw: So it's not feasible to reuse Q-Items. [15:05:37] JeroenDeDauw: Each of the five proposed new entities encompass a new domain. [15:07:13] GPHemsley: perhaps we are not talking about the exact same thing [15:07:32] To me entity means the Entity class (or EntityDocument interface) in our DataModel component [15:07:39] Which has nothing to do with wiki pages [15:08:00] JeroenDeDauw: AIUI, there are currently 3 Entities: Item, Property, and Query. [15:08:14] Just Item and Property [15:08:52] Where does Query fit in? [15:09:23] GPHemsley: the intersection between all these different potential kinds of "entities" is esentially that they have an ID. So why would you mash it all together? [15:09:42] Not having an entity also does not mean you cannot have a wiki page [15:10:24] GPHemsley: at the moment we cannot create a Query entity and have it fit in with the Wikibase.git code, due to the stuff I menationed in that email [15:10:43] JeroenDeDauw: Is there a conflation in terminology? Because I'm getting my information from here: https://www.wikidata.org/wiki/Wikidata:Glossary#Entities.2C_items.2C_properties_and_queries [15:11:38] And here: https://www.mediawiki.org/wiki/Wikibase/DataModel#Values [15:13:21] And a third page which I can't seem to find right now [15:14:01] (or maybe that second one is it...) [15:14:31] but anyway [15:14:58] that's where I'm getting my meaning of "Entity" and "Item"; if there is another set of definitions that apply to the code itself, I am unaware of them [15:17:40] GPHemsley: the description of Entity on the Glossary page is full of fail. You can look at it properties and items on the Wikidata wiki like that at present, but that will not continue to hold true if we support Wiktionary, and it's definitly not describing what an entity means in our data model [15:17:55] Ah, ok [15:19:12] JeroenDeDauw: So is Q-Item on Wikidata a subset of Wikibase Item? [15:19:27] subset/subclass/whatever [15:20:00] A subclass cannot be a subset, so I'm not sure what you mean [15:21:28] GPHemsley: I've been wanting to write up-to-date docs on our data model for some time, but did not do so since we first wanted to fix some issues [15:21:39] Guess those are mosly done now though... [15:22:29] JeroenDeDauw: Let me rephrase: Is the set of Q-Items on Wikidata a strict subset of the set of Wikibase Items? [15:23:02] JeroenDeDauw: Or, put another way, can something be a Wikibase Item (on Wikidata) without being a Wikidata Q-Item? [15:24:12] [13WikibaseDataModel] 15JeroenDeDauw comment on commit 14dd79ac5: Huh wtf?? You are right. I figured this extended `Term`, but it does not. 02http://git.io/8LYcgQ [15:24:57] GPHemsley: no [15:25:42] At least not from the users point of view [15:25:59] JeroenDeDauw: So then I still need the ability to create something that is not a Q-Item [15:26:15] GPHemsley: sure, that is quite clear [15:26:28] The Wiktionary stuff will not be items [15:26:38] OK, so then what are they if not Entities? [15:27:49] (or new Entity types) [15:27:55] GPHemsley: dedicated objects best suited for the task? What would you do if you wanted to represent this information from scratch? [15:29:05] JeroenDeDauw: I'd use the same data model as the proposal; I've designed it without regard for the backend. [15:29:23] JeroenDeDauw: So are you saying that Entity is a false category? [15:30:16] GPHemsley: if you mean having classes that represent a lexeme, a sense, a form, etc, then sure. But why would these be entities? What do you win by making them enties? There certainly are costs, so what makes it wortwhile? [15:30:37] GPHemsley: I do not know what a false category is [15:30:56] GPHemsley: however our original notion of entites just does not hold [15:31:06] JeroenDeDauw: I have no idea. I only called them Entities because that seemed to be the superclass of all Wikidata/Wikibase 'things' [15:32:04] GPHemsley: originally they where specified as all having a fingerprint (labels, descriptions, aliases) and a set of claims. That's not true. They are different, so why regard them as the same thing? [15:33:36] This is the hierarchy in my head: { "Entity": { "Item": {}, "Property": {}, "Query": {}, "Lexeme": {}, "Sense": {}, "Form": {}, "Articulation": {}, "Representation": {} } } [15:33:53] (From a terminology/class standpoint.) [15:35:16] GPHemsley: I suspect it is better to talk about what exactly is required to be able to say something specic, rather than talk about general concepts with confusing terminology [15:35:56] JeroenDeDauw: Well, the proposal shows what I want: 5 independent 'things' that are not Q-Items [15:36:22] They are classes that must not overlap [15:36:43] GPHemsley: sure. Do you want a page for each? Or do you want one page that holds a lexeme which then contains th other ones? [15:37:01] JeroenDeDauw: I personally would want a page for each; others may disagree. [15:37:41] (I would also want reciprocal editing and display, but that's another can of words.) [15:37:57] GPHemsley: so all these kinds of things would need to be identifiable with some id right? [15:38:11] And a lexeme would contain an id of a sense? [15:38:18] right [15:38:28] that's the letter + number combination in the example [15:38:40] Would you be able to have a lexeme without it having a sense? [15:39:00] Sure [15:39:11] Rare, but possible [15:39:36] (At least, with my desired data model. Sense is one that's still up for discussion.) [15:39:44] Does the same go for the other relations? ie having a lexeme without a form or a form without an articulation? [15:40:01] Yes [15:40:38] (In order: nonsense word, unwritten word, written-only word without pronunciation) [15:41:00] (03CR) 10Hoo man: [C: 032] Remove extra dir=auto to fix directionallity problem [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/167409 (owner: 10Ebrahim) [15:42:34] I do not understand this order thing [15:42:48] JeroenDeDauw: Order of your asking about them [15:43:12] What should happen if a user tries to remove a sense that is used by a lexeme? [15:43:59] It should probably prompt if they want to do that, like if you delete a page/template that has links/dependencies [15:44:13] But if they went through with it, then it should remove the link from Lexeme [15:44:28] (Or whatever the best UI decision would be) [15:45:06] The promt is a UI thing, the deletion of all references to something that is removed is not [15:45:38] AIUI, that is not currently done for existing 'things' [15:45:53] doesn't that currently leave dead links behind? [15:45:56] (03Merged) 10jenkins-bot: Remove extra dir=auto to fix directionallity problem [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/167409 (owner: 10Ebrahim) [15:46:03] Either way, I think that's orthogonal to the data model [15:46:38] If you're trying to establish whether all of these are truly distinct, I remain at a firm 'yes' [15:47:35] Indeed, they definitly are distinct, which is why I'm asking the question if they should be entities [15:49:41] I'm getting the impression they should be, at least in the "entity document" sense [15:50:00] Which is all fine for the data model [15:50:15] I could easily implement that without a lot of work [15:50:25] But that leaves the problems in Wikibase.git [15:50:50] What's left? Do you need help? [15:50:58] We'd still need distinct handling from our current "entities", since as you agree they are quite distinct [15:51:15] We have not started implementation work on this [15:51:35] And what I'm saying is that I'm not sure how much it helps us to implement this in the data model at this point [15:51:52] Since we won't be able to use it in Wikibase Client and Wikibase Repository for quite some time [15:52:01] Ah, I see. [15:52:14] We actually did have the Query entity some time back, but I deleted it, since it was just bitrotting [15:52:34] So what's blocking the Wikibase change(s)? [15:52:47] Nothing in particular [15:53:11] I'd be happy to lend a hand in moving things along, if that's what you need. [15:54:02] What needs to be done is fix the incorrect assumtpions about entities in various places. Each of those tends to require some refactoring, which in some cases is not trivial [15:54:14] The simplest occurances have mostly been fixed [15:54:26] 3MediaWiki extensions / 3WikidataRepo: Lua error: Failed to serialize data. - 10https://bugzilla.wikimedia.org/71918#c11 (10Marius Hoch) (In reply to JulesWinnfield-hu from comment #10) > When was deployed? Not occuring for some time. Wednesday (29th) around 8pm (UTC). [15:54:36] GPHemsley: if you could help with that, that'd be awesome [15:54:42] I don't mind refactoring :) [15:54:53] Which is harder, the conceptual refactoring, or the code refactoring? [15:55:36] (03PS1) 10Aude: Fix EntityPerPage namespace in SpecialEntitiesWithoutPageFactory [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170519 [15:56:08] It's a aude! :) [15:56:23] If you can point me to a place that lists all the issues (or draw something up, if nothing exists), I'll take a crack at fixing them. [15:56:29] (03CR) 10Hoo man: [C: 032] "Y we no have tests? :D" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170519 (owner: 10Aude) [15:56:44] GPHemsley: conceptually it's not very hard, and tends to be the same [15:57:40] :) [15:58:22] GPHemsley: you can search for " Entity " (inc the spaces) to get the type hints against Entity [15:58:31] aude: https://github.com/Wikidata-lib/PropertySuggester/pull/107 let its tests run on hhvm [15:58:35] Almost all of those will have bad assumptions [15:58:51] hoo: nice [15:59:14] JeroenDeDauw: Alright, I've gotta run now, but just send me a link or three to where I should be looking, or what I should be looking for, and I'll take a look. [15:59:35] Also, I guess I should subscribe to the mailing list? [15:59:48] aude: :) Did you see my mail about label lookup? [16:00:02] GPHemsley: yeah, the wikidata-tech one [16:00:05] I'd like to make Lua way faster [16:00:17] GPHemsley: bene asked me the same question some time back. Answer: https://lists.wikimedia.org/pipermail/wikidata-tech/2014-August/000546.html [16:00:22] and other components, but Lua for starters [16:01:04] JeroenDeDauw: OK, subscribed. I'll take a look later. Thanks for chatting. :) [16:04:17] hoo: replied [16:04:41] for lua, it would be nice to somehow cache the serialization in the final format [16:05:13] a lot of resources are used to transform it [16:05:24] (03Merged) 10jenkins-bot: Fix EntityPerPage namespace in SpecialEntitiesWithoutPageFactory [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170519 (owner: 10Aude) [16:05:42] aude: Yeah... we only in memory cache that (in Lua) [16:06:04] but memory usage in Lua actually has been a problem before (with Germany eg.) [16:06:23] so if we lookup a lot of labels we can actually hit the 50MB memory limit [16:06:28] that happened on ruwiki [16:07:09] hoo: what do you need the label lookup caching for? Pulling a label from the table should not be that expensive? [16:07:38] hoo: yep [16:08:24] JeroenDeDauw: if not batched, it's less than ideal [16:08:38] yeah... might be good enoughf or Lua, though [16:08:41] if batched, would be ok although i don't know how feasible it is for lua [16:09:02] I don't see why we shouldn't be caching labels in memcached (with low expiry, ofc.) [16:09:07] it's perfect for such tasks [16:09:24] could be either one string per language or a json array of all labels of an entity [16:12:37] we can certainly try [16:13:30] I think it's important to be clear on which use case and which problem you want to solve before you can determine which caching strategy is best and how much it will actually help [16:13:35] maybe I'll prototype something later on [16:13:39] (03PS1) 10Ebrahim: Make sitelink editor match the appropriate language direction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 [16:13:42] And I can see multiple use cases here [16:13:50] hoo: Have a look please [16:13:53] So this whole discussion is a bit fuzzy to me [16:14:10] My use case is: get label of entity in language (no fallback or stuff) [16:14:14] just like that [16:14:23] hoo: no, context needed [16:14:31] if you just do this once, there is no problem [16:14:38] Is this done a lot per request? [16:14:44] Is the same label requested multiple times? [16:14:46] Depends on what people do in Lua [16:14:56] Are distinct requests fetching the same info? [16:15:23] If you just throw some caching at it, you might well not improve anything [16:15:30] probably [16:16:52] JeroenDeDauw: So, what would you suggest? getting hard numbers here is going to be hard [16:17:04] and crunching that data would take a lot of time -> over engineering [16:18:01] ebraminio1: I'm not a fan of embedding more binding with ULS [16:18:10] this JS is messy enough already [16:18:46] hoo: is needed however, would be nice a wrapper would written for direction detection functionallity [16:19:11] hoo: On http://tools.wmflabs.org/ebraminio-dev/w/index.php?title=Item:Q2&uselang=fa edit the link [16:20:23] Wikibase]$ ack --js -i -h -c uls [16:20:24] 40 [16:20:34] that's to much [16:20:37] way to much IMO [16:21:27] hoo: Well I am just responsible for 4 of them :) [16:21:36] current state is this [16:21:37] http://i.imgur.com/VDyHHRo.png [16:21:59] It is confusing because on view state it is okay but on edit not [16:23:52] hoo: be sure this would be last of uls use from me :) [16:24:45] ebraminio1: mh... looks a little weird for me [16:25:08] the rtl input are left aligned and the ltr ones right [16:25:10] aligned [16:25:14] on your test wiki [16:26:06] hoo: I didn't install ULS of course but can I see a screenshot? [16:26:20] brb [16:26:22] door [16:32:27] ok, back [16:36:32] ebraminio1: https://people.wikimedia.org/~hoo/RTL-LTR-input.png [16:40:26] hoo: Weird Chrome has not the issue and Firefox still have that even after the revert of patch [16:41:20] probably regressed before because now http://tools.wmflabs.org/ebraminio-dev/w/index.php?title=Item:Q2&uselang=fa I hard resetted to master and the issue still persists just on Firefox [16:42:40] ebraminio1: On wikidata it looks fine [16:42:52] but fields have width 100% there AFAIS [16:43:27] hoo: probably regressed recently and still not landed on Wikidata? [16:43:37] possible [16:44:10] will look locally [16:44:42] hoo: Ok. So I bring back the patch on my wiki for further tweaks [16:44:59] ebraminio1: Looks fine for me locally [16:45:16] oh wait, I was on some branch [16:45:20] * hoo switches to master [16:46:14] also on msater [16:49:24] hoo: how about with the patch? [16:49:41] hoo: probably I have some local issue [16:49:43] let me check it out [16:51:13] uh [16:51:16] looks weird [16:51:43] maybe caching fooled me [16:52:34] * hoo tries in private mode [16:53:18] I have ULS and it's not working [16:53:28] rtl is totally scrambled AFAIS [16:58:16] hoo: hmm, this.options.value.language is empty [16:58:22] yep [16:58:24] commenting [17:00:07] (03CR) 10Hoo man: [C: 04-1] "Also this will probably act weird with new input fields, as the site id is not know there when constructing the input." (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 (owner: 10Ebrahim) [17:00:15] :S [17:07:56] hoo: I got a solution but not sure if is cool... wb.sites.getSite( this.options.value._siteId )._siteDetails.languageCode [17:08:22] yeah, something like that [17:08:30] not using a private member would be nice [17:08:36] I think there's a getter for that [17:10:26] but you will probably need to also do that in a (each)change method to act if the site changes (that can happen for new inputs) [17:11:15] very important point [17:12:31] hoo: hmm, interestingly thats works already actually [17:13:03] might be that we create a new input if the site changes [17:13:19] on https://www.wikidata.org/wiki/Q3418411?uselang=fa please try to add a lang link for arwiki [17:13:26] we do a lot of such things ... which is a performance nightmare... but a clean code-wise [17:14:07] yep, that actually works [17:14:09] awesome [17:14:17] so no need to worry about that :) [17:15:50] (03CR) 10Hoo man: "Ok, forget my comment about changing sites, we recreate the sitelinkview in these cases. My inline comment still stands, though." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 (owner: 10Ebrahim) [17:23:56] (03PS2) 10Ebrahim: Make sitelink editor match the appropriate language direction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 [17:25:35] hoo: Thank you. an uls free patch. have a look [17:25:46] already looking [17:26:35] (03CR) 10Hoo man: [C: 04-1] "Way better now... I have two picks, still" (032 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 (owner: 10Ebrahim) [17:34:43] (03PS3) 10Ebrahim: Make sitelink editor match the appropriate language direction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 [17:35:20] ebraminio1: Awesome! :) Let me just test it really quick [17:37:01] hoo: Np. I'm cool even if you -1 again and again :P [17:37:51] Not needed, works as expected :) [17:38:12] (03CR) 10Hoo man: [C: 032] "Works like a charm, thank you :)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 (owner: 10Ebrahim) [17:43:55] (03Merged) 10jenkins-bot: Make sitelink editor match the appropriate language direction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170522 (owner: 10Ebrahim) [17:44:39] hoo: Thank you so much :) Now the issue date representation localization [17:44:40] http://tools.wmflabs.org/ebraminio-dev/w/index.php?title=Item:Q2&uselang=fa [17:44:49] 1 ژانویهٔ 1900 [17:44:59] it should become something like ۱ ژانویهٔ ۱۹۰۰ [17:46:19] ebraminio1: Could you please make sure your wiki only loads in https resources? [17:46:28] my firefox goes crazy on http stuff in there [17:48:31] ebraminio1: The formatter should use Language::formatNum to localize these... but I'm not sure it's that trivial [17:48:36] actually I know it's not [17:48:36] hoo: labs is already very slow for me and if I make it force https I should go for walk for page loads :) [17:48:40] https://gerrit.wikimedia.org/r/#/c/102910/2/formatters/BasicFormatter.php [17:49:20] hoo: Yes I traced the code but couldn't find the place for such change [17:49:48] are there any plans to display in links on a wikidata item at any point? [17:50:17] private function getUnocalizedDate( $isoTimestamp, $precision ) { [17:50:18] It is not important though because it already doesn't support our calendar even [17:50:19] People wanting to create inverse properties is coming up a lot [17:50:19] * hoo lulz [17:50:38] wondering if I support or oppose said properties. [17:51:29] ebraminio1: Is there a bug for that already? [17:51:48] hoo: No. But please don't ask me to file one :) [17:52:03] ebraminio1: Then create one, please :D [17:52:17] which is a totally different concept, obviously [17:53:29] ebraminio1: the problem is to be able to parse the dates if they are localized [17:53:37] * aude agrees they should be localized [17:54:08] but the parser needs to understand them when someone clicks edit [17:54:09] aude: Are you parsing it with moment.js or can it be done? [17:54:29] no, our dates support billions of years and such crazy things [17:54:33] aude: bug 63732 ? [17:54:34] that i doubt moment.js handles [17:54:42] hoo: probably [17:54:45] because moment.js is fine with that (I wrote their Persian module actually) [17:54:46] i know i made a bug for this [17:54:57] nice [17:55:04] but the parsing is done in php, via the api for wikidata :/ [17:55:07] git checkout -b typoTypoTypo [17:55:07] fatal: A branch named 'typoTypoTypo' already exists. [17:55:09] hahaha [17:55:16] I'm repeating myself... [17:55:18] :D [17:55:52] hoo: that's the bug [17:55:53] (03PS1) 10Hoo man: Fix private function name typo in MwTimeIsoFormatter [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170524 [17:55:55] no problem, that totally needs a deep rework [17:56:03] aude: ^ trivial fix :P [17:56:13] Annoying typo... [17:56:28] i think it is solvable to support the dates, esepcially if a volunteer can help [17:56:58] I like when people are: Oh date formatting and such is surely like a trivial thing [17:57:22] Like people in my university... and I'm like.... not quite... [17:57:25] (03CR) 10Aude: [C: 032] Fix private function name typo in MwTimeIsoFormatter [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170524 (owner: 10Hoo man) [17:58:09] we can't even just use the php DateTime stuff, as it doesn't handle things like "January 2014" well [17:58:21] or have to be very careful about using it [17:59:46] https://bugzilla.wikimedia.org/show_bug.cgi?id=64659 [18:01:29] I think I'm gonna take a stab at https://bugzilla.wikimedia.org/show_bug.cgi?id=49100 today [18:02:40] wow, old bug [18:02:44] yeah [18:03:25] I told Lydia I would do that months ago already :P [18:03:33] * ebraminio1 looks for another DIY bug to fix :) [18:04:11] usually filing bugs are not effective, that is why DIY is great :) [18:05:28] Yesterday I saw that people on ruwiki actually started removing data from wiki pages (infoboxes) as it's all in Wikidata now. Awesome. [18:05:46] (03Merged) 10jenkins-bot: Fix private function name typo in MwTimeIsoFormatter [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170524 (owner: 10Hoo man) [18:10:16] aude: namespace Wikibase\Client\UpdateRepo; does that make a good namespace? [18:13:58] hoo, aude: I was checking parsing of date property... isn't possible to avoid server date parsing? [18:14:30] You mean not parsing things server side, but only in the client? [18:14:32] https://github.com/moment/moment/blob/develop/test/locale/en.js [18:15:22] I mean parsing the date on client side then send send timestamp for something similar [18:15:37] ebraminio1: how would that work for non-js (i suppose we want to support that better) ? [18:15:51] we did js parsing in the past [18:15:54] That sounds like making things more complicated [18:15:59] which frankly wont help [18:16:10] and didn't in the past [18:16:38] it was decided not to do that for various reasons [18:16:44] Of course, thank you [18:16:46] now* [18:17:03] hoo: seems ok as a namespace [18:17:28] in mediawiki core, i know there is a bit of date localisation code [18:17:40] we use that some [18:18:09] would be nice if it was more reusable and separated a bit fromt he Language objects themselves (which are big, slow...) [18:18:37] and still doesn't support all the things we need, but could potentially change that [18:20:18] (03PS1) 10Hoo man: Move UpdateRepo classes into Wikibase\Client\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 [18:21:19] having to support non js is definitely real issue, nowadays modern browsers (without any libraries) supports great date formatting things. For example test this code on your browser: new Date().toLocaleString('fa') [18:22:31] Complete standard and localized formatted date, uses native digits and even local calendar [18:23:16] even moment.js is not this much good [18:24:49] ebraminio1: But that doesn't have all the features we need, I guess [18:24:58] Like 3 billion years ago... [18:25:01] hoo: definitely, [18:27:11] even https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DateTimeFormat doesn't have all features needed by webdevs [18:27:44] probably doesn't work well with "January 2014" type dates [18:27:56] with the day missing [18:28:49] do that in php, and it will become "January 1, 2014" (1 being, because it's November 1 today) [18:29:05] it guess that [18:29:35] what... so on Nov 13th it would guess January 13? [18:29:46] yes it would [18:29:50] with DateTime in php [18:30:01] that's beyond my imagination :D [18:30:28] yeah [18:31:51] Is there some good binding/wrapper for icu4c in PHP? Because it got all localization needed things at least [18:32:38] not sure [18:32:48] maybe using it with php would be doable (of course I don't know you wikibase needs or icu4c provides) [18:33:22] I think people are using icu4c bindings in some way [18:33:27] maybe poke tim? [18:37:42] (03PS2) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 [18:38:20] I figured having stuff in a dedicated namespace first makes sense [18:42:40] (03CR) 10jenkins-bot: [V: 04-1] Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [18:43:00] ugh [18:44:15] (03CR) 10Aude: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [18:44:17] Good thing I wrote tests for that [18:44:48] might we need the dreaded class alias for that? [18:45:00] aude: Nope :) Will reply [18:45:04] ok [18:45:55] haha, jenkins is in catalan for me [18:46:07] "Sortida de la consola" [18:46:09] (03CR) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [18:46:24] :D [18:46:53] (03PS3) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 [18:47:25] (03CR) 10Hoo man: "Forgot to add 2 use statements. (Good thing we have tests)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [18:48:46] (03CR) 10Aude: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [18:52:52] (03CR) 10jenkins-bot: [V: 04-1] Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [18:54:15] (03PS4) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 [19:01:46] fooood [19:14:46] (03CR) 10Aude: [C: 04-1] "please update the @covers tags. otherwise, looks good :)" (032 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:28:09] (03PS5) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 [19:28:12] (03CR) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo (032 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:28:38] (03CR) 10Hoo man: "Addressed Aude's comments" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:30:25] (03CR) 10Aude: [C: 032] "looks good :)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:30:45] aude: bah, forgot something [19:31:00] (03CR) 10Hoo man: [C: 04-2] "forgot to move repo tests into own folder" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:31:01] what? [19:31:05] aaah [19:31:14] (03CR) 10Aude: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:31:50] (03PS6) 10Hoo man: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 [19:32:07] (03CR) 10Hoo man: "Fixed now" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:36:36] (03CR) 10Aude: [C: 032] Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:36:45] Thanks :) [19:37:00] aude: When adding the new job... how should we do it? Do two commits: First add the job, then deploy that [19:37:09] and then make client actually insert it (when all branches have it) [19:37:23] ? [19:37:31] Or backport? [19:38:12] either is ok with me [19:41:55] (03Merged) 10jenkins-bot: Move UpdateRepo* into Wikibase\(Client|Repo)\UpdateRepo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170525 (owner: 10Hoo man) [19:50:54] (03PS1) 10Hoo man: Add UpdateRepoOnDeleteJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170528 (https://bugzilla.wikimedia.org/49100) [19:52:57] (03CR) 10Hoo man: [C: 04-1] "Still a little WIP: Edit summary message missing" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170528 (https://bugzilla.wikimedia.org/49100) (owner: 10Hoo man) [20:22:34] (cur | prev) 20:22, 1 November 2014‎ Admin (Talk | contribs)‎ . . (25,166 bytes) (-87)‎ . . (‎Page on [dewiki] deleted: Berlin) (undo) [20:22:35] :) [20:24:44] mh [20:25:03] there's not hook to add text to the delete confirmation thing [20:25:04] :S [20:25:08] * no [20:31:35] [13WikibaseDataModel] 15filbertkm created 06arraymerge (+1 new commit): 02http://git.io/69B5Yg [20:31:35] 13WikibaseDataModel/06arraymerge 147238d81 15aude: Don't use array_merge in getAllSnaks methods... [20:37:37] [13WikibaseDataModel] 15filbertkm 04force-pushed 06arraymerge from 147238d81 to 144bab1a2: 02http://git.io/W_Z0ZA [20:37:37] 13WikibaseDataModel/06arraymerge 144bab1a2 15aude: Don't use array_merge in getAllSnaks methods... [20:43:18] [13WikibaseDataModel] 15filbertkm 04force-pushed 06arraymerge from 144bab1a2 to 1460d9837: 02http://git.io/W_Z0ZA [20:43:18] 13WikibaseDataModel/06arraymerge 1460d9837 15aude: Don't use array_merge in getAllSnaks methods... [20:46:39] [13WikibaseDataModel] 15filbertkm opened pull request #266: Don't use array_merge in getAllSnaks methods (06master...06arraymerge) 02http://git.io/8wIj8g [21:05:50] [13WikibaseDataModel] 15mariushoch comment on pull request #266 1460d9837: Does order matter here? If not, you could just do `$snaks = $this->getQualifiers()` and then `$snaks[] = $this->getMainSnak()`. If it does, what about `array_unshift`? I'm just wondering because looping and then adding data one by one sounds less than ideal to me. 02http://git.io/Maw0kQ [21:06:27] aude: Rest of the change looks fine :) [21:21:34] pretty sure order is not important [21:22:48] :) [21:28:35] we have to use iterator_to_array, though [21:31:27] [13WikibaseDataModel] 15filbertkm comment on pull request #266 1460d9837: order is not important, but we would then have to use iterator_to_array which appears to be somewhat less efficient.... 02http://git.io/JS7FIQ [21:32:57] [13WikibaseDataModel] 15mariushoch pushed 1 new commit to 06master: 02http://git.io/vrM2Fw [21:32:57] 13WikibaseDataModel/06master 14e236b84 15Marius Hoch: Merge pull request #266 from wmde/arraymerge... [21:33:03] yay [21:33:07] [13WikibaseDataModel] 15mariushoch 04deleted 06arraymerge at 1460d9837: 02http://git.io/uzcxpQ [21:35:05] // Nobody knows why we need to clone over here, but it's not working [21:35:05] // without... PHP is fun! [21:35:40] Daniel and I were debugging that quite some time... :P [21:36:52] fun indeed! [21:39:17] wow, it's still needed [21:46:18] any news on numbers with units? [21:50:18] (03PS2) 10Hoo man: Add UpdateRepoOnDeleteJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170528 (https://bugzilla.wikimedia.org/49100) [21:50:20] (03PS1) 10Hoo man: Apply page deletions to the repo [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/170570 (https://bugzilla.wikimedia.org/49100) [21:50:26] fale: I'm not aware of anything [21:50:46] hoo: :( thanks. I hope good news will come soon [21:50:59] Well, there's always good new coming [21:53:22] hoo: yep, but usually on things that are not problematic for me :D [21:55:57] 3MediaWiki extensions / 3WikidataClient: Add mw.wikibase.getEntityObject by site link (title) Lua function - 10https://bugzilla.wikimedia.org/72815 (10Marius Hoch) s:5normal>3minor [21:57:53] fale: Yeah :D That's because there's to many things to do... [21:58:40] hoo: yep, but I've seen that wikidata is growing fast, so I hope to see soon the things that I need the most done :) [22:01:57] 3MediaWiki extensions / 3WikidataRepo: Unit tests for API modules should bypass API framework. - 10https://bugzilla.wikimedia.org/55516#c1 (10Marius Hoch) s:5major>3normal Less important after recent changes to core that made API tests way faster. [22:03:27] 3MediaWiki extensions / 3WikidataRepo: fatal error in EditEntityAction - 10https://bugzilla.wikimedia.org/45671#c2 (10Marius Hoch) 5NEW>3RESO/FIX No longer an issue apparently. [22:05:27] 3MediaWiki extensions / 3WikidataRepo: | (vertical bar) character is not accepted in aliases - 10https://bugzilla.wikimedia.org/43136#c6 (10Marius Hoch) 5NEW>3RESO/FIX This is no longer an issue. [22:09:27] 3MediaWiki extensions / 3WikidataRepo: [[d:Q8497800]] is still in wb_items_per_site table even though it's been deleted and can't be undeleted properly - 10https://bugzilla.wikimedia.org/48112#c5 (10Marius Hoch) 5NEW>3RESO/FIX I've deleted these entries when I did manual fixing for bug 71914. See https:... [22:11:56] 3MediaWiki extensions / 3WikidataRepo: ObjectComparer::dataEquals needs more tests - 10https://bugzilla.wikimedia.org/40975#c3 (10Marius Hoch) 5NEW>3RESO/FIX ObjectComparer no longer exists within DataModel. [22:13:42] 3MediaWiki extensions / 3WikidataRepo: Maximum function nesting level of '100' reached - 10https://bugzilla.wikimedia.org/43900#c2 (10Marius Hoch) Still relevant? We have quite a lot of places where code exceeds the nesting level of 100 (especially in client rendering). [22:14:41] 3MediaWiki extensions / 3WikidataRepo: Use "https://" for all URIs in the WikidataRepo extension - 10https://bugzilla.wikimedia.org/57644#c2 (10Marius Hoch) 5NEW>3RESO/FIX I'm pretty sure this has long been fixed. [22:22:40] * Lydia_WMDE waves from callgary [22:22:43] it is snowing -.- [22:22:55] but i have a window seat inside the airport terminal [22:22:59] so that's pretty nice [22:22:59] :D [22:23:38] Lydia_WMDE: hi :) [22:23:48] I've just closed a couple of really old bugs [22:23:58] \o/ [22:24:01] sweet [23:51:38] pitcher Did you mean: pintscher [23:51:42] * hoo is amused