[09:34:18] Yeah addshore removes all the unused methods ;) [09:34:26] :D [09:50:46] the function getWithPropertyId() has changed to getByPropertyId(), however in code, the getByPropertyId isn't recognized, inspite of the fact that my vagrant role of wikidata is up-to date. [10:37:07] should i use P50 (author) or P287 (designer) for a nuclear power plant design like the ESBWR (Q3100362)? [10:38:41] i want to say it is designed by GE-Hitachi [10:40:48] ysangkok: if the item is about a design, use "owner" [10:40:55] that's what I'd do [10:54:08] Nemo_bis: but that is something completely, what if the reactor design gets sold? [10:56:51] aaah maybe now i get your argument: a reactor cannot be actually designed by an abstract concept like a company, the people design the reactor, but the owner (legally) is the company [11:00:01] ysangkok: yes [11:00:37] with patents, usually the inventor is the individual; though in some companies even the "attribution" might be retained by the company [11:18:10] there is a big mess regarding fast/thermal breeder/non-breeder reactors... many wikis only have articles for fast breeder reactors, and in wikidata, that term in the end was equated sometimes fast reactor, and some times breeder reactors [11:18:29] see https://www.wikidata.org/w/index.php?title=Q552334&action=history [11:19:23] i don't know how to handle this, since for example dewiki only has an article on "thermal reactors", but no article on "fast reactors", instead opting to explain what a fast reactor is in the article on thermal reactors... [11:26:34] eternal issue of https://phabricator.wikimedia.org/T54564 [11:35:51] hmmm interesting... but depressing... no solution in sight [11:37:05] Yup, because all solutions are uncontrollable. [11:52:28] sjoerddebruin: what do you think about this: if there are no articles from other wikipedias, on a given IDen, wikipedia could link hard-coded "related" wikidata entries, like the reverse "instance of" relation [11:52:57] There is a proposal for article placeholders. [11:54:07] ah https://www.wikidata.org/wiki/Wikidata:Article_placeholder_input ok thanks [14:35:18] CFisch_WMDE: https://gerrit.wikimedia.org/r/#/c/223818/ [14:35:48] thx [14:37:36] wohoo Jonas_WMDE ! [14:37:44] thx! [14:37:56] that's why firefox is not failing anymore [14:38:08] even though it resizes [14:38:11] haven't realized that it was merged already since some time [14:38:20] yay! [14:38:39] we can delete the debug jobs they were testing that Tobi_WMDE_SW [14:38:53] CFisch_WMDE: yes, just remove them [14:38:57] kk [14:40:18] done [15:26:36] jzerebecki: Hola. What to grep for in https://phabricator.wikimedia.org/T106773 ? I have a local git.wm extensions checkout here anyway [15:27:00] (might want to add that info for potential contributors anyway) [15:27:03] That's a duplicate [15:28:08] yeah but we can't find the real one :-/ [15:30:19] merged them [15:31:01] Heh. hoo really knows his bugs. Thanks! [15:36:05] :D Took me several minutes to find it :S [15:37:27] I just saw the OS request on wikidata.... [15:37:37] but you all know that OS is hardly used there, right? [15:39:16] OS request? [15:39:42] someone is a new OS candidate, yes [15:39:47] Oh [15:39:54] I always found oversighters on wikidata useless... [15:39:55] an access request [15:39:58] yep [15:43:34] I guess people don't care [15:52:17] maybe I can come up with a regex that comes up with the regex :) [15:52:21] andre__: https://phabricator.wikimedia.org/T103070#1479310 but coming up with an actual regex needs a bit more time, [16:09:42] heh, jzerebecki the deprecation case doesn't seem to fit any of the cases listed at https://www.mediawiki.org/wiki/API/Architecture_work/Planning#Deprecation_process [16:11:53] addshore: ah yes, didn't read it before, but the question remains: do we need to follow some process? also is it documented somewhere, do we need to document it? [16:13:03] well, merge deprecations, announce, deploy deprecation, wait, merge & deploy removal [16:13:13] addshore: How much is it being used? [16:13:26] (Couldn't look at the graphs last time you posted them :P) [16:13:46] https://logstash.wikimedia.org/#/dashboard/temp/AU63scYgs2oZVZyjfrS2 [16:14:41] used by 1 ip user for 50 calls since that logging was added (a few days) [16:25:21] * addshore is stuck in a mess in all this language fallback stuff [16:33:29] * JeroenDeDauw sees phpstorm 9 [16:33:33] getgetgetgetgetget [16:34:38] hah! [16:34:58] the early access version I used had some bad memory issues JeroenDeDauw :/ [16:36:19] addshore: guess why I stopped using EAP some major versions back... [16:36:37] Not had any real problems since [16:44:08] ugh. y is my review queue so long >_> just gonna ignore it now and get some actual work done [17:16:30] addshore: https://github.com/wmde/WikibaseDataModelServices/pull/6 [17:26:46] !nyan-review | addshore https://github.com/wmde/WikibaseDataModelServices/pull/7 [17:26:50] huh [17:27:02] benestar|cloud: ^ what was the command name? [17:33:45] yay, jenkins tests failing for all wikibase things :P [17:33:48] for no reason :D [17:38:15] addshore: weeee https://github.com/wmde/WikibaseDataModelServices/pull/8 [17:38:26] After this I just need to update all internal refs [17:38:32] And then we're done moving WB DM stuff [17:38:54] addshore: unless something is missing from this list: https://phabricator.wikimedia.org/T104187 can you has a look? [17:39:12] *looks* [17:41:07] hah, JeroenDeDauw I was going to say Entity / EntityIdParsingException and EntityIdParser but now I see the PR ;) [17:41:17] apparently I totally missed those irc lines... [17:41:45] also JeroenDeDauw wheeeeee, I just finished removing the lib serialization from the api output! ;) [17:44:27] addshore: \o/ [17:44:30] will have a look [17:44:59] I think I will look at whatever EntityAccessor is next [17:45:12] it might require us to implement something somewhere... heh so specific [17:45:41] addshore: for moving things to the new component? [17:45:52] yeh [17:46:25] ok, going to try find the first commit in this chain [17:46:28] brb 5 mins [17:46:30] >_> [17:46:38] hah xD [17:47:16] oh dear [17:49:08] wat [17:49:17] the api deprecation process is deprecated?! [17:50:45] addshore: are the people doing the deplyment stuff aware of this change? [17:51:01] And have you already notified the list, or will someone be doing this? [17:51:22] I'll notify the list etc. on merging of the change [17:51:58] i believe hoo aude and janz have already all seen it [17:52:09] addshore: NOOOOOOOOOOOOOOOOOOO [17:52:14] you modified the thing [17:52:20] \now EVERYTHING needs rebasing [17:52:20] what thing? :/ [17:52:33] bah, doesnt need rebase! [17:52:46] git push origin master -f [17:52:50] it'll stop complaining onces the first thing is merged [17:53:05] addshore: https://github.com/wmde/WikibaseDataModelServices/pull/9 [17:53:24] lots of the changes are actually just improving tests to make sure that I didn't break the api output at all :/ [17:53:35] although I will plan a breaking change once all of this is merged. [17:53:59] right [17:55:15] For the DataModel Services thing, I'm thinking of making the 1.0 release, kill the stuff from DM and release 4.0 and then switch over users before moving more stuff into Servcies from WB.git or making other breaking changes in WB DM [17:55:21] Concerns? [17:55:59] sounds like a plan, I believe the services thing has a repo on gerrit now too [17:56:10] though I have no idea how to force push its current state from github tot here [17:56:13] *there [18:00:38] I'm not seeing any on gerrit [18:00:47] And I'm not sure why we'd want one there [18:01:20] addshore: so what did you say about not needing a rebase once the thing was merged? [18:01:23] :<0 [18:02:24] it shouldn't complain :O [18:02:49] * JeroenDeDauw complains about it complaining anyway [18:03:06] addshore: you gonna rebase all the things? ;p [18:03:17] mhhhhhhmmm, yes I will in a sec [18:03:25] let me just finish posting about the deprecation [18:05:03] $aliasGroupList = $fingerprint->getAliasGroups()->getWithLanguages( array( $language ) ); [18:05:06] That can be [18:05:24] $aliasGroupList = $fingerprint->getAliasGroup( $language ); [18:05:28] addshore: ^ [18:07:17] well, it needs an AliasGroupList JeroenDeDauw , rather than a single AliasGroup [18:09:51] oh, that's a filter.... [18:10:00] all rebased [18:10:35] wow, some people are totally not getting drunk in the office here judging by the noise [18:11:41] ohdear [18:14:22] addshore: you are making this ResultBuilder thing quite big [18:14:48] yes, I was thinking of spliting it up ;) [18:15:45] addshore: https://github.com/wmde/WikibaseDataModelServices/pull/9 [18:15:48] well, it currently 1100 lines [18:16:03] but the plan is to make a few breaking api changes th reduce the size anyway [18:16:22] will probably work on them at the start of next week [18:17:01] hahahaha [18:17:07] beat you to deleting the branch [18:17:09] :D [18:17:11] bah! [18:24:42] addshore: DataModel is totally going to be 6.0 before WB.git is 0.6 ;p [18:25:31] :/ [18:31:50] JeroenDeDauw: that is a good thing, nobody should use Wikibase as if it were a library, we probably need to make that much more clear (and fix defaults, remove experimental and do not dynamically add things to the version string) and release 0.6 [18:36:24] jzerebecki: it is not good it is taking so long for that to happen though [18:36:44] addshore: derp. Apparently BestStatementsFinder is actually used in StatementList [18:36:50] gonna move it back now ;p [18:36:56] whut >.> [18:37:02] In getBestStatementPerProperty [18:37:16] We could also just kill that ofc... [18:37:31] hmmmm [18:37:41] kill kill kill ;) [18:38:17] JeroenDeDauw: that is confusing. what do you think is not good? [18:39:40] jzerebecki: nvm [18:40:17] addshore: not entriely sure that is the best approach [18:40:38] Will submit it with the thing killed for now, so it works [18:40:46] Though I want bene to also have a look at it [18:44:49] addshore: https://github.com/wmde/WikibaseDataModel/pull/516 [18:50:59] addshore: https://gerrit.wikimedia.org/r/#/c/226509/5/repo/maintenance/dumpJson.php ... [18:51:35] $dumper = $this->wikibaseRepo->newJsonDumpGenerator( $output ); [20:13:50] addshore: you made commits on top of a WIP??? https://gerrit.wikimedia.org/r/#/c/226643/4 [20:15:30] JeroenDeDauw: that patch isn't a WIP right now, but shouldn't be merged yet! [20:18:11] so thats the end of the chain for now JeroenDeDauw :D [20:18:19] though I might rebase some of the stuff the other side of it! [20:20:45] jzerebecki: sorry for the long time answer, but any browser can decode std ut8 url (but if you can give me a browser that can't), and, no, it is not decoded for viewing. [20:21:01] if a copy paste is donne the url is decoded in url std fashion [20:21:31] thank you for the label (phabricator) I will look a t it) [20:21:40] addshore: you have now beaten hoo in commit count in WB.git. Of course, it's not a race at all... [20:24:52] Not very surprising, given how much time/ week I have for actual coding on Wikibase [20:26:14] hoo: clearly you need to make more time then eh? :) [20:26:27] * JeroenDeDauw is feeling pleasantly evil now [20:26:58] I like doing deployments and stuff... also there's that evil university thing taking my time [21:05:03] hoo: evil university! [21:52:02] How does one filter references out? https://www.wikidata.org/w/api.php?action=help&modules=wbgetclaims [21:55:15] Nemo_bis: how do you mean? [21:55:49] I don't want to load all that stuff [21:56:05] Documentation for &props only says how to include, not how to remove [21:56:14] Empty string doesn't work [21:57:17] Nemo_bis: thats probably a bug then! [21:58:13] yep, defo a bug, that parameters seems totally useless :D [21:58:20] file it and CC me :) [22:01:35] https://phabricator.wikimedia.org/T106899 [22:03:34] lovely! [22:14:55] Lydia_WMDE: (little nice public note as well) - just made JD|cloud an op in here [22:40:20] popolon: the standard says it is to be %-encoded in the html. see where https://url.spec.whatwg.org/#concept-url-path says ASCII. that it gives the correct parser output doesn't mean it is valid. [22:44:41] popolon: note that the html is sent compressed to normal browsers so I don't think we would gain that much by using plain utf8 [22:55:20] Lydia_WMDE: Oh no! JohnFLewis is recruiting more evil people into his army of doom!