[02:08:47] 10Wikidata, 10MobileFrontend-Feature-requests, 10Possible-Tech-Projects: Wikidata PageBanner extension - https://phabricator.wikimedia.org/T77925#1091036 (10Jdlrobson) @wrh2 any ideas from an editors perspective how you would like this to work without templates? Ideally I would imagine an edit button on the... [02:23:35] (03PS1) 10Hoo man: Add more debug logging to UpdateRepoJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194446 [03:23:21] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 7Need-volunteer: add a new datatype for geoshapes - https://phabricator.wikimedia.org/T57549#1091100 (10Rschen7754) Enforcing a conversion to a geodatabase format without any migration path is not a good idea, however, and will lead to the loss of data o... [04:59:08] 10MediaWiki-extensions-WikibaseClient, 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Avoid lost link for deleting pages which is later restored (e.g. history merge/spilt/cleanup) - https://phabricator.wikimedia.org/T75908#1091211 (10Bugreporter) Another usecase: In zhwiki all copyvio will tag as {{c... [05:25:33] WDQ seems to be working ... thanks to Yuvi and Magnus [07:26:47] [13DataValuesJavascript] 15snaterlicious pushed 1 new commit to 06master: 02http://git.io/x7M0 [07:26:47] 13DataValuesJavascript/06master 14f03f757 15snaterlicious: Merge pull request #69 from wmde/calUris... [07:27:50] well it was working ... no longer .. for whatever new reason [07:36:23] 10Wikidata, 6Labs: wdq.wmflabs.org does not update (data week old) - https://phabricator.wikimedia.org/T89583#1091329 (10QuestPC) Please fix the WDQ queries, I need to debug and re-run my Python script. I did not know that such read-only API requires write updates and is unreliable. [07:43:45] 10MediaWiki-extensions-WikibaseClient, 10Wikidata, 7Tracking: Allow accessing data from an item not connected to the current page - arbitrary access (tracking) - https://phabricator.wikimedia.org/T49930#1091343 (10Eloquence) Can we specify a month to shoot for and add it to the appropriate #roadmap column? [07:44:26] 10Wikidata, 10MobileFrontend-Feature-requests, 10Possible-Tech-Projects: Wikidata PageBanner extension - https://phabricator.wikimedia.org/T77925#1091344 (10Florian) > as a page property In page_props?? If yes, you need to save it elsewhere, too, page_props are purged with any re-parse :) [08:15:28] 10Wikibase-DataModel, 10Wikidata, 5Patch-For-Review, 3§ Wikidata-Sprint-2015-02-03, 3§ Wikidata-Sprint-2015-02-25: Remove Claim class - https://phabricator.wikimedia.org/T87388#1091365 (10Tobi_WMDE_SW) 5Open>3Resolved [08:34:49] 10Wikidata, 5Patch-For-Review, 3§ Wikidata-Sprint-2015-02-03, 3§ Wikidata-Sprint-2015-02-25: spec of how timevalue works and is supposed to work - https://phabricator.wikimedia.org/T88438#1091523 (10Tobi_WMDE_SW) 5Open>3Resolved [08:34:51] 10Wikidata, 7Tracking: Bugs related to time datatype - https://phabricator.wikimedia.org/T87764#1091524 (10Tobi_WMDE_SW) [09:23:20] [13WikidataBrowserTests] 15WMDE-Fisch created 06url_test_split (+1 new commit): 02http://git.io/x5Uo [09:23:20] 13WikidataBrowserTests/06url_test_split 14c280fdb 15WMDE-Fisch: split url test for phantomjs compatibility [09:32:57] 10Wikibase-DataModel, 10Wikidata: Improve Statement/Claim GUID handling - https://phabricator.wikimedia.org/T78301#1091639 (10thiemowmde) 5Open>3Resolved a:3thiemowmde addClaim() (the only method that checks if the GUID is set) is deprecated since 1.0. The now favored getStatements()->addStatement() neve... [09:39:06] 10Wikibase-DataModel, 10Wikidata: Claim vs. Statement getting worse, not better - https://phabricator.wikimedia.org/T78292#1091656 (10thiemowmde) 5Open>3Resolved Claim is gone in 3.x. The Statement constructor is reverted to as is was in 1.x. [09:46:10] [13WikidataBrowserTests] 15WMDE-Fisch pushed 1 new commit to 06time_T88542: 02http://git.io/x5td [09:46:10] 13WikidataBrowserTests/06time_T88542 140a9125e 15WMDE-Fisch: added login for entity modifying test [09:49:53] [13WikidataBrowserTests] 15WMDE-Fisch opened pull request #56: split url test for phantomjs compatibility (06master...06url_test_split) 02http://git.io/x5qS [09:54:11] [13WikidataBrowserTests] 15tobijat pushed 1 new commit to 06master: 02http://git.io/x5Yv [09:54:11] 13WikidataBrowserTests/06master 1425f37aa 15Tobi Gritschacher: Merge pull request #54 from wmde/time_T88542... [09:56:01] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 7Tracking: Browsertests for currently used datavalues (tracking) - https://phabricator.wikimedia.org/T55845#1091693 (10Tobi_WMDE_SW) [09:56:02] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 7Tracking: Migrate current selenium tests to cucumber - https://phabricator.wikimedia.org/T55849#1091694 (10Tobi_WMDE_SW) [09:56:03] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 5Patch-For-Review, 3§ Wikidata-Sprint-2015-02-25: Browsertests for time value input - https://phabricator.wikimedia.org/T88542#1091692 (10Tobi_WMDE_SW) 5Open>3Resolved [09:59:22] Lydia_WMDE: current sprint: http://phragile.herokuapp.com/projects/wikidata [10:00:12] [13WikidataBrowserTests] 15WMDE-Fisch 04force-pushed 06url_test_split from 14c280fdb to 145097b8b: 02http://git.io/x5OP [10:00:12] 13WikidataBrowserTests/06url_test_split 145097b8b 15WMDE-Fisch: split url test for phantomjs compatibility [10:09:53] [13WikidataBrowserTests] 15tobijat 04deleted 06url_test_split at 145097b8b: 02http://git.io/x5ZR [10:12:04] [13WikidataBrowserTests] 15tobijat pushed 1 new commit to 06master: 02http://git.io/x5nq [10:12:04] 13WikidataBrowserTests/06master 1468d0b4f 15Tobias Gritschacher: Update Gemfile.lock [10:12:04] [13Common] 15thiemowmde created 06privateByDef (+1 new commit): 02http://git.io/x5nt [10:12:04] 13Common/06privateByDef 14e63e524 15Thiemo Mättig: Private by default [10:12:44] [13Common] 15thiemowmde opened pull request #21: Private by default (06master...06privateByDef) 02http://git.io/x5n8 [10:13:38] [13Common] 15thiemowmde created 06030 (+1 new commit): 02http://git.io/x5nM [10:13:38] 13Common/06030 1456fea32 15Thiemo Mättig: Prepare 0.3.0 release [10:13:56] [13Common] 15thiemowmde opened pull request #22: Prepare 0.3.0 release (06master...06030) 02http://git.io/x5nQ [10:19:29] 10Wikidata, 6Labs: wdq.wmflabs.org does not update (data week old) - https://phabricator.wikimedia.org/T89583#1091720 (10GerardM) Hoi, Work has been done to make it better.. This morning it worked for some time for me. DO understand that it is not official software. It is exceptional that this software has be... [10:44:12] [13Common] 15thiemowmde created 06getParserClass (+1 new commit): 02http://git.io/x50P [10:44:12] 13Common/06getParserClass 14ebbca1f 15Thiemo Mättig: Deprecate getParserClass in favor of getInstance [10:46:37] [13Common] 15thiemowmde opened pull request #23: Deprecate getParserClass in favor of getInstance (06master...06getParserClass) 02http://git.io/x5E4 [11:10:57] [13Number] 15thiemowmde created 06getParserClass (+1 new commit): 02http://git.io/x5wM [11:10:57] 13Number/06getParserClass 14767717a 15Thiemo Mättig: Deprecate getParserClass in favor of getInstance [11:11:47] [13Number] 15thiemowmde opened pull request #26: Deprecate getParserClass in favor of getInstance (06master...06getParserClass) 02http://git.io/x5rk [11:15:25] [13Number] 15thiemowmde 04force-pushed 06getParserClass from 14767717a to 146e2c622: 02http://git.io/x5oI [11:15:25] 13Number/06getParserClass 146e2c622 15Thiemo Mättig: Deprecate getParserClass in favor of getInstance [11:16:45] [13Number] 15thiemowmde created 06getOptions (+1 new commit): 02http://git.io/x5oW [11:16:45] 13Number/06getOptions 1407e658c 15Thiemo Mättig: Avoid calling getOptions [11:17:31] [13Number] 15thiemowmde opened pull request #27: Avoid calling getOptions (06master...06getOptions) 02http://git.io/x5oV [11:23:23] [13Number] 15thiemowmde 04force-pushed 06unitOption1 from 141727023 to 14683480d: 02http://git.io/x56T [11:23:23] 13Number/06unitOption1 14683480d 15Thiemo Mättig: Allow non-conflicting unit option [11:25:43] [13Number] 15thiemowmde 04force-pushed 06unitOption1 from 14683480d to 14e3e8a98: 02http://git.io/x56T [11:25:43] 13Number/06unitOption1 14e3e8a98 15Thiemo Mättig: Allow non-conflicting unit option [11:34:01] [13Number] 15thiemowmde 04force-pushed 06PrecisionFromScientificNotation from 14cb0fb7c to 1494b76f5: 02http://git.io/xyHA [11:34:01] 13Number/06PrecisionFromScientificNotation 1403f4ee2 15daniel: Correctly detect precision for scientific notation.... [11:34:01] 13Number/06PrecisionFromScientificNotation 1494b76f5 15Thiemo Mättig: Add test cases [11:41:58] (03CR) 10JanZerebecki: [C: 032] Add more debug logging to UpdateRepoJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194446 (owner: 10Hoo man) [11:43:37] [13Geo] 15thiemowmde comment on pull request #43 14e43db7e: Good question. I double-checked. The constructor parameter in the `StringValueParser` base class is optional since 0.1: https://github.com/DataValues/Common/blob/0.1/src/ValueParsers/StringValueParser.php#L29 02http://git.io/x5Mg [11:45:20] (03Merged) 10jenkins-bot: Add more debug logging to UpdateRepoJob [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194446 (owner: 10Hoo man) [11:47:45] (03CR) 10Adrian Lang: "recheck" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194342 (owner: 10Adrian Lang) [11:49:31] wikimedia/mediawiki-extensions-Wikibase/master/3483300 : Marius Hoch The build has errored. http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/53177815 [11:53:46] 10Wikidata, 10MediaWiki-Vagrant, 3§ Wikidata-Sprint-2015-02-25: labs-vagrant git-update should run composer update in WikidataBuildResources - https://phabricator.wikimedia.org/T90565#1091850 (10aude) 5Open>3Resolved [11:57:11] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: add a new datatype for multilingual text - https://phabricator.wikimedia.org/T86517#1091866 (10Lydia_Pintscher) [13:02:49] (03PS1) 10Raimond Spekking: Consistency tweaks [extensions/WikidataQuality] - 10https://gerrit.wikimedia.org/r/194502 [13:03:07] Lucie_WMDE: http://saml.rilspace.com/content/backtracking-key-difference-between-sparql-and-prolog [13:06:16] (03CR) 10Jonaskeutel: [C: 032 V: 032] Consistency tweaks [extensions/WikidataQuality] - 10https://gerrit.wikimedia.org/r/194502 (owner: 10Raimond Spekking) [13:14:22] (03PS3) 10Aude: add i18n support for ApiClientInfo module [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/179868 [13:16:18] (03PS4) 10Aude: add i18n support for ApiClientInfo module [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/179868 [13:16:51] (03CR) 10Aude: "fixed all the issues. patchset 3 is a rebase." (033 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/179868 (owner: 10Aude) [13:23:22] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 7Documentation, 3§ Wikidata-Sprint-2015-02-25: update documentation of high-level data model - https://phabricator.wikimedia.org/T75603#1091952 (10Tobi_WMDE_SW) please provide feedback: https://www.mediawiki.org/w/index.php?title=Wikibase%2FDataModel&d... [13:24:52] Hey. Is it possible to say that a birthdate is before a well known timepoint? [13:24:59] f.e. before 1587. [13:30:00] BenutzerConny: one way is to state "birthdate: somevalue" and qualify with "earliest date: 1587" [13:30:29] matej_suchanek: Thank you. [13:31:38] Data type time shows on special page also before and after - but I do not know to use it... [13:31:42] https://www.wikidata.org/wiki/Special:ListDatatypes?setlang=en [13:33:34] I do niether, AFAIK it is not implemented yet [13:40:03] Hey folks. There's a problem maybe you would know how to address. In OpenStreetMap, we're very careful about the copyright status of data that goes into our DB, but recently we've seen people dumping data from copyrighted sources into Wikidata, then claiming "Well it's CC0" [13:40:19] Is there anything you guys are planning in addressing this problem? [13:40:20] matej_suchanek: Did it :) https://www.wikidata.org/w/index.php?title=Q700622&diff=201223648&oldid=173606956 [13:44:40] [13ValueView] 15thiemowmde comment on pull request #162 14265c2f9: This calls toLowerCase on an object or, with the simplified CALENDARS structure, on a string URI. 02http://git.io/xdlm [13:46:37] [13ValueView] 15thiemowmde 04force-pushed 06datavalues070 from 14265c2f9 to 1489ebed9: 02http://git.io/xd8k [13:46:37] 13ValueView/06datavalues070 1489ebed9 15Thiemo Mättig: Fix the tests [13:48:49] (03PS1) 10Jonaskeutel: Cross-check and constraint report special pages should be working, tests uncomplete, but the existing ones should run [extensions/WikidataQuality] - 10https://gerrit.wikimedia.org/r/194509 [13:49:46] no one has any thoughts here? [13:51:03] [13ValueView] 15thiemowmde 04force-pushed 06datavalues070 from 1489ebed9 to 14f4aadba: 02http://git.io/xd8k [13:51:03] 13ValueView/06datavalues070 14f4aadba 15Thiemo Mättig: Fix the tests [13:51:17] emacsen: if you notice an actual copyright violation, please report it (on the project chat, I guess). However, facts as such are not copyrightable. [13:51:32] emacsen: do you have an example of what you mean? [13:53:05] [13ValueView] 15thiemowmde 04force-pushed 06datavalues070 from 14f4aadba to 1430687e0: 02http://git.io/xd8k [13:53:05] 13ValueView/06datavalues070 1430687e0 15Thiemo Mättig: Fix the tests [13:53:14] DanielK_WMDE, someone took a book of "translations" of place names, dumped them into Wikidata, then tried to import them into OSM [13:54:19] DanielK_WMDE, so the issue is really two things, for us at least. We're concerned that this is just "laundering", both from a copyright standpoint, but also a data quality standpoint [13:54:28] ie "I'll put this in Wikidata, then it's "Wikidata good" [13:56:04] emacsen: laundring is bad. place names are not copyrighted, unless someone made them up. [13:56:21] i'd suggest to explain your concerns, including examples, on the mailing list or project chat [13:57:21] [13Geo] 15thiemowmde created 06getParserClass (+1 new commit): 02http://git.io/xd0d [13:57:21] 13Geo/06getParserClass 14de5a501 15Thiemo Mättig: Deprecate getParserClass in favor of getInstance [13:57:29] DanielK_WMDE, I guess, is this something that the project has plans to address as a whole or if I complain about X, will I be told "Just modify it yourself"? [13:57:53] DanielK_WMDE, as for "making them up"- that's the point, if there's no signage, then yes, it's just made up [13:58:20] [13Geo] 15thiemowmde opened pull request #44: Deprecate getParserClass in favor of getInstance (06master...06getParserClass) 02http://git.io/xdEc [13:58:24] it's a key difference between the Wiki* projects and OSM is that OSM wants on the ground validation, rather than source citation [13:58:37] but Wikidata is considered a "high value source" [13:59:55] emacsen: sounds like a valid concern wrt quality assurance. bring it up on the list. [14:00:21] note however that lexical terms (labels, descriptions, and aliases) are editorial values, not statements that even *can* be sourced [14:00:25] [13Geo] 15thiemowmde 04force-pushed 06getParserClass from 14de5a501 to 14ec69245: 02http://git.io/xdun [14:00:25] 13Geo/06getParserClass 14ec69245 15Thiemo Mättig: Deprecate getParserClass in favor of getInstance [14:00:30] the software doesn't provide a mechanism for that [14:01:51] DanielK_WMDE, Is that something to be addressed? [14:02:20] [13Geo] 15thiemowmde created 06lessOptions2 (+1 new commit): 02http://git.io/xdzk [14:02:20] 13Geo/06lessOptions2 1452afa7b 15Thiemo Mättig: Avoid calling getOptions [14:02:52] emacsen: i would persoanlly say no, but there are arguments going the other way. and there are properties for "official name" and such, which can be used with sourced statements. [14:03:01] bring it up, it'sa good discussion point [14:04:45] DanielK_WMDE, okay. May not be today but I will [14:06:15] thanks! [14:07:05] [13Time] 15thiemowmde created 06getParserClass (+1 new commit): 02http://git.io/xdgy [14:07:05] 13Time/06getParserClass 148272111 15Thiemo Mättig: Deprecate getParserClass in favor of getInstance [14:07:47] [13Time] 15thiemowmde opened pull request #44: Deprecate getParserClass in favor of getInstance (06master...06getParserClass) 02http://git.io/xdgA [14:08:20] [13Time] 15thiemowmde created 06getOptions (+1 new commit): 02http://git.io/xd2m [14:08:20] 13Time/06getOptions 1493bdc5a 15Thiemo Mättig: Avoid calling getOptions [14:08:23] (03PS1) 10Daniel Kinzler: Mock services for SpecialWikibaseRepoPage [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194516 [14:08:30] [13Time] 15thiemowmde opened pull request #45: Avoid calling getOptions (06master...06getOptions) 02http://git.io/xd23 [14:24:50] [13Interfaces] 15thiemowmde created 06getFormatterClass (+1 new commit): 02http://git.io/xd6f [14:24:50] 13Interfaces/06getFormatterClass 14ba01ed8 15Thiemo Mättig: Deprecate getFormatterClass in favor of getInstance [14:25:55] [13Interfaces] 15thiemowmde opened pull request #9: Deprecate getFormatterClass in favor of getInstance (06master...06getFormatterClass) 02http://git.io/xd6W [14:27:09] [13Time] 15thiemowmde created 06getFormatterClass (+1 new commit): 02http://git.io/xd6p [14:27:09] 13Time/06getFormatterClass 14a95f0fd 15Thiemo Mättig: Deprecate getFormatterClass in favor of getInstance [14:27:32] [13Time] 15thiemowmde opened pull request #46: Deprecate getFormatterClass in favor of getInstance (06master...06getFormatterClass) 02http://git.io/xdim [14:29:47] [13Number] 15thiemowmde created 06getFormatterClass (+1 new commit): 02http://git.io/xdPe [14:29:47] 13Number/06getFormatterClass 14c955738 15Thiemo Mättig: Deprecate getFormatterClass in favor of getInstance [14:30:37] [13Number] 15thiemowmde opened pull request #28: Deprecate getFormatterClass in favor of getInstance (06master...06getFormatterClass) 02http://git.io/xdPn [14:32:47] [13Number] 15thiemowmde 04force-pushed 06getFormatterClass from 14c955738 to 14178ec69: 02http://git.io/xdPF [14:32:47] 13Number/06getFormatterClass 14178ec69 15Thiemo Mättig: Deprecate getFormatterClass in favor of getInstance [14:33:57] [13Geo] 15thiemowmde created 06getFormatterClass (+1 new commit): 02http://git.io/xdXV [14:33:57] 13Geo/06getFormatterClass 141dde360 15Thiemo Mättig: Deprecate getFormatterClass in favor of getInstance [14:34:05] [13Geo] 15thiemowmde opened pull request #46: Deprecate getFormatterClass in favor of getInstance (06master...06getFormatterClass) 02http://git.io/xdXi [14:35:36] [13Common] 15thiemowmde created 06getFormatterClass (+1 new commit): 02http://git.io/xd1Z [14:35:36] 13Common/06getFormatterClass 14f8e227a 15Thiemo Mättig: Deprecate getFormatterClass in favor of getInstance [14:35:46] [13Common] 15thiemowmde opened pull request #24: Deprecate getFormatterClass in favor of getInstance (06master...06getFormatterClass) 02http://git.io/xd1W [14:41:18] $this->assertRegExp( '/]*name="x"[^<>]*value="x"/', $html ); [14:42:09] I want jQuery to add browser check again :/ [14:43:36] Lydia_WMDE :D I got a mug and a t-shirt! I'm so hype! :D [14:48:23] Chandiqueer: :D [14:48:41] like it? [14:49:45] I do! :D I wore the shirt to school today :P Matches my Wikipedia -pin and WIkipedia-keychain. People comment that I'm a walking billbord for WIkimedia (after I told them what Wikimedia and Wikidata is) [14:50:43] [13Common] 15JeroenDeDauw 04deleted 06privateByDef at 14e63e524: 02http://git.io/xdHj [14:52:30] damn, i must have it too! [14:52:51] Thank you 4guys and gals3 all! [14:53:22] Chandiqueer, what if people don't self-identify as a part of the all? [14:54:05] Well, like that old saying 3it's all or nothing, so then they are worth nothing :P [15:02:33] matej_suchanek: do you already have any wikidata swag? [15:02:46] (03PS2) 10Thiemo Mättig (WMDE): Make options nullable/optional in all ValueFormatters [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194112 [15:02:59] [13WikibaseDataModel] 15JeroenDeDauw created 06travis (+1 new commit): 02http://git.io/xdFA [15:02:59] 13WikibaseDataModel/06travis 14033a0c1 15jeroendedauw: Allow failures on hhvm nightly [15:03:26] please don't use the word, I don't like it... [15:03:44] actually no [15:03:55] heh ok [15:04:13] well do email me your address and t-shirt size and we'll fix that :) [15:05:22] [13Geo] 15JeroenDeDauw 04deleted 06lessOptions2 at 1452afa7b: 02http://git.io/xdNf [15:05:49] I will ask myself if I really want and will surely do [15:06:32] [13Geo] 15JeroenDeDauw comment on pull request #44 14ec69245: What is the reason for deprecating this rather than removing it? Is something still using it? 02http://git.io/xdNr [15:06:43] (03PS1) 10Daniel Kinzler: Data provider for testing SpecialSetLabelDescritionAliases [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194524 [15:08:01] Lydia_WMDE: I found two bugs but I am not sure if they are already fixed, where is the place with the newest version of the software? [15:09:26] matej_suchanek: we've not deployed the latest stuff to the test system yet [15:09:31] hopefully in the next days [15:09:52] can you explain here what the issue is? [15:10:00] both in diffs [15:10:00] wmde/WikibaseDataModel/travis/033a0c1 : jeroendedauw The build passed. http://travis-ci.org/wmde/WikibaseDataModel/builds/53200841 [15:11:08] I know there is one patch around this, but am not able to find out its current state [15:30:09] (03PS1) 10Thiemo Mättig (WMDE): Make options nullable/optional in all ValueParsers [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194530 [15:31:15] (03PS1) 10Thiemo Mättig (WMDE): Deprecate getParserClass in favor of getInstance [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194531 [15:32:21] [13WikibaseDataModel] 15thiemowmde closed pull request #396: Allow failures on hhvm nightly (06master...06travis) 02http://git.io/xdbJ [15:33:12] (03PS1) 10Adrian Lang: Make tests independent of local group permissions [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194532 [15:33:33] [13Geo] 15thiemowmde comment on pull request #44 14ec69245: Just for an easier migration process. I'm fine with removing it right away. 02http://git.io/xFIP [15:35:03] (03PS2) 10Adrian Lang: Completely rework SectionEditLinkGenerator [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194342 [15:35:05] (03PS1) 10Adrian Lang: Also load wikibase.ui.entityViewInit on read-only entities [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194533 [15:40:11] (03CR) 10Thiemo Mättig (WMDE): Make SpecialSetLabelDescriptionAliases a 2-step process (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194105 (https://phabricator.wikimedia.org/T91387) (owner: 10Thiemo Mättig (WMDE)) [15:59:38] 10MediaWiki-extensions-WikibaseClient, 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Avoid lost link for deleting pages which is later restored (e.g. history merge/spilt/cleanup) - https://phabricator.wikimedia.org/T75908#1092307 (10hoo) >>! In T75908#1091211, @Bugreporter wrote: > Another usecase:... [15:59:52] jzerebecki: https://anonscm.debian.org/cgit/pkg-rust/ [16:20:56] Just a thought...perhaps #wikipedia-bot-messages would be something, so this chanel could be devoted to "discussions" etc.? [16:21:03] channel* [16:23:41] [13WikibaseDataModel] 15JeroenDeDauw closed pull request #389: Favor float over alias double (06master...06double) 02http://git.io/AwHi [16:33:54] Chandiqueer: yeah i have that on my todo but didn't get to it yet :( [16:34:18] (wikidata* oops) [16:34:19] Chandiqueer: if you want please file a ticket so i can't push it off further [16:35:21] Lydia_WMDE: I would love to do it, if I didn't...4hate dislike/can't comprehend Phab. I like Bugzilla! [16:35:39] hehe [16:42:39] [13Time] 15JeroenDeDauw pushed 2 new commits to 06master: 02http://git.io/xFSx [16:42:39] 13Time/06master 1493bdc5a 15Thiemo Mättig: Avoid calling getOptions [16:42:39] 13Time/06master 14687f20f 15Jeroen De Dauw: Merge pull request #45 from DataValues/getOptions... [16:42:59] [13Number] 15JeroenDeDauw pushed 2 new commits to 06master: 02http://git.io/xF9O [16:42:59] 13Number/06master 1407e658c 15Thiemo Mättig: Avoid calling getOptions [16:42:59] 13Number/06master 14ca5ee56 15Jeroen De Dauw: Merge pull request #27 from DataValues/getOptions... [16:57:12] Lucie_WMDE: https://stackoverflow.com/questions/1966503/does-imdb-provide-an-api [16:58:16] * Nemo_bis doesn't even manage to reach forums nowadays [17:04:02] Hello. I'm trying to learn how to use autolist -->http://tools.wmflabs.org/autolist/autolist1.html <-- to generate a list of items having links to enwiki but not hiwiki. I've made a few attempts, trying to narrow down the search range using a small category from enwiki, but get stuck at Running query... [17:04:51] could someone help me [17:06:09] . [17:08:21] Sid-G: it is because the WikiDataQuery used by AutoList is not working well at the moment [17:10:13] matej_suchanek: ok, so is there an api which i can use to generate results in json directly? [17:10:27] using an http post request? [17:11:04] yes, there is but, as said, it isn't working well atm [17:11:54] oh, ok, thanks [18:27:10] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Run subset of Wikidata browsertests on every commit on Gerrit. - https://phabricator.wikimedia.org/T75364#1092887 (10greg) [18:28:21] 10Wikidata, 10Citoid, 6Editing, 10Possible-Tech-Projects, and 2 others: Create a system to store and query links to books - https://phabricator.wikimedia.org/T90852#1092905 (10Jdforrester-WMF) [18:43:25] JeroenDeDauw: I'm looking over https://gerrit.wikimedia.org/r/#/c/191393/ and wondering if you can summarize your objections to the patch moving forward. Are there substantial problems that must be addressed before a merge or generally just some things you'd like to see done as followup work? [18:43:49] Gerrit is not always the easiest place to track a conversation over time :/ [18:44:43] My reason for poking is that this patch chain is blocking progress on WDQv2 and we'd like to figure out how to unblock it [18:48:40] bd808: The things JeroenDeDauw pointed out are just to big function he wants to have split up [18:48:53] Nothing that's hard to fix or couldn't be done in a follow up [18:51:08] (03CR) 10Jeroen De Dauw: "Indeed, my comments from PS15. Last time I checked the dead code was still present." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [18:52:42] JeroenDeDauw: I'm not sure - which dead code you're talking about? [18:52:54] could you explain a bit more? [18:53:46] hoo: I would prefer to move the patch forward if the only issue is refactoring the code and make the refactoring changes in subseqent patches if needed [18:54:08] I'm ok with that [18:54:21] though I really don't think we need to have separate function for each if - that just makes code so much harder to read [18:54:30] If it works, I'd support moving forward [18:54:51] (03CR) 10Jeroen De Dauw: "I think this should be split up into something that can be reviewed more easily. There are a great number of small issues and a number of " [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [18:55:17] JeroenDeDauw: split up how? [18:55:24] (03CR) 10Jeroen De Dauw: [C: 04-1] Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [18:56:18] JeroenDeDauw: also, could you describe the bigger issues in a bit more detail? [18:56:44] (03CR) 10Smalyshev: "Jeroen, split up how? Also, what are the bigger issues?" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [18:56:50] SMalyshev: I think I asked this before, but the old command line invocation for dumpJson.php keep working exactly as they did? [18:57:00] Because we use that in production to generate the dumps [18:57:02] SMalyshev: addStatementForString is one of the methods that appears to not be used [18:57:11] JeroenDeDauw: maybe we could just spent 15 minutes and walk trhough the code and I could explain it to you? [18:57:18] JeroenDeDauw: it is most definitely used. [18:57:38] JeroenDeDauw: that's what generates RDF values for string items [18:58:10] hoo: yes [18:58:16] Nice :) [18:58:26] SMalyshev: I'm checking the code out now to have a proper look at it in my IDE [18:59:00] I'll come back to you in a few years when gerrit is done giving me the change [19:00:03] JeroenDeDauw: see addStatementValue [19:03:39] SMalyshev: ah, I did not think you'd be constructing method names dynamically [19:04:00] JeroenDeDauw: I'd be :) [19:04:43] I'm not happy with that [19:05:00] And my IDE is also not happy with this code. Half of it is underlined for one reason or the other [19:05:16] Wrong docblocks, lots of coding style violations [19:05:30] JeroenDeDauw: I don't know about your IDE but I'm not sure what's wrong with using PHP features [19:06:02] It's not because you can that you should [19:06:15] JeroenDeDauw: in this case, I think I should [19:06:58] unless of course you can suggest a better way [19:07:09] SMalyshev: I have already suggested one [19:07:24] JeroenDeDauw: could you repeat that suggestion? I must have missed it [19:07:37] SMalyshev: look at what I wrote on gerrit please [19:08:35] https://gerrit.wikimedia.org/r/#/c/191393/15/repo/includes/rdf/RdfBuilder.php [19:08:51] JeroenDeDauw: you realize that just repeating it takes exactly the same time as writing " look at what I wrote on gerrit please" but doesn't force me to reascan all your comments in hope I find the one, right? Just noticing it would be more efficient [19:09:13] SMalyshev: I think it'd take longer [19:09:17] Perhaps you write faster than me [19:09:47] SMalyshev: also, I am doing the effort to give you feedback [19:10:05] If you do not read it and then ask me to simply repeat... I do not think this is very polite [19:10:41] JeroenDeDauw: from what I see you suggest using a full interface and set of 7 classes to implement one simple function. I think this is overingeneering and makes the code hard to read and understand [19:12:09] if we had a bigger chunk of code then ok but we have 1 line for most of them and less than 10 lines for all of them [19:12:28] SMalyshev: yeah, it seems you think doing a single thing per method or class makes thingsd hard to understand [19:12:31] A common argument [19:12:50] JeroenDeDauw: I do think so. [19:13:16] JeroenDeDauw: not a single thing per method but a method per line of code [19:13:41] And I do not. And I doubt you will listen to an explanation from my side over IRC by how you are talking about this [19:14:00] JeroenDeDauw: creating RDF representation is a single thing. But if it includes branches it doesn't mean we have 7 classes for each branch [19:14:35] SMalyshev: can you list me the problems with putting it all in one class? [19:14:36] JeroenDeDauw: I'd like to listen, surely [19:14:52] Even if you think this is the most readable, there most definitly are probelsm with it [19:15:02] JeroenDeDauw: sorry, I'm not sure I got that - putting what in one class? [19:15:28] JeroenDeDauw: sure, if there are problems please tell me, I'll fix them [19:20:08] SMalyshev: the OCP violation is a good place to start [19:20:20] JeroenDeDauw: what you mean by OCP violation? [19:20:48] in which way it is violated? [19:23:11] SMalyshev: the class needs to be modified for each new data type [19:23:40] JeroenDeDauw: sure, you'd have to add a function for new data type. What's a problem in that? [19:23:50] SMalyshev: you could also fix the numerous violations of our / MediaWikis coding conventions [19:23:56] And fix the docblocks [19:24:13] SMalyshev: do you know what an OCP violation is? [19:24:33] JeroenDeDauw: you mean the whitespace? I'll fix it of course, but we're talking about more important things for now [19:25:13] JeroenDeDauw: as for docblocks, I may have missed some if you point them to me I'll fix them of course [19:25:40] JeroenDeDauw: I have the idea what the term means, yes, but I'm not sure still what the problem in this code [19:26:20] JeroenDeDauw: i.e. if you have new data type, wouldn't you expect to modify rdf generating code in order to support this type? [19:27:01] JeroenDeDauw: right now it's extermely simple - you implement one function and you're done [19:27:15] 10MediaWiki-extensions-WikibaseClient, 10Wikidata, 7Tracking: Allow accessing data from an item not connected to the current page - arbitrary access (tracking) - https://phabricator.wikimedia.org/T49930#1093196 (10Multichill) >>! In T49930#1072150, @aude wrote: > We still might be able to proceed with enabli... [19:27:27] (well, you'll need tests of course but that's different) [19:36:14] (03CR) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [19:37:02] Still bike shedding over !$object? [19:38:38] hoo: I think it's fine :) [19:38:43] (03CR) 10Hoo man: "We do this in many places and there's no need to be overly explicit here." (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [19:38:53] So do I... commented [19:39:29] JeroenDeDauw: so, how can we move forward on this? [19:40:39] SMalyshev: hey. I'm really struggeling to find time to review your patch. I now blocked the better part of tuesday for it. One of the big issues for me is understanding what the tests do. [19:40:47] SMalyshev: I've outlined several concerns, it's your job to decide on what you do with them [19:41:23] JeroenDeDauw: except for the OCP thing which we seem to disagree about, what are the other things? i.e. what's the story with docblocks? [19:41:44] SMalyshev: we discussed converting some of the tests to read from json and compare the result to n3. would you do that until next week? I think that would make it a lot easier for me to understand what's going on, and to see what is covered by the tests. [19:42:00] (03PS23) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [19:42:03] DanielK_WMDE: this is definitely on the todo list [19:42:30] JeroenDeDauw: I just fixed one remaining whitespace problem, but if you see any others please tell me [19:42:58] DanielK_WMDE: I just want to get this thing OKed in principle before I go and spent 2 days on unit tests [19:43:24] I don't want to disrupt both code and tests at the same time, this is a recipe for a mess [19:43:25] SMalyshev: ok. so can i postpose reviewing the test code until that is done? I'll comment on the other code tomorrow then. [19:43:42] SMalyshev: you can run PHPCS against the code and see for yourself what the coding style violations are [19:43:48] You can use this config https://raw.githubusercontent.com/wmde/Diff/master/phpcs.xml [19:44:01] And you can do the same with PHPMD using https://raw.githubusercontent.com/wmde/Diff/master/phpmd.xml [19:44:12] PROBLEM - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1456 bytes in 0.204 second response time [19:44:17] JeroenDeDauw: have you compared the output og phpcs before and after the patch? [19:44:45] DanielK_WMDE: just for one file [19:45:00] DanielK_WMDE: sure. Even though if you give it a quick scan and see if you see some part is missing (like I forgot to test some important branch) it may be useful [19:45:09] JeroenDeDauw: RdfGenerator is what we care about mostly. the rest is scaffolding [19:45:09] JeroenDeDauw: ok, I'll do that [19:45:23] RdfBuilder I think? [19:45:38] That is the one I looked at [19:45:52] SMalyshev: last time i checked, the tests did not check the different "procude" settings separately. that would be good to have, but a lot easier to do with the expected output as n-triples files. [19:46:19] DanielK_WMDE: ok, noted, but that would be *much* easier to do with json files [19:46:44] DanielK_WMDE: b/c otherwise I will have to hand-create enother ton of graphs [19:46:49] SMalyshev: yes, json input, n-triples output. [19:47:16] you can use the same json input for a lot of different cases with different output, though [19:49:15] SMalyshev: hm, we should avoid a deadlock here. if you wait for an ok for the code before improving the tests, and I wait for better tests so i can review the code more easily, we have a problem :D [19:50:15] i'll do a cursory review of the code, and you work on changing the tests to jason-to-n-triples, ok? that way, we should be able to get this in by next week. [19:50:23] DanielK_WMDE: right. So I propose you review the code while not reviewing the tests. If you're OK with the code, then we can move the code forward and I'll commit to fix the tests as soon as I have ok on the code [19:50:42] (03CR) 10Ricordisamoa: "recheck" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194533 (owner: 10Adrian Lang) [19:50:47] with regards to the architecture, I would have preferred a refactoring before we introduce a ton of new code, but doing the refactoring after is fine too, as long as we do it. [19:50:53] DanielK_WMDE: if you're concerned about the code without tests, I can get to the test thing as soon as I get OK on the code [19:51:17] DanielK_WMDE: which refactoring you mean? [19:51:19] SMalyshev: for complex code, I really like to run tests and step through [19:52:05] factoring rdf generation for data values into separate classes, maybe also using separate classes for the different hings we produce [19:52:12] DanielK_WMDE: that may take a lot of time with this one (btw we have performance problem with EasyRDf right now, so we probably will have to do something about it) [19:52:38] DanielK_WMDE: I don't think we need separate classes for that... It's one line for most of them [19:52:48] there are a lot of "mode" and "type" switches in the code that could be refactored to use dispatching of some sort [19:52:48] or very close to one line [19:53:22] DanielK_WMDE: there's only shouldProduce() thing, the rest depends on it really [19:53:30] I do think we want that. but we don't have to argue about that now, I'm not going to block this patch on that question. [19:53:58] yes, in in my mind, there would be a list of "producers", each doing it's thing. [19:54:16] but again: not an argument we have to have for this patch to go in, if i can review it. [19:54:32] improved tests would help a lot with that [20:00:53] DanielK_WMDE: ok [20:01:19] DanielK_WMDE: I'll try to get to the tests by the end of the week [20:01:43] SMalyshev: excelent :) and i'll go over the code once more by that time. [20:01:45] thanks! [20:04:09] (03PS24) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [20:11:49] (03PS25) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [20:14:21] RECOVERY - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is OK: HTTP OK: HTTP/1.1 200 OK - 1444 bytes in 0.187 second response time [20:54:09] (03PS3) 10Krinkle: Simplify SectionEditLinkGenerator [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194305 (owner: 10Adrian Lang) [20:54:14] (03PS3) 10Krinkle: Completely rework SectionEditLinkGenerator [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194342 (owner: 10Adrian Lang) [20:54:19] (03PS2) 10Krinkle: Also load wikibase.ui.entityViewInit on read-only entities [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194533 (owner: 10Adrian Lang) [20:56:47] I'm looking at JSON dump generator and it uses entity lookup instead of revision lookup and basically discards revision info. Is there any reason why revision info is not dumped too? [21:02:49] SMalyshev: because it's not part of the entity. [21:03:16] DanielK_WMDE: right, but it's part of wikidata [21:03:31] arguably [21:03:54] SMalyshev: i'd really like to keep the wikibase stuff as separate from mediawiki stuff as possible [21:04:20] our data model and serialization code has no dependencies on mediawiki. i'd like to push this division as far as possible [21:04:39] but we may compromize wrt including meta-info in the json dump [21:04:52] we also include it in the output of Special:EntityData [21:05:25] DanielK_WMDE: well, yeah, revision is mediawiki but there's nothing mediawiki specific in in. it would be useful if we need to consider dump not as a point in time but as a sequence of dumps [21:05:58] DanielK_WMDE: right, it doesn't have to be part of the data model for entity, but can be part of the dump as metadata [21:06:08] yea. iÄ'm kind of torn about this. revision meta info isn't part of the wikibase data model [21:06:14] it is however in the json serialization spec [21:06:21] *shrug* file a ticket :) [21:06:27] sure :) [21:06:37] the current dump format doesn't have a< good place for metadata [21:06:46] that's the problem [21:07:04] right, this may take a bit of thinking... getting the info is easy but where to put it... [21:07:30] I am happy ... WDQ seems to work :) [21:07:45] rdf is kind of easier since you can just claim stuff out of the blue [21:08:27] DanielK_WMDE I'm just trying to prepare json dumps for tests and I see Ican't actually make it handle revision since json dump has none... [21:11:21] just use the output of /entity/Q23.json [21:11:25] that should have revision info [21:11:45] but the unserializer will not pick it up [21:11:51] you'd have to do that on top [21:12:05] DanielK_WMDE: so, we take the input from JSON, but what you want to compare it to? [21:12:08] SMalyshev: i suggest you ignroe per-entity revision info for now. [21:12:32] DanielK_WMDE: because I see there's already RdfSerializerTest [21:12:37] you take the input from json, run test case X against it, and then generate n-trimples, and sort the lines. [21:12:50] you compare the lines to an n-triples file stored along with the test case [21:12:55] DanielK_WMDE: isn't that serializer test? [21:13:07] you can write it my hand, or just have the test itself generate it (comapre to something empty) and verify it's ok [21:13:15] i.e. what's the diff then between builder test and serializer test [21:13:22] yes, you will also be testing the serializer [21:13:32] the i/o round trip makes this not a true unit test [21:13:37] Hey all [21:13:48] https://www.wikidata.org/wiki/Special:Contributions/DunDunDunt has been confirmed at be a sockpuppet at two wikis [21:13:52] JD|cloud ^ [21:14:03] yeah? [21:14:06] SMalyshev: we should still have *some* tests that don't rely on the serializer. But the JSON to n-triples tests hel pus cover more cases more easily. [21:14:08] Bsadowski1: what do you want us to do about it? [21:14:12] Block [21:14:15] https://simple.wikipedia.org/wiki/Wikipedia:Requests_for_deletion/Requests/2015/Heaven_Sent_Gaming also [21:14:21] DanielK_WMDE: yeah that worries me a bit that builder test would be testing both but if you're ok with it then fine [21:14:25] Using multiple accounts to sway votes [21:14:36] (CU confirmed) [21:14:40] well, unless it's locally vandalism we can't do anything about it, see https://www.wikidata.org/wiki/WD:BLOCK [21:15:10] Also: https://en.wikipedia.org/wiki/Wikipedia:Sockpuppet_investigations/DunDunDunt [21:15:28] SMalyshev: if you want to do it the clean way, split the test case into RdfBuilderTest and RdfBuilderIntegrationTest [21:15:35] but i think mixing these for now is fine [21:15:40] Accounts are supposedly operated by Smile Lee [21:15:53] fyi Bsadowski1 you do have access to #wikidata-admin if you need (as a steward) [21:16:01] SMalyshev: we are also relying on a lot of EasyRDF code for the tests. for a real unit test, we'd have to mock all that. i don't think we want to do that, really... [21:16:16] DanielK_WMDE: no, not really :) [21:16:18] Bsadowski1: but the question is under what rationale I should block the socks [21:16:24] :P [21:16:27] Oh [21:16:53] Well, he's trying to get his promotional article on multiple wikis. But using multiple accounts to get it to keep [21:17:14] for now, I'd suggest posting at https://www.wikidata.org/wiki/WD:AN [21:17:27] this may or may not be considered disruptive enough for a block [21:17:43] although if there's ballot stuffing on WD then it's blockable [21:17:56] or you can go into #wikidata-admin and use !team [21:18:26] SMalyshev: you probably want helper functions loadJsonDump( $file ) and assertNTriples( $file, $graph ) [21:18:40] DanielK_WMDE: yeah surely [21:19:09] JD|cloud: Look at https://en.wikipedia.org/wiki/Wikipedia:Sockpuppet_investigations/DunDunDunt [21:19:14] DanielK_WMDE: though I'm thinking maybe dump loading needs to go to data provider and it should just spit out entities [21:19:58] yes Bsadowski1 I know he's a sock but... https://www.wikidata.org/wiki/Wikidata:Blocking_policy doesn't let me do anything about it [21:20:22] since we explicitly disallow blocks solely for global autoblocking [21:20:26] SMalyshev: yes [21:20:42] https://www.wikidata.org/wiki/User:Smile_Lee [21:20:44] pfft [21:21:36] Bsadowski1: I g2g, it's best to ask on-wiki or in the admin channel [21:21:37] He doesn't want to get a COI involved so he created multiple accounts [21:21:45] technically socking is blockable [21:21:57] but idk the extent of it on WD [21:21:58] I mean to be cliassified as a conflict of interest [21:22:05] classified* [21:23:09] Bsadowski1: Hm? [21:23:14] And when his article was nominated for deletion on Simple, he brought them over pretending they're different people [21:23:20] (Missed the first part of what was said I assume) [21:23:33] JohnFLewis: JD|cloud: Look at https://en.wikipedia.org/wiki/Wikipedia:Sockpuppet_investigations/DunDunDunt [21:23:36] etc. [21:24:01] Hey all [21:24:01] https://www.wikidata.org/wiki/Special:Contributions/DunDunDunt has been confirmed at be a sockpuppet at two wikis [21:24:01] JD|cloud ^ [21:24:13] * JD|cloud goes for real [21:24:57] Bsadowski1: so you want local socking blocks to be done? [21:25:35] (03CR) 10jenkins-bot: [V: 04-1] Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [21:26:24] (03CR) 10jenkins-bot: [V: 04-1] Completely rework SectionEditLinkGenerator [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194342 (owner: 10Adrian Lang) [21:26:49] (03CR) 10jenkins-bot: [V: 04-1] Simplify SectionEditLinkGenerator [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194305 (owner: 10Adrian Lang) [21:35:08] DanielK_WMDE: my unit tests for some reason check wrong db: Error: 1146 Table 'wiki.wb_entity_per_page' doesn't exist - do you know if there's some setting/test setup that is missing? [21:37:34] SMalyshev: the unit tests should not be accessing any database, really... but if they do, it's improtant that the file has @group Database on top, so the actual db tables are shadowed by test tables [21:38:09] DanielK_WMDE: entity serializer needs the DB... I wanted to use it to generate test dumps [21:38:10] SMalyshev: other than that, it should Just Work [21:38:30] if you have a wiki family setup, you --wiki foo to pick the right one when runnign tests [21:38:48] DanielK_WMDE: yes I do --wiki but it still uses wiki db, not wikidatawiki one [21:38:51] SMalyshev: entity serializer needs the db? it shoudn't. what does it need it for? [21:39:22] DanielK_WMDE: I don't know it's not my code :) let me show you the backtrace [21:39:38] SMalyshev: no, rather show me how you construct the serializer [21:39:55] but, actually... i'm still confused why you need a serializer in the test case [21:39:58] DanielK_WMDE: http://pastebin.com/0FiExucW [21:40:12] DanielK_WMDE: to generate JSON from entity [21:40:20] *in* the test? [21:40:33] now i'm really confused [21:40:41] the JSON file should be part of the test [21:40:45] static, checked into git [21:40:48] DanielK_WMDE: I already have the code that produces the entities. so I want to generate JSON from them and then remove that code and use JSONs [21:41:00] ah, a one-off thing [21:41:04] yes [21:41:12] ic. for that it should be fine [21:41:26] seems like SnakSerializer needs a PropertyDataTypeLookup which would hit the db per default [21:41:32] yeah [21:41:40] the question is why it hits the wrong one [21:42:04] "wrong" as in "the wrong wiki" or something you don't even know exists, or doesn't? [21:42:20] and - are you sure it's the wrong one? [21:43:13] DanielK_WMDE: yeah wiki db doesn't have this table but wikidatawiki one does [21:43:38] because of course only the second one is wikidata, the first one is enwiki [21:43:55] do you serve both of them from the same LocalSettings? or do you have separate directories? [21:44:08] DanielK_WMDE: I have vagrant setup. I have no idea how it works [21:44:23] yay. me neither. [21:44:30] DanielK_WMDE: so far everything worked fine, dumps, pages, etc. but for unit test I guess I need to do something special [21:44:31] try --wiki wikidatawiki [21:44:35] but I have no idea what [21:44:38] ...when invokin phpunit.php [21:44:40] DanielK_WMDE: already did [21:44:59] php tests/phpunit/phpunit.php --wiki wikidatawiki extensions/WikidataBuildResources/extensions/Wikibase/repo/tests/phpunit/includes/rdf/RdfBuilderTest.php [21:45:01] shoudl use the exact same mechanism and settings as all other maintenance scripts [21:47:09] looks like it doesn't... dump script works [21:47:28] mwscript extensions/WikidataBuildResources/extensions/Wikibase/repo/maintenance/dumpEntities.php --wiki wikidatawiki --format ttl --output f [21:47:47] that works fine, uses right db... but not unit tests [21:48:04] they seem to ignore all the settings [21:51:18] SMalyshev: works fine for me - coudn't work half a day if it didn't. [21:51:49] where are you getting the serializer from? [21:53:00] DanielK_WMDE: weird... maybe my serializer code is wrong [21:53:01] http://pastebin.com/kumjBRrS [21:53:10] it's a copy of what entity dumper does basically [21:53:34] with couple of settings because it requires them for some reason... maybe that's related [21:57:48] I don't know how I can find where it decides which DB to use... there's like 10 levels of abstraction on it [21:58:08] SMalyshev: the getPropertyDataTypeLookup is hitting the wrong db?... very strange. very. [21:58:20] SMalyshev: but you could just mock it :) [21:58:30] just hardcode the data type for the properties. [21:58:37] DanielK_WMDE: that will probably produce broken JSON [21:58:41] it's a one-off hack anyway, right? [21:58:45] why? [21:58:54] you know the correct datatypes, don't you? [21:59:10] DanielK_WMDE: ah, well, I do [21:59:15] how many different properties with how many different datatypes are you using? [21:59:23] DanielK_WMDE: but I'm not sure how I can replace the lookup class [22:00:02] DanielK_WMDE: can I just put MockRepository there? [22:00:13] SMalyshev: $dummyDataTypeLookup= new InMemoryDataTypeLookup(); [22:00:37] $dummyDataTypeLookup->setDataTypeForProperty( new PropertyId( 'P15' ), 'string' ); [22:00:38] etc [22:00:49] DanielK_WMDE: ok, will try that [22:00:57] thanks [22:01:12] SMalyshev: in case there is no in-memory implementation, you'd use phpunit's macking mechanism [22:01:17] *mocking [22:02:40] $lookup = $this->getMock( 'Wikibase\DataModel\Entity\PropertyDataTypeLookup' ); [22:02:43] $lookup->expects( $this->any() )->method( 'getDataTypeIdForProperty' )->with( new PropertyId( 'P15' ) )->will( $this->returnValue( 'string' ); [22:02:54] $lookup->expects( $this->any() )->method( 'getDataTypeIdForProperty' )->with( new PropertyId( 'P23' ) )->will( $this->returnValue( 'CommonsMedia' ); [22:02:56] etc [22:03:20] but InMemoryDataTypeLookup is nicer, of course :) [22:04:04] yeah mocking out everything that way may be a bit annoying [22:04:41] DanielK_WMDE: can you rage to thiemo that he should get on IRC> [22:04:42] ? [22:06:05] SMalyshev: well, ...->will( $this->returnCallback( .... ) ) helps [22:06:13] (03PS26) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [22:06:51] Thiemo_WMDE: http://www.twitch.tv/entropygames [22:07:10] oh, so you are now screeming for your money? ;))) [22:07:28] Thiemo_WMDE: gimmeh all the bitcoinz [22:07:36] how much? [22:08:40] yeah that worked [22:08:55] :) [22:09:24] ok folks, off to bed. [22:09:26] I may actually keep that in memory store code for later tests since RdfBuilder has lookups too [22:09:32] DanielK_WMDE: thanks for your help [22:52:08] is there a way to action=purge all items and pages? [22:52:38] A bot could do that. [22:53:13] sjoerddebruin: right! [22:56:51] despens: we essentially do that sometimes when we deploy new code [22:56:54] gah [22:57:32] aude: good to know this is professional practice :) [22:58:18] we avoid doing it, if possible, but necessary when change parts of the layout etc [22:58:35] * aude wouldn't do this on wikipedia [22:59:09] aude: is this bot's code published? [22:59:28] despens: it's just a setting, to invalidate parser cache [23:00:39] aude: where can i find this setting? [23:01:22] it's $wgCacheEpoch which goes in local settings [23:02:24] thnx!! [23:02:30] sure [23:25:12] PROBLEM - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1459 bytes in 0.178 second response time [23:28:29] [13Diff] 15JeroenDeDauw comment on commit 14acd15a0: ![pfft](https://cloud.githubusercontent.com/assets/146040/6517064/9ab97f4a-c397-11e4-8d0e-925b5c32f66c.jpg)... 02http://git.io/xA5W [23:29:04] [13WikibaseDataModel] 15JeroenDeDauw created 06revert-390-noVendor (+1 new commit): 02http://git.io/xA5w [23:29:04] 13WikibaseDataModel/06revert-390-noVendor 14d8036ba 15Jeroen De Dauw: Revert "Don't access vendor/bin in composer ci" [23:29:11] [13WikibaseDataModel] 15JeroenDeDauw closed pull request #397: Revert "Don't access vendor/bin in composer ci" (06master...06revert-390-noVendor) 02http://git.io/xA5r [23:30:23] wmde/WikibaseDataModel/revert-390-noVendor/d8036ba : Jeroen De Dauw The build has errored. http://travis-ci.org/wmde/WikibaseDataModel/builds/53271363 [23:30:35] [13Diff] 15JeroenDeDauw created 06revert-37-noVendor (+1 new commit): 02http://git.io/xAdk [23:30:35] 13Diff/06revert-37-noVendor 14f6cbad8 15Jeroen De Dauw: Revert "Don't access vendor/bin in composer ci" [23:30:48] [13Diff] 15JeroenDeDauw 04deleted 06revert-37-noVendor at 14f6cbad8: 02http://git.io/xAdZ [23:35:46] aude: Any idea why we had two dispatch lag problems tonight? [23:35:52] Dispatcher looks fine to me [23:35:58] also not to many edits [23:42:26] (03PS1) 10Smalyshev: Refactor tests to be data-driven [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/194750 [23:51:11] [13WikibaseDataModel] 15JeroenDeDauw created 06empyrefs (+1 new commit): 02http://git.io/xApq [23:51:11] 13WikibaseDataModel/06empyrefs 14b6e4cf1 15jeroendedauw: Ignore empty references when added to ReferenceList [23:51:41] [13WikibaseDataModel] 15JeroenDeDauw opened pull request #398: Ignore empty references when added to ReferenceList (06master...06empyrefs) 02http://git.io/xApc [23:53:04] wmde/WikibaseDataModel/empyrefs/b6e4cf1 : jeroendedauw The build passed. http://travis-ci.org/wmde/WikibaseDataModel/builds/53273649 [23:57:52] [13Diff] 15JeroenDeDauw pushed 1 new commit to 06master: 02http://git.io/xAjV [23:57:52] 13Diff/06master 14a92cf40 15Jeroen De Dauw: Update .travis.yml