[00:32:44] something seems broken now [00:32:58] I added Commons: https://www.wikidata.org/wiki/Q100003 [00:33:13] but here it stays not connected: https://commons.wikimedia.org/wiki/Category:Gronsveld [04:36:50] . [04:36:54] !nyan [04:36:54] ~=[,,_,,]:3 [05:13:56] hehehe [05:13:58] !nyan [05:13:58] ~=[,,_,,]:3 [05:14:03] d'awwwww [07:15:44] (03PS4) 10Henning Snater: Major rewrite of SpecialSetLabelDescriptionAliases [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191570 (owner: 10Thiemo Mättig (WMDE)) [07:28:16] (03PS5) 10Henning Snater: Major rewrite of SpecialSetLabelDescriptionAliases [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191570 (owner: 10Thiemo Mättig (WMDE)) [07:51:43] How do I set a target as a string in Pywikibot (ie. pywikibot.?) [08:14:25] Hi! When processing MW API results for data properties of https://www.wikidata.org/wiki/Q14657 I see strange data structures. For example there is "country" / "Sweden" / "end time" subproperty with value 'time': '+00000001721-01-01T00:00:00Z' [08:15:06] While http://www.wikidata.org/wiki/Q490 has the same value "country" / "Italy" / "end time" to no value. [08:15:27] There is nothing wrong with that however API result contains the list of values not the single one. [08:15:54] That means it's possible to have both "no value" and time value as value for the same subproperty. [08:15:59] Why is it that? [08:16:17] How can I detect whether is it "no value" or real time value reliably? [08:21:52] Ok, I made an assumption that when there is only one value and it's value is 'novalue' then it's a fake subproperty. [08:39:23] https://www.wikidata.org/w/index.php?title=Q4633448&diff=prev&oldid=197995978 [08:39:26] My bots alive [08:39:28] It's aliiiive [08:44:12] 3MediaWiki-extensions-WikibaseRepository, Wikidata: update documentation of high-level data model - https://phabricator.wikimedia.org/T75603#1052472 (10Snaterlicious) Trying to capture a high-level, conceptual view on the data model; I would like to use that for documentation. Please comment: {F44112} (The low-l... [08:48:03] hoi hoo [08:48:34] hi GerardM- [08:49:05] my surprise is that these elections are seen as politicians because I always use claim[31:5] when I do add positions [08:49:51] by the way Reasonator just went down [08:50:09] Time to debug the bot, how fun D: [08:51:11] such things only happen when things really turn into the abyss [08:51:44] I am not that bothered.. with many million edits Reasonator is quite stable [08:52:14] I hardly use reasonator [08:56:04] your loss [08:56:18] I promise you that you are more effective at Wikidata with Reasonator [08:56:56] I promise you that you are more effective at Wikidata with Reasonator [08:57:37] I could not have done the numbers of edits with this quality without Reasonator and associated tool [08:58:20] without Reasonator you do not have a clue what statements exist on an item with many statements [08:58:42] it is too much all over the page and unsorted [09:00:11] for me Wikidata is only for editing [09:07:25] how else do you find 7,384 painters who are not known as such ? [09:12:13] [13WikibaseDataModel] 15thiemowmde created 06altFallBackDocs (+1 new commit): 02http://git.io/AZfS [09:12:13] 13WikibaseDataModel/06altFallBackDocs 14f626e1d 15Thiemo Mättig: More descriptive Term/AliasGroup docs [09:12:51] [13WikibaseDataModel] 15thiemowmde opened pull request #384: More descriptive Term/AliasGroup docs (06master...06altFallBackDocs) 02http://git.io/AZfN [09:14:56] (03CR) 10Thiemo Mättig (WMDE): [C: 031] "Thanks for the fix, Henning." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191570 (owner: 10Thiemo Mättig (WMDE)) [09:20:15] wmde/WikibaseDataModel/altFallBackDocs/f626e1d : Thiemo Mättig The build passed. http://travis-ci.org/wmde/WikibaseDataModel/builds/51485869 [09:36:03] (03PS1) 10Hoo man: Update lua.wiki docs with latest changes from MediaWiki.org [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191859 [09:39:55] (03PS2) 10Hoo man: Move duplicate code to UsageAccumulator classes [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191335 (owner: 10Thiemo Mättig (WMDE)) [09:41:52] (03CR) 10Hoo man: [C: 04-1] "-1 to get your attention" (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191335 (owner: 10Thiemo Mättig (WMDE)) [09:42:32] (03CR) 10jenkins-bot: [V: 04-1] Move duplicate code to UsageAccumulator classes [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191335 (owner: 10Thiemo Mättig (WMDE)) [09:54:19] when is the moment to delete an item like https://www.wikidata.org/wiki/Q19324483 ?? [09:55:19] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Special:EntitiesWithoutLabel shouldn't use numerical offsets - https://phabricator.wikimedia.org/T90098#1052542 (10hoo) 3NEW [10:00:34] 3Wikidata, wikidata-query-service: Analyze indexes used for the engine - https://phabricator.wikimedia.org/T87307#1052561 (10Smalyshev) 5stalled>3Resolved Closing as not relevant after the Titan's demise. [10:01:08] 3Wikidata, wikidata-query-service: Update links/refs Data model according to discussed on MW Summit - https://phabricator.wikimedia.org/T87627#1052563 (10Smalyshev) 5Open>3declined [10:03:06] (03CR) 10Aude: [C: 032] Update lua.wiki docs with latest changes from MediaWiki.org [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191859 (owner: 10Hoo man) [10:05:34] (03Merged) 10jenkins-bot: Update lua.wiki docs with latest changes from MediaWiki.org [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191859 (owner: 10Hoo man) [10:09:17] DanielK_WMDE: https://phabricator.wikimedia.org/T89169 [10:18:58] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Undefined index in HtmlTimeFormatter.php on line 111 - https://phabricator.wikimedia.org/T90100#1052606 (10aude) 3NEW [10:23:42] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Undefined index in HtmlTimeFormatter.php on line 111 - https://phabricator.wikimedia.org/T90100#1052619 (10hoo) That was me testing on testwikidata... https://test.wikidata.org/wiki/Special:EntityData/Q68.json [10:30:45] 3Wikidata-Quality, Wikidata: Unique Value Constraint Check - https://phabricator.wikimedia.org/T90102#1052646 (10Jonas.keutel) 3NEW a:3Andreasburmeister [10:44:08] * aude glares at hoo :) [10:45:28] huh? Shall I come over? [10:46:08] hoo testing on test.wikidata... [10:46:30] Yeah, I'm evil like that :P [10:46:38] Hi! http://www.wikidata.org/wiki/Q9009 has "dissolved or abolished" property like if it does not exists. However the city actually exists. [10:47:04] May I delete this property? [10:47:35] 3Wikidata: RDF mapping should not assert that .../entity/Q123 is-a Wikidata item - https://phabricator.wikimedia.org/T89949#1052731 (10daniel) Nik tells me that the HA features in Virtuoso are only available in the closed source enterprise version. That basically means WMF is not going to use it. [10:48:34] Here's really abolished city, which is correct: https://www.wikidata.org/wiki/Q243456 [10:52:56] 3Wikidata-Quality, Wikidata: Handle constraint parameter "now" - https://phabricator.wikimedia.org/T90107#1052736 (10Jonas.keutel) 3NEW [10:53:10] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Undefined index in HtmlTimeFormatter.php on line 111 - https://phabricator.wikimedia.org/T90100#1052751 (10aude) this is already fixed on master: https://gerrit.wikimedia.org/r/#/c/190998/ [10:53:19] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Undefined index in HtmlTimeFormatter.php on line 111 - https://phabricator.wikimedia.org/T90100#1052752 (10aude) 5Open>3Resolved [11:07:27] Found the whole bunch of "dissolved or abolished" cities which actually do exists. [11:08:01] Probably will need to disable that filter in the script. However some actually dissolved or abolished cities will get into the list of modern cities. [11:08:25] QuestPC: Whatever wrong data you find, feel free to correct it [11:08:58] hoo: I am still not sure - imagine the town maybe actually was abolished during some major war then rebuilt from scratch? [11:09:26] mh... you could add a qualifier that states that, if you have sources [11:09:30] hoo: However maybe there should be separate properties for city that was abolished and not rebuilt and city which was abolished and rebuilt. [11:09:52] I doubt that... I think that should be expressed with Qualifiers [11:10:02] hoo: I do now know that's commercial project I am affraid that if I'll be checking the sources / history of the city that will take a lot of time. [11:11:45] 3MediaWiki-Maintenance-scripts, § Wikidata-Sprint-2015-01-21, Wikidata-Sprint-2015-01-08§, § Wikidata-Sprint-2015-02-03, Wikidata, MediaWiki-Sites: Maintenance script for importing site definitions. - https://phabricator.wikimedia.org/T87176#1052820 (10hoo) 5Open>3Resolved Patch merged [11:11:47] 3§ Wikidata-Sprint-2015-01-21, § Wikidata-Sprint-2015-02-03, Wikidata: document xml schema for sites definition - https://phabricator.wikimedia.org/T87183#1052822 (10hoo) [11:12:06] 3§ Wikidata-Sprint-2015-01-21, § Wikidata-Sprint-2015-02-03, Wikidata: document xml schema for sites definition - https://phabricator.wikimedia.org/T87183#1052825 (10hoo) 5Open>3Resolved Patch merged [11:13:07] QuestPC: What do you mean with commercial project? [11:17:13] hoo: I need the list of modern countries / cities for commercial project so they will not want for too long time. [11:17:31] hoo: Still I hope my info will help you a bit. [11:17:37] my reports. [11:19:43] It seems that lots of existing German cities which has "dissolved or abolished" are related to Napoleonic rule in Germany around 1806. Somewhat that's "politically dissolved or abolished" however not "physically" as for some Ancient Roman cities. [11:21:21] So probably that value is picked up by some bot however bot just cannot understand the difference. [11:22:38] 3MediaWiki-Maintenance-scripts, § Wikidata-Sprint-2015-01-21, § Wikidata-Sprint-2015-02-03, Wikidata, MediaWiki-Sites: Maintenance script for exporting site definitions. - https://phabricator.wikimedia.org/T87178#1052864 (10hoo) 5Open>3Resolved Patch merged [11:22:39] 3§ Wikidata-Sprint-2015-01-21, § Wikidata-Sprint-2015-02-03, Wikidata: document xml schema for sites definition - https://phabricator.wikimedia.org/T87183#1052866 (10hoo) [11:23:11] [13Time] 15thiemowmde 04force-pushed 06pad4Year from 1463cb30d to 14176fa98: 02http://git.io/AZK1 [11:23:11] 13Time/06pad4Year 145588673 15Thiemo Mättig: Enforce minimal year padding to 4 digits [11:23:11] 13Time/06pad4Year 144a578ee 15Thiemo Mättig: Simplify data providers [11:23:11] 13Time/06pad4Year 14176fa98 15Thiemo Mättig: Update docs [11:23:14] Lydia_WMDE: DanielK_WMDE: Do we care about the workboards of past sprints? [11:23:20] Some tasks have two sprints assigned [11:23:28] hoo: that is how it should be [11:23:41] if they were in a past sprint they should keep the tag [11:24:18] So a sprint workboard gets "frozen" after the sprint ended? [11:24:25] pretty much [11:24:34] ok [11:29:48] [13Time] 15thiemowmde created 06timeDocsAgain (+1 new commit): 02http://git.io/AZi7 [11:29:48] 13Time/06timeDocsAgain 14492c762 15Thiemo Mättig: More specific TimeValue::$precision documentation [11:31:30] [13Time] 15thiemowmde opened pull request #36: More specific TimeValue::$precision documentation (06master...06timeDocsAgain) 02http://git.io/AZPu [11:33:39] [13Time] 15thiemowmde created 06timeNonEmpty (+1 new commit): 02http://git.io/AZXZ [11:33:39] 13Time/06timeNonEmpty 148e2e7cd 15Thiemo Mättig: Disallow empty strings in TimeValue [11:34:27] (03CR) 10Daniel Kinzler: Implement RDF export for items and RDF dumps (Turtle only for now) (038 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [11:38:45] [13Time] 15thiemowmde opened pull request #37: Disallow empty strings in TimeValue (06master...06timeNonEmpty) 02http://git.io/AZ1j [11:44:18] 3MediaWiki-extensions-WikibaseClient, Wikidata: Improve interlanguage links for talk pages - https://phabricator.wikimedia.org/T30604#1052904 (10aude) https://www.mediawiki.org/wiki/Requests_for_comment/Associated_namespaces might be of some use or help with this, to know in a more structured way which namespace... [11:48:15] [13Time] 15thiemowmde 04force-pushed 06calModel from 1416c6cc0 to 146ef2a23: 02http://git.io/AZyb [11:48:15] 13Time/06calModel 146ef2a23 15Thiemo Mättig: Correct calendar model documentation [11:50:50] (03CR) 10Hoo man: Always link to the repo if we don't have any language links (0313 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/168632 (https://phabricator.wikimedia.org/T61391) (owner: 10Hoo man) [11:54:55] (03PS10) 10Hoo man: Always link to the repo if we don't have any language links [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/168632 (https://phabricator.wikimedia.org/T61391) [11:55:40] (03CR) 10Hoo man: "Addressed Daniel's comments, removed the last use case for the nolanglinks css." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/168632 (https://phabricator.wikimedia.org/T61391) (owner: 10Hoo man) [11:56:25] [13Time] 15thiemowmde 04force-pushed 06gregorian from 140079ed3 to 145c2c0fd: 02http://git.io/AZHr [11:56:26] 13Time/06gregorian 145c2c0fd 15Thiemo Mättig: Do not always show "(Gregorian)" suffix [12:03:40] (03CR) 10Henning Snater: [C: 032] Major rewrite of SpecialSetLabelDescriptionAliases [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191570 (owner: 10Thiemo Mättig (WMDE)) [12:04:48] hoo: https://gerrit.wikimedia.org/r/#/c/190409/ [12:05:26] (03CR) 10Daniel Kinzler: Implement RDF export for items and RDF dumps (Turtle only for now) (032 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [12:07:03] (03Merged) 10jenkins-bot: Major rewrite of SpecialSetLabelDescriptionAliases [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191570 (owner: 10Thiemo Mättig (WMDE)) [12:08:39] [13WikibaseDataModel] 15thiemowmde pushed 1 new commit to 06master: 02http://git.io/AZdC [12:08:39] 13WikibaseDataModel/06master 147fbdb7c 15Thiemo Mättig: Update RELEASE-NOTES.md [12:10:51] 3Wikidata: RDF mapping should not assert that .../entity/Q123 is-a Wikidata item - https://phabricator.wikimedia.org/T89949#1052955 (10mkroetzsch) >>! In T89949#1052731, @daniel wrote: > Nik tells me that the HA features in Virtuoso are only available in the closed source enterprise version. That basically means... [12:17:50] wikimedia/mediawiki-extensions-Wikibase/master/364c306 : jenkins-bot The build is still failing. http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/51501404 [13:05:01] (03PS11) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [13:17:55] 3Wikidata: RDF mapping should not assert that .../entity/Q123 is-a Wikidata item - https://phabricator.wikimedia.org/T89949#1053063 (10Manybubbles) >>! In T89949#1052955, @mkroetzsch wrote: >>>! In T89949#1052731, @daniel wrote: >> Nik tells me that the HA features in Virtuoso are only available in the closed so... [13:25:13] [13WikibaseDataModel] 15thiemowmde created 06idParserTest (+1 new commit): 02http://git.io/AncX [13:25:13] 13WikibaseDataModel/06idParserTest 14137b07e 15Thiemo Mättig: Rename/refactor DispatchingEntityIdParserTest [13:26:37] [13WikibaseDataModel] 15thiemowmde opened pull request #385: Rename/refactor DispatchingEntityIdParserTest (06master...06idParserTest) 02http://git.io/AnCw [13:27:30] wmde/WikibaseDataModel/idParserTest/137b07e : Thiemo Mättig The build failed. http://travis-ci.org/wmde/WikibaseDataModel/builds/51509302 [13:31:56] [13WikibaseDataModel] 15thiemowmde created 06idParserExceptions (+1 new commit): 02http://git.io/Anl1 [13:31:56] 13WikibaseDataModel/06idParserExceptions 14c9bf392 15Thiemo Mättig: More specific exception messages in DispatchingEntityIdParser [13:34:21] wmde/WikibaseDataModel/idParserExceptions/c9bf392 : Thiemo Mättig The build passed. http://travis-ci.org/wmde/WikibaseDataModel/builds/51510045 [13:36:22] [13WikibaseDataModel] 15thiemowmde opened pull request #386: More specific exception messages in DispatchingEntityIdParser (06master...06idParserExceptions) 02http://git.io/An4q [13:37:54] 3Wikidata-Quality, Wikidata: check if references are missing - https://phabricator.wikimedia.org/T90136#1053130 (10soeren.oldag) 3NEW [13:38:24] 3Wikidata-Quality, Wikidata: Check if references are missing - https://phabricator.wikimedia.org/T90136#1053130 (10soeren.oldag) [13:42:22] [13WikibaseDataModel] 15thiemowmde 04force-pushed 06idParserExceptions from 14c9bf392 to 1462ac7fd: 02http://git.io/AnRG [13:42:22] 13WikibaseDataModel/06idParserExceptions 1462ac7fd 15Thiemo Mättig: More specific exception messages in DispatchingEntityIdParser [13:45:23] [13WikibaseDataModel] 15thiemowmde 04force-pushed 06idParserTest from 14137b07e to 14a5086f2: 02http://git.io/An0g [13:45:23] 13WikibaseDataModel/06idParserTest 14a5086f2 15Thiemo Mättig: Rename/refactor DispatchingEntityIdParserTest [13:48:04] wmde/WikibaseDataModel/idParserTest/a5086f2 : Thiemo Mättig The build was fixed. http://travis-ci.org/wmde/WikibaseDataModel/builds/51511751 [14:00:15] (03PS12) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [14:02:44] (03CR) 10jenkins-bot: [V: 04-1] Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [14:05:06] (03PS13) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [14:11:48] 3Wikidata: Fatal error: Class 'Wikibase\Utils' not found in PropertySuggester/GetSuggestions.php on line 153 - https://phabricator.wikimedia.org/T90137#1053177 (10aude) 3NEW [14:12:24] 3Wikidata: Fatal error: Class 'Wikibase\Utils' not found in PropertySuggester/GetSuggestions.php on line 153 - https://phabricator.wikimedia.org/T90137#1053184 (10aude) https://gerrit.wikimedia.org/r/#/c/190485/ removed the method and the class has also since been removed / renamed. [14:20:12] 3Wikidata: Fatal error: Class 'Wikibase\Utils' not found in PropertySuggester/GetSuggestions.php on line 153 - https://phabricator.wikimedia.org/T90137#1053207 (10aude) https://github.com/Wikidata-lib/PropertySuggester/pull/117 [14:21:12] with --group Wikibase, tests now pass :) [14:22:13] [13WikibaseDataModel] 15thiemowmde 04force-pushed 06removeClaim from 142f017a0 to 14e5e7e4a: 02http://git.io/qLLyQg [14:22:13] 13WikibaseDataModel/06removeClaim 1450b1f14 15Thiemo Mättig: Stop Statement requiring a Claim [14:22:13] 13WikibaseDataModel/06removeClaim 14b7baee8 15jeroendedauw: Remove Claim class [14:22:13] 13WikibaseDataModel/06removeClaim 14e5e7e4a 15Thiemo Mättig: Revert variable name changes and merge conflict fragments [14:29:42] [13WikibaseDataModel] 15thiemowmde created 06useClaim (+1 new commit): 02http://git.io/An9i [14:29:42] 13WikibaseDataModel/06useClaim 145a76574 15Thiemo Mättig: Drop all unused imports in the 3.0 branch [14:30:09] [13WikibaseDataModel] 15thiemowmde opened pull request #387: Drop all unused imports in the 3.0 branch (063.0.x-dev...06useClaim) 02http://git.io/An9Q [14:31:00] hoo: have you seen Can't connect to local MySQL server through socket '/tmp/mysql.sock' error for hhvm in travis? [14:31:11] known issue that we fixed in other git repos? [14:32:01] (03PS14) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [14:40:01] aude: Don't think so [14:40:31] hoo: ok [14:40:37] https://travis-ci.org/Wikidata-lib/PropertySuggester/jobs/51515653 [14:40:45] shall investigate [14:49:38] hi, folks! does anyone know if there's a tool to see which pages in a category *don't* have a Wikidata item? [14:49:54] i'm lazy today, and i have a category with 80 pages but 73 wikidata items... i don't wanna go through them manually to find out [14:50:28] [13WikibaseDataModel] 15thiemowmde created 06dupeCode (+1 new commit): 02http://git.io/AnpT [14:50:28] 13WikibaseDataModel/06dupeCode 14e088104 15Thiemo Mättig: Move implementations from Claim to Statement [14:51:11] Jhs: that sounds really interesting to do [14:52:07] Jhs: can't autolist do that? [14:52:19] Jhs: I don't know the category, but I a help you :) [14:52:40] [13WikibaseDataModel] 15thiemowmde opened pull request #388: Move implementations from Claim to Statement (063.0.x-dev...06dupeCode) 02http://git.io/AnhJ [14:52:46] DanielK_WMDE, it's autolist2 that only finds 73 items on 80 pages [14:53:41] Jhs: and it can't show you the difference?... [14:53:48] too bad :/ [14:53:54] no, it only shows the ones with items [14:54:16] i'm using it on category no.wikipedia Kategori:Jernbanestasjoner_p�_�stfoldbanen [14:54:21] should be easy to do in the database via page_props [14:56:03] i'm guessing https://tools.wmflabs.org/wikidata-todo/creator.html can do it, it will probably ignore existing pages (anything else wouldn't make sense) [14:56:24] (03CR) 10JanZerebecki: [C: 032] Set Change timestamp based on RC timestamp. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191649 (owner: 10Daniel Kinzler) [14:57:15] wmde/WikibaseDataModel/dupeCode/e088104 : Thiemo Mättig The build failed. http://travis-ci.org/wmde/WikibaseDataModel/builds/51519567 [14:58:45] Jhs: http://en.wikipedia.org/w/api.php?action=query&generator=categorymembers&gcmtitle=Category:Australian_Labor_Party_politicians&list=pageswithprop&pwppropname=wikibase_item&pwpprop=value|title [14:58:46] (03Merged) 10jenkins-bot: Set Change timestamp based on RC timestamp. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191649 (owner: 10Daniel Kinzler) [14:58:55] tells you which ones have an item [14:59:12] not sure / don't think it easily tells you which don't [14:59:52] but there might be a way to formulate it with the api [15:04:04] 3§ Wikidata-Sprint-2015-02-03, MediaWiki-extensions-WikibaseClient, Wikidata: Lua: Create a mw.wikibase.renderSnak method for rendering arbitrary Snaks - https://phabricator.wikimedia.org/T76213#1053370 (10daniel) 5Open>3Resolved [15:04:05] 3MediaWiki-extensions-WikibaseRepository, MediaWiki-extensions-WikibaseClient, Wikidata: data quality and trust - https://phabricator.wikimedia.org/T76230#1053372 (10daniel) [15:04:06] 3MediaWiki-extensions-WikibaseClient, Wikidata: Allow Snak rendering in user language instead of content language for multilingual wikis - https://phabricator.wikimedia.org/T88924#1053371 (10daniel) [15:04:43] (03PS1) 10JanZerebecki: New Wikidata build 2015-02-20 [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/191884 [15:06:27] wikimedia/mediawiki-extensions-Wikibase/master/5817775 : jenkins-bot The build was fixed. http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/51520463 [15:07:00] (03CR) 10jenkins-bot: [V: 04-1] New Wikidata build 2015-02-20 [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/191884 (owner: 10JanZerebecki) [15:07:14] 3MediaWiki-extensions-WikibaseClient, Wikidata: For "special" precision in Wikidata interface, show dimension in meters - https://phabricator.wikimedia.org/T89218#1053382 (10thiemowmde) +/-0.014000802722428° is something between 0 and 1555.644746936 meters. How is that helpful? [15:07:28] (03CR) 10Daniel Kinzler: [C: 04-1] "Looks good so far, but needs tests for the new functionality." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 (owner: 10Smalyshev) [15:08:15] (03Abandoned) 10JanZerebecki: MWException -> Exception [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/184023 (owner: 10Ori.livneh) [15:09:20] aude, DanielK_WMDE: Magnus' creator tool fixed it. Although it didn't preload the category like it should, but it worked when i copypasted the list of pages [15:10:16] Jhs: at least it's something [15:10:26] certainly could be a tool to make this easier [15:13:38] 3Wikidata: Provide test data for development installations - https://phabricator.wikimedia.org/T90148#1053402 (10daniel) 3NEW [15:14:07] Lydia_WMDE: --^ [15:14:24] DanielK_WMDE: there are the elements [15:14:37] think it still might work to import them, but they don't have claims [15:14:48] since we had elements from time before claims existed [15:14:53] aude: yes, and the script we used to import them is horrible, and i want to kill it [15:14:57] heh :) [15:15:03] replace it* [15:15:08] or with import [15:15:13] replace? by what? [15:15:18] yea, importDump.php [15:15:25] another script? or import dump [15:15:28] there is now a config switch that enables entity import [15:15:38] that already works [15:15:39] that might only work for empty wikis [15:15:42] does it? [15:15:45] yes [15:15:55] elements work for any wiki [15:16:45] because they don't do anything meaningful [15:17:59] they have all the site links and easily can have properties + claims [15:18:05] statements* [15:19:27] * aude would do it with a bot :) [15:19:56] does anyone know how to kickstart ToolScript ? [15:20:42] GerardM-: don't know :( [15:20:57] Hallo. [15:20:59] First time I encounter a protected item page on Wikidata: https://www.wikidata.org/wiki/Q16503 [15:21:27] And I don't see any indication that it's protected, except the lack of the edit links. [15:21:42] (03CR) 10Daniel Kinzler: [C: 032] Drop unused code from deprecated Wikibase\Term class [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191016 (owner: 10Thiemo Mättig (WMDE)) [15:21:45] it can be a termporary bug [15:21:50] it happens [15:22:40] The explanation from 2013 is that the protection is "To disable the automated page-move-change-sitelink that happens twice in a month" [15:23:03] I don't completely understand what is the problem, but I suspect that it's not relevant any more. [15:23:14] aharoni: yea, we have a bug open for the protection indicator issue, i think. [15:23:19] it's a problem with caching... [15:23:26] DanielK_WMDE: Thanks. [15:23:43] 3Wikidata: Provide test data for development installations - https://phabricator.wikimedia.org/T90148#1053440 (10aude) I would make this a maintenance script that pulls the items via the wikidata api. if there are id conflicts for the properties, they can be renumbered and any usages of the properties (or refer... [15:23:43] GerardM-: url? [15:23:46] aharoni: for the editorial issue of if and why this should be protected: no idea [15:23:52] I an more curious about the puzzling protection explanation. Does this sound like something that is still relevant? [15:23:59] Sounds more technical than editorial. [15:24:08] But I can bring it up on its talk page. [15:24:29] 3Wikidata: Provide test data for development installations - https://phabricator.wikimedia.org/T90148#1053443 (10aude) and sites table contents should be available via api, although populateSitesTable script generally works fine as well to replicate the contents of wikidata's sites table. [15:24:30] (03Merged) 10jenkins-bot: Drop unused code from deprecated Wikibase\Term class [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191016 (owner: 10Thiemo Mättig (WMDE)) [15:24:45] aharoni: you need to purge [15:24:48] aharoni: ah, i think i know what it is. [15:24:48] it's a bug [15:25:04] aharoni: when you move a page on wikipedia, the sitelink automatically gets adjusted on wikidata [15:25:25] https://phabricator.wikimedia.org/T85252 [15:25:30] DanielK_WMDE: Is it a problem? What's special about this page? Lots of pages move all the time. [15:25:39] aharoni: but if you move a page to archive it, you don't want that. You don't want the sitelink to go to the archive page. [15:25:42] GerardM-: or enough information so i can reproduce? [15:26:14] aharoni: i *think* the archiving uses page moves, so the history of the page doesn't grow too big (which causes trouble when deleting, among other things) [15:27:14] jzerebecki: https://travis-ci.org/Wikidata-lib/PropertySuggester/jobs/51515653 [15:27:32] might be something about travis, our setup scripts for it, or hhvm or not sure [15:28:41] retriggerin https://travis-ci.org/Wikidata-lib/PropertySuggester/builds/48742558 [15:28:44] g* [15:28:51] aharoni: https://phabricator.wikimedia.org/T70947 [15:32:33] jzerebecki: the old jobs for PropertySuggester don't work, since there is incompatibility with wikibase master [15:33:10] although the hhvm error is encountered before even running tests [15:33:36] DB connection error: Can't connect to local MySQL server through socket '/tmp/mysql.sock' (2) (). [15:36:49] aude: that generally means mysql isn't running, or the socket is not located in the right place [15:36:59] could also mean the system is out of handles, though i find that unlikely [15:37:14] strange, as this error doesn't happen for wikibase jobs on travis [15:37:35] maybe something in the setup actually kills mysql? [15:38:06] perhaps [15:38:30] could also be a file permission isse [15:38:44] https://stackoverflow.com/questions/5376427/cant-connect-to-local-mysql-server-through-socket-var-mysql-mysql-sock-38 [15:39:21] aude: hm actually, /tmp/mysql.sock? Really? That's a strange path, the standard seems to be /var/lib/mysql/mysql.sock [15:39:48] haha, suggested fix: ln -s /var/lib/mysql/mysql.sock /tmp/mysql.sock [15:42:37] DanielK_WMDE: interesting... [15:44:31] 3Wikidata, wikidata-query-service: Install Java 8 on a Jenkins node - https://phabricator.wikimedia.org/T85964#1053581 (10hashar) 5Resolved>3declined I have removed Java 8 on all Jenkins slaves per comment on https://gerrit.wikimedia.org/r/#/c/183222/ Will remove the Wikidata/Gremlin jobs as well since they... [15:45:42] 3Wikidata: update and consolidate json dump documentation - https://phabricator.wikimedia.org/T87329#1053591 (10Lucie) So far, there is no part talking about the snaks-order. Was that on purpose or jsut because it wasn't there at the point it got last updated? So- should I add a part about skan-orders? Fixing t... [15:46:25] Lydia_WMDE: are we still enabling language links for wikibooks on tuesday? [15:46:30] * aude sees no reason not to [15:46:39] aude: yes still on track :) [15:48:13] yay :) [15:55:19] 3Wikidata: update and consolidate json dump documentation - https://phabricator.wikimedia.org/T87329#1053670 (10daniel) yes, please document how we represent order, if we do. There is four places where we (should) do this: # The order of statements (implies the order of statement groups) # The order of qualifie... [15:55:49] good morning [15:55:56] DanielK_WMDE: hi [15:56:10] hey dennyvrandecic! [15:57:49] had time to check the rdf serialization and want to chat about it? [15:59:51] dennyvrandecic: caught up in code reviews right now [16:00:05] sure, just ping me whenever you want :) [16:00:51] dennyvrandecic: stas has been working on rdf export from php code. maybe have a look at that: https://gerrit.wikimedia.org/r/#/c/191393/ [16:01:47] dennyvrandecic: it introduces several "flavors" of rdf output, including one that is similar to the full blown one WDT produces, and one using somethign like "simple statements". [16:02:24] i just hope that the simple statements are never used for qualified statements [16:04:42] dennyvrandecic: currently, they are used for statements with "best" rank. but they use predicates from a separate namespace, so the semantic can be defined appropriately. [16:05:11] that is fine, as long as they have no qualifiers :) [16:05:16] aude: for new release: https://github.com/Wikidata-lib/PropertySuggester/pull/118 [16:05:41] dennyvrandecic: even if they do have qualifiers. this mode is intended specifically fro the query engine. it would be bad if we wouldn't find any wives of george washington, just because the marriage is qualified with start and end date [16:06:22] dennyvrandecic: i understand the reasoning behind omitting qualified statements, but that would not be useful to our use case. [16:06:22] and if he had two, with both being best, he's a bigamist :) [16:06:43] dennyvrandecic: that's your own inference :) [16:06:46] 3MediaWiki-extensions-WikibaseClient, Wikidata: When sitelinks change, update link data in cached ParserOutput without re-parsing the page. - https://phabricator.wikimedia.org/T89965#1053704 (10FriedhelmW) This would slow down editing, as Wikidata had to wait until all languages caches have been updated. [16:06:58] and french guyana will be in Europe [16:07:00] if you ask for wives of GW, you may get multiple results. [16:07:05] which is exactly what is intended [16:07:38] dennyvrandecic: french guyana is in yourup if you follow the political structure. [16:07:45] not if you follow the geographical [16:07:51] the latter exists, but is not well maintained [16:08:05] err. [16:08:16] s/yourup/europe/ [16:08:19] Germany is named after Bavaria [16:08:19] how did that happen?= [16:08:36] pronounciation -> text, that was an easy to explain slip [16:08:57] yea, my brain sometimes doesn't think before typing ;) [16:09:22] 3Wikidata: PropertySuggester travis failure for mysql + hhvm due to DBConnectionError - https://phabricator.wikimedia.org/T90178#1053706 (10aude) 3NEW [16:09:43] 3MediaWiki-extensions-WikibaseClient, Wikidata: When sitelinks change, update link data in cached ParserOutput without re-parsing the page. - https://phabricator.wikimedia.org/T89965#1053714 (10hoo) >>! In T89965#1053704, @FriedhelmW wrote: > This would slow down editing, as Wikidata had to wait until all langua... [16:09:47] so I assume uncurrent statements should not be marked preferred? [16:09:55] i.e. Germany borders Czechoslovakia? [16:10:40] No, they shouldn't [16:10:50] eg. in client we show the preferred statements per default [16:10:55] dennyvrandecic: indeed. historical entries should generally not be preferred. [16:10:59] so eg. the current mayor, not all the former ones [16:11:02] ok, but Germany namedAfter Bavaria ? [16:11:09] it is, in some languages. [16:11:11] so? [16:11:11] there's the concept of best statement in the data model these days [16:11:30] what's a best statement? [16:11:38] dennyvrandecic: "Q183" is named after bavaria. "Germany" isn't. [16:12:01] without the qualifiers this amounts to the same [16:12:02] dennyvrandecic: "best" is preferred, if there is a preferred one, normal otherwise. [16:12:03] The best statement(s) are just the statements with one property id that have the highest rank [16:12:17] hoo, DanielK_WMDE: thanks for the explanation [16:12:19] oh yes [16:12:19] never deprecated [16:13:05] dennyvrandecic: no it doesn't amount to the same. the name "Germany" is not derived from Bavaria. But Q183 has a name that is derived from Bavaria (in some language) [16:13:21] i see no problem with that [16:13:24] no, you don't have the qualifier anymore [16:13:30] remove the bracket of your sentence [16:13:33] that's what I am saying [16:14:07] it's now Q183 is named after Q980 [16:14:10] no qualifiers [16:14:44] adn i'm saying that "Q183 is named after Q980" is simply true. [16:15:12] yes.- [16:15:16] which is a true statement [16:15:43] hmm, I have ... issues with that, but let's see if i can find a better example :) [16:16:57] dennyvrandecic: you can probably find an example where this leads to counter-intuitive results. But the results of omitting all qualified statments from the result of direct quries is far more annoying and misleading. [16:16:58] Jerusalem is in Palestine (applies to part East Jerusalem) [16:17:22] I am not only saying counter-intuitive [16:17:24] dennyvrandecic: if this was the *only* output, that would be very misleading [16:17:46] Well, it will also be Jerusalem is in Israel [16:18:04] I am worried that this simplified output will be the basis for the query engine [16:18:15] and the query engine will become the main access point to Wikidata [16:18:17] it will be. that's exactly the intention [16:18:46] And this means that all the nice things of the Data Model are completely thrown out in its most important application [16:18:58] dennyvrandecic: If you decide to leave out stuff with qualifiers, I'm fairly sure people will start to remove them from data that's "important" [16:19:16] dennyvrandecic: ah! maybe it helps to think of the query database separately from the output. we want the simplified statements as our data base for queries. that does not mean we can't show the qualifiers in the output. [16:20:10] hmm... maybe that helps... let me think [16:21:03] dennyvrandecic: following your logic, Jerusalem would not be in any country. And France would be missing half it's borders. That would be surprising and misleading [16:21:16] it wouldn't have any flags, either. [16:21:18] well, or we figure out how to deal with qualifiers [16:21:46] we can, but it makes queries a *lot* more comples, both to write and to execute [16:22:03] yes, that's true for the datamodel as well [16:22:13] a simple spo model would have been easier to implement from the beginning [16:22:19] but it would be equally insufficient [16:22:38] dennyvrandecic: omitting all qualified statements provides good semantic precision, but the recall is horrible, and unpredictable. [16:22:45] thigns would just be left out for random reasons [16:23:25] I am not disagreeing on that [16:23:38] dennyvrandecic: sure - which is why the simplified model is just that - simplified. Useful for some applications. Insufficient for others. [16:23:57] It'S great that we *have* all the depth and complexity. [16:24:00] but it will become the only way to query wikidata life [16:24:14] no, we don't have that depth and complexity in the query engine anymore [16:24:19] but having a simplification is pointless if it does not produce useful results. [16:24:31] if you would say you would provide query endpoints for both, i would be happy [16:24:36] i think we can live with false-ish positives better than with a ton of flase negatives. [16:24:57] i think errors are worse than incompleteness in some cases [16:25:02] that is actualyl the plan, but it's not clear yet whether it's feasible wrt performance [16:25:16] what I am saying. [16:25:21] please try to list such cases somewhere :) [16:25:34] If you had both, and launched both at the same time, I'd be completely happy [16:26:00] But I am worried you will only launch the simplified one, and move the full one to an undefined future several months, years behind that [16:26:10] it might actually be the same endpoint and the same database. just different statements using different namespaces [16:26:20] which is equal to throwing out the whole datamodel as is and replacing it with a simple spo model [16:26:36] (if we use sparql, which, to my surprise, looks like a good option right now) [16:27:02] well, as said, if the full one launches as well, then it is fine [16:27:04] no, that's not equivalent. we are not throwing out the data. [16:27:19] the query engine doesn't define the content. it just defines how to find it. [16:27:45] just because the full text index doesn't index images, doesn't mean images on wikipedia pages are useless [16:27:52] same thing, really. [16:27:52] I heard that one before, about Google and it's importance for the Web :) [16:28:03] if it's not on Google, it doesn't exist [16:28:17] 3MediaWiki-extensions-WikibaseRepository, MediaWiki-extensions-WikibaseClient, Wikidata: Figure what to do with MWException within Wikibase - https://phabricator.wikimedia.org/T88360#1053890 (10JanZerebecki) See T86704. [16:28:21] which is why i say false positives are better than false negatives :) [16:28:35] i'm arguing for *more* results. to hide *less* [16:28:43] so, for historic states, what's the preferred head of state? [16:28:51] none [16:28:56] which means all [16:28:59] are best [16:29:00] so if you ask for the emporer of rome, you get all of them [16:29:04] yes [16:29:20] ok, so if i ask for the country where hindenburg was head of state i get nazi germany [16:29:43] if i ask for the country where weizsaecker was head of state, i get none [16:30:07] because hindenburg is best for nazi germany, but weizsaecker is not best for germany [16:30:27] yes. [16:30:29] With best statements you don't ask such "was" question [16:30:51] well, i just did :) and you too, for rome [16:30:55] but it's a valid concern. it needs to at least be clearly documented [16:30:55] you only ask for the latest state of something. Which for countries is now or the time of desolution [16:30:59] * dissolution [16:31:03] and we may want an easy way to get out of the problem [16:31:24] hoo: no, not at the time of dissolution. that's exactly the point. [16:31:46] DanielK_WMDE: Mh... [16:31:49] dennyvrandecic: we *could* always include preferred and normal. would be worth considering. [16:32:05] we could duplicate the property, using different namespaces, to keep them separate [16:32:32] then you have the choice of asking for "best" or for "all" (not deprecated), even with simple statementsy [16:33:06] but that doesn't capture the qualifers at all [16:33:28] DanielK_WMDE: How'd you query best statements with that? [16:33:50] the query engine would need to know about falling back to normal from preferred :S [16:34:02] hoo: you use wdpb:P123 for best, and wdpa:P123 for all [16:34:17] no, you just dublicate the statements [16:34:25] if we are using sparql anyway, why not educate the users about the data model and allow them to query for qualifiers [16:34:29] dennyvrandecic: correct. [16:34:33] if they want to skip qualifiers now, they can [16:34:47] yeah, that would also work (at the slight cost of data duplication) [16:35:01] dennyvrandecic: if we have both models imported, then we can do that, if performance holds up [16:35:04] i'd like that [16:35:13] but querie4s would be useful without that, too. [16:35:27] the way simple queries were originally design would provide a LOT less detail than even that [16:35:50] 3Wikidata: PropertySuggester travis failure for mysql + hhvm due to DBConnectionError - https://phabricator.wikimedia.org/T90178#1053996 (10aude) same problem occurs for mediawiki core hhvm + mysql job on travis: https://travis-ci.org/wikimedia/mediawiki/jobs/51528090 [16:36:09] yes, but simple queries was a very different beast [16:36:31] it only returns a set of entities, and then you can pick your data from that [16:36:36] 3Release-Engineering, Wikidata: Travis failure for mysql + hhvm due to DBConnectionError - https://phabricator.wikimedia.org/T90178#1054012 (10aude) [16:37:01] i was telling to implement that pretty much from the middle of my tenure there, but i completely failed to get that done [16:37:11] and i think the idea of simple queries is pretty much dead [16:37:18] what you are working on now is complex queries [16:37:23] immediately [16:37:28] without the first step of simple queries [16:37:57] without the chance to gather experience from that, etc. [16:38:26] you are just skipping the simple query step, probably because you thing "oh, we can do the complex thing right now" [16:38:51] dennyvrandecic: the idea is still to just return a set of entities. [16:39:24] that changes the thing [16:39:28] we'd support more ways to query them, but in terms of output, it'S the same (for now) [16:39:57] sparql would allow you to return not only a set of entities, but also values for those entities [16:40:16] if the result is always just a set of QIDs, then everything is fine [16:40:29] just don't make the result a table of QIDs with values [16:40:45] that's one projection too far [16:41:48] dennyvrandecic: for the head-of-state example, what values would these be? [16:42:03] (i can imagine some extra output that would be useful, especially for top-n queries) [16:42:47] actually, the extra output should be the full entity, maybe filtered by property. but not just the value of the main snak and completely throwing out the qualifiers [16:43:27] head of state: select ?s ?p where { ?s headofstate ?p ) [16:43:32] would lead to a table [16:43:32] aude: now also increased the version :) https://github.com/Wikidata-lib/PropertySuggester/pull/118 [16:43:50] germany koehler, rome trajan, rome caesar, usa obama, etc. [16:43:56] and loose all qualifiers [16:44:16] it would be a list of Q-ids [16:44:28] how we present that would depend on the context [16:44:43] in Lua, as a list of Entity objects... if we get the code efficient enough to do that. [16:44:50] currently, we run out of memory in Lua quickly [16:45:05] jzerebecki: made a tag for property suggester [16:45:09] thx [16:45:12] now we can update the build [16:45:20] a list of QIDs sounds good [16:45:24] on the web interface, we could just show label + description, but may as well show the appropriate statement(s) with qualifiers. [16:45:27] that would alleviate my concerns [16:45:52] why is a list of QIDs better than a list of labels + descriptions? [16:46:06] (i agree that showing the statement would be better, of course) [16:46:51] meh, labels and descriptions are also fine [16:46:55] I am not concerned about those [16:47:07] I am concerned about loosing the qualifiers when they are displayed [16:47:22] and just the head-of-state query I gave above would do exactly that [16:47:47] oh, for my understanding: [16:48:03] is the idea to allow anyone to ask any sparql queries, or would it be only certain templates you define? [16:48:39] [13WikidataBuildResources] 15JanZerebecki created 06upgrade-property-suggester (+1 new commit): 02http://git.io/AcxE [16:48:39] 13WikidataBuildResources/06upgrade-property-suggester 1408d0718 15Jan Zerebecki: Update property-suggester dependency [16:49:05] [13WikidataBuildResources] 15JanZerebecki opened pull request #21: Update property-suggester dependency (06master...06upgrade-property-suggester) 02http://git.io/Acxo [16:50:22] if it is the latter, then there is quite no real problem because you can indeed ensure that only qids are returned by the query engine, and which can then be used to retrieve the items and return these to the user [16:51:20] [13WikidataBuildResources] 15filbertkm pushed 1 new commit to 06master: 02http://git.io/AcpF [16:51:20] 13WikidataBuildResources/06master 143954308 15Katie Filbert: Merge pull request #21 from wmde/upgrade-property-suggester... [16:54:03] (03PS2) 10JanZerebecki: New Wikidata build 2015-02-20 [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/191884 [16:54:10] dennyvrandecic: we will restrict the sparql in some ways. but we have not yet decided in which ways, and how far. we havn't even decided that there *will* be sparql. [16:54:16] though it does look like it at the meoment. [16:54:41] understood [16:55:21] I would just like to know that there is anyone in the team who champions the datamodel and why it was made [16:55:36] and not just throwing it out because it is inconvenient to implement for queries [16:55:37] Basically, if BlazeGraph updates fast enough, and we don't encounter any other deal breaker, that's what we'll use, and it comes with sparql. If we run into something nasty, we'd probably go for Neo4J. Or even GraphX, but that's pretty low level. We'd end up writing a lof of db level stuff. [16:55:57] dennyvrandecic: well, uh... that would be me, I think :) [16:56:19] i'm *very* found of qualifiers, though it's a bit annoying that their semantics is so squishy. [16:56:35] 3MediaWiki-General-or-Unknown, operations, Services, Analytics, Wikidata, wikidata-query-service: Reliable publish / subscribe event bus - https://phabricator.wikimedia.org/T84923#1054081 (10bd808) [16:56:37] they are qualifying the statement :) nothing sqishy about it [16:56:38] i'd like to be able to interpret them as proper preconditions. which you often can, but not always. [16:56:51] well, they are not, indeed :) [16:56:56] they might be [16:57:15] well, many can be interpreted that way, and it would be quite nice to be able to make use of that [16:57:42] if they were preconditions then we wouldn't have such huge losses, because in that cases the losses would be justified [16:58:22] indeed [16:59:00] btw, why not lucene/elasticsearch? [16:59:10] i mean, is that even being considered? [17:01:41] dennyvrandecic: yes it was. you'll have to ask nik for the details (he's also the person who implemented the elastic backend to cirrus, so he knows this stuff) [17:02:21] is he ever on irc? or is there a document that says why not lucene, or rather why titan? [17:02:21] if I understand correctly, the issue is that we want to do (possibly even recursive) traversal, and that is something elastic is really bad at [17:02:30] yes, that is correct [17:02:50] manybubbles: meet dennyvrandecic :) [17:02:59] manybubbles is nik? ah, thanks, didn't know [17:03:04] dennyvrandecic: hi! [17:03:04] yea [17:03:19] we did meet a few times, just didn't know the irc handle [17:03:39] manybubbles: btw, are you back at the office? [17:03:41] yeah - sorry for not replying to that document - things have been pretty crazy in the past, well, month [17:03:47] DanielK_WMDE: yeah [17:03:55] only did 2.5 hours at the museum [17:04:00] 3Wikidata: update and consolidate json dump documentation - https://phabricator.wikimedia.org/T87329#1054090 (10Lucie) Also, the new example might be a big too long. I didn't want to change the data, that's why I just pasted the whole json of the Item in there, but it seems to be a bit too much. For claims and... [17:04:05] manybubbles: if you have the time, what's the reason against lucene/elastic as the search backend? [17:04:12] (03PS1) 10Lucie Kaffee: Update json dump documentation [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191907 (https://phabricator.wikimedia.org/T87329) [17:04:22] i understand, no recursion and no property chains, but is that all? [17:04:50] dennyvrandecic: well, that, and that we'd have to build something to use it or use a dead project like titan for it [17:05:15] its not that I dislike elasticsearch - I'm a contributor there. well, I was, less so the past few months [17:05:28] BatchEntityRevisionLookup as an interface name? [17:05:31] DanielK_WMDE: ^ [17:05:31] or EntityRevisionBatchLookup [17:05:57] mhm, I am not sure you would need to build so much on top of it to use it [17:06:06] hoo: EntityRevisionBatchLookup. but no strong prefference [17:06:19] lookup by property-value and by conjunctions of property-values works out of the box [17:06:21] ok :) [17:06:23] * aude agrees with DanielK_WMDE .... EntityRevisionBatchLookup [17:06:29] :) [17:06:46] doesn't this cover already like most of the use cases? [17:07:13] dennyvrandecic: we'd need to build a mapping to elastic's document model. for rdf, we need that mapping anyway. [17:07:20] dennyvrandecic: i've even hacked together property-value lookup in elastic [17:07:32] but not sure how well it handles traversals, combining queries etc [17:07:43] DanielK_WMDE: the mapping would be rather trivial, an item is a document [17:07:51] no traversals, but if you wanted something simple you could build it [17:07:56] for simple query, we could have started with elastic [17:08:02] yes [17:08:06] but moving beyond that now for wikigrok etc. use cases [17:08:21] aude: but one point of having simple queries is that they are easy to used by 3rd parties. [17:08:27] DanielK_WMDE: true [17:08:34] wikigrok requires traversal? [17:08:37] i still might hack more on it, as i like the geo stuff in elastic [17:08:37] i don't want to end up with sql *and* elastic *and* [17:08:44] *and* rdf... [17:08:47] true [17:08:50] dennyvrandecic: the trouble is that in this case you don't get stuff like automatic join order optizing [17:08:54] dennyvrandecic: yes [17:09:05] * aude likes postgres + elastic but they might not do everything nor work out of the box as well [17:09:08] dennyvrandecic: find all of the people who's father doesn't list them as a child [17:09:10] and/or [17:09:33] ironicly, postgresql is probably a less difficult target [17:09:33] manybubbles: ok, use case convinced [17:09:56] but what would the table structure be in postgres? [17:09:57] it looks like virtuoso is actually a sql system manhandled into an rdf store [17:10:10] one big table with triples, or one table per property? [17:10:14] dennyvrandecic: yea, sparql would be really cool for finding incosnsistencies and constraint violations [17:10:26] there is geosparql to postgis implementation [17:10:32] dennyvrandecic: propbably one tyble per value type [17:10:40] and think non-geo stuff also, but don't know how good it is etc [17:10:41] sparql would be awesome, no question, I just wonder if it scales to the frequency of updates [17:10:58] no idea without experimenting with it [17:11:03] DanielK_WMDE: yes, true, that or one table per property [17:11:06] Does it make sense to have LATEST_FROM_MASTER in the batch lookup? I doubt it does [17:11:12] one table per value type still might get... a bit unwieldy [17:11:13] dennyvrandecic: I _think_ virtuoso talks about it being one table per graph. s, p, v. [17:11:39] dennyvrandecic: can't do one table per value because values can be added on the fly and DBA would flip tables if we added tables on the fly [17:12:02] manybubbles: understood :( [17:12:25] fixed set of tables leads to a lot of joins with itself on a gigantic table [17:12:52] hoo: it may make sense to be able to specify that in the constructor. [17:12:58] but not per call [17:13:00] dennyvrandecic: yup. leads you to want a system where adding a property isn't a big deal [17:13:15] dennyvrandecic: my favorite page on wikipedia: https://en.wikipedia.org/wiki/List_of_emoticons contains a flip table one [17:14:20] dennyvrandecic: yeah - I'm not saying its a good idea, just that its on the list. I'll be talking to the systap folks on Tuesday. I'll make sure to talk about dynamic. [17:14:22] thanks DanielK_WMDE for reviewing my sites patch [17:14:23] DanielK_WMDE: Ok, so no need for that to be in the interface [17:14:27] manybubbles: but nothing for "hug" or "embrace"?! my favorite version is {{denny}}. em-brace :) [17:14:31] probably will update it this weekend [17:14:59] aude: i like the design, looks fluffy now :) [17:15:11] yay [17:15:21] trying to keep it baby steps [17:15:27] ah man, why is not everything perfect? [17:17:10] well, I'll let you all work, just saying: think about the qualifiers... Q5098257 [17:19:16] it's almost beer o'clock :) or at least food [17:20:38] one more review... [17:21:26] (03CR) 10Aude: [C: 031] "shall merge probably tomorrow, when i'll be around just in case this breaks beta" [extensions/Wikidata] - 10https://gerrit.wikimedia.org/r/191884 (owner: 10JanZerebecki) [17:22:24] dennyvrandecic: Thanks! going to get some dinner once DanielK_WMDE finished that last review. [17:22:59] 3§ Wikidata-Sprint-2015-02-03, Wikidata: Fatal error: Class 'Wikibase\Utils' not found in PropertySuggester/GetSuggestions.php on line 153 - https://phabricator.wikimedia.org/T90137#1054168 (10JanZerebecki) 5Open>3Resolved [17:25:05] About query support of Wikibase, is https://github.com/wmde/WikibaseQuery abandonned in favor of a new query backend or is it planned to have two backends? [17:26:27] And a related question: is https://github.com/wmde/Ask planned to be supported by a query backend or is it a dead project? [17:26:29] Tpt: that's the frontend. development on that is stalled, but not abandoned. [17:27:13] So we would have simple queries backed with WikidataQuery and more powerful queries using an other backend? [17:27:20] 3Wikidata: update and consolidate json dump documentation - https://phabricator.wikimedia.org/T87329#1054183 (10Aklapper) [17:27:24] Tpt: There's also WikibaseQueryEingine, which we may or may not use with the "new" backend. The SQL binding that is implemented there may be discontinued. [17:27:56] Tpt: the future of simple query is a bit undecided. it depends on how far the full query capability develops. [17:28:06] mh... WikiPageEntityLookupHelper as a class I make use of in the WikiPage*Lookup classes using compositing [17:28:12] i still want *very* simple queries, but for that, we don't need WikibaseQuery. [17:28:14] we [17:28:31] we'd just put that into the repo directly, and make it very light weight and stupid [17:28:37] ok, so if full queries works well enought, there would be only a "full query" backend? [17:29:02] Tpt: yes. though there may still be a simple frontend, that would use the same backend. [17:29:14] it's all a bit up in the air right now, while we are still evaluating technologies [17:29:43] ok. Thank you. And will Ask language still be used or dropped in favor of something else? [17:30:00] hoo: use "Helper" only if you really can't think of anything that would be decriptiove of what that class is doing. [17:30:09] actually, that's an inidcation that somethign is wrong. [17:30:20] (but i have used "Helper" occasionally) [17:30:45] yeah, it's nasty [17:30:53] but I want to consolidate duplicate code [17:30:53] Tpt: there may be a binding for Ask, but it currently doesn't look like it will be a priority. [17:31:09] ok. I asked this question because I'm investigating on query capacities for my custom EntityStore https://github.com/ProjetPP/WikibaseEntityStore [17:31:13] I could make two helpers... one to get latest page ids, that would be easy [17:31:28] the other thing will either be duplicate code or a messy new class, I'm afraid [17:31:54] hoo: call it "helper" then. we can argue over that patch, once i see the code :) [17:32:02] yeah, I guess [17:32:39] (03CR) 10Daniel Kinzler: [C: 032] Generalize client's SnaksFinder (032 comments) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/190809 (owner: 10Hoo man) [17:32:52] hoo: ---^ left a couple of comments there [17:33:53] Yeah, I see [17:35:02] (03Merged) 10jenkins-bot: Generalize client's SnaksFinder [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/190809 (owner: 10Hoo man) [17:35:58] I'll write it down to address them once I have the other stuff out of my way [17:51:32] (03PS15) 10Smalyshev: Implement RDF export for items and RDF dumps (Turtle only for now) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/191393 [19:04:43] * JeroenDeDauw sees Daniel recommend "helper" as name [19:04:46] >_> [19:49:29] 10MediaWiki-extensions-WikibaseClient, 10Wikidata: For "special" precision in Wikidata interface, show dimension in meters - https://phabricator.wikimedia.org/T89218#1055014 (10Multichill) >>! In T89218#1053382, @thiemowmde wrote: > +/-0.014000802722428° is something between 0 and 1555.644746936 meters. How... [20:05:56] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: BadMethodCallException from line 244 of EntityViewPlaceholderExpander.php - https://phabricator.wikimedia.org/T90268#1055071 (10aude) 3NEW [20:07:11] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: InvalidArgumentException from line 57 of SuggestionGenerator.php - https://phabricator.wikimedia.org/T90269#1055092 (10aude) 3NEW [20:09:52] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: HttpError from line 359 of EntityDataRequestHandler.php - https://phabricator.wikimedia.org/T90270#1055099 (10aude) 3NEW [20:46:07] Lydia_WMDE: --> wikidata-l [21:30:56] Tpt: cheers for the pull req! [21:32:04] addshore: As an heavy user of your bot libs, it's normal that I do some pull requests [21:32:13] :) [21:32:36] makes me happy that they are used :D [22:03:45] [13WikibaseDataModel] 15JeroenDeDauw pushed 1 new commit to 063.0.x-dev: 02http://git.io/Alux [22:03:45] 13WikibaseDataModel/063.0.x-dev 14d532c09 15Jeroen De Dauw: Merge pull request #387 from wmde/useClaim... [22:10:05] [13WikibaseDataModel] 15JeroenDeDauw pushed 1 new commit to 06master: 02http://git.io/Al2j [22:10:05] 13WikibaseDataModel/06master 14f062701 15Jeroen De Dauw: Merge pull request #385 from wmde/idParserTest... [22:10:22] [13WikibaseDataModel] 15JeroenDeDauw closed pull request #384: More descriptive Term/AliasGroup docs (06master...06altFallBackDocs) 02http://git.io/AZfN [22:10:36] [13WikibaseDataModel] 15JeroenDeDauw 04deleted 06altFallBackDocs at 14f626e1d: 02http://git.io/Alas [22:10:40] 10Wikidata, 6Services, 10wikidata-query-service: Investigate & design public API, possibly using MQL - https://phabricator.wikimedia.org/T85181#1055497 (10GWicke) a:5GWicke>3None [22:16:38] haha, gogo dennyvrandecic [22:18:08] *blush* [23:45:30] 10Wikibase-DataModel-JavaScript, 10Wikidata, 10Librarization, 10MediaWiki-Vendor: Add data-values/javascript 0.6.0 to mediawiki/vendor - https://phabricator.wikimedia.org/T88436#1011591 (10Legoktm) >>! In T88436#1013289, @Krinkle wrote: > I propose we stop using mediawiki-vendor for the majority of Media...