[00:24:08] (03PS1) 10Hoo man: Fix MockRepository::getLatestRevisionId for redirected entities [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/216514 [00:24:33] jzerebecki: In case you're there ^ [00:24:39] That is blocking frimelle right now :D [00:26:17] (03CR) 10jenkins-bot: [V: 04-1] Fix MockRepository::getLatestRevisionId for redirected entities [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/216514 (owner: 10Hoo man) [00:33:50] hoo: jenkins says no [00:36:07] Reedy: Yeah... turns out our API modules rely on the broken behavior of the mock entity revision lookup [00:36:23] So that's stuff that would have never ever worked in the real world [00:36:54] lol [02:34:29] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: RedirectCreationInteractor relies on wrong behavior on MockRepository::getLatestRevisionId - https://phabricator.wikimedia.org/T101625#1343737 (10hoo) 3NEW a:3hoo [02:34:46] (03PS2) 10Hoo man: Fix behaviour of EntityRevisionLookup::getLatestRevisionId implementations [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/216514 (https://phabricator.wikimedia.org/T101625) [02:40:38] is tools.wmflabs.org down? [02:40:46] (03PS1) 10Hoo man: Use WikibaseRepo::getRedirectCreator in CreateRedirect [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/216518 [02:41:16] oh typical, it finally decides to respond when I give up and ask :P [02:42:58] nikki, it's very slow at the moment [02:43:05] someone in ops is looking into it [02:46:40] ah [05:13:15] (03PS8) 10Lucie Kaffee: [WIP] Redirect creation when using wbmergeitems [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/214917 (https://phabricator.wikimedia.org/T59745) [05:13:16] (03CR) 10jenkins-bot: [V: 04-1] [WIP] Redirect creation when using wbmergeitems [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/214917 (https://phabricator.wikimedia.org/T59745) (owner: 10Lucie Kaffee) [05:13:23] Using wikibase - wish to get flag (property P41) for India (Q668) - after using "local entity = mw.wikibase.getEntityObject()" I use return "entity.claims.P41[1].mainsnak.datavalue.value" - wondering if this is correct ?- thanks! [05:13:23] 10[1] 04https://www.wikidata.org/wiki/Template:PERSONDATA [05:13:24] Thanks will try elsewhere... have a great day. [09:16:22] 10Wikidata, 10Wikidata-Gadgets: Merge.js should ignore sitelink conflicts and rely on API - https://phabricator.wikimedia.org/T101629#1343825 (10matej_suchanek) 3NEW [09:17:35] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Special:MergeItems should report all conflicts at once - https://phabricator.wikimedia.org/T91210#1343835 (10matej_suchanek) [09:17:38] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Merge with conflicting sitelinks is not rejected - https://phabricator.wikimedia.org/T87524#1343836 (10matej_suchanek) [09:17:40] 10Wikidata, 10Wikidata-Gadgets: Merge.js should ignore sitelink conflicts and rely on API - https://phabricator.wikimedia.org/T101629#1343834 (10matej_suchanek) [09:22:59] 10Wikidata, 10Wikidata-Gadgets: Merge.js should ignore sitelink conflicts and rely on API - https://phabricator.wikimedia.org/T101629#1343842 (10matej_suchanek) [10:14:27] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 3Wikidata-Sprint-2015-06-02: Selecting value with keyboard is making wrong edits again - https://phabricator.wikimedia.org/T98471#1343863 (10Lydia_Pintscher) It is in this sprint and has highest priority. [10:53:02] [13WikibaseDataModel] 15thiemowmde 04deleted 06fpNaming at 14b0cffa8: 02http://git.io/vIc89 [10:54:47] [13WikibaseDataModel] 15thiemowmde 04deleted 06revert-363-claimListAccess at 14e57a776: 02http://git.io/vIc4U [10:55:42] [13WikibaseDataModel] 15thiemowmde 04deleted 06allofthegenerics at 14e648cc3: 02http://git.io/vIc43 [10:56:59] 10Wikidata, 10Wikidata-Quality: Wikibase "Quality" seems to not be a correct name for this extension - https://phabricator.wikimedia.org/T101621#1343899 (10Bugreporter) Already there's a task to rename it [10:57:44] 10Wikidata, 10Wikidata-Quality: Wikibase "Quality" seems to not be a correct name for this extension - https://phabricator.wikimedia.org/T101621#1343900 (10Bugreporter) [10:57:47] 10Wikidata, 10Wikidata-Quality: (Re-)Naming our extension(s) - https://phabricator.wikimedia.org/T97423#1343901 (10Bugreporter) [10:59:17] 10Wikidata: Investigate why empty references get into Wikidata - https://phabricator.wikimedia.org/T92383#1343904 (10Bugreporter) Now we should find how many empty references are there in Wikidata. [11:03:10] [13WikibaseDataModel] 15thiemowmde 04deleted 06unDeprecate at 14cdb36d3: 02http://git.io/vfy0h [11:15:37] when I read Magnus's latest blogpost it sounds as resignation to me [11:15:53] http://blog.magnusmanske.de/?p=307 [11:44:36] (03PS9) 10Lucie Kaffee: [WIP] Redirect creation when using wbmergeitems [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/214917 (https://phabricator.wikimedia.org/T59745) [11:45:54] GerardM-: I think he's not so willing to add things anymore before the release of WDWS. [11:46:25] (03CR) 10jenkins-bot: [V: 04-1] [WIP] Redirect creation when using wbmergeitems [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/214917 (https://phabricator.wikimedia.org/T59745) (owner: 10Lucie Kaffee) [12:08:24] sjoerd what wdws ? [12:08:42] Wikidata Query Service. [12:08:54] https://www.mediawiki.org/wiki/Wikidata_query_service [12:09:04] The official WDQ. ;0 [12:09:08] ;)* [12:10:01] it makes so much sense ... for him to do this [12:11:39] he does not get support and his work is largely dismissed [13:41:47] 10Wikidata, 10Datasets-General-or-Unknown, 6Labs, 10Labs-Infrastructure, and 2 others: Add Wikidata json dumps to labs in /public/dumps - https://phabricator.wikimedia.org/T100885#1344066 (10Hydriz) [14:13:57] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: labels are not shown for redirect - https://phabricator.wikimedia.org/T96553#1344126 (10Lydia_Pintscher) [14:14:00] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Make EntityView UI aware of redirects - https://phabricator.wikimedia.org/T70567#1344125 (10Lydia_Pintscher) [14:59:38] 10Wikidata, 10Datasets-General-or-Unknown, 5Patch-For-Review, 3Wikidata-Sprint-2015-05-05, 3Wikidata-Sprint-2015-06-02: Add wb_changes_subscription to xml dumps - https://phabricator.wikimedia.org/T98742#1344199 (10Lydia_Pintscher) Any estimate when this will go live? [18:27:38] Amir1: Do you use http://scikit-learn.org/stable/ for your Wikidata toys? [18:27:55] multichill: hey, no [18:28:08] Which one do you use? [18:28:11] because It's hard to use them in labs [18:28:32] 10Wikidata, 10Datasets-General-or-Unknown, 5Patch-For-Review, 3Wikidata-Sprint-2015-05-05, 3Wikidata-Sprint-2015-06-02: Add wb_changes_subscription to xml dumps - https://phabricator.wikimedia.org/T98742#1344420 (10ArielGlenn) It's already live. Next time tables are dumped, this will show up. [18:28:37] I wrote all of them from scratch and I only used built-in libraries like math [18:29:07] hmmm, that sounds a bit like reinventing the wheel [18:29:17] yes [18:29:21] I didn't like it either [18:29:44] but for revision scoring we use scikit [18:30:01] ok [18:30:33] Would be nice if it could be used to suggest claims to add based on similar items [18:31:28] Amir1: Looks like my bot already added about 50K new statements :-) [18:32:11] btw It doesn't work for ANNs, Revision Scoring uses decision tree so they can use scikit [18:32:24] ANNs? [18:32:48] Artificial neural network = what Kian uses [18:35:03] multichill: I was thinking about creating a system that will be backbone of AI in Wikidata [18:35:19] but I'm very busy now [18:35:43] hopefully in summer [18:39:46] Running https://tools.wmflabs.org/multichill/queries/wikidata/noclaims_per_wiki.sql now Amir1. Let's see what wiki's to focus on ;-) [18:40:02] awesome [18:40:28] * sjoerddebruin is doing a lot of that by hand [18:40:39] I want to work on some AI-based stuff to speed it up [18:40:48] probably tonight :) [18:41:30] I should probably right down the noclaims approach I'm doing right now. Almost forgot it myself how I set it up [18:41:57] Yesterday I was like, I need a bot to do that, turns out that I already wrote one I forgot about ;- [18:42:42] sjoerddebruin: For Dutch Wikipedia I'm not sure how much automatic stuff is left now. We have less than 30.000 claimless items and a lot of them are quite hard and broad [18:42:46] :)) [18:43:00] multichill: Yes, that's why I'm working on it by hand. [18:43:13] Already processed most of last month. [18:43:27] I would love to cut https://nl.wikipedia.org/wiki/Gebruiker:Multichill/Geen_claim in half by bot [18:43:30] Also focusing on fywiki, only 3300. ;) [18:43:50] On the other hand, we have a lot of info already so the AI stuff might be able to help out here [18:44:22] Yeah, but that stuff also works with categories. And most of them are badly categorized. [18:45:44] But working is hard with a autolist that keeps displaying merged and deleted items. [18:51:25] Yeah, that sucks [18:52:42] And these things take time. https://www.wikidata.org/w/index.php?title=Q20054237&type=revision&diff=221546003&oldid=221520858 [19:01:57] 10Wikidata, 10Datasets-General-or-Unknown, 5Patch-For-Review, 3Wikidata-Sprint-2015-05-05, 3Wikidata-Sprint-2015-06-02: Add wb_changes_subscription to xml dumps - https://phabricator.wikimedia.org/T98742#1344430 (10Lydia_Pintscher) 5Open>3Resolved a:3Lydia_Pintscher Awesome! Thank you! Then we can... [19:02:23] 10Wikidata, 10Datasets-General-or-Unknown, 5Patch-For-Review, 3Wikidata-Sprint-2015-05-05, 3Wikidata-Sprint-2015-06-02: Add wb_changes_subscription to xml dumps - https://phabricator.wikimedia.org/T98742#1344433 (10Lydia_Pintscher) a:5Lydia_Pintscher>3None [19:02:32] 10Wikidata, 10Datasets-General-or-Unknown, 5Patch-For-Review, 3Wikidata-Sprint-2015-05-05, 3Wikidata-Sprint-2015-06-02: Add wbc_entity_usage table to xml dumps - https://phabricator.wikimedia.org/T98743#1344434 (10Lydia_Pintscher) 5Open>3Resolved [19:28:39] Lydia_WMDE: Would it be possible to slip in Commons somehwere at https://www.wikidata.org/wiki/Wikidata:Arbitrary_access? :-) [19:29:09] multichill: hehe yeah. daniel is working on the remaining multilingual issues [19:29:17] Right [19:29:32] when they're fixed we'll set a date [19:29:47] ok [19:31:01] Lydia_WMDE: heyho :) [19:31:13] benestar: hey [19:40:55] Lydia_WMDE: Is the watchlist still working? [19:41:14] I don't see any Wikidata edits in my Watchlist. I just did one to verify and nothing seems to show up [19:41:29] multichill: do you have the extended watchlist? [19:41:34] there it doesn't work yet [19:41:39] I switched to the normal one to check [19:42:06] and not working? [19:44:46] Ah, found it [19:45:13] what was the issue? [19:45:15] Damn, I had to change like 4 things P [19:45:24] -.- [19:45:30] we need to fix that then [19:46:15] Disable advanced watchlist, click the Wikidata box and show it in my watchlist [19:46:20] What's up with " Show Wikidata edits by default in recent changes and watchlist (does not work yet with enhanced changes)"? [19:46:44] I would expect that in https://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-watchlist [19:46:54] Where you say "show bots", show myself, etc [19:47:42] And what's really anoying it that the interface doesn't remember that I want to see Wikidata [19:53:12] mpfh [19:53:15] it really should [19:53:51] ok we'll look at these as part of making enhanced recent changes work (one of the next things) [19:55:41] Yah, query done. Amir1 / sjoerddebruin : https://www.wikidata.org/wiki/User:Multichill/No_claim#Some_statistics for the per site numbers [19:55:49] Looks like we need to learn Japanese :-( [19:56:03] I teach it to Kian ;) [19:56:52] Looking at the full list makes me assume that wikisource still has some quick wins [20:49:31] Hello! I have a question. Why is https://www.wikidata.org/wiki/Q11310500 notable if it doesn't have one valid site link or fulfills a structural need? All I see are statements. [20:49:36] I'm a bit new to WikiData. [20:50:40] MJ94: No, since it isn't used elsewhere or linked to an article, it doesn't look notable. [20:50:49] it should be merged into https://www.wikidata.org/wiki/Q16234949 [20:50:54] MJ94: feel free to https://www.wikidata.org/wiki/Wikidata:Requests_for_deletions [20:51:11] wikimedia/mediawiki-extensions-Wikibase/master/5a8572e : Translation updater bot The build is still failing. http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/65805944 [20:51:11] it only doesn't have a sitelink because someone removed it in order to add it to another article [20:51:31] Josve05a: {{done}} [20:51:31] How efficient, MJ94! [20:51:46] lol'd :P [20:51:54] nikki: hmm? [20:52:15] sorry nikki, didn't see your messgae [20:54:37] MJ94: see https://www.wikidata.org/w/index.php?title=Q11310500&diff=191458706&oldid=166534386 and https://www.wikidata.org/w/index.php?title=Q16234949&diff=191458713&oldid=190945808 - the user deleted the sitelink from the item you linked and then added it to a different item, presumably because they didn't know how to merge them [20:55:02] (which seems to happen quite often :() [20:55:04] SMalyshev: SPARQL endpoint is down :( gives 502 bad gateway [20:55:23] benestar: I'll take a look [20:55:35] ty [20:55:48] nikki, MJ94: well, no harm no faul [20:55:54] nikki: how did you know to find the one they added it to? [20:56:11] it looks like they are different items altogether [20:56:58] benestar: restarted it... looks like it was manually stopped, but I don't remember doing it... [20:57:04] Sorry, let me rephrase [20:57:14] strange [20:57:27] I looked at the history to find out which page it was linked to originally, then checked to see if the page had been deleted, it hadn't, so I clicked the wikidata link in the sidebar to find out what item it's now linked to [20:57:31] nikki: How did you know that they had added it elsewhere? Just by checking their contribs? [20:57:33] works now, thanks [20:57:54] ah, clever [20:57:55] thanks [20:58:21] benestar: np. Eventually, I'll add monitoring to it :) [20:58:49] SMalyshev: some status page would be great which would also show the lag and the number of triples [21:00:20] benestar: there's already this: http://wdqs-beta.wmflabs.org/bigdata/counters?path=%2F [21:00:43] benestar: and this: http://wdqs-beta.wmflabs.org/bigdata/status [21:00:57] but I'm not sure yet how to make it more fit for human consumption [21:01:36] include it here? http://status.wikimedia.org/ [21:02:12] or do something like https://status.github.com/ ^^ [21:02:27] I can see where Wikidata could get addictive fast [21:03:08] benestar: it's not production yet but I'm looking into if it has something in labs [21:03:43] SMalyshev: what do you think about https://tools.wmflabs.org/bene/sparql/ ? [21:04:01] just created a small prototype how sparql queries could be generated more easily [21:04:07] leave a field empty for variables [21:04:30] nikki: still around? [21:04:34] yep [21:04:54] nikki: is there a category or list of items with no sitelinks? [21:05:29] I don't know, I'd quite like to know that too :) [21:05:39] benestar: that's a good start, yeah. but some explanation wouldn't hurt. [21:05:48] benestar: also, some way to link predicates [21:06:20] benestar: i.e. if I want to know what people Elvis Presley was somehow related to, I get this: SELECT ?predicate1 WHERE { [21:06:20] wd:Q610926 ?predicate1 wd:?x . [21:06:20] wd:?x wdt:P31 wd:Q5 . [21:06:20] } [21:06:44] I'd like it to be ?x not wd:?x [21:06:52] i.e. support variables [21:07:04] without variables SPARQL is not that useful [21:07:17] benestar: but overall it's a great idea [21:07:50] nikki: would https://www.wikidata.org/wiki/Q18734917 be notable? [21:08:05] benestar: oh and also so that I can get ?x in select [21:08:47] I would expect so, because of the oxford biography index number [21:09:13] although I'm not the best person to ask, I don't like deleting things :) [21:09:38] nikki: what do you like? :) [21:09:45] having lots of data! :D [21:09:57] :D [21:10:55] SMalyshev: for variables just leave the field empty ;) [21:11:09] it will then include them in the select (not really intuitive :S) [21:11:11] benestar: but that would produce *different* variables [21:11:22] oh, I see what you mean [21:11:52] SELECT ?predicate1 ?object2 ?subject3 WHERE { [21:11:52] wd:Q303 ?predicate1 ?object2 . [21:11:52] ?subject3 wdt:P31 wd:Q5 . [21:11:52] } [21:12:05] what I want is that ?object2 and ?subject3 be the same variable [21:14:15] SMalyshev: removed that magic variable creation. If you enter a string which is not Qxxx or Pxxx it assumes it is a variable [21:17:16] benestar: for P31 as predicate it shows wd:P31 but I think it needs to be wdt:P31 [21:17:27] ty [21:17:39] also, SELECT should deduplicate, SPARQL does not allow duplicates in select [21:17:47] like [21:17:48] SELECT ?something ?x ?x WHERE { [21:17:48] wd:Q303 ?something ?x . [21:17:48] ?x wd:P31 wd:Q5 . [21:17:48] } [21:19:00] yes, just fixed that ;) [21:22:15] MJ94: more seriously though, one of the goals of wikidata (as I understand it) is to be able to create lists from the data and there are lots of things which aren't notable enough for their own wikipedia article which would need to be notable enough for us if we want to be able to include it in a list, so for me the important thing is identifying the item, not having sitelinks or links from another item (although obviously [21:22:16] those do help a lot) [21:22:57] that's just my opinion though, some things that I've said I think are notable other people have said they don't think are notable [21:42:05] How would I use `wbsetclaimvalue` to set the datavalue property of the mainsnak of claim Q4115189$66001212-4f38-5752-744f-17ea89417127 to something different? API always returns "Invalid snak (Can only construct StringValue from strings)" [21:43:16] action: wbsetclaimvalue | claim: Q4115189$66001212-4f38-5752-744f-17ea89417127 | value: {"value":"\" onmouseover = \"alert('There is a security vulnerable. Please contact the next admin.')\" target=\".png","type":"string"} [21:44:05] snaktype: value | summary: test | token: | baserevid: 221430772 [21:44:39] https://www.wikidata.org/wiki/Special:ApiSandbox#action=wbsetclaimvalue&format=json&claim=Q4115189%2466001212-4f38-5752-744f-17ea89417127&value={%22value%22%3A%22\%22%20onmouseover%20%3D%20\%22alert%28%27There%20is%20a%20security%20vulnerable.%20Please%20contact%20the%20next%20admin.%27%29\%22%20target%3D\%22.png%22%2C%22type%22%3A%22string%22}&snaktype=value&summary=test&token=FILLINTOKENHERE&baserevid=221430772 [21:45:18] If they're not notable enough for Wikipedia, nikki, what are they notable for? Wikipedia's notability guideline is pretty good, IMO, although that may sound snobby? [21:50:44] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Misleading error message when using wrong datatype in wbcreateclaim - https://phabricator.wikimedia.org/T93668#1344517 (10Rillke) Just hit this with `wbsetclaimvalue`; still confused of what WDB? expects me to feed-in. IRC-log: (11:41:59 PM): How would I... [22:04:49] I guess I could also use one of the JS components installed to Commons but to be frank, this is an utter mess. None of the links on Special:Version goes to a documentation page. Instead I should clone each repo and build the docs myself? This is quite wiki-user friendly. [22:10:56] MJ94: hm, I could probably write a lot in answer to that :) I guess it mainly comes down to wikipedia's aim is different from wikidata's, wikipedia wants to create nice informative articles to read, wikidata wants to provide structured data [22:11:08] things which can be linked to other datasets are useful as a way of, well, linking datasets and making the data useful outside of wikimedia projects [22:11:32] other things can be useful as a way of representing the data in wikipedia in a structured way [22:11:47] nikki: I was under the impression that Wikidata wanted to collect data from other Wikimedia projects :) [22:11:52] e.g. wikipedia might say the members of some obscure football team aren't notable enough and simply list them on the article for the football team, because that's a better way to present the data for someone to read than lots of little stubs. whereas wikidata would have separate items for each person and link them as a member of the football team - it's actually still the same data, but we can't apply wikipedia's criteria if [22:11:52] we want to store the same data [22:11:58] Is its aim larger than that? [22:12:22] Just in case `wbsetclaimvalue` is not the right choice: I am trying to replace file use in Wikidata when files are moved on Commons. As a last resort, I will simply upload the whole Item. Or to be more specific Commons users will upload Item entries in whole, hehe. [22:12:43] nikki: oh, interesting [22:22:12] 10Wikibase-DataModel-JavaScript, 10Wikidata: Missing online documentation - https://phabricator.wikimedia.org/T101662#1344543 (10Rillke) 3NEW [22:23:27] oh and linking to other datasets is useful as a way of being able to check whether the data we have is correct (there's apparently a group of people working on that) and also as a way of fetching more data [22:29:55] (which I guess is mostly useful as a way of checking data of the things which are linked to wikipedia, but it's also good as a way to try and avoid data ending up in the wrong place, since when we don't have someone else's id, it's not clear whether we don't have that item at all, or whether the item just doesn't have someone else's id added yet)