[08:04:43] I'm not sure if anyone's around, so I'll just put this here: I'm trying to parse the JSON dump of wikidata, and the very first line fails to parse due to this string: "{"language":"ml","value":"\u0d05\u0d23\u0d4d\u0d21\u0d15\2dc\uac04\... [08:04:58] specifically the "\2dc" [08:06:21] I can't find any docs on the character encoding of the JSON dump, so I'm not sure if this is a case of my error or something wrong with the dump [08:32:20] !nyan [08:32:20] ~=[,,_,,]:3 [08:35:57] 3wikidata-query-service, Wikidata: Define which data the query service would store - https://phabricator.wikimedia.org/T86278#965363 (10Lydia_Pintscher) Things I can think of right now - maybe more: * definitely should also import aliases. (Might already but not in the description here so making sure.) * probabl... [08:52:40] (03CR) 10Henning Snater: [C: 032] Fix fingerprint values in entityview when scraping from html [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183567 (https://phabricator.wikimedia.org/T75749) (owner: 10Aude) [08:56:15] (03Merged) 10jenkins-bot: Fix fingerprint values in entityview when scraping from html [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183567 (https://phabricator.wikimedia.org/T75749) (owner: 10Aude) [08:59:16] hi guys, anyone here happen to be an admin on the English Wikipedia? [08:59:44] I want to make a Wikidata related template edit [09:00:04] https://en.wikipedia.org/wiki/Template_talk:Infobox_person#Template-protected_edit_request_on_9_January_2015 [09:11:37] (03PS2) 10Henning Snater: Fix siteselector tests [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183239 (owner: 10Adrian Lang) [09:16:29] (03CR) 10Henning Snater: [C: 032] Fix siteselector tests [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183239 (owner: 10Adrian Lang) [09:19:19] (03Merged) 10jenkins-bot: Fix siteselector tests [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183239 (owner: 10Adrian Lang) [09:23:39] 3Wikidata-Sprint-2015-01-08ยง, MediaWiki-extensions-WikibaseRepository, Wikidata: languages mixed up in "other languages" box - https://phabricator.wikimedia.org/T75749#965407 (10Snaterlicious) 5Open>3Resolved [09:46:16] h/wqw [10:12:18] 3Wikidata: Have the populateSitesTable.php maintenance script auto detect potentially enforced protocols - https://phabricator.wikimedia.org/T85859#965465 (10Lydia_Pintscher) p:5Triage>3Normal [10:31:36] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Redesign Item UI for Wikidata repo - https://phabricator.wikimedia.org/T54136#965490 (10Lydia_Pintscher) p:5High>3Low [10:41:52] DanielK_WMDE_: hangout in a few minutes? [11:13:02] sjoerddebruin: Finished your update yesterday, btw ;) [11:13:25] aude: REVIEWWWW https://github.com/Wikidata-lib/PropertySuggester-Python/pulls [11:13:41] I've already applied both PRs to create the update tonight :P [11:17:15] Lydia_WMDE: sorry, was distracted [11:17:23] np [11:17:26] working on the subscription service, see the board [11:17:33] cool [11:17:33] makign good progress [11:18:18] hoo: done [11:19:09] aude: Thanks :) [11:24:46] hoo: will test then [11:33:03] 3MediaWiki-extensions-WikibaseRepository, Wikidata: suggestion shown twice in suggester - https://phabricator.wikimedia.org/T84915#965533 (10Lydia_Pintscher) 5Open>3Resolved a:3Lydia_Pintscher [11:42:16] aude: Is there anyway to do memory based profiling on the cluster? [11:42:42] I doubt there is... but I really need it :S [11:43:00] might be [11:43:20] i would ask ori or bd808|BUFFER or tim [11:43:35] * aude would like a session about this at the summit [11:43:37] Good idea [11:43:49] and how to make best use of xhprof [11:44:10] locally, i have a hack in my local settings that tells me memory usage [11:44:25] same way as tells me response time [11:44:48] 3wikidata-query-service, Wikidata: Define which data the query service would store - https://phabricator.wikimedia.org/T86278#965535 (10Tobi_WMDE_SW) I assume qualifiers are covered? [11:45:05] * egonw is working on the WD4R proposal [11:46:37] Lydia_WMDE: the link in https://phabricator.wikimedia.org/T86278 is boken [11:47:05] Tobi_WMDE_SW: fixing [11:47:21] 3wikidata-query-service, Wikidata: Define which data the query service would store - https://phabricator.wikimedia.org/T86278#965537 (10Lydia_Pintscher) [11:47:39] 3wikidata-query-service, Wikidata: Define which data the query service would store - https://phabricator.wikimedia.org/T86278#964965 (10Lydia_Pintscher) Yeah they are. [12:10:49] is there some insight in the number of changes per day? [12:11:34] or per whatever time period? [12:11:57] that is hard to determine from "Recent changes" [12:12:01] egonw: You mean just the number of edits? [12:12:05] yes [12:13:06] I can give you edits/ month in a bit [12:13:28] yeah, that would do! [12:14:56] A lot. :P [12:15:17] that works for me, but I don't think the reviewers will accept this :) [12:17:00] egonw: http://fpaste.org/167622/14208057/ that's per month [12:17:08] I can also give you per day, if you need that [12:17:16] but not sure it's very useful [12:17:31] per month will be fine... [12:17:34] you can calculate that... [12:17:58] some idea of the growth in edits per day over a month would be useful too, but not essential [12:18:16] 201405 is huge, wasn't that the month that GerardM screwed up? [12:18:33] oh! [12:18:47] this is not just for one month, but for many! [12:18:48] Super! [12:19:34] hoo: you should tweet that table! :) [12:19:41] love to RT it [12:19:57] egonw: Running per day stats now [12:20:01] maybe I'll plot both [12:20:07] cool! suggestion: [12:20:15] save as CSV and then upload to Figshare [12:20:26] then I can cite those numbers in the proposal [12:20:40] and I think it may even automatically plot it... not sure... [12:24:43] (03PS1) 10Daniel Kinzler: Report progress while populating entity_usage table [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183836 [12:25:38] hoo: https://twitter.com/egonwillighagen/status/553527945445511170 [12:25:52] hoo: something better citable really welcome! [12:26:03] egonw: On that :) [12:26:37] (03PS1) 10Daniel Kinzler: Bump default batch size for populateEntityUsage.php [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183837 [12:26:50] (Put a . before @wikidata, so all your followers can see it) [12:27:01] Now only people who follow @Wikidata will see it. [12:27:34] sjoerddebruin: really? didn't know that... I've seen people use that, but tools use .@ but not @ ? [12:27:49] It's just a mention now. [12:28:00] yes, got it... thanx! [12:28:08] will do that from now on... [12:28:17] It's the most common way to do this. :) [12:28:51] yeah, customs change over time, and I missed this one... [12:28:59] the education is appreciated! [12:29:04] * egonw feels old now :) [12:30:29] (03CR) 10Aude: Report progress while populating entity_usage table (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183836 (owner: 10Daniel Kinzler) [12:30:37] egonw: https://people.wikimedia.org/~hoo/WikidataEditsDay.csv [12:31:40] deleted edits etc. are included [12:31:44] going to upload it to Figshare? [12:31:48] yes, that is good! [12:31:58] the context is realtime pulling in those changes... [12:32:10] which thus has to be a process taking 0.1s at most [12:32:25] and must include delecting things in the derivative [12:35:59] (03PS1) 10Daniel Kinzler: Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) [12:36:51] egonw: Sorry, I've never worked with figshare [12:36:59] but feel free to do whatever you want with that data [12:38:01] http://figshare.com/ is easy... you can just upload it there, but the advantage is a clear license statement, and that people can acredit you... (it gets a DOI and the conclusions can be added to the Wikidata entry in Wikipedia :) [12:38:37] (03CR) 10jenkins-bot: [V: 04-1] Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) (owner: 10Daniel Kinzler) [12:38:53] (03PS2) 10Daniel Kinzler: Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) [12:41:56] (03CR) 10jenkins-bot: [V: 04-1] Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) (owner: 10Daniel Kinzler) [12:44:20] (03PS3) 10Daniel Kinzler: Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) [12:47:56] (03CR) 10jenkins-bot: [V: 04-1] Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) (owner: 10Daniel Kinzler) [12:48:49] (03PS4) 10Daniel Kinzler: Introducing changes subscription infrastructure. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183839 (https://phabricator.wikimedia.org/T86184) [13:07:48] egonw: will you upload it? I'd love to tweet a chart [13:10:39] hoo: can I release the CSV as CCZero? [13:10:51] Lydia_WMDE, hoo: then I can take this forward this afternoon [13:11:03] \o/ [13:11:10] (upload to Figshare and make graphical plot...) [13:11:29] but only if hoo will not do this himself... because credit goes to him for the numbers [13:12:42] egonw: Go ahead, I wont find the time for further number crunching today ;) [13:47:58] !nyan | Lydia_WMDE [13:47:58] Lydia_WMDE: ~=[,,_,,]:3 [13:48:32] <3 [13:48:39] :D [13:50:07] ugh. i think i just found a *really* nasty issue with usage tracking. or rather, with all custom data updates registered in a parser output object... [13:50:09] eek. [13:52:18] ohnoes [13:53:01] Can we pretend it's not a problem until it hits us *hard*? :P [13:53:46] sjoerddebruin: any autolist outages today? :) [13:54:01] It seems fine today. [13:54:07] yay :) [13:54:36] But the lag is still there. :/ [13:56:47] Lydia_WMDE: still investigating. looks liek a core bug [13:56:52] would affect a LOT of things [13:57:04] ok [13:57:22] hoo: bad hoo! :D [13:57:54] Yeah! bad hoo! [13:58:00] Lydia_WMDE: why is he bad now? [13:58:03] Yeah! bad hoo! [13:58:27] :D [13:58:45] * JeroenDeDauw blames hoo for the rain [13:58:57] JeroenDeDauw: he suggested ignoring problems until they hot us hard in production ;-) [13:59:07] No rain over here, so obviously not my fault :D [13:59:13] lol [13:59:38] hoo: no, it means it's your fault ofc. No rain at your place, rain here, so you are being evil [14:00:09] sound logic that [14:00:11] +1 [14:00:13] Lydia_WMDE: the hail is also kinda a problem and it did hit me hard ealier :| [14:00:21] hah [14:00:35] staying inside ftw [14:01:12] Lydia_WMDE: yeah. But I want to be inside the office... [14:01:24] i see [14:01:30] And that evil Abraham has again not included a teleporter in our budget [14:01:43] bah [14:01:44] fail [14:02:20] Do we get a 3d printer now? [14:02:26] I need to... uhm ... print my code [14:03:39] this seems to cause problems: https://gerrit.wikimedia.org/r/#/c/174628/ [14:04:15] "Causes bug 1" [14:04:20] lovely [14:08:58] aude, hoo: it seems like the "prepared" parser output is missing our handlers. [14:09:09] data update handlers, i mean [14:09:20] which makes sense, they cannot be serialized [14:09:26] so i suppose they get skipped- [14:09:32] ugh! [14:09:42] this is a pretty massive issue... [14:10:20] just the usage tracking? [14:10:25] or other things at this point? [14:11:09] argh [14:11:13] already in production? [14:11:17] yes [14:11:22] $wgAjaxEditStash [14:11:25] seems to be a setting [14:11:30] Lydia_WMDE: Think that's the label issues we saw [14:11:40] yea. found that. turning it off doesn't prevent stale stuff from the cache being used :) [14:11:45] will submit a patch for that [14:12:25] hoo: ahhhhhh [14:12:26] ok [14:12:54] what label issue? [14:13:13] aude: The last edit one did to labels will never show up [14:13:17] but the one before that [14:13:31] ugh [14:13:41] aude, hoo: so, the issue is this: extensions can put custom DataUpdates into the ParserOutput. They will be executed when the content is saved. They get lost when the ParserOutput goes to memcached (at least the ones that can't be serialized). [14:14:07] usually, that's not a problem, because the PO is generated, then content is saved, then stuff is cached. th3e cached version is used only for display, so we don't need the4 data updates there [14:14:38] but now, the content is parser via the API "preemptively", while the user is still on the edit page. [14:14:54] so WikiPage gets the PO from the cache for processing - and loses all the DataUpdates [14:15:29] nobody noticed, because the one standard update, LinksUpdate, is added to the list on the fly, based on the links in ParserOutput [14:15:33] so... yay. [14:15:33] $useCache [14:15:34] I see... very nasty [14:15:34] = false [14:15:38] we can't make this seriali9zable [14:16:48] we'd need to somehow restore the data updates [14:16:59] we could use serializable updates, but that would kill CI [14:17:02] arg [14:17:12] will file a bug. i don't have a solution right now [14:17:23] maybe i can get aaron later, to discuss it [14:25:26] we crossed 14000 active users \o/ [14:25:30] highest ever [14:25:41] 13700 at the end of december [14:26:19] Nice :) [14:26:45] There's a few wikidata related OOMs in the logs [14:26:49] Not like spamming or anything [14:27:06] Ok... but why on earth are those logged [14:27:10] but the CA ones not [14:27:31] looks like 11 in total in the logs [14:27:34] only one isn't wikidata [14:28:04] What logs exactly are you referring to? [14:28:10] fatal logs on fluorine [14:28:15] looks like they're all in job runners [14:29:57] oh, fatal.log exists again :) [14:30:19] 11 entries in ~12 hours doesn't seem bad going to me [14:30:23] exception log is spammier [14:30:33] Reedy: That's because it's incomplete [14:31:13] fatals from appservers probably aren't logged at all [14:31:33] lol [14:31:35] The once I forced earlier today all didn't appear on fluorine [14:32:22] * ones [14:36:56] hm.. can't add values with url datatype [14:37:09] broken on wikidata.org, beta and test [14:37:09] on Wikidata? [14:37:13] but works for me locally [14:37:45] hoo: getting {"servedby":"mw1231","error":{"code":"failed-save","info":"","*":"See https://www.wikidata.org/w/api.php for API usage"}} [14:37:49] mh, it worked for me earlier today [14:37:57] oh *grrr* [14:38:01] I'm having a look [14:40:17] Tobi_WMDE_SW: https://www.wikidata.org/w/index.php?title=Q4115189&action=history [14:40:22] worked for me just now [14:40:34] What item did you try to edit and what property did you try to add exaclty? [14:44:20] hoo: whuut? it works for links to wikidata but not for others.. [14:44:27] tried on the sandbox item [14:44:43] tried adding a reference with property "referenced url" [14:44:49] but also tried adding a claim [14:45:14] works for https://www.wikidata.org/wiki/Q4115189 but does not work for eg http://www.google.com [14:45:46] Still can't reproduce :( [14:45:50] Used https://twitter.com/wikidata/ as url now [14:46:26] hoo: can you try http://www.google.com? [14:46:29] feeling mocked [14:46:31] :) [14:46:45] sure [14:46:55] does not work for me [14:47:00] hoo: and can you try as anonymous user? [14:47:06] i don't get any details as an error [14:47:21] aude: yes, exactly [14:47:28] phew.. relieved [14:47:34] uhm... worked for me? [14:47:43] but why is it working for hoo? [14:47:48] * aude tries on master [14:47:55] is wmde's ip blacklisted or something [14:48:05] i am logged out [14:48:08] aude: I have the same issue on test and beta [14:48:17] works for me on my local machine [14:48:22] https://www.wikidata.org/w/index.php?title=Q4115189&diff=186865167&oldid=186864801 [14:48:54] works logged in but appears broken [14:48:57] hoo: aude: strange thing is, after hoo added a specific url it starts working for me.. [14:49:12] so, http://www.google.com is now working for me after hoo added it [14:49:16] http://snag.gy/XDOCu.jpg [14:49:19] but not other urls [14:49:22] where is the link for reference url? [14:49:29] oh now it is there [14:50:04] could it be spam blacklist or abuse filter or some of these? [14:50:13] http://twitter.com [14:50:17] malformed url [14:50:27] https is ok [14:50:34] aude: It says that if you have extra spaces in the front [14:50:37] doesn't trim [14:50:42] gr [14:50:52] But I think that's that way for some time now [14:52:18] hoo: aude: so why do all the urls hoo has added work now? https://www.wikidata.org/wiki/Q4115189 https://twitter.com/wikidata/ and http://www.google.com are now working for me but nothing else.. [14:52:26] does not like http://openstreetmap.org [14:52:37] aude: let hoo add it once, then it will work# [14:53:03] worked [14:53:09] aude: try again [14:53:12] :) [14:53:18] works! [14:53:20] omg [14:53:20] hm [14:53:23] sooo [14:53:25] what? [14:53:53] did you try logged out yet or so? [14:53:56] That's very weird [14:53:59] i am logged out [14:54:04] oh [14:54:05] hoo: I tried logged in and logged out [14:54:34] Lydia_WMDE: any idea why I cant add tasks on phabricator to the #wikidata tag? [14:54:56] addshore: you need special javascript [14:55:03] addshore: You have to do that manually: Scrap the ID of the wikidata project and then add taht per hand [14:55:13] ahhhh! where can I find this funky javascript? :D [14:55:18] Lydia_WMDE: ^ [14:55:19] addshore: or go here: https://phabricator.wikimedia.org/project/view/125/ and click "new task" [14:55:35] then it has #wikidata in by default [14:56:08] https://phabricator.wikimedia.org/T75737 [14:56:25] https://phabricator.wikimedia.org/T75737#821012 [14:56:43] :D [14:57:13] Is there a bug open for the funky thing that happens with sitelink sections when empty and you click on edit then click on cancel and then cant edit that section again? [14:57:18] Or should I make one? ;p [14:57:31] addshore: does it happen on test.wikidata? [14:57:35] addshore: is that on wikidata.org or on beta? [14:57:40] or test? [14:57:40] wikidata.or [14:57:42] g [14:57:46] otherwise, doubt we will bother to fix between now and tuesday [14:57:48] *checks test* [14:57:51] addshore: can you try on test and/or beta? [14:58:20] hoo: any ideas what's with the url datatype thing? [14:58:35] doesnt happen on test :) [14:58:41] As long as I can't reproduce it, no... [14:58:47] Let me try logged out [14:59:03] would be nice if something was in the error logs [14:59:17] hoo: yeah try logged out and use urls you didn't try before [15:00:01] Tobi_WMDE_SW: Logged out it doesn't work [15:00:10] hm.. ok [15:00:18] at least you can reproduce now [15:00:20] :) [15:00:41] "" [15:02:40] hoo: http://figshare.com/articles/Wikidata_edits_per_day_2012_2014_/1286885 [15:02:49] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Automatically drop redundant sitelink to redirect when merging - https://phabricator.wikimedia.org/T85347#965726 (10Addshore) @Lydia_Pintscher, if discussion leads to us wanting to do this feel free to assign me and I'll take a poke when I find the time! [15:03:10] hoo: please let me know your full name if you like to be added as "author" [15:03:23] egonw: hoo man [15:03:26] * Reedy giggles [15:03:44] :P [15:03:44] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Automatically drop redundant sitelink to redirect when merging in wbmergeitems api module - https://phabricator.wikimedia.org/T85347#965730 (10Addshore) [15:03:47] Marius Hoch [15:03:49] if you want [15:04:19] ok, added [15:07:15] so, we are hitting dieStatus [15:07:23] from our error reporter [15:11:50] (03PS2) 10Hoo man: Update FormatAutocomments hook for I36c3a9e5 [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [15:13:08] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Automatically drop redundant sitelink to redirect when merging in wbmergeitems api module - https://phabricator.wikimedia.org/T85347#965750 (10Lydia_Pintscher) This seems fine to me. Subscribing a few more people for input. [15:14:15] (03CR) 10jenkins-bot: [V: 04-1] Update FormatAutocomments hook for I36c3a9e5 [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [15:15:06] 3Mobile-Web, Wikidata, MediaWiki-extensions-WikibaseRepository: Image thumbnail urls should be included where applicable in wikidata API response for commonsMedia - https://phabricator.wikimedia.org/T76827#965756 (10Addshore) For example https://upload.wikimedia.org/wikipedia/commons/thumb/c/c9/Fuerteventura_sun... [15:16:40] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Automatically drop redundant sitelink to redirect when merging in wbmergeitems api module - https://phabricator.wikimedia.org/T85347#965758 (10Sjoerddebruin) Would be great if this also works with the real duplicates (same sitelink on two items), but I like the... [15:17:37] addshore: thanks :) [15:18:52] (03PS3) 10Hoo man: Update FormatAutocomments hook for I36c3a9e5 [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [15:19:34] (03CR) 10Hoo man: "Fixed tests, avoided string concat with false." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [15:19:52] (03CR) 10Hoo man: [C: 031] "Would like someone to sign-off my changes." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [15:22:35] aude: Are you still investigating that API thing? [15:23:10] somewhat, i can try [15:23:15] api thing? :P [15:23:22] but nasty without anything in the logs [15:23:58] * aude has things like spam blacklist and abusefilter enabled locally and doesn't have the issue [15:24:07] Let me try locally [15:24:45] suppose if it is reproducable on beta, then might be able to do more debugging there [15:25:10] works locally :/ [15:25:27] although i always get flagged as "suspicious" when editing logged out on beta [15:25:57] can reproduce there [15:28:15] on beta, i see stuff like Fatal error: Class undefined: ProfilerSimpleText [15:28:47] likely unrelated though [15:31:08] something about captcha [15:31:18] get(fancycaptcha:dirlist:wikidatawiki [15:31:23] oh, that's possible [15:31:24] [memcached] result: NOT FOUND [15:31:39] Cache miss for mwstore://captcha-backend/captcha-render/e/8/1 captcha listing [15:31:47] My Wikidata account of course has the right to skip captchas [15:31:59] But yours should as well [15:32:04] logged out not [15:32:23] Sure, that's why it stopped working for me logged out [15:32:32] 3MediaWiki-API, Wikidata, MediaWiki-Page-editing: ApiStashEdit silently loses custom DataUpdates. - https://phabricator.wikimedia.org/T86305#965789 (10Lydia_Pintscher) [15:32:53] 3MediaWiki-API, Wikidata, MediaWiki-Page-editing: ApiStashEdit silently loses custom DataUpdates. - https://phabricator.wikimedia.org/T86305#965790 (10daniel) The only fix I can think of offhand would be to check if there are custom updates, and if there are, don't stash the edit. That would only happen after bu... [15:34:10] with a new account, logged in, doesn't work [15:34:19] probably has to do with captcha [15:34:46] :/ [15:35:02] i can put urls into the description and aliases [15:36:51] How did we handle that before? [15:37:06] Our UI never supported showing captchas [15:39:18] never supported it [15:40:40] jzerebecki: http://earth.nullschool.net/#current/wind/surface/level/orthographic=-11.85,51.78,1490 [15:42:49] * aude pulls all my extensions [15:43:00] could be some recent change to one of them [15:43:00] Lydia_WMDE: BBQ weather... :P [15:43:18] aude: I think the captcha thing was recently largely updated by Tim [15:43:20] could be related [15:43:25] haha [15:43:32] true [15:43:48] hoo: beta has entire debug log, if you want to look [15:43:53] could be something else there also [15:43:58] oO all of it? [15:44:04] pretty much yes [15:44:08] "web.log" [15:44:22] see CentralAuthHooks::onUserLoadFromSession: no token or session [15:44:26] also [15:44:33] dont' know if that is normal, since i am logged out [15:45:09] aude: I think it is [15:45:17] we have that in production extremely often as well [15:45:28] k [15:45:29] * hoo is doing to much CentralAuth recently :P [15:46:00] aude: Where do the logs reside nowayday? [15:46:05] * nowadays [15:46:26] might just be beta config [15:46:28] got it [15:46:36] * aude tries adding a blacklist link to the community portal [15:46:45] "Error": "Incorrect or missing CAPTCHA" [15:46:57] but i got captchas on wikidata [15:48:22] oooo [15:48:31] (03PS1) 10Hoo man: Add missing space to wfDebugLog call in SettingsArray [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183860 [15:48:34] get the same error on test.wikidata [15:49:00] but then do see a captcha on the bottom, next to the save button [15:50:27] [13WikidataBrowserTests] 15tobijat pushed 1 new commit to 06master: 02http://git.io/0u7epg [15:50:27] 13WikidataBrowserTests/06master 148384f5f 15Tobias Gritschacher: Adjust sitelink tests to work with SiteIdToInterwiki Gadget [15:51:00] aude: test.wikidata? Or beta? [15:51:07] test [15:51:14] https://test.wikidata.org/wiki/Wikidata:Test_1417775411.63 [15:51:34] i see the error, which probably triggers dieStatus [15:51:43] in wikibase api [15:52:02] but shows an error here and also strangely a captcha [15:56:43] 3Wikidata, MediaWiki-extensions-WikibaseClient: track entity usage on client pages - https://phabricator.wikimedia.org/T49288#965841 (10daniel) [15:56:57] 3Wikidata, MediaWiki-extensions-WikibaseClient: Use ArticleEditUpdates hook for usage tracking, instead of ParserAfterParse - https://phabricator.wikimedia.org/T86308#965844 (10Lydia_Pintscher) [15:57:10] 3MediaWiki-extensions-WikibaseRepository, Wikidata: Track the subscriptions - https://phabricator.wikimedia.org/T86185#965847 (10daniel) [15:57:29] 3Wikidata, MediaWiki-extensions-WikibaseClient: Use ArticleEditUpdates hook for usage tracking, instead of ParserAfterParse - https://phabricator.wikimedia.org/T86308#965848 (10daniel) p:5Normal>3High [15:58:08] die Status, hm. What gender is Status in German? [16:00:06] Nemo_bis: male: der Status [16:00:53] :) [16:01:12] Like Italian, for once [16:01:24] (03PS5) 10Daniel Kinzler: Introducing changes subscription management. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183835 (https://phabricator.wikimedia.org/T86184) [16:03:24] DanielK_WMDE_: Still around? [16:04:59] (03CR) 10Anomie: [C: 031] "Your changes look fine to me." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [16:05:37] (03CR) 10jenkins-bot: [V: 04-1] Introducing changes subscription management. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183835 (https://phabricator.wikimedia.org/T86184) (owner: 10Daniel Kinzler) [16:06:26] mariushoch/mediawiki-extensions-Wikibase/master/76df0e7 : jenkins-bot The build passed. http://travis-ci.org/mariushoch/mediawiki-extensions-Wikibase/builds/46458409 [16:07:10] (03CR) 10Hoo man: [C: 032] Update FormatAutocomments hook for I36c3a9e5 [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [16:07:51] mariushoch/mediawiki-extensions-Wikibase/travisTest/db64873 : Marius Hoch The build passed. http://travis-ci.org/mariushoch/mediawiki-extensions-Wikibase/builds/46458523 [16:11:51] * aude back later [16:12:00] (03Merged) 10jenkins-bot: Update FormatAutocomments hook for I36c3a9e5 [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/159327 (owner: 10Anomie) [16:12:41] aude: cu :) [16:13:35] (03PS6) 10Daniel Kinzler: Introducing changes subscription management. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183835 (https://phabricator.wikimedia.org/T86184) [16:15:41] [13ValueView] 15snaterlicious created 06inputautoexpandheight (+1 new commit): 02http://git.io/Tsa6Iw [16:15:41] 13ValueView/06inputautoexpandheight 148dabcb3 15snaterlicious: `$.fn.inputautoexpand`: Fixed height expansion mechanism [16:17:51] [13ValueView] 15snaterlicious opened pull request #142: `$.fn.inputautoexpand`: Fixed height expansion mechanism (06master...06inputautoexpandheight) 02http://git.io/m5Zu7Q [16:23:17] (03PS1) 10Henning Snater: Using textarea for entering description [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183866 [16:23:24] (03CR) 10jenkins-bot: [V: 04-1] Using textarea for entering description [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183866 (owner: 10Henning Snater) [16:24:08] (03Draft1) 10Henning Snater: Altered initialization order in entityview widgets [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183852 [16:25:08] 3Wikimedia-Site-requests, Wikidata: Enable "Other projects sidebar" by default on frwiki - https://phabricator.wikimedia.org/T85971#965910 (10greg) TpT and hoo and aude: That patch enables it by default, yes, but will that (by some other magical logic in the Other projects sidebar code) remove it from the list o... [16:25:09] Can I change a sttement to no value using Widar? [16:25:14] *statement [16:25:59] 3Wikimedia-Site-requests, Wikidata: Enable "Other projects sidebar" by default on frwiki - https://phabricator.wikimedia.org/T85971#965912 (10hoo) >>! In T85971#965910, @greg wrote: > TpT and hoo and aude: That patch enables it by default, yes, but will that (by some other magical logic in the Other projects sid... [16:26:10] 3Wikimedia-Site-requests, Wikidata: Enable "Other projects sidebar" by default on frwiki - https://phabricator.wikimedia.org/T85971#965918 (10greg) (I removed the planned date for that on the deploy calendar, of this coming Monday, until that question/issue is resolved) [16:26:38] 3Wikimedia-Site-requests, Wikidata: Enable "Other projects sidebar" by default on frwiki - https://phabricator.wikimedia.org/T85971#965924 (10greg) [16:27:21] (03PS4) 10Henning Snater: Rearranged entity view header [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/183498 [16:35:30] 3Wikimedia-Site-requests, Wikidata: Enable "Other projects sidebar" by default on frwiki - https://phabricator.wikimedia.org/T85971#965946 (10greg) >>! In T85971#965912, @hoo wrote: >>>! In T85971#965910, @greg wrote: >> TpT and hoo and aude: That patch enables it by default, yes, but will that (by some other ma... [16:37:17] 3MediaWiki-API, Wikidata, MediaWiki-Page-editing: ApiStashEdit silently loses custom DataUpdates. - https://phabricator.wikimedia.org/T86305#965949 (10Anomie) > Since DataUpdate objects are generally not serializable, ParserOutput skips them during serialization (see ParserOutput::__sleep). I don't know about "... [16:37:43] If not, I'll just use Python [17:17:10] aude: we should add deployment of arbitrary access to the deployment calendar for greg-g's peace of mind. any idea when you and/or hoo want to enable it? [17:19:08] Do we have list of performance deployment blockers? [17:19:57] i don't think so [17:20:04] :) :) [17:20:34] GeorgeEdwardC: When can we expect your name on WD:RFA? [17:20:48] * aude back [17:21:07] i would like to add the table to wikidata (since it's a client) next week, probably tuesday [17:21:17] and enable the tracking there for arbitrary access [17:21:27] wikidata already has arbitrary access but no tracking yet [17:21:50] then, depending on how things go, enable on commons in february [17:21:50] aude: fine with me. greg-g ^ [17:22:20] then staged, like we did originally with "phase1" and "phase2" on the wikipedias etc [17:22:24] yeah [17:22:30] aude: kk, please add to [[wikitech:Deployments]] :) [17:22:43] ok [17:22:54] Before we hit any larger wiki we should at least resolve some of hte deployment blockers [17:23:04] hoo: sure [17:23:19] I wish I had more time :( [17:24:18] (btw, would like something at the developers summit about performance, profiling-xhprof and what more there is to help us, in terms of metrics) [17:24:58] \o/ [17:26:09] sjoerddebruin: Well :P [17:26:56] GeorgeEdwardC: If people make good deletion requests, they should be able to do them themselves. [17:27:22] Ok, I'll go for it [17:27:47] Jeez, wish this was so easy on nlwiki. [17:29:09] hmmm, we might want to deploy the arbitrary access for wikidata on wednesday [17:29:32] just so we can have our own slot (which are more available earlier in the day) [17:29:38] aude: You mean UsageTracking? [17:30:13] ah yes [17:30:20] usage tracking of arbitrary access [17:31:39] maybe 1pm for us (or when we come back from lunch) on wednesday [17:31:50] daniel will be around if there are problems [17:32:29] aude: That will conflict with the kick off meeting [17:32:35] AFAIS [17:32:55] that is a bit later [17:33:09] it's 13:30 [17:33:14] that gives you 30min [17:33:14] oh [17:33:18] ok :) [17:34:14] * aude looking at wrong week [17:35:29] 6pm then [17:35:54] Also Wednesday? [17:36:09] sure [17:36:24] that way we have whatever new code deployed already (although might be irrelevant for this) [17:36:24] that also works for me [17:36:27] ok [17:38:06] might also take an hour on monday to enable this on test.wikidata first [17:38:21] Ok [17:38:36] We also have the other projects sidebar to frwiki on Monday [17:38:58] we do [17:39:19] maybe earlier on monday, like 3pm for this [17:39:21] should be easy [17:40:08] and also test2 etc [17:40:24] Yeah. [17:40:33] I'm not going to be aorund, but that shouldn't be an issue [17:40:36] both test and test2? [17:40:44] or should we leave one without it [17:40:49] mh [17:40:50] just so we are testing the other way [17:41:40] think i will do test + test.wikidata [17:41:44] can leave test2 for now [17:42:02] I doubt it will matter, but I tend towards keeping one wikipedia liek [17:42:09] yep [17:42:21] scheudled [17:43:41] 3Wikidata, wikidata-query-service: Define which data the query service would store - https://phabricator.wikimedia.org/T86278#966086 (10Smalyshev) Only things explicitly mentioned in the wiki are covered now. So, qualifiers are covered, but references, sitelinks, badges and aliases are not. I'd like to hear mo... [17:43:57] done with the calendar [17:44:12] :) [17:44:41] Also our google calender? [17:45:42] * aude doesn't look at that [17:56:11] hi all, i'd like to split https://www.wikidata.org/wiki/Q9022297 in two, one for categories covering all municipalities in estonia, one for categories covering only rural municipalities (vallad) [17:56:26] how do i go about it, should i discuss it somewhere first? [17:57:42] some wikipedia versions mix them, some have completely separate category trees for cities and rural municipalities [17:57:51] hi [17:58:05] Hi matej [17:58:25] essin: I don't know how controversial this can be, so it is up to you [17:59:49] Maybe discuss it in the project chat. [18:00:06] sjoer: Writing it now ;) [18:00:12] etwp has https://et.wikipedia.org/wiki/Kategooria:Eesti_omavalitsused for all municipalities (no wikidata link) and https://et.wikipedia.org/wiki/Kategooria:Eesti_vallad for the rural municipalities [18:01:15] ok so start with the no-link item [18:01:41] and then just move sitelinks where they should be [18:01:49] 3wikidata-query-service, Wikidata: Define which data the query service would store - https://phabricator.wikimedia.org/T86278#966209 (10Lydia_Pintscher) Didn't you have qualifiers in the proposal already? Anyway. They should be in ;-) Some examples: References: "Give me all statements referenced to Nature" or "... [18:07:13] Lydia_WMDE: Would it also be possible to find unreferenced statements then ^ [18:07:50] sjoerddebruin: mention it there and we can see if stas can get that to work :) [18:07:58] Will do. [18:08:01] thx [18:08:24] There is no easy way to do that now, if I'm right... [18:08:34] idunno tbh [18:12:40] Lydia_WMDE: so, the qualifiers are in the proposal [18:12:44] sjoerddebruin: when you say the project chat, do you mean this channel, or elsewhere? [18:12:59] essin: [[Wikidata:Project chat]] [18:13:07] thx [18:13:10] Oh, don't we have a linkie bot here? [18:13:37] Lydia_WMDE: I'm not sure though what we'd do with the aliases. Are we going to do lookups by name? Because I was assuming that's already covered by elastic, etc. [18:14:13] SMalyshev: hey :) you have the labels in there right? aliases are basically the same. [18:14:20] SMalyshev: or am i misunderstanding? [18:14:26] matej_suchanek: i don't expect it to be very controversial, but i haven't done anything on wikidata involving so many wp language versions before [18:14:27] and yeah i misread your comment at first [18:15:17] Lydia_WMDE: I have the labels because otherwise I'd have to remember whether Q32 is George Washington, USA or Luxembourg [18:15:26] ok [18:15:36] so I need something human-readable to see it. But I don't really need 10 of them :) [18:16:06] well if you have them you can make use of them ;-) for queries like give me all people names george washington [18:16:08] SMalyshev: elastic would tie in with titan, right if we want to combine query of label/alias with soemthing else? [18:16:34] but yeah not sure if it is absolutely necessary [18:16:34] * aude wants to find all the items with "x" label or alias that are instance of place [18:16:41] but if you have one it seems to make sense to also have the other [18:17:14] aude: Titan has ElasticSearch index, yes, but not all combinations are supported by index. We can query ES directly also if there's something we know ES supports but Titan doesn't [18:17:23] SMalyshev: another thing i was just thinking of: do you already have novalue and somevalue covered as special value types? [18:17:28] SMalyshev: ok [18:17:54] aude: note though more complex the query gets more changces are it'll take forever :) [18:18:16] * aude nods [18:18:27] Lydia_WMDE: yes, novalue and somevalue are covered as special values [18:18:33] cool [18:19:00] maybe jan and daniel can also chime in on labels/aliases later [18:19:07] ok [18:19:24] Lydia_WMDE: do you have per chance some examples of pages with badges, for testing? [18:19:29] sure [18:19:30] sec [18:19:46] i would want to take another look at wdq and all the things it covers [18:19:53] or terminator [18:19:59] https://www.wikidata.org/wiki/Q84 [18:20:09] there in the sitelinks at the bottom [18:20:34] grey and yellow [18:20:36] Lydia_WMDE: aha, thanks. So badges come with sitelinks, right? [18:20:40] right [18:20:49] essentially, badge is a property of a sitelink [18:20:51] they are an attribute of the article that is linked [18:20:57] yeah [18:21:04] they can have several [18:21:07] cool. [18:21:22] there is a limited number of badges [18:21:27] it is a defined list [18:21:38] but they can be used for any sitelink [18:22:30] Lydia_WMDE: WDQ has only "has link"/"has no link" type of query - do we need anything else? [18:22:49] i _think_ not [18:22:56] (taking into account the addition of badges of course) [18:23:17] if people want the specific link they'd go via a different api [18:23:19] ok, then I'll make links a simple yes/no list for now [18:23:47] Lydia_WMDE: do we want to actually have the link name or just the fact the link exists? [18:24:00] i think the fact that the link exists is enough [18:24:18] because if you want the item with a specific article name there is a special page for that [18:24:48] or if you want the article name for a specific item on a given site then that is also via another api request [18:25:41] badges aren't even covered yet, in terms of querying via existing api or special page or anything :( [18:25:49] nor in a database table [18:25:54] yeah [18:25:57] it's on our todo and would be really nice [18:26:10] Would be great to match categories with badges. [18:26:20] yeah [18:26:28] categories? [18:27:08] Our featured articles are in a category. [18:27:13] Our (nlwiki) [18:27:14] probably out of scope for query but suppose catgraph (of clients) + wikidata? [18:27:36] yeah [18:27:47] maybe soem tool could be made to cross reference using wikidata query + something else [18:28:25] matej_suchanek: the more i look at it, the more i think the current object is mostly considering all municipalities, so maybe it would be easier to create a new object for rural municipality categories, move the etwp sitelink there, and add the general etwp category as a sitelink on the existing object [18:30:25] I don't see any category data in the source data [18:31:19] but i haven't worked with statements before, could i just leave it blank or would that be considered as creating an incomplete object? [18:31:49] SMalyshev: yeah i don't think that is in scope for query [18:31:55] needs some more tools [18:33:05] I see also WDQ doesn't do references now, right? [18:33:28] * Lydia_WMDE doesn't know [18:33:55] Is there any way to assign "no value" using AutoList? [18:35:34] * Lydia_WMDE is away for a bit - laundry and food [18:35:39] I'm planning to make a bot for replacing "102:327591" with "102:(no value)" [18:38:23] GeorgeEdwardC: Autolist only works with values sadly. [18:38:49] aude: can you edit the deploy page with dates, please? [18:39:03] it's "not yet scheduled" as it has dates :) [18:39:10] not "not yet scheduled", that is [18:39:20] it's scheduled :) [18:39:37] Oh, I'll probably need to get some Python then [18:41:28] greg-g: "probably february" = don't have a date yet, imho [18:41:47] * aude don't want to overpromise [18:56:11] GeorgeEdwardC: I know that WDQ represents novalue as '4294967295' but I have never tried adding such values [18:58:18] But I need a way to add novalue with Widar, it doesn't seem to be possible [19:05:35] matej_suchanek / GeorgeEdwardC : -1 [19:05:48] So you could try adding Q-1 [19:08:03] Right [19:09:00] So P102:Q-1? [19:09:11] Or P102:327591 [19:09:20] *Q-327591 [19:11:30] Once I've got this figured out I'll request the flag and hopefully it can be done in one run [19:31:27] ok, now i posted at https://www.wikidata.org/wiki/Wikidata:Project_chat#Municipalities_of_Estonia.2C_splitting_a_category_object , i'll check back later [19:34:25] GeorgeEdwardC: your AutoList task will be "-P102:Q327591 (break) P102:Q-1", thank you multichill again [19:34:59] Ok I'll run one test [19:35:08] Then request the flag for the other nearly 700 [19:36:56] Running... [19:37:36] I see that you want to be one of sysops... [19:37:58] matej_suchanek: https://www.wikidata.org/w/index.php?title=Q10296812&diff=prev&oldid=186911081 [19:38:00] Didn't work [19:38:13] I see... [19:39:00] But yes :-) [19:39:47] great imo [19:40:48] just realised we should have tested on the sandbox... [19:41:18] Yeah, I guess [19:41:26] Didn't do any harm though [19:41:51] I mean if we can't do it with Widar, I'd just learn some PyWikiBot [19:45:53] on my home wiki, I used PyWikiBot only for one task but I found it too complex for me (two yrs ago)... no it may worth learning using it [19:46:13] *now it may be [19:51:35] matej_suchanek: You're welcome. I'm happy to see you found it! [19:51:39] https://commons.wikimedia.org/wiki/Special:Contributions/BotMultichill <- munge munge munge [19:56:48] 3Wikidata, MediaWiki-Page-editing, MediaWiki-API: ApiStashEdit silently loses custom DataUpdates. - https://phabricator.wikimedia.org/T86305#966553 (10daniel) @anomie: adding the guarantee that DataUpdates must be serializable would make the mechanism a lot more complex (right now, it's an easy way to provide ca... [20:02:11] I need to go, bye! [20:45:51] * jeblad yawns [20:45:56] why [21:02:44] Oh, wait, we had office hour today [21:03:01] multichill: Nope, that's next week [21:03:50] Wikidata on IRC next Friday at 18:00 UTC <- always confusing [21:04:03] Yeah, I was confused as well :P [21:04:13] For me that reads as "the next day that equals Friday equals today" :P [21:04:16] But Lydia posted the link to that time converter thing and it had the date in the URL [21:15:49] hoo|away: If people do stuff like https://www.wikidata.org/w/index.php?title=User:Inwind&curid=18888399&diff=186923059&oldid=179779975 you know the item suggester isn't working properly [21:16:36] It's horrible. [21:17:06] I don't get why male and female are not showing up as first hits [21:17:14] It's sorted on number of sitelinks. [21:17:21] Should be number of incoming internal links. [21:17:40] Even better to take the lefthand side into account [21:25:40] mh :/ [21:26:38] wtf [21:27:20] Either my home network starts getting unstable or my network port on this notebook is dying :S [22:02:05] You will get close to random guesses if you only use incoming links to the items in the same property [22:02:33] You need dependency to get the probabilities right [22:05:08] There is at least one solution by using neural nets but the complexity is prohibitive heavy [22:06:03] JeroenDeDauw could probably do it in 2.71 nights or 3.14 days.. [22:06:27] But back to do something useful... [22:07:49] Vi aude, Lydia_WMDE, Tobi_WMDE_SW_NA, DanielK_WMDE_ and whatever else [22:08:51] hey jeblad [22:10:17] So there is an upcoming lightweight "theorem checker"? [22:10:59] Seems like it is only doing named entity checks, but it is good enough to cach a lot of cases [22:12:27] I would prefer to have somewhat better description than this https://www.wikidata.org/wiki/Wikidata:Project_chat#Data_model_-_consistency_checks_-_detecting_false_information [22:30:24] "theorem checker" is certainly a much more powerful tool than what Lydia is talking ther about [22:30:43] the student's tool is just checking consistency [23:09:32] dennyvrandecic_: not even really consistency. just some very basic constraints, like range and domain of properties. [23:13:51] (03PS1) 10Daniel Kinzler: Use ArticleEditUpdates hook for usage tracking [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/184002 (https://phabricator.wikimedia.org/T86308) [23:16:28] (03PS11) 10Daniel Kinzler: Batched term lookups for RC lists [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/180140 (https://phabricator.wikimedia.org/T74310) [23:19:16] DuesenFaq (DanielK_WMDE?) Oh, I thought they were checking consistency with external databases like MusicBrainz [23:22:53] dennyvrandecic_: right, they also do that. i mean they are not checking internal consistency, looking for contradictions. [23:25:59] DanielK_WMDE: yes, which is why I thought a theorem checker would be a quite big word for what they are doing [23:27:44] right :) [23:28:05] i was just interpreting "consistency" in a different way [23:28:15] yeah, it has many meanings [23:28:47] i think you had the meaning from ontology in mind, checking an ontology for internal contradictions