[00:06:48] hoo: i guess looks ok [00:07:09] would be nice if we could retry at the entity level [00:07:34] Yeah [00:07:36] but not sure how [00:07:45] me neither [04:23:01] hola buenoas noches alguno de ustedes habla español...?? [04:23:25] jejeje escribi mal... [04:23:51] quise preguntar si alguno de ustedes habla español.?? [13:46:25] moin :) [13:54:40] hey aude! [13:55:50] jzerebecki: which build of wikidata does jenkins test against for the ArticlePlaceholder repo (and the MediaInfo repo)? I'm mostly wondering whether this change is visible to jenkins on other repos: https://gerrit.wikimedia.org/r/#/c/289393/ [13:56:48] DanielK_WMDE: it doesn't use the build, the plain Wikibase or whatever dependencies are also specified [13:57:45] jzerebecki: oh, it uses master? ok then [13:57:54] yup [13:58:34] frimelle: wfLogWarning [14:57:35] jzerebecki: where are the phpcs rules we use for wikibase? do you know how to test them locally? running `vendor/bin/phpcs -sp --standard=phpcs.xml --extensions=php` seems to do nothing. [14:58:04] DanielK_WMDE: i think they are in wikibase [14:58:37] we have our own phpcs.xml file [14:58:50] aude: i also tried that. also seems to do nothing [14:58:54] oh really [14:59:06] surely phpstorm can run phpcs automatically [14:59:36] probably somehow [14:59:40] but that'S not the point. [14:59:49] i'm trying to find out why (and since when) certrain rules fail. [15:00:04] DanielK_WMDE: composer cs [15:00:20] composer.json has aliases [15:00:24] aude: this report is bogus: https://integration.wikimedia.org/ci/job/composer-php55-trusty/5450/console [15:00:51] the line it is complining about doessn't have a call-by-reference-param. it has a param that is an arary that contains a reference. [15:01:09] the new hook, to be precise: https://gerrit.wikimedia.org/r/#/c/293506/3/client/includes/WikibaseClient.php [15:01:39] jzerebecki: also seems to do nothing [15:03:06] what if you run it manually [15:03:08] . [15:03:12] ./vendor/bin/phpcs -sp --standard=phpcs.xml --extensions=php --ignore=extensions/ValueView,vendor,.git . [15:03:15] ? [15:03:47] DanielK_WMDE: at least use short array syntax? [15:04:00] maybe cs is getting confused [15:04:43] aude: it failed for short array syntax. since the same line works with old array syntax in the same file, i amended to use the old syntax [15:04:58] hmmm [15:05:00] aude: this is the diff for which it failed: https://gerrit.wikimedia.org/r/#/c/293506/1/client/includes/WikibaseClient.php [15:05:09] still waiting for the result for new syntax [15:05:16] but anyway, this is clearly broken. where does this come from? [15:10:40] it's complaining now about whitespace [15:10:41] https://integration.wikimedia.org/ci/job/composer-php55-trusty/5458/consoleFull [15:10:52] also get that when i run composer cs locally [15:11:14] jzerebecki: Wait you're backportin? [15:11:57] aude: ? [15:13:43] hoo: i'm not deploying wikidata stuff [15:13:55] But Jan is? [15:16:10] yes [15:16:19] shoudn't I [15:16:24] aude: running locally doesn't work at all for me for some reason. with the old array syntax, phpcs no longer complains about pass-by-references. fixed the missing whitespace. [15:16:45] DanielK_WMDE: just composer install inside wikibase [15:16:47] and composer cs [15:16:56] I didn't want these to linger more because I'm still nervous that there is actually some negative effect [15:17:03] jzerebecki: You can [15:17:13] but you also need to pick up the other change (as noted on the ticket) [15:17:22] I'm going to also deploy later today [15:17:27] yes thx [15:17:30] but you can already pick that up now, if you want [15:17:41] hoo: which one? [15:17:55] https://wikitech.wikimedia.org/wiki/Deployments#Thursday.2C.C2.A0June.C2.A009 [15:18:01] ArticlePlaceholder to two more wikis [15:18:17] Will also require AP bump and (maybe) Wikibase backports [15:18:26] so I would have just picky backed this [15:19:32] * piggybacked [15:20:29] or put it in swat [15:20:33] but whatever [15:20:39] hoo: can you cherrypick the backports? [15:22:03] aude: Doubt we need a separate deploy for that [15:22:33] jzerebecki: Sure… just the master connection stuff, right? [15:25:09] hoo: i meant the stuff needed for AP [15:25:18] hoo: I do not know what that is [15:25:39] aude: could you have a look at a patch? https://gerrit.wikimedia.org/r/#/c/293124/ [15:25:45] I'll have to look at the diff myself, but maybe nothing, yet [15:26:08] aude: the idea is to have a mechanism to register the namespaces for the MediaInfo type automatically, without requireing anything in LocalSettings. [15:26:20] can take a look today [15:26:27] sounds good [15:53:52] hoo: *sigh* selenium tests consistently fail. no use. perhaps I can try again tomorrow. [15:56:53] :/ [15:57:00] Will have to override them, then [16:00:04] that is indeed an option as it is only selenium with one scenario. [16:09:04] ugh :( [16:14:15] hoo: done submitting the 2. what about yours? [16:14:26] Still working on that with Lucie [16:14:34] And tried to talk with Lydia [16:14:39] but she seems unresponsive [16:15:30] hoo: we have three Wikis today. :) [16:15:46] Three? [16:16:12] lv, gu, nn [16:16:22] You're assigned to the tickets [16:16:44] hoo: i think lydia is busy this week (not working, other stuff) [16:16:50] but maybe around later [16:17:13] Oh, I see [16:17:42] Phabricators html emails are annoying [16:18:13] Oh… that was easy to change: https://phabricator.wikimedia.org/settings/panel/emailformat/ [16:20:03] There's no ticket for nnwiki [16:20:06] or maybe I'm just blind [16:21:31] https://phabricator.wikimedia.org/T130997 Don't worry hoo that's what you got me for :3 [16:32:32] hoo: will you do the rest? I think I need sleep. [16:32:54] Yeah [16:33:24] thx [16:36:28] * aude sitting outside with most perfect weather conditions :) [16:50:31] I'll be back for the deploy [16:51:10] ok [17:20:34] hmmm.... entityNamespaces setting might be complicated for commons [17:20:54] * aude files a task [17:26:46] why can't i add Wikidata as a tag on phabricator? [17:27:11] https://phabricator.wikimedia.org/T137444 [17:27:31] andre__: ^ [17:29:15] aude: The old to many matches bug [17:29:22] https://phabricator.wikimedia.org/typeahead/class/PhabricatorProjectDatasource/?q=wikidatad&raw=wikidata&__ajax__=true&__metablock__=7 [17:29:34] There you can find the ID of the Wikidata project (PHID-PROJ-7ocjej2gottz7cikkdc6) [17:29:39] and then just a little DOM magic [17:29:42] hmmmm [17:30:03] Not exactly nice, though [17:30:15] this hsasn't been a problem for me until just now (and a long time ago) [17:30:28] Yeah, we had the same problem a long time ago [17:30:43] guess the phab. update yesterday(?) introduced that regression again [17:31:00] :( [17:45:26] aude: waiting for upstream to fix https://phabricator.wikimedia.org/T76732 entirely [17:46:37] andre__: Well, it wasn't a (visible) issue until the last update [17:46:39] bump priority? [17:57:40] andre__: it's been okay for a while [18:42:16] I totally underestimated the waiting time for jenkins [19:20:44] aude: Around? [19:21:41] yeah [19:22:06] Can you quickly have a look at https://gerrit.wikimedia.org/r/293551? [19:22:10] The change is trivial [19:22:16] just saw https://phabricator.wikimedia.org/T137404 [19:22:32] EntityNamespaceLookup::getEntityNamespace was changed from wmf3 to master [19:22:40] so we need to adopt to that on the AP branch [19:22:43] if we want to backport [19:22:45] trivial fix [19:23:06] And yes, that blocks the train… will have a look after I'm done with my AP deploy [19:23:25] Or you can have a look now [19:23:31] but please, review my AP change [19:23:42] can i +2? [19:23:50] {{done}} [19:23:50] How cool, aude! [19:23:50] Yes, go ahed [19:23:59] * hoo waits for jenkins again [19:56:50] hi there, someone working with introspection via Sparql ? [20:06:55] introspection via Sparql ? I wonder what you mean [20:08:37] aude: Are you looking into the interwiki UI mess? [20:08:41] I'm having other troubles [20:08:49] and doubt I want to work much much longer today [20:09:55] rom1504: yeah, I mean, querying wikidata via query.wikidata.org, and trying to get the most used properties for a given class and things like this [20:10:17] of course, trying not to get that obnoxious queue timeouts (aaaargh!) [20:11:41] ah [20:11:42] I tried to get my own server running, but is a lot worse, once I get that data imported (a lot of hours), queries tend to fail earlier than in the query.wikidata.org server :-( [20:12:14] sounds like something that would be easier to achieve by processing the dump yourself [20:12:24] some hints and thought are really appreciated [20:13:09] hoo: i wonder how widespread the problem is? [20:13:13] mmmh, working directly with the dumps and forgetting sparql is my last resort [20:13:21] aude: The UI one? [20:13:27] https://ca.wikipedia.org/wiki/Uapit%C3%AD has Wapati in the langlinks table and Elk as a sitelink on wikidata [20:13:45] not sure i understand content translation but am looking [20:13:47] I don't have a lot of experience with wikidata, but doing processing on most of DBpedia is easier by processing the dump without a triplestore, because triplestores are usually optimized/configured to work better with smaller queries [20:14:39] rom1504: oh lovely dbpedia, with all that duplicity :-) [20:14:42] aude: I ran into huge issues when trying deploy the fix for the master connection bug [20:14:42] repking: if your query is basically processing every entity (or every entity in class like Q5) then query may not work for you because of the timeouts [20:14:45] it is probably possible to increase timeout in the config though [20:14:53] And I didn't even go for Wikipedias yet [20:15:34] rom1504: yes it is if you run your own server [20:15:55] (well, it's possible to increase it on query.wikidata.org too but we won't do it for now :) [20:16:07] hoo: :( [20:16:29] (that works for most things for DBpedia, but for some things you endup using tons of ram (if you are asking a good portion of the kb in a query for example)) [20:17:15] aude: Sadly I don't think we can do anything about it… it's just MariaDB going mental [20:17:50] repking: you mean the mapping/raw properties thing ? [20:17:51] SMalyshev: I tend to narrow the graph by typical class based properties, like nationality and things like that for Q5, but is a drag, since what I'm trying to get is that very popular properties [20:19:00] repking: for statistics and such, if the criteria is simple, it may indeed be simpler to use dump processor, such as Wikidata Toolkit maybe [20:19:02] I think I should go for what rom1504 suggests and work with the raw data, but I don't like that [20:19:18] query is optimized for complex conditions that return relatively small number of results. [20:20:08] mmmh, I'll give Wdata Toolkit a try, thx [20:20:14] otherwise, try to see how many records you are processing... if it's millions, then it will be slow. If its not that many, maybe you need better query :) [20:20:58] my goal in some sort of future is a reasoning machine of some sort [20:22:24] I mean, a program that will treat items by it's class and is able to offer information [20:23:00] meaningful information, like comparing items with others of the same class or context [20:23:51] that's the "top view" purpose of the research :-) [21:01:34] aude: All article placeholder deploys done for today [21:01:49] Going to take a break now… ping me if you need CR or so [21:07:48] ok [21:44:26] hmmm, there's a bug in article placeholder :( [21:45:33] hm? :( [21:48:10] try searching for a non-existing title [21:48:31] https://lv.wikipedia.org/w/index.php?search=kittens%21%21%21%211&title=Special:Search [21:48:43] it's probably a trivial fix [21:48:43] ouch [21:48:58] * aude accidentally paste a git hash into wikipedia search [21:49:10] but it's only where article placeholder is enabled [21:49:20] 2016-06-09 21:48:37 [V1nkNQpAIDcAAGpAzbEAAAAI] mw1185 lvwiki 1.28.0-wmf.4 exception ERROR: [V1nkNQpAIDcAAGpAzbEAAAAI] /w/index.php?search=kittens%21%21%21%211&title=Special:Search MWException from line 1548 of /srv/mediawiki/php-1.28.0-wmf.4/includes/db/Database.php: DatabaseBase::makeList: empty input for field page_title {"exception_id":"V1nkNQpAIDcAAGpAzbEAAAAI"} [21:49:36] yeah [21:53:27] i wonder if the interwiki issue is related to https://gerrit.wikimedia.org/r/#/c/293099/ [21:53:37] and the second link is supposed to be simple english? [21:53:55] * aude adds simple english wiki links to my repo [21:54:32] pretty sure [21:56:04] that's it [21:59:41] hoo https://gerrit.wikimedia.org/r/#/c/293627/ [21:59:42] :/ [22:00:41] checking that reverting is enough [22:01:12] it is but we are stuck with caching for an ~hour [22:49:52] hoo: if you are still around, i have a patch for article placeholder in a minute [22:51:04] I am :) [22:55:48] https://gerrit.wikimedia.org/r/#/c/293639/ [22:59:12] looking now [23:00:48] tried not to touch too much, yet make the code more clear for what this fixes [23:01:04] and is doing [23:03:56] aude: Looks good… did you test it? [23:04:07] yes, with different types of searches [23:04:26] Ok [23:04:29] if we hurry, we can get it into swat [23:04:32] the control flow is still a little weird [23:04:36] yeah [23:04:38] but ok [23:04:56] i didn't want to touch too much [23:05:03] renderTermSearchResults should not do the notability check… it does to many [23:05:05] but ok for now [23:05:08] agree [23:11:28] * aude cries........ mosquitoes in the office :P [23:11:45] meh [23:11:51] office = outside [23:11:56] hehe