[05:22:53] PROBLEM - puppet last run on wdqs1001 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [05:50:53] RECOVERY - puppet last run on wdqs1001 is OK: OK: Puppet is currently enabled, last run 21 seconds ago with 0 failures [10:43:02] aude: addshore: Aleksey_WMDE: good new hamcrest autoload-dev is now properly injected for Wikidata build :}}} [10:43:06] I have upgraded composer [10:43:13] latest build is https://gerrit.wikimedia.org/r/#/c/340698/ [10:43:33] fails at https://integration.wikimedia.org/ci/job/mwext-testextension-hhvm/38654/console [10:45:32] (reference is https://phabricator.wikimedia.org/T158674 ) [11:29:54] Food! https://www.lieferando.de/lieferservice-muglia-berlin#! [11:32:05] DanielK_WMDE! Jonas_WMDE! leszek_wmde? [11:32:27] Thiemo_WMDE Sabzi Tikka [11:32:31] reis [12:11:50] interesting new blog post on OSM + WIkidata https://www.openstreetmap.org/user/sabas88/diary/40577 [13:29:01] hashar: thanks :-) [14:05:21] SMalyshev: seems that wdqs is still unstable (or at least not as stable as we could wish). "Query timeout limit reached" reported by Auregann_WMDE [14:06:00] SMalyshev: load on servers (CPU / IO / memory) seems well under control [14:07:17] one of the problematic query reported is "Airports within 100km of Berlin" which looks simple, but the geosearch might be more expensive than it looks (I have no idea) [14:10:02] aude: https://gerrit.wikimedia.org/r/#/c/340745/1 [14:10:09] leszek_wmde: --^ [14:11:12] I don't see any recent OOME, so the increase of heap size seems to help [14:11:15] gehel: SMalyshev may not be awake yet, it's still early in the US. [14:11:41] DanielK_WMDE: yep, I know... but I expect he will read the backlog when waking up... [14:11:50] * gehel hopes so... [14:12:07] ok then! [14:12:15] i think he will [14:12:30] I'm officially not working today... and I can't think of anything I can do right now... [14:14:08] timeouts are expected on expensive queries, not sure why those are expensive... [14:14:51] thanks gehel :) it's surprising because I often use this one and had no problem so far [14:14:55] Auregann_WMDE: if you have a list of queries that systematically time out, we could have a look at why they are expensive... and maybe correlate those queries with suspicious activity on the wdqs servers... [14:15:11] if you could open a phab task, I'm sure we can have a look! [17:53:39] I wanna add enwiki sitelink to this query "Turksish woman who have no page in tr wiki but have enwki" : http://tinyurl.com/jqf25d7 [17:53:54] can somebody help me? [18:00:02] mavrikant: how about this? http://tinyurl.com/hba953w [18:00:38] hm, I guess you need to filter out the commons links [18:01:39] yes. can you give full query not just result [18:02:38] found link on bottom left corner [18:03:07] this is better: http://tinyurl.com/gwm37ta [18:04:42] pintoch: awesome thanks a lot. [18:04:50] yw :-) [18:26:27] I am adding limit but still it gives "Query timeout limit reached" error : http://tinyurl.com/j84hgg7 [18:26:46] I just need 100 pages [18:28:21] mavrikant: shouldn't your "?s" be "?item" instead? [18:28:51] (with that version, it works for me) [18:29:23] Hi -- can anyone help me debug why my Mediawiki+wikibase installation gives an HTTP 500 error when I try to add an item or property? Here is my $wgDebugLogFile: http://sprunge.us/ALgc Any ideas would be appreciated! [18:30:24] pintoch: you are right. I am sorry this was mine fault. [18:36:59] gehel: query timeout is normal :) [18:36:59] gehel: query timeout is normal :) [19:10:35] SMalyshev: that was my intuition, but I don't have much to back it up... [21:15:33] So for my problem, I did try running composer to make sure I had all dependencies, but it didn't fix the problem. MatmaRex on #mediawiki referred me here. I think the problem is this: "[fatal] [21618be2] PHP Fatal Error: Class 'Wikimedia\Rdbms\SessionConsistentConnectionManager' not found". What can I do to debug / fix this? [22:16:08] https://soundcloud.com/thieverycorporation/sets/the-temple-of-i-i-1