[03:55:19] WDQ is down again... last time it was connection issues with the SQL server ... now it is not responsive [03:58:30] given the error I assume it is the SQL server [03:58:43] Getting WDQ data... Warning: fopen(http://wdq.wmflabs.org/api?q=Claim%5B31%3A5%5D+and+noclaim%5B69%3A1641788%5D): failed to open stream: HTTP request failed! HTTP/1.1 502 Bad Gateway in /data/project/catscan2/public_html/omniscan.inc on line 132 Call Stack: 0.0025 894768 1. {main}() /data/project/autolist/public_html/index.php:0 0.1665 1628312 2. Pagelist->loadWDQ() /data/project/autolist/public_h [03:58:45] tml/index.php:190 0.1705 1629440 3. fopen() /data/project/catscan2/public_html/omniscan.inc:132 0 items loaded. [03:58:46] Combining datasets... [05:05:03] !log deployment-prep upgrade hhvm-tidy to 0.1-2 [05:05:09] Logged the message, Master [06:40:43] PROBLEM - Puppet failure on tools-exec-02 is CRITICAL: CRITICAL: 44.44% of data above the critical threshold [0.0] [07:05:44] RECOVERY - Puppet failure on tools-exec-02 is OK: OK: Less than 1.00% above the threshold [0.0] [07:53:59] GerardM-: hey! is wdq still down? [07:54:30] now it is not [07:54:39] I did send a mail to Magnus as well [07:54:53] what I do know i that it is not stable [07:55:27] wait [07:55:36] it is the wrong window I was looking at [07:55:41] checking again [07:56:11] YuviPanda by the time it takes, it is likely down [07:56:17] heh [07:56:20] yeah, let me go restart it [07:56:26] it's probably run out of memory again [07:56:56] I am ever SO pissed off that this needs to run out of memory [07:57:27] nothing to do with you but this is stable software that uses more memory when there is more use [07:57:53] !screen [07:57:53] script /dev/null [07:58:54] GerardM-: actually, it's not out of memory [07:59:04] GerardM-: I restarted it anyway [07:59:05] but it broke [07:59:49] thanks for that bit of news [08:00:25] GerardM-: is there docs somewhere for this? magnus emailed me how to restart this if it goes down, but would be nice to put this on wikitech so other labsops folks can do so as well [08:01:29] I wish there was [08:01:35] I am not aware of it [08:02:12] heh [08:02:16] I think I should puppetize it [08:02:23] and then find ways to scale it to a couple of machines [08:02:30] so that it doesn't run out of memory as much [08:02:45] GerardM-: I guess enough people depend on this for this to be ok... [08:03:34] GerardM-: it's back after a restart btw [08:04:42] checking if it works for me [08:10:08] GerardM-: https://wikitech.wikimedia.org/wiki/User:Yuvipanda/Restarting_magnus_wdq instructions on restarting wdq and who can do it. I just wrote that down [08:11:10] it is back now (functionality) [08:11:35] GerardM-: cool! [08:24:51] YuviPanda: I cc-d you on a mail [08:24:57] what else can I do ? [08:37:56] GerardM-: nothing much atm, I guess :( [08:38:07] ok [08:38:16] GerardM-: I am thinking, we should 1. setup a magnuswikidataquery project, 2. have at least 2 machines running this, 3. load balance between those two [08:38:21] I have at least done SOMETHING [08:38:37] GerardM-: if you look at the list of people who can restart, it's pretty large :D and covers almost all timezones [08:38:40] YuviPanda: the system is designed to work like that [08:38:53] GerardM-: in addition to that list you've me, core.n, andrew who can also restart, now that they know how to [08:38:58] have multiple instances of the same software [08:39:01] GerardM-: yeah, except it's running on only one machine now [08:39:13] so I can blog about that [08:39:22] do you mind ? [08:39:33] I will be positive [08:39:47] in that we can do this to bring us to the next level [08:40:22] GerardM-: sure! I consider everything I say on public channels public and fully bloggable/shareable :) [08:40:47] GerardM-: I'd say step 1. puppetize, 2. load balance [08:41:12] ... I prefer to ask anyway ... do not want embarasment but also want good feelings ... it is the only thing that moves us foreward [08:42:36] GerardM-: I appreciate that, but for me consider this a 'do not need to ask in future for public channels (not private channels or PM) for such things' [08:43:17] then I have to have a list about it because I forget [08:43:29] allow me to be nice [08:44:13] GerardM-: :) OK! [08:47:29] :( it broke again [08:51:27] GerardM-: heh, I've restarted it again, but this probably isn't a out of memory issue [08:51:45] many people are using it [08:52:00] and it does report on the memory it uses for a query [08:52:55] 3Tool-Labs-tools-Other: Wikiviewstats does not support Wikidata - https://phabricator.wikimedia.org/T63833#823179 (10Andyrom75) It seems that Hedonil do not connect since 08/2014. Can anyone support the resolution of db connection on his behalf? This is the error message: "Again something is messed up after Too... [08:53:36] GerardM-: a temp. solution might be to set up the script to restart it every hour, but that's terrible [08:53:52] not providing a service is worse [09:17:38] YuviPanda: you may like this blogpost ... different subject though http://ultimategerardm.blogspot.nl/2014/12/wikidata-bangladesh-university-of.html [09:18:28] my question to you ... would YOUR old university be intersted in tagging all the old students in Wikidata ? [09:18:58] when you indicate what university it is, I see what I can do already [09:19:18] GerardM-: heh, my college is only about 7 years old and doesn't have much :) [09:19:29] GerardM-: https://en.wikipedia.org/wiki/KCG_College_of_Technology [09:19:42] I wonder if it should be deleted [09:20:18] no [09:20:28] do you have a picture of the building ? [09:21:52] Wikidata now has a label in Tamil [09:22:01] You can now find it on the Tamil Wikipedia [09:22:11] கே சீ ஜீ தொழில்நுட்ப கல்லூரி [09:22:20] nice [09:22:56] do you have a picture of it that you can share ? [09:23:47] hmm, don't htink so [09:23:51] at least not one of the building [09:26:46] You probably know how to get one [09:27:50] GerardM-: true, but to keep my sanity intact I try to not deal with that place as much as possible :) [09:28:13] ok [09:29:08] https://tools.wmflabs.org/reasonator/?q=Q6326776&live I just added the place and the fact that Karapakkam is in Chennai [09:33:49] GerardM-: yay [09:35:13] I would make all engineering colleges in India an engineering college if my tool worked [09:35:20] it is down yet again [09:35:29] no it is not [09:35:32] it is slow [09:36:30] running [09:36:44] over a thousand of them [10:10:40] GerardM-: Could use a bit more help with this puzzle https://tools.wmflabs.org/autolist/autolist1.html?q=CLAIM%5B195%3A190804%5D%20AND%20NOCLAIM%5B170%5D ;-) [10:10:51] (finding the painters who painted these) [10:11:36] ben bezig om engineering colleges in India een plek te geven [10:12:06] gevolg van een gesprek hierboven met yuvi :) [10:13:57] http://tools.wmflabs.org/autolist/index.php?language=en&project=wikipedia&category=&depth=12&wdq=&mode=undefined&statementlist=&run=Run&label_contains=&label_contains_not=&chunk_size=10000 [10:14:02] how about that ? [10:14:33] painting by Benjamin Wolff geeft me 4 resultaten [10:24:45] GerardM-: https://www.wikidata.org/w/index.php?search=painting+by+Benjamin+Wolff&title=Special%3ASearch&go=Go ? [10:25:05] Ik gebruikte WDQ [10:25:08] werkt goed [10:25:19] kan ik als er meer zijn ze in een keer doen [10:31:07] hoe doe je dat als er geen schilder is ? [10:31:14] Gerrit Kamphuysen wordt vier keer genoemd [12:35:30] YuviPanda: hi [12:35:48] do you know about some php based skeleton that is ready to use on tool labs with OAuth and stuff [12:36:03] I need to make some php app but I am lazy to start it from scratch [12:36:16] petan: sadly no. [12:36:38] we should have some tool-labs uniform skeleton that would be ready to fork :P [12:36:51] I hate to care about css, html and all that [12:37:23] Completely agree [12:37:40] Sets up config some HTML CSS and a helper for jsub [13:44:41] PROBLEM - Puppet failure on tools-exec-09 is CRITICAL: CRITICAL: 22.22% of data above the critical threshold [0.0] [13:46:37] PROBLEM - Puppet failure on tools-exec-05 is CRITICAL: CRITICAL: 44.44% of data above the critical threshold [0.0] [13:47:11] PROBLEM - Puppet failure on tools-redis is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [13:47:21] PROBLEM - Puppet failure on tools-exec-10 is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [13:48:28] PROBLEM - Puppet failure on tools-exec-03 is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [13:48:42] PROBLEM - Puppet failure on tools-exec-06 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [0.0] [13:50:06] PROBLEM - Puppet failure on tools-webgrid-02 is CRITICAL: CRITICAL: 12.50% of data above the critical threshold [0.0] [13:50:07] PROBLEM - Puppet failure on tools-exec-11 is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [13:50:44] PROBLEM - Puppet failure on tools-master is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [0.0] [13:51:18] PROBLEM - Puppet failure on tools-webgrid-05 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [0.0] [13:51:53] PROBLEM - Puppet failure on tools-exec-wmt is CRITICAL: CRITICAL: 44.44% of data above the critical threshold [0.0] [13:52:21] PROBLEM - Puppet failure on tools-exec-13 is CRITICAL: CRITICAL: 22.22% of data above the critical threshold [0.0] [13:53:25] PROBLEM - Puppet failure on tools-mail is CRITICAL: CRITICAL: 44.44% of data above the critical threshold [0.0] [13:53:49] PROBLEM - Puppet failure on tools-exec-15 is CRITICAL: CRITICAL: 60.00% of data above the critical threshold [0.0] [13:54:25] PROBLEM - Puppet failure on tools-exec-gift is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [13:54:57] PROBLEM - Puppet failure on tools-exec-12 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [0.0] [13:54:59] PROBLEM - Puppet failure on tools-exec-08 is CRITICAL: CRITICAL: 30.00% of data above the critical threshold [0.0] [13:55:31] PROBLEM - Puppet failure on tools-trusty is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [13:55:35] PROBLEM - Puppet failure on tools-webgrid-tomcat is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [13:56:27] PROBLEM - Puppet failure on tools-exec-14 is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [13:56:49] PROBLEM - Puppet failure on tools-webgrid-03 is CRITICAL: CRITICAL: 22.22% of data above the critical threshold [0.0] [13:57:19] PROBLEM - Puppet failure on tools-webgrid-01 is CRITICAL: CRITICAL: 44.44% of data above the critical threshold [0.0] [13:57:27] PROBLEM - Puppet failure on tools-exec-07 is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [13:58:35] PROBLEM - Puppet failure on tools-dev is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [13:59:43] PROBLEM - Puppet failure on tools-login is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [14:00:50] PROBLEM - Puppet failure on tools-exec-04 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [0.0] [14:00:50] PROBLEM - Puppet failure on tools-webgrid-04 is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [14:01:22] PROBLEM - Puppet failure on tools-exec-catscan is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [14:01:26] PROBLEM - Puppet failure on tools-exec-cyberbot is CRITICAL: CRITICAL: 22.22% of data above the critical threshold [0.0] [14:01:44] PROBLEM - Puppet failure on tools-exec-02 is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [14:01:56] PROBLEM - Puppet failure on tools-submit is CRITICAL: CRITICAL: 40.00% of data above the critical threshold [0.0] [14:02:06] PROBLEM - Puppet failure on tools-exec-01 is CRITICAL: CRITICAL: 55.56% of data above the critical threshold [0.0] [14:03:32] PROBLEM - Puppet failure on tools-shadow is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [0.0] [18:26:00] hi, how can I avoid the new query limitation in databases in tool labs? I'm not getting to run a script becouse that [18:28:07] *because of that [18:31:14] and I have also two tools broken because of the query limitations [18:44:50] these query limitations don't make sense, I have to run the same script many times till it get to run the query (I'm using jstart to do that) it uses much more memory than the query running only once [19:20:52] danilo: what limitations? [19:24:49] I can't make heavy queries, the script return the error "Lost connection to MySQL server during query" [19:27:03] in the yesterday log of this channel I found this comment: guillom: I know Sean put in some query memory limits to avoid the bigger problem of database desync - you may be hitting against it if your query is particularily heavy. [19:28:25] I am with the same problem [19:36:58] danilo: why not just fix your queries? [19:38:43] I'm making a research, I need to search in all recentchanges table [19:38:46] odds are your queries could use a bit of work [19:39:03] danilo: what are you searching for? [19:41:28] counting edits grouping by some params [19:41:49] I have a tool that also do that but is broken now: http://tools.wmflabs.org/ptwikis/Patrulhamento_de_IPs [19:42:47] danilo: odds are if you give me more information than the vague "my stuff is broken" I can help you fix your tools [19:44:52] my English is bad, it is hard to me explaing the details [19:49:47] there is some way to make mysql server authorize heavy queries, something like "SLOW OK" comment in query that was used in toolserver? [19:50:26] danilo: your doing it the wrong way, instead of fixing the server you need to fix your query [19:51:10] danilo: and I doubt that your hitting the query killer, odds are your script is doing something else wrong [19:52:51] danilo: especially if your just limiting yourself to the RC table [20:10:00] Betacommand: the two queries of the broken tool: https://bitbucket.org/danilomac/ptwikis/src/83ad41b6744c4cebee81f12e586a8c17491859a0/tools/Patrulhamento_de_IPs.py?at=master#cl-170 [20:15:50] danilo: eswiki? [20:15:57] or ptwiki? [20:16:03] the tool make a chart of IP edits, IP pattroled edits and rollback edits per hour for the last week and per day for the last 30 days [20:16:05] ptwiki [20:18:34] danilo: first thing I would do is split that into three different queries [20:19:21] why are you using SUM(rc_user = 0) [20:20:44] to get the number of IP edits [20:21:07] thats the wrong way to do that [20:21:56] ok, I will split into three queries [20:23:38] danilo: using count() is far better than SUM for getting the number of results [20:24:06] sum makes the database add all values of rc_user together [20:44:29] danilo: if you continue to get the "server has gone away" messages let me know, its not the server killing your query [20:46:35] Betacommand: ok [20:48:38] Betacommand: i get those messages. do i understand this right, that this is an error and it is not my mistake? [20:57:29] annika_: no, this tool always works fine, it is not working since some days ago, I never got these mensages before when running queries, now I'm reciving the same erros in more than one query [21:00:23] problems with mariadb today? [21:00:25] ERROR 2006 (HY000): MySQL server has gone away [21:00:26] No connection. Trying to reconnect... [21:00:26] Connection id: 819396 [21:00:26] Current database: commonswiki_p [21:09:01] Steinsplitter: yes, I'm having the same problem [21:13:46] danilo: Steinsplitter file a ticket [21:55:52] Betacommand: I have splited the first query into three and removed the second query, but I'm still getting the same error [22:21:54] where do I file a ticket? [23:11:43] danilo: https://phabricator.wikimedia.org/ [23:13:07] thanks [23:20:31] Change on 12mediawiki a page OAuth (obsolete info)/ja was created, changed by Shirayuki link https://www.mediawiki.org/w/index.php?title=OAuth+(obsolete+info)%2fja edit summary: Created page with "OAuth" [23:20:43] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305487 edit summary: Created page with "プラットフォーム" [23:21:02] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305489 edit summary: Created page with "== タイムライン ==" [23:21:20] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305491 edit summary: Created page with "==背景==" [23:21:29] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305493 edit summary: Created page with "== コミュニケーション ==" [23:22:23] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305495 edit summary: Created page with "==関連項目==" [23:22:54] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305497 edit summary: Created page with "== 以前の議論 ==" [23:46:42] Change on 12mediawiki a page OAuth (obsolete info) was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305501 edit summary: translation tweaks [23:54:53] Change on 12mediawiki a page OAuth (obsolete info) was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305502 edit summary: translation tweaks [23:55:25] Change on 12mediawiki a page OAuth (obsolete info) was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305503 edit summary: Marked this version for translation [23:55:28] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305504 edit summary: Updating to match new version of source page [23:55:38] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305507 edit summary: Importing a new version from external source [23:55:44] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305511 edit summary: Importing a new version from external source [23:55:45] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305527 edit summary: Importing a new version from external source [23:55:45] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305539 edit summary: Importing a new version from external source [23:55:46] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305556 edit summary: Importing a new version from external source [23:55:46] Change on 12mediawiki a page OAuth (obsolete info)/en was modified, changed by FuzzyBot link https://www.mediawiki.org/w/index.php?diff=1305559 edit summary: Importing a new version from external source [23:56:07] Change on 12mediawiki a page OAuth (obsolete info)/ja was modified, changed by Shirayuki link https://www.mediawiki.org/w/index.php?diff=1305574 edit summary: Created page with "状態"