[11:40:50] !log tools.mabot Updaed mabot to ff78955. [11:40:51] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.mabot/SAL [16:55:12] !help [16:55:12] audiodude: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [16:55:28] hi audiodude ! [16:55:36] hello [16:55:39] do you need help with anything? [16:55:49] I'm rewriting a tool and the new Python version is getting "(2013, 'Lost connection to MySQL server during query')" while the old tool has no issues [16:56:00] the query is very large, probably half a million rows [16:57:11] This is the code in question: https://pastebin.com/i5H0pAd6 [16:57:25] the error happens on line 16 during cursor.execute [16:57:58] according to this mysql help article (https://dev.mysql.com/doc/refman/5.7/en/error-lost-connection.html) it's most likely the `net_read_timeout` variable [16:58:10] but I know that we can't adjust that globally [17:01:16] this is the corresponding code in the old Perl version: https://github.com/openzim/wikimedia_wp1_bot/blob/master/backend/toolserver_api.pl#L168 [17:01:52] arturo: so that's what I need help with :) [17:02:07] oh I see [17:03:55] wikidb is a pymysql connection to enwiki_p replica, just FYI [17:04:04] audiodude: I recommend you open a phabricator task [17:04:21] (or wait here for other people familiar with that stuff to show up) [17:05:09] why not both? :) [17:05:22] :-) sure [17:05:55] audiodude: perl and python handle objects very differently. It is possible the python is being slower...and thus slowing down the query until it gets cut (unless you got caught in the middle of some sort of maintenance). [17:06:03] Is the perl version still running? [17:06:28] the perl version is still running yes [17:06:41] it wasn't a network fluke or maintenance, it's repeatable [17:07:20] the `slow ok` comment doesn't really do anything anymore, btw [17:08:59] cool, I'll kill that when I get a chance [17:10:22] Have you tried doing fetchone's instead? [17:10:48] instead of fetchmany? [17:10:59] No. But it doesn't even get to the fetch code [17:11:04] it dies at cursor.execute [17:11:22] That's...odd [17:13:05] That places the problem more on the library than the method [17:17:31] audiodude: what are the params for your connection? [17:17:36] That's not in your paste [17:18:36] https://github.com/openzim/wikimedia_wp1_bot/blob/master/lucky/lucky/wiki_db.py [17:19:58] maybe the problem is with DictCursor? [17:20:54] I suspect so. [17:23:41] You could try with an unbuffered dict cursor: SSDictCursor [17:23:54] It introduces some new bits of fun, of course :) [17:25:12] I created a phabricator task anyways, but I'll try switching that out: https://phabricator.wikimedia.org/T226038 [17:28:15] halfak: Hello, If I am not wrong then you are the maintainer of python-mwoauth. I have filled the issue can you look at that. [17:28:45] Hey! Yes. I can take a look. [17:28:46] halfak: https://github.com/mediawiki-utilities/python-mwoauth/issues/31 [17:29:23] halfak: Thanks :) [20:57:32] !log tools.quickcategories deployed 99f9d3f4c2 (copy nav below lists) [20:57:34] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.quickcategories/SAL [22:11:16] Hi, can I gain access to https://gerrit.wikimedia.org/r/analytics/analytics.wikimedia.org I want to make some small changes [22:16:36] terrrydactyl, this might help? (assuming you know about gerrit generally) https://wikitech.wikimedia.org/wiki/Analytics/analytics.wikimedia.org