[00:23:50] 6Labs, 10Tool-Labs: Delete tool labs project "commonstools" - https://phabricator.wikimedia.org/T99868#2021209 (10Fastily) 5Open>3Resolved a:3Fastily Tim just [[ https://wikitech.wikimedia.org/w/index.php?title=User_talk:Tim_Landscheidt&oldid=293552#commonstools | took care of this ]]. [00:49:50] legoktm: bd808 I'm not running master but a version patched by myself [00:50:28] liangent: cool. I sent a warning email out [00:50:33] but I (ir?)regularly pull from master and apply my patches again towards it [00:51:13] liangent: https://lists.wikimedia.org/pipermail/labs-l/2016-February/004291.html [00:51:50] Do you run MW on the Tool grid? [00:51:51] bd808: I've already specified -l release=trusty for all my jobs [00:51:58] perfect [00:52:08] https://zh.wikipedia.org/w/index.php?title=User:Liangent-bot/crontab/liangent-php&diff=38262111&oldid=38240689 [00:53:06] I can read the crontab lines in that diff :) [00:53:20] the rest is beyond my skills [00:55:09] bd808: the rest lines are descriptions of jobs [00:56:05] I figured as much. And seeing that makes me less scared about you using MW on the job grid. Custom maintenance jobs make sense. [00:56:32] I was wondering if anyone was trying to run an actual wiki on the grid [00:57:30] bd808: well I'm running it too... [00:57:53] not an actual but an effectively mirror of wikis [00:58:16] we could probably get you a lot more performance on a Labs instance. [00:58:27] NFS is sloooooow [00:58:41] bd808: well I'm not really depending on it [00:58:48] *nod* [00:59:04] the mostly used part is the maintenance scripts [00:59:26] tools.wmflabs.org/liangent-php/index.php/enwiki?title=Main_Page should be loading after a long wait... [01:01:22] wow. that's even worse than I'd guessed it would be [01:01:34] for slow I mean [01:04:00] bd808: anyway I disabled a bunch of caches [01:04:15] because they won't be updated or cleared when db content changes [01:04:56] so do you pull page content from the dumps or somewhere else? [01:05:06] * bd808 is just curious [01:05:46] bd808: I pull from db replica [01:06:06] for page text as well as write operations, I either patched or hooked into those methods [01:06:15] and interact with the real server using api [01:06:59] interesting. subbu is working on setting up something similar to test Parsoid changes but I think he decided that it wasn't possible to use the replica DBs [01:07:32] He's planning on loading dumps into his on db instead [01:07:45] Sounds like you have figured out a way that actually works [01:08:45] bd808: ya and in those customized maintenance scripts (as bots), I just do WikiPage::factory( ... )->doEdit() to push [01:13:55] 6Labs, 10Tool-Labs, 10pywikibot-core: pywikibot broken on wikimedia labs - https://phabricator.wikimedia.org/T126666#2021312 (10Ladsgroup) OK, I think [[https://github.com/pywikibot/Pywikibot-nightly-creator/pull/3|PR]] will fix the issue for now. Feel free to merge it. I know it's not best option possible.... [04:48:42] !log tools.stashbot Upgraded to 00ef845 (T108720) [04:48:45] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [04:50:46] !log tools.stashbot Live hack cut-n-paste code error (T108720) [04:50:49] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [04:54:00] !log tools.stashbot Third time is a charm (T108720) [04:54:03] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [04:56:50] !log tools.stashbot Testing task mentions (T108720) [04:56:53] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [05:07:19] 6Labs, 10Tool-Labs: Python2 on trusty exec nodes does not support SNI - https://phabricator.wikimedia.org/T126714#2021612 (10bd808) 3NEW [05:20:09] 10Wikibugs: Figure out whether we can use the new Phabricator API updates to get comment link anchors instead of screenscraping - https://phabricator.wikimedia.org/T126715#2021628 (10Legoktm) 3NEW [05:23:24] 6Labs, 10Tool-Labs: Python2 on trusty exec nodes does not support SNI - https://phabricator.wikimedia.org/T126714#2021636 (10Legoktm) If you `pip install requests[security]`, that should install the missing python libraries. We'd have to upgrade to 2.7.9 or higher for it to be fixed properly in Python. [06:09:14] !log tools.stashbot Testing configurable comment format (T108720) [06:09:17] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [06:27:47] OMG [06:27:51] THAT'S AWESOME [06:27:59] * legoktm hugs bd808 [08:25:11] 6Labs: Instance discourse.search.eqiad.wmflabs in SHUTDOWN state - https://phabricator.wikimedia.org/T126191#2021724 (10AdHuikeshoven) 5Resolved>3Open [08:25:12] 6Labs, 10Tool-Labs: Python2 on trusty exec nodes does not support SNI - https://phabricator.wikimedia.org/T126714#2021726 (10valhallasw) `requests[security]` installs: ``` extras_require={ 'security': ['pyOpenSSL>=0.13', 'ndg-httpsclient', 'pyasn1'], }, ``` pyOpenSSL and pyasn1 are installed gl... [08:25:42] 6Labs, 10Tool-Labs: Python2 on trusty exec nodes does not support SNI - https://phabricator.wikimedia.org/T126714#2021727 (10valhallasw) (that won't help in your virtualenv, though, unless you created it with `--system-site-packages`) [08:25:47] 10Labs-Other-Projects: Succesful pilot of Discourse on https://discourse.wmflabs.org/ as an alternative to wikimedia-l mailinglist - https://phabricator.wikimedia.org/T124690#2021728 (10AdHuikeshoven) [08:25:49] 6Labs: Instance discourse.search.eqiad.wmflabs in SHUTDOWN state - https://phabricator.wikimedia.org/T126191#2007200 (10AdHuikeshoven) [08:27:18] 10Wikibugs: Figure out whether we can use the new Phabricator API updates to get comment link anchors instead of screenscraping - https://phabricator.wikimedia.org/T126715#2021731 (10valhallasw) Yep, should be doable. {T1176} also still needs to be converted. [11:54:24] 10Tool-Labs-tools-Other: WIWOSM: Options cannot go away - https://phabricator.wikimedia.org/T126673#2022054 (10Aklapper) [11:56:16] 10Tool-Labs-tools-Other: WIWOSM: Options cannot go away - https://phabricator.wikimedia.org/T126673#2020465 (10Aklapper) The map comes from WIWOSM on tools.wmflabs.org (and is not part of MediaWiki). I'm not sure who maintains it or where the bugtracker is... [11:59:49] hi guys. i'm seeing an error i don't understand using the database from tool labs [12:00:19] could someone help me see it? [12:01:06] http://pastebin.com/U3cybbX4 [12:08:55] it seems i can't export into file info from databases using my user [12:09:20] YuviPanda: is it like this? [12:29:07] Maybe someone can help me? After the last mediawiki update at group2, the login from my bot did not work anymore [12:33:15] if someone can help me with this, please ping my bouncer, I will answer, when I'm back [13:00:20] Hello! I'trying to validate a gerrit account since yesterday! Now i discover that it's google designed ; is it the reason why it is so.... [13:03:33] 6Labs, 10Labs-Infrastructure: Instance console does not gives output / keystroke access - https://phabricator.wikimedia.org/T64847#2022210 (10hashar) p:5Normal>3Low "we" Labs ops. If an instance is locked on boot, the console access does not let you connect or does not accept key strokes. This task is... [15:31:02] How comes "jstart" takes about a minute to complete, recently? [15:33:56] not sure can you give me an example of a specific command [15:42:57] I'm not having the same experience now [15:46:07] jynus, Krenair: https://etherpad.wikimedia.org/p/keystoneldap [15:46:42] if you want to do any kind of backups first, those are not immediate [15:47:14] jynus: the backup process is slow, you mean? [15:47:18] andrewbogott: what is that? :) [15:47:21] usually [15:47:24] but hot [15:47:28] the keystone db is tiny and mostly consists of empty tables, so I don’t think it’ll be a problem. [15:47:33] ok [15:47:39] paravoid: that’s the step-by-step for today’s maintenance [15:47:47] what is it about? [15:47:53] https://phabricator.wikimedia.org/T115029 [15:48:46] paravoid: looks like I sent the announcement to labs-l and wikitech-l but not ops. [15:48:58] I read labs-l too but I must have missed it, sorry [15:49:15] subj: Wikitech maintenance and CI downtime tomorrow, 2015-02-12 16:00 UTC [15:49:29] interesting [15:49:39] messy :( [15:50:25] andrewbogott, I guess we should start merging the patches [15:50:29] Krenair: I propose that we +2 those branch patches in gerrit before I turn off keystone, because otherwise we’ll lose jenkins auto-merge. [15:50:36] ah, yes, as you say :) [15:51:30] Also you have very little trust on mysql's speed- which will be lower, but I do not think *user-noticible* lower [15:51:51] jynus: mysql isn’t the issue, it’s all the rest calls [15:51:57] ah! [15:51:58] jynus: just emailed labs-l explaining that [15:52:53] jynus: my description of this obscured that there are two changes: OSM will be reading/writing via keystone AND keystone will be using the mysql backend. [15:52:57] It’s the first part that’s slow [15:53:05] (previously OSM read/wrote to ldap directly) [15:53:19] ok [15:58:06] 6Labs, 10Labs-Infrastructure, 7Tracking: Labs instances sometimes freeze - https://phabricator.wikimedia.org/T124133#2022561 (10chasemp) re: 'tools-webgrid-lighttpd-1208.tools.eqiad.wmflabs It all ties back to NFS but some details. ``` top - 20:06:16 up 18 days, 20 min, 0 users, load average: 157.00,... [15:58:24] 6Labs, 10Tool-Labs: tools-webgrid-lighttpd-1208.eqiad.wmflabs hangs - https://phabricator.wikimedia.org/T125770#2022568 (10chasemp) 5Open>3Resolved a:3chasemp https://phabricator.wikimedia.org/T124133#2022561 [15:58:28] 6Labs, 10Labs-Infrastructure, 7Tracking: Labs instances sometimes freeze - https://phabricator.wikimedia.org/T124133#2022571 (10chasemp) [16:00:42] Krenair: ok, I’m going to start disabling wikitech things [16:00:49] -ops channel [16:02:55] ok, it takes 4 seconds to perform that backup, we could survive :-) [16:03:37] I'm accustomed to at least 1 TB of data [16:11:18] labs-morebots: how’s it going? [16:11:18] I am a logbot running on tools-exec-1207. [16:11:19] Messages are logged to wikitech.wikimedia.org/wiki/Server_Admin_Log. [16:11:19] To log a message, type !log . [16:33:35] !log deployment-prep starting to ship logs from elasticsearch to logstash (https://gerrit.wikimedia.org/r/#/c/269100/) [16:33:41] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Deployment-prep/SAL, Master [16:33:51] Luke081515|away: Did you figure out the problem with your bot? What framework are you using? The most common problem we have seen so far with bots and 1.27.0-wmf.13 is the token not being URL encoded. Second most common is not handling Set-Cookie headers that are deleting cookies. [16:39:06] ebernhardson: you around? [16:41:59] MusikAnimal: yp [16:42:19] I don't think prefixsearch is returning redirects [16:42:33] it seems it does *sometimes* [16:44:28] I guess I have to use opensearch [16:44:34] MusikAnimal: opensearch === prefixsearch [16:44:49] MusikAnimal: the only different is opensearch is a public spec for the response format that is, imo, insane [16:45:19] well opensearch seems to do the job, while prefixsearch does not [16:45:27] MusikAnimal: demo urls? [16:45:48] http://tools.wmflabs.org/pageviews and try to search for "Gooogle" [16:45:55] this worked at one point, it does not anymore [16:46:32] 10Tool-Labs-tools-Other: WIWOSM: Options cannot go away - https://phabricator.wikimedia.org/T126673#2022692 (10Nnemo) I have Safari on iOS 9.2.1 on an iPad in landscape. [16:47:05] 10Tool-Labs-tools-Other, 7Accessibility: WIWOSM: Options cannot go away - https://phabricator.wikimedia.org/T126673#2022694 (10Nnemo) [16:48:22] I mean, opensearch seems to go very fast [16:49:22] 6Labs, 10Labs-Infrastructure, 10labs-sprint-117, 10labs-sprint-118, and 3 others: Clean up after ldap->mysql keystone migration - https://phabricator.wikimedia.org/T126758#2022698 (10Andrew) 3NEW a:3Andrew [16:49:23] it says at mw:API:Prefixsearch that "Depending on the search engine backend, [the best-matching titles] might include typo correction, redirect avoidance, or other heuristics." [16:50:07] MusikAnimal: i toyed with it, but i only get the exact same results from prefix vs opensearch for google [16:50:10] https://en.wikipedia.org/w/api.php?action=opensearch&format=json&search=goog&cirrusUseCompletionSuggester=yes [16:50:16] https://en.wikipedia.org/w/api.php?action=query&list=prefixsearch&format=json&pssearch=goog&cirrusUseCompletionSuggester=yes [16:50:32] they are formated different, but the same titles in the same order [16:50:37] 6Labs, 10Labs-Infrastructure: Clean up after ldap->mysql keystone migration - https://phabricator.wikimedia.org/T126758#2022708 (10Krenair) [16:50:51] 6Labs, 10Labs-Infrastructure, 10MediaWiki-extensions-OpenStackManager: Clean up after ldap->mysql keystone migration - https://phabricator.wikimedia.org/T126758#2022698 (10Krenair) [16:53:59] hmm I don't know if it's my implementation, or what. Here's the diff https://github.com/MusikAnimal/wiki-tools/commit/cad3afdacb9897141091b0f6ff79056ecf5e2978 [16:54:09] one will return redirects, the other does not [16:55:29] ah, I guess it's the "articleSuggestionCallback" that does the trick [16:55:36] https://en.wikipedia.org/w/api.php?callback=articleSuggestionCallback&action=opensearch&format=json&search=Gooog&redirects=return [16:56:05] that versus https://en.wikipedia.org/w/api.php?action=opensearch&format=json&search=Gooog&redirects=return [17:00:41] ebernhardson: is worthy of a phab report? I think we just need to make opensearch accept the "redirects=return" parameter like opensearch does [17:01:15] *prefixsearch accept redirects=return [17:06:40] 6Labs, 10Labs-Infrastructure, 10labs-sprint-117, 10labs-sprint-118, and 3 others: Lock down access for new keystone role model - https://phabricator.wikimedia.org/T126765#2022794 (10Andrew) 3NEW a:3Andrew [17:19:52] 6Labs, 10Labs-Infrastructure, 6operations: labservices1001 ran out of disk space - https://phabricator.wikimedia.org/T126572#2022935 (10Dzahn) a:3Andrew [17:24:51] bd808: The reason for the problem was wmf.13, I used the old method via API to get cookies. (I'm using my own framework), but another use gave a part of code to fix this... going to test this now [17:24:58] MusikAnimal: still looking, it's not directly related to redirects=return, the doesn't explain the difference in number of results. Something else odd is going on [17:25:09] compare https://en.wikipedia.org/w/api.php?callback&action=opensearch&search=Gooog vs https://en.wikipedia.org/w/api.php?action=opensearch&search=Gooog [17:25:29] the callback query param shouldn't have any effect on the results...but it seems to :S [17:25:35] yeah you the need callback [17:25:36] ha [17:26:21] Luke081515|Busy: awesome. If you need more help ping anomie and/or open a Phabricator task. We are trying to help bot owners deal with the SessionManager changes as quickly as we can. [17:27:21] bd808: Thanks for your offer, but I realized that it works now, the bot made some edits a few seconds ago :) [17:27:31] sweet [17:27:41] for fun, i only need that callback from the web request, running an api request through the prod debug console doesn't expand out the redirects ... [17:27:48] bd808: But one question: Is there a plan when bot owners have to switch to bot passwords? [17:27:53] ebernhardson: interesting [17:28:39] MusikAnimal: well, the truly odd thing is based on how i know our search works, to get those two very different responses a completely different request had to be made from mediawiki to elasticsearch, but i'm not seeing what would flip between them [17:29:05] Luke081515|Busy: The is no firm date. When AuthManager ships then it will be possible for wikis to start changing config in such a way that the new auth flow will be required for bots not using OAuth or bot passwords. [17:29:32] Luke081515|Busy: We will try to make lots of announcements before that happens [17:30:04] You should be able to switch now however by generating a bot password and changing the username and password you use to auth [17:30:05] bd808: So it depends on the wiki config, ok. [17:30:09] yeah [17:30:13] ebernhardson: yeah as I said, the doc seems to imply it resolves redirects. And in fact removing redirects=return doesn't change anything [17:30:17] ebernhardson: To add to your confusion, when I access those two URLs they give the same results. [17:31:02] bd808: What do you think of creating a phab project, where bot owners can suscribe, and the project will be added as CC if there is an important API-Change? [17:31:04] Luke081515: AuthManager will allow a wiki to offer or require things like password expiration and 2factor auth. Features like that will break using the old login api [17:31:09] anomie: the second one doesn't return a result for "Gooogle" as it should [17:31:28] MusikAnimal: It does for me... [17:31:46] both logged in and anon. [17:31:52] Luke081515: we already have https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce [17:32:09] anomie: this one? https://en.wikipedia.org/w/api.php?action=opensearch&search=Gooog [17:32:11] Hi all. I keep forgetting where to find this: I am looking for documentation on format of the passwd file that would let your bot login itself when it gets logged out [17:32:15] I'd rather not make lots of fragmented places to update and look [17:32:17] MusikAnimal: Yes [17:32:24] * Luke081515 should better suscribe that list^^ [17:32:25] what the [17:32:45] yeah that's what I was saying earlier, that I think it's inconsistent [17:33:12] there's definitely no "Gooogle" (three o's) for me at https://en.wikipedia.org/w/api.php?action=opensearch&search=Gooog [17:33:37] this is how i'm receiving it: http://i.imgur.com/Dr1o0tD.png [17:33:39] something very odd ... [17:35:27] please make wikibugs work in -operations again [17:35:39] (the usual thing when it got restarted it stops talking there) [17:35:44] ebernhardson: try in incognito [17:36:02] when I am logged out the first one (without callback) shows the redirects [17:36:43] MusikAnimal, ebernhardson: Does getting the different results depend on some beta feature or user preference? Because adding the "callback" parameter causes the API to force the request to be processed as if you're not logged in. [17:37:19] MusikAnimal: oh silly me ... its the beta feature. [17:37:32] although how it varies based ont that callback= params ... doesn't seem right [17:37:49] ah yes, the completion suggester? [17:37:52] MusikAnimal: yes [17:38:14] well as nice as it is, for my purposes I don't want completion suggester =P [17:38:44] but I'd prefer not to have to do any hacks either, which I'm guessing is the case with `callback` [17:38:55] MusikAnimal: soon-ish it will be the default for main namespace, i suppose we'll have to work up some sort of bypass [17:39:13] MusikAnimal: hopefully you can understand that in terms of showing users what they want in an autocomplete, misspellings is almost never it :) [17:39:21] oh definitely [17:39:35] a great feature for readers [17:40:37] https://www.mediawiki.org/wiki/API:Opensearch says `redirects=return` will return the redirect. I guess that broke with the beta feature [17:41:03] !log tools.stashbot Upgraded to c599c8f to use requests[security] for SNI support (T126714) [17:41:06] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [17:41:08] 6Labs, 10Tool-Labs: Python2 on trusty exec nodes does not support SNI - https://phabricator.wikimedia.org/T126714#2023074 (10Stashbot) {nav icon=file, name=Mentioned in SAL, href=https://tools.wmflabs.org/sal/log/AVLWkmHBhQaf1CQcCd40} [2016-02-12T17:41:03Z] Upgraded to c599c8f to use requests[security]... [17:41:38] MusikAnimal: if you look at the code in ApiOpenSearch it actually never checks for redirects=return [17:41:54] oh wonderful haha [17:42:03] 6Labs, 10Tool-Labs, 15User-bd808: Python2 on trusty exec nodes does not support SNI - https://phabricator.wikimedia.org/T126714#2023083 (10bd808) 5Open>3Resolved a:3bd808 Using `requests[security]` works like a charm. Thanks @legoktm. [17:42:40] MusikAnimal: doesn't look like it ever has, it's an oddly named thing :P All it does is check if redirects === 'resolve', and if it is set then it will take any results from the search engine and see if they are redirects. But it never tells the search engine about that argument [17:43:26] ebernhardson: well maybe we could implement that into prefixsearch? easier said than done, I'm sure [17:43:43] or say, the `redirects=return` just bypasses the beta feature [17:43:47] MusikAnimal: well, depends what you want. Part of it is the search data model, redirects don't exist as their own pages [17:43:57] MusikAnimal: instead for every real page it contains a list of incoming redirects [17:44:14] we want any and every page to be returned that matches the search input [17:44:14] err, i dunno if that makes sense :P elasticsearch is a document model rather than columns/rows like an sql db [17:44:42] or top 10 or whatever, not necessarily EVERY page, but you get the point [17:47:14] MusikAnimal: i've added https://phabricator.wikimedia.org/T126782 i imagine we can get to it in the next week or two [17:47:49] awesome, thank you! [18:09:44] 6Labs, 6operations: Some labs instances IP have multiple PTR entries in DNS - https://phabricator.wikimedia.org/T115194#2023269 (10dduvall) [18:18:18] Damianz_: hi are you here [18:18:29] Damianz_: I need ur help :o u got mac right? [18:22:28] petan: did you figure out how to get the session handing working? [18:52:37] 6Labs, 10whatcanidoforwikimedia.org: Project wcidfwm (What can I do for wikimedia) - https://phabricator.wikimedia.org/T115092#2023403 (10Luke081515) [18:53:18] 6Labs, 10whatcanidoforwikimedia.org: Project wcidfwm (What can I do for wikimedia) - https://phabricator.wikimedia.org/T115092#2023416 (10Dzahn) is this project also for http://whatcanidoforwikimediacommons.org/ ? [18:53:44] petan: ^^ maybe you can take a look at that task? I guess there is a open question waiting for your answer ;) [18:54:36] I just wanted to say which idiot created this task and put me in CC when I figured out [18:54:37] it was me [18:55:43] 6Labs, 10whatcanidoforwikimedia.org: Project wcidfwm (What can I do for wikimedia) - https://phabricator.wikimedia.org/T115092#2023435 (10Petrb) I think it could be, @yuvipanda yes it does and it's open source so it probably could be reused. [18:56:16] 6Labs, 10whatcanidoforwikimedia.org: Project wcidfwm (What can I do for wikimedia) - https://phabricator.wikimedia.org/T115092#2023442 (10Luke081515) This project was requested for whatcanidoforwikimedia.org, see T124814. I think we should create a new project for the other site, because the authors are differ... [19:21:06] PROBLEM - Puppet failure on tools-k8s-bastion-01 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [0.0] [19:21:36] PROBLEM - Puppet failure on tools-grid-master is CRITICAL: CRITICAL: 42.86% of data above the critical threshold [0.0] [19:30:14] ^puppet is lagging but only 42m and manual runs fine [19:30:50] chasemp: thanks, I was just firing up ssh :-) [19:31:56] 6Labs, 10Tool-Labs, 6Project-Creators: Migrate Tools access request process to Phabricator - https://phabricator.wikimedia.org/T72625#2023708 (10Luke081515) p:5Triage>3Normal [19:35:32] 6Labs, 10Tool-Labs, 6Project-Creators: Create a project to bring Jyothis' old toolserver tools back to life - https://phabricator.wikimedia.org/T125934#2023735 (10Luke081515) p:5Triage>3Low [19:41:40] RECOVERY - Puppet failure on tools-grid-master is OK: OK: Less than 1.00% above the threshold [0.0] [19:46:01] RECOVERY - Puppet failure on tools-k8s-bastion-01 is OK: OK: Less than 1.00% above the threshold [0.0] [19:56:50] !log tools nfs traffic shaping pilot round 2 [19:56:54] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL, Master [20:16:15] hi everyone -- Brian from OSGeo.org .. I have started a firefight about "to Github or not to Github" today .. wise words from WMF welcome [20:17:09] .. actually I exaggerate, people are discussing vigorously but no fighting [20:17:26] thx [20:17:33] "provider independence" and "caring about our user's privacy" [20:21:13] (03PS1) 10Legoktm: Ignore "Stashbot" [labs/tools/wikibugs2] - 10https://gerrit.wikimedia.org/r/270349 [20:24:30] dbb_: you might also be interested in cpython, which is just moving towards github [20:24:59] basically, the big plus of github is that 'everyone' is on there, and it's easy to submit a pull request [20:35:01] Hey, who is user marcmiq... I am asking because he is running an intesnvei script that has slowed down the tools server [20:35:38] marmick: ^ [20:44:02] huji: tell me [20:49:37] marmick: python cira_pageviews_dataset_generator.py enwiki is using large amounts of CPU and memory [20:49:45] ok, one sec [20:50:37] done [21:12:54] 6Labs, 6operations, 5Patch-For-Review: audit labs versus production ssh keys - https://phabricator.wikimedia.org/T108078#2024073 (10Dzahn) maybe we can get this merged and ask CI team to add it as a non-voting check on operations/puppet and see how it works. then if we like it , just change non-voting to voting [21:16:33] kaldari, how goes it? [21:17:13] CP678: good [21:17:33] kaldari, have you looked at the new code yet? [21:17:49] just briefly [21:18:08] it looks like a big improvement, at least organizationally [21:18:20] I'm thinking I have to move the entire analyzePage function into a modular class too. [21:18:36] analyzePage is wiki specific. [21:18:47] what does analyzePage do? [21:18:58] It does that entire analysis of the page [21:19:23] And some stuff such as data formatting and parameter assignments are enwiki specific. [21:19:42] So, um, labs all messed up right now? [21:20:00] Or is it just me [21:20:00] I tried moving that off on it's own, but there's just so many variables and problems structuring that off. [21:20:34] kaldari, so I think it's easier to simply perfect the function, duplicate it, and modify it accordingly for a different wiki, [21:21:12] The analyzePage is the parent function that handles page analysis. [21:21:35] It calls the subsequent functions, that parses the page for the needed information. [21:21:47] And API and DB connections are seperate too. [21:23:27] CP678: Yay! [21:23:56] kaldari, so if any thing is broken, it's likely there, and not the analyzePage function [21:24:17] So duplicating that function and modifying it shouldn't make code maintainance too tedious. [21:25:12] that's good to hear [21:26:07] CP678: I'm impressed you've been able to accomplish so many improvements on the bot, especially since I imagine you're back in school now. [21:26:21] kaldari, I was considering other alternatives to function dupication, because I hate that idea, but placing on wiki configuration needlessly adds to a level of comlexity, and there are still formatting considerations to deal with. [21:26:28] Fixed it by destroying the instance, sorry guys [21:26:51] kaldari, I am. I took an exam 2 days ago, as well as had an interview. [21:28:20] kaldari, for example, Wikipedia uses the Wayback template in certain cases, but dewiki uses Vorlage:Webarchiv [21:28:43] Completely different parameters the bot needs to be familiar with, and date formats too. [21:29:07] CP678: Yeah, I think having a modular class that changes per wiki is a decent solution, especially since there is so much variation in how different wikis flag dead links. [21:30:00] kaldari, the pirmary function of analyze page is to get an array of data about each link on a page, inject new data into the array, and have the parser class generate new strings to replace the old ones. [21:30:15] Based off of the array [21:30:47] kaldari, some variables I do want to add are edit summary variables. [21:31:57] So the edit summaries can be changed on Wiki. I also want to enhance the edit text generator. The bot allows for magic words in the on wiki config variables, and I have a pretty ugly setup going. [21:33:00] kaldari, and then we need an enhanced 404 detection class. [21:33:29] That's more reliable than simply header code detection [21:34:07] Then Cyberbot will go through another BRFA on enwiki to test the new 404 detection class. [21:36:03] kaldari, gotta go [21:36:15] Cya! [22:18:40] tgr: Maybe you as the task author can take a look at https://phabricator.wikimedia.org/T109643#1961558 ? Thnaks [22:18:43] *Thanks