[02:25:26] is the mediawiki-commits mailing list active and just doesn't retain archives? http://lists.wikimedia.org/pipermail/mediawiki-commits/ [02:27:11] yes [02:27:17] there is no point archiving that list [02:27:28] (and it broke MM in the past once) [02:28:17] ok, do you know what bot/script runs it? we'd like to be able to have pywikibot commits go to our own mailing list [02:30:07] you need a change-merged hook in gerrit [02:30:10] see https://gerrit.googlecode.com/svn/documentation/2.1.6/config-hooks.html#change-merged [02:30:30] i'm not sure where the mediawiki-commits one is defined; probably in operations/puppet [02:30:34] grep for it and see if it turns up [02:30:50] ok, will look in there, thanks :D [02:31:24] yeah, it's there [02:33:49] found https://github.com/wikimedia/operations-puppet/blob/2d26b5ace6bbab966e79dff0b0f5ee288bc430ba/files/gerrit/hooks/change-merged [02:34:32] now need to find where its called from.... [02:35:47] first add it to templates/gerrit/hookconfig.py.erb [02:36:10] feel free to add anyone you don't like to this list: [02:36:12] # These users are annoying since they're bots and such. Don't report their comments... [02:36:12] spammyusers = ["jenkins-bot"] [02:36:12] # ...or any of their actions [02:36:14] reallyspammyusers = ["L10n-bot"] [02:36:33] heh, i'm constantly a step behind you. i just found that file [02:37:30] ori-l: oh wait, are you talking about https://gerrit.wikimedia.org/r/#/c/70780/ ? [02:37:56] then you should add a test to files/gerrit/hooks/hookhelper_test.py [02:38:05] irc will be taken care of, i want to set up email [02:39:07] oh, i'm an idiot [02:39:11] i sent you on a wild goose chase [02:39:21] it's actually a different config completely http://review.coreboot.org/Documentation/user-notify.html [02:40:02] i'd probably have to run that for you [02:40:47] Servers seem to be a bit on the slow side. [02:42:25] ori-l: hm. well i think what we want is all merged commits go to our pywikipedia-svn mailing list. i'm reading that it would be need to be done via the "project level settings" option? [02:44:13] do you have access to the pywikibot conversion account? [02:44:22] you need direct push rights to modify the email notification settings [02:46:40] ori-l: no, valhallasw is running it. i'll talk to him when he gets online then. thanks for the pointers :D [02:50:22] legoktm: to which e-mail address should it send notifications? [03:08:20] !log on fenari: moved away the outdated source copy that was at /home/wikipedia/common . Will update noc. [03:08:30] Logged the message, Master [03:26:14] ori-l: sorry was having dinner. it would need to go to pywikipedia-svn@lists.wikimedia.org. [12:56:48] What is the tool that lets me see how many people have visited a page? [13:15:23] Technical_13: stats.grok.se? [13:16:06] Yep.. [09:07] Technical_13: stats.grok.se ? [09:07] yep.. that was what I was looking for. [13:16:15] in #wikimedia... [13:16:25] Thanks. [13:16:27] :) [13:17:00] I noticed it looked buggy a week or two ago.. [13:17:08] It was adding like 300 to the current day.. [13:17:15] was very odd. [13:17:23] Is there a place to report bugs for that? [13:17:58] no idea [13:20:34] so it's a third party thing and not related to wikimedia or tools or labs in any way? [13:21:25] i think so [13:21:46] I think there are plans to have similar data in Labs at some point, but we're not there yet. [13:21:59] Technical_13: http://stats.grok.se/about [13:23:26] Thanks you two.. [13:59:16] I've understood correctly that the pagelinks table contains outgoing links for each page ? [13:59:29] sorry wrong chan [13:59:59] or is it? [14:00:12] https://www.mediawiki.org/wiki/Manual:Pagelinks_table [14:00:44] in that table there are multiple pl_from entries with the same id for each outgoing link that page contains? [14:03:21] i.e. is this a valid SQL query for that table INSERT INTO pagelinks(pl_from, pl_namespace, pl_title) VALUES (42, 0, "NASA"), (42, 0, "ESA") ? [14:04:44] i.e. page with id 42 has two internal links to pages with pagenames "NASA" and "ESA" ? [14:09:19] yes [14:10:32] the incoming page is identified by page.page_id, the target page by name - to allow "red links" to the pages that do not exist yet [14:12:19] saper: good, thanks [14:14:57] Does having a [[Category::]] inside a pag count as internal links too ? [14:15:04] *page [14:16:14] ShiningThrough: [[Category:Foo]] doesn't but [[:Category:Foo]] does... [14:16:41] ShiningThrough: [[Category:Foo]] puts the page in the category and [[:Category:Foo]] creates a link to the category. [14:18:33] hmm, something is wrong with my counting algorithm [14:21:12] for example this page https://ro.wikipedia.org/w/index.php?title=Cordial&action=edit has 16 outgoing internal links but my counting code only reports 2 :( [14:23:22] I only count 14... [14:23:46] [[Tiraspol]] is used 3 times, but only counts as one... [14:23:50] ShiningThrough: pagelinks shows links that points TO the page [14:24:49] ok, forget it, it can be used in both ways [14:25:19] https://ro.wikipedia.org/wiki/Special:Ce_se_leag%C4%83_aici/Cordial does only have two pages pointing to it... [14:25:41] AH [14:26:51] so what I count are incoming links not outgoing [14:36:45] Is there a page where can I get *a count* of how many pages link to a certain page? like the one Technical_13 pinpointed but that also shows the count [14:37:16] a page? [14:37:55] let me check... not sure if wp has that available (my home wiki uses an extenstion that adds that count to the page I already linked...) [14:38:47] i.e. I mean how do I determine how many pages link to the page specified in this query? https://ro.wikipedia.org/w/index.php?title=Special%3ACe+se+leag%C4%83+aici&target=Academia_Rom%C3%A2n%C4%83&namespace= There are too many to count manually... I want just to check that my code reports the number correctly... [14:39:23] https://ro.wikipedia.org/w/index.php?title=Cordial&action=info maybe? [14:40:33] using index.php umm... I bet there is an answer on http://www.mediawiki.org/wiki/Manual:Parameters_to_index.php somewhere... [14:43:24] However ShiningThrough, I would think that using the https://www.mediawiki.org/wiki/API would be more productive for you. [14:43:45] yeah, the answer is to use a bigger &limit parameter and copy the entire string to a text editor :P [14:48:29] guys, to where moved the db.php-file? (or its datacenter-subfiles)? [15:26:58] DaBPunkt: not sure i understand your question [15:27:07] DaBPunkt: it is in git operations/mediawiki-config.git [15:27:30] DaBPunkt: and published on http://noc.wikimedia.org/conf/ [15:29:46] hashar: is there a way to find these files on the shell at fenari? [15:30:01] we don't use fenari anymore [15:30:03] but tin.eqiad.wmnet [15:30:12] files are under /a/common/wmf-config iirc [15:30:21] or on fenari that would be /home/wikipedia/common/wmf-config [15:33:00] hashar: yes, there it was in the past, but there is no /home/wikipedia/common/ at fenari anymore. [15:34:20] ah [15:34:24] so it got phased out [15:34:31] definitely uses tin.eqiad.wmnet so [15:35:37] ah ok. thanks [15:36:21] mm, I have no access there afais [16:25:08] DaBPunkt: they are all in operations/mediawiki-config.git [16:25:18] DaBPunkt: the only one missings are passwords :] [18:09:14] bits.wikimedia.org (bits-lb.esams) gives me lemons^W 503s. [18:10:07] both in minified and debug mode [19:42:30] I got a french localised ref error on en.wiki, reproducible on me at the bottom of http://en.wikipedia.org/wiki/Wikipedia_talk:Articles_for_creation/%22The_Lamentation_of_Christ%22_by_Peter-Paul_Rubens [19:44:12] MartijnH: that happens when someone with french language interface set in prefs saves such a change [19:44:20] MartijnH: i think there's a bug for that somewhere [19:44:37] ah, ok [19:45:17] MartijnH: i just made it display in hebrew by visiting https://en.wikipedia.org/wiki/Wikipedia_talk:Articles_for_creation/%22The_Lamentation_of_Christ%22_by_Peter-Paul_Rubens?action=purge&uselang=he [19:45:25] MartijnH: you can just purge to make it english [19:46:20] strangely, the error is gone now :/ [19:46:43] with no new edits to the page [19:47:56] oh. heh. [19:50:05] you fixed it \o/ [22:32:19] http://uk.wikipedia.org/w/api.php?maxlag=5600&format=xml&prop=info&redirects=&titles=%C2%E8%B4%F3%F0%B3%E2%F9%E8%ED%E0 why am i get not normal answer? [22:35:05] Base-w: you forgot action=query [22:35:07] https://uk.wikipedia.org/w/api.php?maxlag=5600&format=xml&action=query&prop=info&redirects=&titles=%C2%E8%B4%F3%F0%B3%E2%F9%E8%ED%E0 [22:38:56] valhallasw: thanks smth in framework so