[01:23:01] So what we need is the opposite of a "thank" link. [01:23:04] For edits that fuck up my talk page. [01:24:23] 'hate' [01:24:44] Elsie: but that's an experimental feature [01:25:13] so what's up with m.wmf.org? [01:25:18] doesn't really seem to work for me [01:25:58] Example URL? [01:27:20] http://m.wikimediafoundation.org/wiki/Home ? [01:27:42] that's intentional odder [01:27:50] [[Category:Desktop-only pages]] [01:27:59] for pages that don't look good in mobile view [01:28:25] what's the /point/ of this category anyway [01:28:45] it's the friking main page, for Christ's sake [01:29:20] the first thing I do when I see a broken main page on my mobile device is to go somewhere else [01:49:08] Yes, what is the point of such a category? All the pages seem broken from mobile one way or another. [02:33:10] odder: Is it broken on mobile devices? [03:32:45] Elsie: 'Sass'? [04:00:35] I'm the sassmaster. [04:00:50] I had to part -e3 over that word. ;-) [04:00:59] Now if only I could part the others. [04:02:46] So many channels. [04:33:55] So many. [04:34:28] too many [07:25:37] Why on Earth do templates inside Wikipedia for other languages have language specific attributes!!!??? [07:25:46] https://de.wikipedia.org/w/index.php?title=Ibi_%28Spanien%29&action=edit [07:25:59] gentilicio provincia nombre ? [07:26:21] and the coord template has North South and West East in that language as well !! {{coord|38|37|38|N|00|34|31|O|type:city|display=inline,title}} [07:26:44] Why O? Because Osten(Ost) = East in german? [07:27:05] Y U NO USE ENGLISH!!!! [07:28:55] ... [07:29:54] damn now I have to figure out how north/south east/west is spelled out in each language [07:30:00] ffs [07:30:30] and that article is in german but has spanish named attributes inside the infobox [07:30:40] SAY WHAT! [07:41:48] and the https://en.wikipedia.org/wiki/Template:Coord page doesn't mention a thing that you can have N/S W/E internationalized!! [07:42:34] That's crazy!!!!!! [07:43:08] ori-l: TOTALY! [07:43:39] hashar: morning :) [07:43:46] hi [07:44:17] and I though Coord was meant to be phased out in favor of wikidata [07:44:51] there are zillions other ways that articles use to specify coordinates [07:45:53] some of them are language specific!!! [07:46:13] WHY ON EARTH!! [07:46:29] * BadDesign sighs [07:46:32] ori-l: I wrote yet another python tool :D [07:46:48] hashar: 0-day warez! where? [07:46:52] 'Multiple exclamation marks,' he went on, shaking his head, 'are a sure sign of a diseased mind.' -- in Eric [07:46:57] ori-l: to check Zuul/Gerrit/Jenkins consistency (aka an extension missing triggers in Zuul or jobs in Jenkins [07:47:34] ori-l: it is mostly a hack , I need to find out with analytics what they need as input (I guess json/csv) and what they can do with the raw data. [07:47:56] show! :P [07:48:00] i've been having fun with ruby for vagrant [07:48:20] have you checked it out recently? it does some nifty things [07:50:17] hashar: [07:50:20] 00:47 show! :P [07:50:20] 00:48 i've been having fun with ruby for vagrant [07:50:20] 00:48 have you checked it out recently? it does some nifty things [08:10:36] ori-l: haven't looked at vagrant [08:10:51] I should but I am too lazy to learn yet another tech [08:11:22] it isn't really another tech; it's just a piece of glue that pieces together virtualbox with puppet [08:11:40] the 'nice stuff' is mostly just how i laid out mediawiki config files and puppet helpers [08:11:56] but i know what you mean :) [08:12:08] i'll show it to you sometime [08:23:11] ori-l: I almost installed it yesterday but fedora hates VirtualBox (or vice versa) [08:23:29] I installed it on f18 and it was ok [08:23:35] looks easy enough [08:23:39] yes the first time is ok [08:23:46] then every kernel upgrade destroys it [08:23:55] hmm haven't checked that [08:25:29] oh, cool, thanks for trying it! [08:26:04] i know guest additions require building kernel modules, but that's on the guest vm.. does it need to do something like that for the host as well? [08:26:08] wouldn't surprise me [08:26:30] ori-l: vagrant/puppet was a very nice idea. I'm now sortof 'addicted' to puppet, and rebuilding my personal VPS from the ground up with just puppet (no command line modifications) [08:26:32] pretty sweet [08:26:32] the hell in german they use {{Coordinate ? https://de.wikipedia.org/wiki/Vorlage:Coordinate ? [08:26:36] ori-l: also shouldn't you be sleeping? [08:27:06] YuviPanda: heh, I did the same recently [08:27:18] and yes, probably [08:27:22] ori-l: how did you manage passwords? [08:28:47] ah Nemo_bis regrading the list of translators in CREDITS https://gerrit.wikimedia.org/r/#/c/68381/ [08:29:15] Nemo_bis: isn't translatewiki.net offering a dynamic list of users that participated in a project translation? [08:30:09] ori-l: it's not vagrant itself, it's VirtualBox trying to use the kernel which got removed [08:30:14] hashar: no [08:30:43] hashar: what we have is https://translatewiki.net/w/i.php?title=Special:SupportedLanguages which is dynamically generated for all projects based on manual (partial) list of names [08:31:25] OMFG [08:31:35] 1) it's not complete [this is just a config though], 2) it's extremely slow, 3) it's for all projects, 4) it doesn't consider whether your translation is still alive [08:32:15] Nemo_bis: so maybe twn could use a special page to list translators for a given project :-] [08:32:26] Special:TranslationCredits/mediawiki or something similar [08:32:46] that is nicer than having a bot commit an update to CREDITS every single day :-) [08:38:15] hashar: I doubt such a bot will ever appear :) [08:39:26] creating a special page would equally be very low priority, I don't dare to ask [08:46:30] hashar: I moved the list to https://translatewiki.net/wiki/Translating:MediaWiki/Credits , feel free to reply on https://gerrit.wikimedia.org/r/#/c/68381/ when you have more time [08:49:31] Why on earth does every language use a different format for specifying coordinates https://nl.wikipedia.org/wiki/Sjabloon:Co%C3%B6rdinaten ? [08:51:57] madness!!! [08:53:33] :(( [08:53:39] because all projects are independent [08:53:51] this is extremly stupid [08:54:14] it's extremely inconvenient [08:54:25] for someone working across projects [08:54:33] you could be working on all the wiktionaries [08:54:38] that too [08:55:03] BadDesign: the power of collaboration! [08:55:04] different templates and formats for each one, if you wanted to get the etymology/definition/synonyms/related words out of each one... [08:55:18] and yet I believe it's best that the projects are independent that way [08:55:21] How can I help unify all the ways in which the main coordinate of the article can be specified across all the languages? [08:55:37] BadDesign: scrap pl.wp if that's what you want, we have one one template there and consistent parametersin infoboxes <3 [08:55:44] BadDesign: by using the API [08:55:48] scrape, rather. [08:57:26] MatmaRex: What is pl.wp ? [08:57:41] polish wikipedia? [08:58:09] ok, let me try parse that dump [08:59:40] yep [08:59:44] MatmaRex: you still have {{koordynaty instead of {{Coord https://pl.wikipedia.org/wiki/Szablon:Koordynaty [08:59:49] but there's a separate api for coordinates anyway [08:59:59] BadDesign: yeah, so? the name is translated [09:00:13] in the dumps it is koordynaty ? [09:00:27] the template is also slightly more flexible than the en.wp version, but most inputs should be the same [09:00:35] yes, but i think {{coord}} redirects [09:00:56] and most coordinates are stated using infobox parameters [09:01:24] If I use the API how do I determine for which pages should I make HTTP requests? If there is no consistent way to determine that an article has a main coordinate in each language [09:02:04] how long have we been into this discussion over and over? months? [09:02:21] probably [09:02:48] BadDesign: prop=coordinates api? [09:03:02] MatmaRex: for English Wikipedia I need to make 10 million HTTP requests [09:03:38] if I can't determine beforehand which article has a main coordinate [09:03:40] no, only 20000 if you do it right [09:03:42] or maybe 2000 [09:03:55] you can use the allpages generator with the coordinates api [09:03:59] yes I could check if the article text has {{Coord with display=*title* [09:04:03] to do it in chunks of 5000 [09:04:05] but there are other ways [09:04:39] ah you mean sending more values in the &articles= parameter [09:04:53] BadDesign: https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&prop=coordinates&format=json&colimit=1&coprimary=primary&generator=allpages&gaplimit=max [09:05:37] there happens to be just one article with coordinates in the first 5000 there [09:05:39] or is it 500 [09:05:46] use the continue parameter to get them all [09:05:59] a couple thousand requests, apparently around a second to generate each. [09:06:07] a few hours of scraping, at worst. [09:06:14] and it all with a nice api [09:06:42] the api really is more powerful that people give it credit for, eh. [09:07:00] I'll give this a second try, as my first attempt to use the API was a failure [09:08:46] If I send articles with title in german it will return coordinates for the german version of the article not the english one right? [09:09:45] it will returns results for whatever wiki you run it on [09:10:15] assuming that users of this wiki took the steps necessary to make it work (adding certain tags in their {{coord}} templates) [09:10:20] i think all large wikis work [09:10:24] ah, so I just need to change the https://lang. part [09:10:39] yeah [09:11:16] hm, i'm notsure if the link i gave you loads correctly [09:11:21] i meant /w/api.php?action=query&prop=coordinates&format=json&colimit=10&generator=allpages&gaplimit=max [09:11:51] yeah, I know; the link you gave is just the sandbox [09:12:05] yeah, but when i clickedon it myself it didn't load the generator part [09:12:12] and that is rather important here :) [09:12:22] oh, I have the same problem [09:13:40] MatmaRex: that query returns all pages in the wiki along with coordinates if there are any? [09:13:56] yes [09:14:03] the first 500 of all pages, actually [09:14:15] you canuse the gapcontinue parameter to get the next 500, etc [09:15:01] (if it returns only 50, then that means your account doesn't have bot rights, you could request them at the wiki you're using) [09:15:10] uh, no, wait. [09:15:28] 500 is normal (not a bot). bots can go up to 5000. [09:16:00] there doesn't seem to be a way to only returns articles with coordinates [09:16:05] *return [09:16:32] only the other way around [09:16:36] yeah, i don't think there's a way [09:16:47] hm, or maybe [09:18:07] (i'mdigging, one sec) [09:21:19] BadDesign: no, apparently no. https://www.mediawiki.org/wiki/Extension:GeoData#Enumerating_pages_with_or_without_coordinates says this is disabled now [09:22:46] speaking of which, MaxSem, I know fi.wiki had added #coordinates on the 2nd (no impact on servers that I could see), dunno other wikis [09:26:17] my previous attempt used http://en.wikipedia.org/w/api.php?action=query&prop=coordinates&colimit=500&format=json&titles= with 50 titles per request without using continue parameter, maybe this was the problem [09:26:44] cause it gave connection reset by peer after a few minutes [09:28:25] I don't understand the colimit parameter [09:28:43] Does it mean I can send 500 articles per request? [09:28:45] max [09:28:55] no, it's thenumber of coordinates that might be returned [09:29:01] sone articles can have more than one [09:29:14] but which article has 500 coordinates? [09:29:19] 5000 for bots? [09:29:21] huh? [09:29:36] and the limit for "titles=" parameter limits you to 50 comma-separated titles to check, 500 for bots. [09:29:42] well, there are some, let me look [09:30:08] https://en.wikipedia.org/wiki/Goiânia_accident has a bunch [09:30:26] BadDesign, use allpages as a generator so that you don't need to provide the titles manually [09:30:47] (not 500, though :P) [09:30:55] speaking of coordinates, can we add them to dumps apergos? [09:31:48] huh? [09:32:00] they are embedded in some template or other right? [09:32:09] they're in DB [09:32:13] MatmaRex: I have a file with all the article titles from the dumps [09:32:14] I see [09:32:33] can you bz that please? [09:32:48] if you have a pointer to which db table, even better [09:34:05] apergos: that'd be awesome if you could add a new XML tag () to the pages in the dumps [09:34:14] it won't be soon [09:34:29] but if it's in the queue it will eventually get looked at [09:35:21] later is better than never [09:35:41] actually, MaxSem [09:35:48] i just looked at geodata - why doesn't it use page_props? [09:36:30] can you imagine page_props for a page with 2000 coordinates? [09:36:52] hm [09:36:58] btw, that {{#coordinates is now in the template I believe ? Should probably be moved into the Module. [09:36:58] yeah, scratch that. :P [09:37:54] the primary vs all problem :D [09:39:44] apergos, https://bugzilla.wikimedia.org/show_bug.cgi?id=51225 [09:41:01] thanks [09:41:14] what else is that table used for? [09:41:28] nothing, just the coordinates [09:42:26] http://www.mediawiki.org/wiki/Category:MediaWiki_database_tables [09:42:28] not there [09:42:46] would be nice to have an entry :-) [09:43:23] https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FGeoData.git/6d12d8cd1b5f9da1d1428a37aedb0d5033c507af/sql%2Fexternally-backed.sql [09:43:51] isn't it for core tables only? [09:43:59] since these are against the pageid, it's probably best to dump the whole table [09:44:08] instead of folding into the dumps [09:44:53] only geo_tags is needed, the remaining tables are for loading into solr [09:45:20] yes, I mean dumping that table, insead of writing values into tags in the xml [09:46:27] I dunno about core and non core tables [09:46:50] if we really don't put non core tables in the manual, they should at least have a category and entries of their own [09:47:37] I don't find the table anywhere on mediawiki.org though [09:52:34] Using https://nl.wikipedia.org/w/api.php?action=query&prop=coordinates&format=json&colimit=500&generator=allpages&gaplimit=max&gapcontinue=-heem_%28toponiem%29 I can only get 500 pages max per request right? [09:53:35] then I need to check if any of those pages has coordinates and after I did that I need to fire a new HTTP request with the gapcontinue parameter [09:53:57] I pressume the API is keep-alive aware right? [09:54:19] 5000 if you are a bot and authenticate [09:54:50] don't forget to honor maxlag [09:55:03] what are the requirements for becoming a bot? [09:55:21] global bot req can be done on... meta? [09:55:26] you'd have to ask here [09:55:27] *there [10:58:39] apergos: Could I use https://www.mediawiki.org/wiki/Manual:Pywikipediabot to gain bot priviledges? [10:59:33] you still need to apply and get community approval or whatever [10:59:58] ah, got it [11:00:24] https://meta.wikimedia.org/wiki/Bot_policy#Global_bots [11:00:43] hmm this says only interwiki and etc. but maybe if you are not making edits they will be willinkg to make an exception [11:01:56] dunno, I would chat about the idea with folks there [11:02:48] I want only read-only access [11:03:00] I know [11:03:33] another thing you could do is see if you could get tool-labs or toolserver access [11:03:47] and hav the abilityto query the database directly for at least that table [11:04:00] [11:04:24] I would try for labs but not toolserver.. [11:04:29] http://www.mediawiki.org/wiki/Wikimedia_Labs/Tool_Labs [11:12:05] laptop overheated [11:15:45] Fun. [11:20:31] guess it's time to go get some compressed air [11:20:37] and maybe a laptop cooler pad [11:20:38] tah [11:25:32] I crack mine open, clean it out, and put fresh thermal paste in when mine starts acting wonky. [11:50:30] hm, one can give feedback and ask to enable AFT on other wikis... by using an en.wiki talk page? [11:57:07] no, its no differnt than any other shell request [11:57:16] you use bugzilla with a link to consenus [11:59:23] p858snake|l: see the monthly report [12:00:04] seems to imply so: "we plan to make AFT5 available to other wiki projects in coming weeks, as outlined in the release plan. For tips on how to use Article feedback, visit the testing page, and let us know what you think on this talk page." https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2013/June#Editor_engagement_features [12:00:27] busy flirting with someone via twitter dms :p [12:01:21] tbh that just reads that they are going to turn it on whereever they want like normal and if people want to know how to use it to view the page [12:01:33] p858snake|l: stop answering me [12:01:55] https://www.mediawiki.org/wiki/Article_feedback/Version_5/Release_Plan_2013 lists some projects [12:41:44] gave it a good cleaning but didn't have thermal paste, we'll see how it goes [12:42:03] no too much dust in there tbh [12:54:43] you put on a heatsink without thermal paste? [12:56:13] or rather didn't remove it because of the lack I think [12:56:25] mark: of course you "can"....... [13:04:35] no, I didn't try to remove and put new [13:04:46] just a regular cleaning [13:06:03] the sony zs have these tiny little vents on the left side only, and the one fan, so it's no wonder [14:23:01] I so enjoy Gerrit patches loading for a minute or more [14:28:07] odder: don't we all [14:55:24] odder: You're obviously not pressing F5 enough [14:56:04] Reedy: he smashed F5 so hard it broke? [16:55:18] hi [16:55:21] Hello, I've got a question. I cannot join wikidata because of a dns-lookup failure. Anything i can do about it? [16:55:41] (^ probably the Amsterdam cluster, if it matters) [16:56:22] Reedy: any idea when the DNS changes will be over? [16:56:35] over? [16:56:37] ? what's the dns lookup failure [16:57:06] IIRC mark moved eu traffic for wikivoyage and wikidata to ESAMS earlier today [16:57:46] natuur12: Use a decent DNS provider? [16:58:08] but what failure are you seeing ? what's the result of your lookup? [16:58:12] I just use google chrome and ziggo as my internet profider [16:58:41] the result is that wikidata cannot displayed on my computer [16:58:47] What OS are you on? [16:59:44] reedy: windows 8 64 bit [16:59:50] what, is that still ongoing? [16:59:52] ridiculous [17:00:15] natuur12: Open a command prompt, type nslookup www.wikidata.org and press enter [17:00:31] Then pastebin the output and send us the link http://p.defau.lt/new.html [17:05:30] Reedy: done [17:05:41] Link? [17:05:51] http://p.defau.lt/?ThTmCmLck1hLi8uQlKdacQ [17:06:13] Did you type that out manually? [17:06:52] yes, because I couldnot copy paste [17:07:11] Click on the icon in the top left [17:07:19] Edit -> Select all [17:07:21] then press enter [17:07:47] Reedy: He might be using Damn Typofull Linux :D [17:07:59] it doesn't matter [17:08:13] his internet provider / dns server caught the NXDOMAIN during the few minutes I broke it this morning [17:08:21] which should not be cached for more than 10 mins [17:08:29] but some dns servers ignore that [17:09:07] Reedy: http://p.defau.lt/?44r8ljF5Q6AWRORHEKvTfw [17:09:27] mark: Hence [17:57:46] natuur12: Use a decent DNS provider? [17:09:28] ;) [17:09:36] yeah [17:10:09] natuur12: unfortunately there's not much we can do at this point, most likely it will start working again within a day [17:10:31] oke ty [17:10:47] natuur12: you might be able to work around it [17:10:53] Restarting your router might help if that's where the DNS entry is being cached (shown by 192.168.0.1 being the resolver) [17:11:00] if you really need this now [17:11:02] it's possible [17:11:50] Already tried to restart the router. So bad luck for me [17:12:16] what is your internet provider? [17:12:25] Ziggo [17:12:29] ah ziggo [17:12:39] It's all Dutch to me [17:13:07] you could try configuring 8.8.8.8 as your dns resolver [17:13:10] (google dns) [17:13:21] 8.8.8.8 all the way ~ [17:13:25] might be better than ziggo's anyway ;) [17:33:53] robla, DarTar , AaronSchulz, James_F, Reedy - so there seems to be an issue with loss of revtag data [17:34:10] https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#VisualEditor_tag_not_working_correctly has the original reports [17:35:43] Let's see [17:36:13] how can I checkout the exact version of mediawiki (with extensions) that is deployed in wikipedia [17:36:45] including CommonSettings.php, InitialiseSettings.php etc... [17:36:51] !link [[Special:Version]] [17:36:51] https://meta.wikimedia.org/wiki/Help:Link [17:36:57] ... [17:37:25] Eloquence: Which is revtag? tag_summary or change_tag? [17:37:33] Reedy: change_tag [17:37:43] If the latter https://bugzilla.wikimedia.org/show_bug.cgi?id=40867 might be relevant [17:37:46] tag_summary is derived from change_tag [17:38:02] I know Special:Version, but that does not seem to include CommonSettings and other stuff here http://noc.wikimedia.org/conf/highlight.php?file=CommonSettings.php [17:38:20] Eloquence: DarTar Index changes have been made today to that very table [17:38:29] Reedy: all historical revtags I was tracking (gettingstarted, VE, mobile edits) seem to have been cleared until 7/12 midnight UTC [17:38:31] never mind, I found it: https://git.wikimedia.org/tree/operations%2Fmediawiki-config.git [17:38:43] springle: About? When did you start the schema changes for enwiki? [17:39:14] Reedy, yesterday, around 1500 PDT [17:41:01] Reedy, but wgOldChangeTagsIndex is not yet live. tables have both old and new indexes. due to lack of primary key, pt-online-schema-change needed to do it in two steps, adds, then drops [17:41:16] Reedy: I need to run, should be back online within an hour, if you guys find the cause, pls cc me on the post mortem [17:41:19] That would probably explain the problem [17:43:19] Eloquence: Most likely not lost, but just unaccessible via MW [17:43:34] That is reassuring. [17:44:02] springle, sorry, can you briefly recap what the DB work is you're doing here? [17:44:09] https://bugzilla.wikimedia.org/show_bug.cgi?id=40867 [17:44:13] Fixing MW legacy crap [17:44:18] :) [17:44:56] Eloquence, https://bugzilla.wikimedia.org/show_bug.cgi?id=40867. completing unmigrated wikis listed in comment 6 [17:45:56] got it. please note that tag data is in heavy use and especially critical right now due to visualeditor rollout [17:46:06] hope we can recover this soon. [17:46:38] If the index migrations are finished, we can just merge the config change and that should be that [17:46:48] Reedy, Eloquence, I'm not clear where the problem is. will https://gerrit.wikimedia.org/r/#/c/73435/ fix it? [17:47:20] ie, should I not be making the schema changes, or should I be expediting the config changes? to be clear, both old and new indexes still exist [17:47:35] This is somewhat of a rare case [17:48:33] Hm. If they both exist, things using those indexes should be working as before [17:50:30] Do we have a rev_id/rc_id for a revision/recentchanges entry where the tags should be there? [17:50:39] lemme poke [17:51:08] https://en.wikipedia.org/w/index.php?title=Steven_Cojocaru&diff=prev&oldid=563615370 [17:51:23] this is a VE dirty diff, so it should definitely have the VisualEditor tag [17:51:43] It does [17:51:43] mysql:wikiadmin@db1056 [enwiki]> select * FROM change_tag where ct_rev_id = 563615370; [17:51:43] +-----------+-----------+-----------+--------------+-----------+ [17:51:43] | ct_rc_id | ct_log_id | ct_rev_id | ct_tag | ct_params | [17:51:43] +-----------+-----------+-----------+--------------+-----------+ [17:51:45] | 589674173 | NULL | 563615370 | visualeditor | NULL | [17:54:26] i see same result forcing old or new index [17:56:13] Well, the data is there, so MW is at fault in some way or another [17:56:19] oh man, don't tell me i broke something with the tags [17:56:26] no, wait, that's not even deployed on en.wp yet [17:56:31] wmf9? [17:56:39] it was merged like yesterday [17:56:41] 10? [17:56:43] my changes are live on mw.org [17:56:47] wmf10 then [17:56:55] but related VE changes are live everywhere [17:57:09] Oh? [17:57:15] (i think) [17:57:25] but that stillshouldn't have broken anything [17:57:42] unless someone accidentally killed the message or something [17:57:42] the tag appearance appears erratic right now. in this case you see the mobile edit tag in the history: https://en.wikipedia.org/w/index.php?title=Monsignor_Fraser_College&action=history - in this case you do not see the visualeditor tags in the history: https://en.wikipedia.org/w/index.php?title=Steven_Cojocaru&action=history [17:59:41] If the new indexes are there, we might aswell merge https://gerrit.wikimedia.org/r/#/c/73435/ for starters [17:59:53] am I the only one who gets a gateway timeout looking at http://en.wikipedia.org/w/index.php?title=Special:Contributions/Addbot&offset=20130708223322&limit=500&target=Addbot ? [18:00:28] Apparently [18:00:32] ^ Addbot file namespace contributions past the first 500 [18:01:05] tried on two computers and getting the same timeout lol [18:05:25] I try looking at the contributions 100 at a time, and after a few pages of going back, it gateway timeouts again [18:05:56] springle: Reedy: I've asked greg-g to coordinate the response on this issue, so he'll be checking in on this, but I'm assuming you all have it [18:06:05] (this issue being rev tagging) [18:09:18] somebody did check that the tables weren't accidentally trundacted? maybe on some slaves? :P [18:10:28] MatmaRex: I just checked that one of them was at least there that MW wasn't showing [18:11:07] springle: see MatmaRex's last question? [18:11:43] greg-g, MatmaRex, checking now [18:16:33] | visualeditor | 76887 | [18:16:33] | visualeditor-needcheck | 518 | [18:16:44] | gettingstarted edit | 18612 | [18:16:46] etc [18:16:53] They're there on the master. Doesn't look like data loss [18:16:56] MatmaRex, slaves seem intact [18:17:01] So, Reedy MatmaRex springle: just to clarify: The tag is correctly being saved in the db for edits made with VE, it just isn't coming out of the db and being displayed? [18:17:15] Seems to be the case [18:17:22] good [18:17:36] and funnily, it's only not displayed for older edits, apparently :/ [18:17:59] * RoanKattouw looks up [18:18:05] also, they're still displayed on other wikis, i think: https://www.mediawiki.org/w/index.php?title=Project:Sandbox&action=history [18:18:21] greg-g asked me to come in here so I can help out if needed [18:18:54] So AIUI we have tags still being stored in the DB and still being stored for new edits as well, just not displayed in MW? [18:19:03] Seemingly [18:19:09] MatmaRex: mw.org is wmf10; is there a wmf9 wiki where it still works? [18:19:53] https://en.wikipedia.org/w/index.php?title=Steven_Cojocaru&action=history&year=2013&month=-1&tagfilter=visualeditor [18:20:07] ^ That shows the revisions with the tag. It just doesn't show the tag on the actual revision [18:20:14] RoanKattouw: works on pl.wp: https://pl.wikipedia.org/w/index.php?title=Wikipedysta:Miloszk22/brudnopis5&curid=2986516&action=history [18:20:23] Minor bug seems rather minor [18:20:28] Hell https://en.wikipedia.org/w/index.php?title=Special:RecentChanges&tagfilter=visualeditor works [18:20:32] And Special:Tags lists it [18:20:32] Yup [18:21:06] list( $tagSummary, $newClasses ) = ChangeTags::formatSummaryRow( $row->ts_tags, 'history' ); [18:21:12] this is from HistoryAction [18:21:17] it just doesn't get any simpler [18:21:22] ts_tags has to be empty in there [18:21:41] Eloquence: Did you try https://en.wikipedia.org/w/index.php?title=Steven_Cojocaru&action=history&year=2013&month=-1&tagfilter=visualeditor or anything like the links Roan just posted? [18:21:54] it's the only way for ChangeTags::formatSummaryRow to return an empty string [18:22:06] (or, well, otherwise falsy) [18:23:34] Hmm [18:23:45] And I suppose the tag filters use a different table or something? [18:23:57] hm, there are two tables for tags [18:24:03] Yeah [18:24:05] one which has comma-separatedlists as text [18:24:06] Three I believe [18:24:07] tag_summary and change_tags [18:24:08] and one with the mapping [18:24:50] there are change_tag and tag_summary? [18:25:01] -- A table to track tags for revisions, logs and recent changes. [18:25:03] -- Rollup table to pull a LIST of tags simply without ugly GROUP_CONCAT [18:26:39] Right, valid_tag [18:26:45] Which I suppose is not of interest here [18:27:23] but notihng checks that table afaik [18:27:36] apart from listDefinedTags [18:27:44] I've gotta go out. Won't be too long [18:27:45] called on Special:Tags, i guess [18:30:30] I have a meeting, back later [18:49:56] could somebody verify that the two change tags tables are consistent? [18:50:03] for rev_id = 563615370, for example [18:50:43] that is, select * from tag_summary where ts_rev_id = 563615370; [18:51:02] and select * from change_tag where ct_rev_id = 563615370; [18:51:33] i really see no code path that could lead to tags not being displayed if they exist D: [18:53:06] MatmaRex, tag_summary no results. change_tag, 1 result [18:53:25] welp [18:53:27] that's not good [18:53:48] as dumbledore said, it seems i'm right, i've never before wished to be wrong this much [18:54:25] somebody killed the table :/ (or i really forgot how to write sql queries) [18:58:21] MatmaRex: whats the next step here? [18:59:35] greg-g: well, i'd sure like to know :P [18:59:51] either something cleared the table, or we discovered a very unlikely mysql bug [18:59:52] MatmaRex: what springle said. [19:00:37] if thefirst thingis a case, then it's either backups or rebulding the table from the other which luckily didn't disappear [19:01:00] but this i'll have to leave for someone with access to teh servahs [19:01:26] or of course i could be wrong, so it would be great if someone could verify if this even makes sense [19:04:46] StevenW: ? clarify? [19:05:06] greg-g: StevenW: i assume steven got the same results for the db queries [19:05:14] (thanks for double-checking :) ) [19:05:14] ah [19:06:00] so, we arent without the data, it just isn't where the code thinks it is, right? [19:06:46] there are backups probably anyway [19:06:59] the data is storedin two tables, with content exactly the same in both [19:07:01] greg-g, unsure. tag_summary is lacking some records? MatmaRex ? [19:07:04] but in different formats [19:07:16] the tag counts from change_tag are reliable, but are missing from tags_summary [19:07:26] one table is used for getting lists of the tags, the other for filtering by tags [19:07:29] so it should be a matter of repopulating the latter from the former [19:07:34] thefirst oneis borked, thesecondseems okay [19:08:14] there are tag_summary and change_tag respectively [19:08:15] greg-g: ^ [19:09:03] now there's a question of what the hell happened, of course, since that's one of the few pieces of information which are duplicated in this way :D [19:10:32] right right [19:10:58] I just asked springle to open a bug to document current thinking and a plan of action to fix. We'll have binasher review when he's online [19:11:19] MatmaRex: Reedy your help appreciated with the bug if springle needs anything :) [19:11:55] i really don't know much about how this works, you know ;) [19:12:18] well, hey, you're talking a lot, so I figured :P [19:12:53] I'm going afk for a few (lunch and such) [19:20:17] MatmaRex, https://bugzilla.wikimedia.org/show_bug.cgi?id=51254 . tracking down binasher now for his input [19:21:28] (i moved the bug to the Wikimedia product and cc'd james) [19:34:11] binasher, https://bugzilla.wikimedia.org/show_bug.cgi?id=51254 [19:34:34] thanks [19:36:01] 12 19:32:17 < jeremyb> can you confirm your email if you're not logged in? (i.e. does the confirmation link still work?) [19:36:04] no answer in #mediawiki :( (this is re a WMF wiki) [19:36:26] not that i was terribly patient :P [19:36:47] jeremyb: i think it should [19:36:56] (i missed this on #mediawiki due to bot flood) [19:36:57] hrmmmm [19:37:03] i figured it was bot flood [19:37:18] does it not? [19:37:27] you don't have OTRS, right? [19:38:03] let me paraphrase if not [19:40:22] nope [19:41:08] turns out there wasn't much private info there anyway so i didn't even paraphrase :) [19:41:19] > I created a new account with the user Name "${username}" and my email address and it logged me in for a few minutes. Once I was ready to make my [19:41:22] > updates to the wiki page it logged me out. From my email I got a confirm email link that was invalid. then when I tried to re-login it wouldn't [19:41:25] > login with my password. So I tried the forgot password and that failed saying there is no email address associated to my account which is bogus. [19:41:28] > [19:41:30] > Please fix this if you would so that any articles I update in the future will use my correct user name! [19:42:16] MatmaRex: i verified the account was created recently. i guess i'll just reply so that they can verify the mail's not spoofed and then we can get an op to manually confirm it? [19:42:24] unless someone has a better idea [19:43:21] I want to know whether this is a bug or not [19:43:34] there was originally a photo on commons on the name https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo.jpg [19:44:01] but since I wanted to copy a photo from enwp to commons with the same name (but different photo), I renamed the original photo to https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo_%282%29.jpg [19:44:12] after that I moved the other photo to commons [19:44:28] but now I see the photo of #2 on #1: https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo.jpg [19:44:33] jeremyb: hm, there was a bug with centralauth which did this to some users' autocreated accounts [19:44:51] (create accounts without e-mail and with unknown password) [19:44:53] (but on the thumbnail you see photo #2, although I cannot access that - purging doesn't help) [19:44:59] MatmaRex: this is a manual creation. only one wiki attached to the whole account (and no unattached) [19:45:27] jeremyb: i dunno, test this? it should work as far as i know :) [19:45:38] MatmaRex: right... :) [19:46:13] Trijnstel: errr, what does #2 on #1 mean? how do i know you're seeing the same thing i am? [19:46:45] jeremyb: well, I asked other users what they saw and it turned out they saw the same as me, but we can check it again ;) [19:47:14] right now you should see the same photo on both links (apart from the thumbnail of photo 1) [19:47:23] i don't [19:47:40] i see thumbs for the verisions that match the bigger pic at the top [19:47:47] versions* [19:52:34] jeremyb: hmm, which photo see you here? https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo.jpg (on top) [19:52:47] and which here: https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo.jpg#filehistory [19:52:51] (what's on them?) [20:10:01] What is the Gerrit equivalent of adding a page to watchlist? [20:10:29] (And how do I do that.) [20:13:00] odder: You star a change. Click the star. [20:13:55] Matthew_: any hints to where I can find it? [20:14:15] odder: above thecommit message, to theleft [20:14:26] odder: It's in the upper left. [20:14:28] That [20:14:37] oh, it's empty [20:14:42] my screen really is awful [20:14:46] (I was looking for it when you pinged, I haven't logged into Gerrit in months XD ) [20:15:06] so I will get notifications when things change on the patch, right? [20:15:31] (I would assume not, and that it's possible with 'Watched changes') [20:15:57] yes, and it's the same when you're a reviewer [20:16:07] but there's an open bug about notifications about new patchsets not going out [20:16:21] you'll still get a notif about jenkins running the tests, though. [20:16:40] MatmaRex: so what's 'Watched changes' for? [20:17:00] or maybe not. i can't find it [20:17:16] odder: userscan remove others from the reviewer lists [20:17:30] i think at leasteverybody with +2 rights, and maybe more [20:17:43] or if you want to watch something sneakily. [20:37:12] Trijn|away: fixed? [20:38:58] binasher: springle just checking, are you all still moving forward with the tag table issue, or should Reedy/someone do more code/cause diagnosing? [20:43:12] Trijn|away: http://i.imgur.com/UEQNTmL.png http://i.imgur.com/JZ9cgxK.png [20:43:45] that's how it looked for me from the beginning. (but i modified some parts of the page so everything would fit in one screenshot instead of being below the fold) [20:44:05] jeremyb: hmm, it doesn't look like it here [20:44:10] but maybe it will tomorrow or so [20:44:15] or after I removed the cookies [20:44:17] I don't know [20:44:33] Trijn|away: no, it's not cookies [20:44:37] greg-g, yes, we think we have identified a bug in pt-online-schema-change. now looking at rebuilding tag_summary [20:44:43] Trijn|away: but again, i don't know what you're seeing... [20:45:07] Trijn|away: you could make screenshots like i did or use more detailed descriptions :) [20:45:39] springle: coolio [20:46:18] springle: no relation to ron pringle, right? [20:48:21] https://bugzilla.wikimedia.org/show_bug.cgi?id=16066 [20:48:31] funny how stuff was turned on its head in the past few years [20:55:54] jeremyb, not that i know of [20:57:55] jeremyb: I see on https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo.jpg the photo of https://commons.wikimedia.org/wiki/File:Roberto_De_Vicenzo_%282%29.jpg (though only on top; the thumbnail looks good) [21:01:42] paravoid: can you take a look? [21:01:52] or i guess it's kinda late there [21:02:16] purge didn't fix it. i can repro when i set esams in my hosts. eqiad is fine [21:08:14] uhm [21:08:16] that's weird [21:08:47] cp3005 backend has the wrong image [21:08:57] but cp1064 backend has the right one [21:09:03] this means it wasn't purged [21:09:17] but the more weird part is that the "wrong" version is not on the file history [21:09:26] AaronSchulz: hey [21:11:17] oh god [21:11:23] this smells varnish cache corruption [21:12:07] hm, unless this was actually renamed recently [21:12:28] (cur | prev) 11:27, 12 July 2013‎ Trijnstel (talk | contribs)‎ m . . (459 bytes) (0)‎ . . (Trijnstel moved page File:Roberto De Vicenzo.jpg to File:Roberto De Vicenzo (2).jpg: conflicts with a local file on enwp) (undo) [21:13:05] and action=purge didn't fix it? [21:15:13] paravoid: Is cp1064 hooked up to all the right purge streams? [21:15:32] something's wrong with vhtcpd I think [21:15:33] I say this because I noticed that the new cp10* servers for bits weren't in the dsh node list and so the command line purge script didn't work for them [21:15:47] Not directly related, only potentially forgot-to-list-new-server related [21:33:46] RoanKattouw: it's esams that's broken not eqiad though [21:34:05] re cp1064 [22:59:23] gn8 folks [23:40:46] with SUL2, which side of GlobalBlock will login.wikimedia.org sit? [23:41:37] csteipp: --^^