[02:23:28] !log LocalisationUpdate completed (1.20wmf7) at Sun Jul 15 02:23:27 UTC 2012 [02:23:39] Logged the message, Master [02:46:33] !log LocalisationUpdate completed (1.20wmf6) at Sun Jul 15 02:46:33 UTC 2012 [02:46:41] Logged the message, Master [07:01:14] We seem to have a problem. [07:01:16] "the maximum replication lag is 167 second(s). (Parts of the wiki may appear to be 167 second(s) out-of-date)." [07:01:26] and "504 Gateway Time-out" [07:06:58] The wiki is not happy. [07:08:02] Any tech staff around? [07:18:49] matanya: did you break the wiki? ;) [08:00:08] I guess that was db12? [08:00:15] I see no replag issues now [08:00:20] http://noc.wikimedia.org/dbtree/ [08:02:19] sorry I didn't respond earlier: there was no page for it and I am occasionally (gasp!) afk :-D [08:07:17] apergos: the wiki new you were gone so it chose that moment to cause trouble. [08:07:29] figures :-) [10:56:32] where is the master username database in Extension:CentralAuth kept in the case of Wikimedia sites ? [10:57:42] In Consumium we are going to have all languages in one wiki with the upcoming Universal Language Selector but have a "commons" type of separate wiki for files so obviously we need CentralAuth [10:58:42] does the "master" file reside in which wiki or where ?? [10:58:47] <- confuzed [10:59:59] I think make the consumium.org/wiki/ and consuploads.consumium.org/wiki/ ( which is like commons in case we need to later on split the wiki into langs so it's obvious to have a one wiki devoted to files uploads [11:01:22] I suppose in the case of 2 or 3 ( if we decide to use the same userbase as the http://develop.consumerium.org/wiki/ ( which has hundereds if not thousands of bot registered before upgrade and CAPTCHA installation [11:02:29] currently only 1 wiki user database exists.. so there is incentive to take the easier way to shared login but I sorta want CentralAuth for streed cred. [11:08:39] jubo2: use sharedtables if you can over centralauth [11:09:17] p858snake|l: I sure can if it is preferable [11:09:37] how do I count how many users there are..? [11:10:15] a cheap and easy way i would guess would be a count statement over your users table [11:11:32] I need to evaluate over starting over from zero against using what we have.. We have famous people as users like User:Brion, User:Anthere, User:Taw, and others.. some got bitten by trolls.. [11:11:34] hmm.. [11:11:47] [11:12:00] count (*) from users; [11:12:08] starting from zero? [11:13:19] p858snake|l: there are lots of spam accounts registered, but they're not doing only vandalism but also inputting like ideas or so... Untill Spring 2012 the registration of accounts had no CAPTCHA at all [11:13:42] But I think a thousand or so spam accounts really arent much more then a drop in a bucket [11:15:14] if this highly political wiki is to go on-stream _and_ attracts consumer/editor interest and thus grow to be a large wiki I suppose we could keep our existing user database with accounts starting from Spring 2003 onwards.. [11:16:09] I ssh to server [11:18:57] sez I have error in SQL syntax [11:22:14] In uni CS I did relational algebra optimization with wetware, pencil, paper and eraser.. and I got a good degree in that and now MySQL sez "Syntax error" when I try to "count (*) from user;" [11:22:43] always push up the selects as high as possible.. that was one of the thumb-rules [11:23:09] and the intersections and shit as low as possible [11:25:50] p858snake|l: do the sharedtables need to be on same server or is it possible to distribute the load ? [11:27:16] 1358 users [11:28:44] I dump the table to .sql file and grep at it for this new http://develop.consumerium.org/wiki/The_/on/troll to see if I can find something interesting [11:29:48] it makes it ways around the CAPTCHA and the SpamBlacklist, sorta seems like it always was partially written by a bot [11:30:26] approx once a day it posts an article using various accounts but it is clearly the same thing [11:30:59] also could consider installing CheckUser-extension .. that'd be most useful.. [11:32:46] Yeah.. there is a gigaton of usernames created on 2012-02-28 .. [11:40:28] last time I looked at the user stats I think we had 150 users and now it's 1358 .. locking the editing did not disable registering new accounts.. [11:48:08] domas: what advantage over specialized triplestores does one get with storing RDF triplets in MySQL ? [11:49:23] dunno [11:49:38] specialized triplestores usually have not that much engineering thrown into them [11:51:09] otoh, what is a specialized triple store? :) [12:01:57] http://en.wikipedia.org/wiki/Special:Search?go=Go&search=Triplestore apparently states if there is any [12:04:04] domas: I trust you are familiar with LinkedWiki-extension that provides possibility to build an ontology and datasets and query 'em with SPARQL, a query language for RDF subject/predicate/object triplets .. [12:05:14] we are going to use this heavily.. prlly set up own SPARQL endpoint like DBpedia.org has as a public service if someone wants to run queries against the ontology and datasets [15:16:38] [[Tech]]; 68.173.113.106; /* Native version of MediaWiki? */ ; https://meta.wikimedia.org/w/index.php?diff=3908977&oldid=3899946&rcid=3394653 [15:28:19] [[Tech]]; Krinkle; /* inconsistend align= behavior for class="wikitable" */ re; https://meta.wikimedia.org/w/index.php?diff=3909002&oldid=3908977&rcid=3394669 [15:36:31] hey [15:36:49] anyone here know js? [16:07:24] Steven_Zhang, I know some basics things about it. what's the problem exactly though? [16:33:23] Krenair: trying to add a wikilink [18:30:48] Evening guys :) Do you have any technical workings to do with Wikimedia's mail servers please? I was just checking my inbox and found an email sent to my old Wikipedia email address, offering me a free trial of a manhood enlargement drug. [18:31:17] Just wondering if you're aware of it, and if not, whether you'd be interested to get the message source and check it in case any of your servers have been compromised [18:31:48] apparently, it came from an @wikimedia.org address, which may (or may not) be fake: irsdemolish@wikimedia.org [18:46:45] BarkingFish, the wikimedia mail servers have SPF records [18:47:19] from the headers it should be easy to determine if it was fake (99.999%) or somehow it was not [18:47:38] Right. Well I have no idea what one of those is. All I know is I got this email, i have no idea how to read the headers whatsoever [18:49:07] sigh, I could look it up for you [18:49:22] but you'd need to send me an attached copy of the mail or something similar [18:49:33] which you may not like to [18:50:10] i can send you a copy of the message source [18:50:58] someone sending an email can put whatever they want in the From address - president@whitehouse.gov or what have you [18:51:47] heh, I have been recently receiving Obama mails [18:52:07] the funny thing is, they seem legit [18:52:20] Platonides, i will check anyway - instead of sending it to you, i'll pastebin it as an unlisted post [18:52:39] BarkingFish, as far as the headers are there, it would work [18:53:57] the whole of the message is source is there, from x-store-info downwards [18:54:28] the only thing I obliterated was my email address, bar from the domain [18:54:43] http://pastebin.com/9Y0MUqnp [18:55:32] it comes from korea [18:55:46] not through wikimedia servers [18:55:54] ok, well that tells me all I need to know :) It's not one of yours. [18:56:09] I couldn't read a mail header if you got me to take a degree course in it [18:56:58] the key lines are the received ones [18:57:03] Received: from ICU ([112.217.244.242]) by SNT0-MC2-F29.Snt0.hotmail.com with Microsoft SMTPSVC... [18:57:09] that's the first one [18:57:20] so it's from which hormail received the email [18:57:42] 112.217.244.242 -> DACOM BORANET, DACOM Bldg., 706-1, Yoeksam-dong, Kangnam-ku, Seoul [18:58:35] right, that will help me sort out the cruft for the future, instead of bothering you guys :) Thanks for the assist, Platonides [19:53:37] i wonder if steven ever got an answer (JS). maybe he's on a plane now [20:06:43] anyone getting DB errors? [20:07:55] matanya: elaborate? [20:07:59] i may be going off shortly [20:08:08] not that i have access anyway [20:08:18] https://bugzilla.wikimedia.org/38414 [20:08:47] matanya: repeatedly? [20:08:56] yes [20:09:06] @replag [20:09:08] jeremyb: No replag currently. See also "replag all". [20:09:08] @replag all [20:09:10] jeremyb: [s1] db38: 0s, db36: 0s, db32: 0s, db59: 0s, db60: 0s, db12: 0s; [s2] db52: 0s, db53: 1s, db54: 1s, db57: 1s; [s3] db39: 0s, db34: 0s, db25: 0s, db11: 0s [20:09:11] jeremyb: [s4] db31: 0s, db22: 0s, db33: 0s, db51: 0s; [s5] db35: 0s, db45: 0s, db44: 0s, db55: 0s; [s6] db47: 0s, db43: 0s, db46: 0s, db50: 0s; [s7] db37: 0s, db56: 0s, db58: 0s, db26: 0s [20:11:23] now it suddenly worked [20:11:25] weird [20:11:47] huh [20:13:28] well, i'm off [20:13:52] see ya [20:19:56] !log Running copyFileBackend.php for commons (shard 7) [20:20:06] Logged the message, Master [22:22:03] The lag is going down, which is GREAT! [23:32:29] gn8 folks