[01:03:44] Reedy: who would I speak to for https://bugzilla.wikimedia.org/show_bug.cgi?id=41270? [01:04:22] Reply on the bug? [01:05:10] I mean, who is in charge of the DNS settings? [01:06:35] dns-admin@wikimedia.org [01:06:55] No one specifically [01:07:03] It's why we have an ops team [01:08:28] Specifically, how does the Wikimedia geolookup servers work? [01:08:42] do they use their own database, or some external database? [01:12:27] No idea [01:13:16] Best to ask these questions either during the SF working day, or a european working day ;) [01:22:29] Hello. [01:22:29] http://en.wikipedia.org/w/index.php?title=2012_Atlantic_hurricane_season&action=history [01:22:39] Some users (including me) have had difficulty editing this page. [01:22:57] As you can see, when some edit, the entire page gets deleted for no apparent reason [01:26:05] hi TheAustinMan [01:26:08] andre__: ^ [01:26:31] hm? [01:27:12] TheAustinMan: https://bugzilla.wikimedia.org/show_bug.cgi?id=41280 [01:27:13] TheAustinMan: I am bringing andre__'s attention to your situation [01:27:31] ok [01:27:46] sumanah, and I wonder who else to CC on that ticket. [01:28:01] lemme look [01:34:42] andre__: cc robla, because this is high-priority and caused by the current deploy [01:34:43] checking who else [01:35:06] andre__: I'm using https://gerrit.wikimedia.org/r/#/q/message:edit,n,z to get ideas [01:36:56] andre__: Sam Reed because he is release manager, Daniel Kinzler because he's lead on Wikidata and Aude because she seems to be taking care of a lot of Wikidata stuff this week [01:36:57] sumanah: tried that already but I queried git for "conflict" instead [01:37:04] ah [01:38:02] andre__: maybe adding "status:merged" to the search helps too [01:38:40] Daniel Werner, Alex Monk, Umherrirender [01:38:53] for this bug, that should be enough :) [01:38:58] and now I have to go to dinner [02:53:33] Jasper_Deng: Or you can ask wikitech-l. [02:53:45] Jasper_Deng: I assume you're talking about geoiplookup.wikimedia.org or whatever? [02:53:48] I think it moved to bits. [02:53:48] yeah [02:53:53] There are probably notes on wikitech. [02:54:00] I think they bought a database. [02:54:02] Or a subscription. [02:54:13] But I don't know definitively. [02:55:12] Brooke: Not much on wikitech [02:55:19] https://bits.wikimedia.org/geoiplookup [02:55:20] http://www.gossamer-threads.com/lists/wiki/wikitech/214373 [02:55:29] Probably needs better docs on wikitech and/or Meta-Wiki. [02:55:35] Much like every other Wikimedia service. [02:57:17] yep, it uses WM's own database [02:57:26] In that case, getting an IPv6 database shouldn't be that hard [02:58:39] Jasper_Deng: I'm not sure what you mean. [02:58:45] "WM's own database"? [02:58:52] own geolookup database [02:59:00] (own /instance/, I should say) [02:59:19] I don't believe Wikimedia made its own database. [02:59:29] no we didn't [02:59:35] it's maxmind [03:01:08] https://meta.wikimedia.org/wiki/Wikimedia_services [03:01:39] Ryan_Lane: how much effort would it take to do IPv6 geolookups? [03:01:42] Oh, right. [03:01:43] databases do exist [03:01:50] I was going to e-mail about status.wm.o... [03:03:39] Nimsoft CA or some noise? [03:03:48] It was much cuter when it was called WatchMouse, y'know. [03:08:12] Jasper_Deng, Ryan_Lane: https://meta.wikimedia.org/wiki/Geo_IP_lookup [03:08:44] not much more than what's on Wikitech [03:08:45] Jasper_Deng: If there's a relevant wikitech page, you can add it there. :-) [03:09:08] Well, yeah. I just made that page just now. It'll need time to grow! [03:09:14] just just just [03:12:58] Ryan_Lane: Is IPv6 support for geolookup on the roadmap? [03:14:40] http://www.maxmind.com/en/news_ipv6 [03:15:26] Brooke: So if MaxMind does support IPv6 lookup, what other obstacles? [03:15:42] Cost? Dunno. [03:15:55] I've no idea what that arrangement between WMF and MaxMind is. [03:16:04] It doesn't seem like WM would have to buy new services [03:16:09] May have been a donation; may have been a one-time purchase; may be a subscription. [03:16:45] oh.... software revisions needed too.... [03:16:58] https://www.maxmind.com/en/ipv6 [03:17:24] There's a city database too [03:17:42] I'd imagine, though, that IPv6 geolookups would be highly distorted by tunnelbrokers [03:17:46] I believe Wikimedia is city-level. [03:18:00] I wonder what the business model is here... [03:18:22] If sites like Wikimedia expose the info via https://geoiplookup.wikimedia.org/ [03:18:56] To wikitech-l? [03:19:05] If you're curious. [03:21:03] idk... perhaps after Wikivoyage is sorted out [03:32:10] Brooke: just updated http://wikitech.wikimedia.org/view/IPv6_deployment#Current_IPv6_deployment_status with this [03:32:35] Cool. [03:54:42] Bug? As a steward, I see special pages I should not see in SpecialPages... [03:54:53] However, I cannot use them. [03:54:57] which special pages? [03:55:13] For example, right now, I see Special:LockDB in the list itself. [03:55:16] At enwiki. [03:55:26] hhm... [03:55:36] Along with Special:CheckUser [03:55:39] * Jasper_Deng points Bsadowski1 to bugzilla.wikimedia.org [04:16:57] Bsadowski1: I'm not sure that's a bug. Stewards are intended to be the roots of Wikimedia. They should have access to everything... [04:17:25] yeah, but that shouldn't happen on the technical level [04:17:33] What shouldn't? [04:17:33] is probably a CA bug [04:17:46] Stewards shouldn't see CU by default [04:17:52] How is it a CentralAuth bug? [04:18:00] b/c CA does global groups [04:18:01] I can't "use" it, but I can see the Special page in the list [04:18:20] Special pages not accessible to me aren't visible as they should be [04:18:21] same with LockDB (which should only be viewable by system admins" [04:18:25] )* [04:26:58] strange that Special:LockDB still exists [04:28:35] you know it was in MW 1.1 [04:28:45] wow... [04:29:13] TimStarling: its still around, The rights just aren't assigned in a default install to use it iirc [04:29:23] its the developer group iirc [04:29:28] Which no longer exists. [04:29:43] it exists, just commented out [04:29:45] And it requires a writeable lock file, which has been "broken" on Wikimedia wikis for ages. [04:30:05] p858snake|l: It exists where? [04:30:27] localsettings or defaultsettings i think [04:30:34] Once you remove everyone from a custom user group and comment it out, it doesn't exist. [04:30:47] someone must love it, to rewrite it with HTMLForm [04:30:59] or was that just done with all of them? [04:31:25] TimStarling: For some reason, I see Special:LockDB and Special:UnlockDB in the special pages list :/ [04:31:30] I shouldn't be able to as a steward. [04:31:43] I don't see them [04:31:54] Brooke: depends on your definition of exists, since it is there in the code, i count that as existing [04:31:57] Bsadowski1: try clearing your cache? [04:32:08] Heh. [04:32:16] How is it a cache issue? [04:32:17] actually [04:32:24] I have a problem too [04:32:26] Just one? [04:32:27] GlobalBlock shouldn't show up on my list [04:32:38] Unblock shouldn't show up, and UserRights hsouldn't [04:32:45] It's not showing up as a "Restricted" special page [04:32:52] "Shouldn't" is a strong word. [04:32:53] It's not bolded. [04:32:59] ^ [04:33:07] same symptoms, only on Meta [04:33:14] Nor does Special:Oversight actually [04:33:22] CheckUserLog shows up on my list too [04:33:36] so does CheckUser [04:33:45] but /only/ on Meta [04:34:05] Well, it shows up for me at English.. [04:34:05] * Bsadowski1 checks other wikis. [04:34:33] same problems on enwiki too [04:34:48] to Bugzilla [04:36:31] this is not a security problem, and only affects global groups, correct? [04:36:38] hmm [04:36:40] well, actually idk about global groups [04:36:47] because both of us are part of at least one global group [04:37:29] I know for a fact that Special:CheckUser would not show up for me as a steward. [04:37:59] and neither for me as a mere global rollbacker (with some additional permissions here and there) [04:38:57] https://bugzilla.wikimedia.org/show_bug.cgi?id=41294 [04:38:57] https://meta.wikimedia.org/wiki/Special:GlobalGroupPermissions/Global_sysops and https://meta.wikimedia.org/wiki/Special:GlobalGroupPermissions/steward do not contain CheckUser and siteadmin [04:39:26] neither does global rollback [04:39:29] so.... [04:39:30] hence the bugzilla report [04:40:29] TimStarling: Are those entries on the Restricted special pages thing in the config? [04:42:07] making Special:Specialpages work for stewards is not really that high on my priority list at the moment [04:42:18] Bsaodwski1 , TimStarling: It turns out that it also affects me logged-out [04:42:24] so it's not specific to global permissions [04:42:41] yea but [04:42:43] meh [04:43:02] yes, making it work for anonymous users could be fairly high up there [04:43:45] This is probably not related to whatever user groups one's in [04:53:09] Bsadowski1: Diagnosis - was broken by https://gerrit.wikimedia.org/r/#/c/26357/ [04:53:55] Ah. [04:54:09] Weird I didn't notice it until now. [04:54:35] Good detective work, Jasper_Deng. [04:54:56] don't credit me [04:55:06] see the bug link, the commentor found it [04:55:18] Oh :P [04:55:30] I would credit you, except I already worked that out a few minutes ago [05:05:35] garbage collection is slow... [05:05:58] not sure how it can compress hard-linked objects anyway [05:05:59] maybe it is duplicating them all [05:18:57] Bsaodwski1: Should be fixed now [05:19:01] Bsadowski1* [05:20:28] :) [05:20:34] is it fixed? [05:20:47] Yep [08:15:23] mark: ? [08:15:38] mark: we still have the ipv6 testscript on en.wp I guess we could remove it ? [08:25:49] thedj[work]: yes [08:26:43] mark: thx for the confirmation [13:20:22] hello, please see http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Projektneuheiten#PostEdit_.2A_Syntaxfehler [13:20:46] its about a bug in the js, which is breaking user scripts on wikipedia [13:24:32] se4598: Taking care [13:24:46] http://meta.wikimedia.org/w/index.php?title=Translations:Fundraising_2012/Translation/Ways_to_give/21/ru&action=edit — “[65c7f7b7] 2012-10-23 13:24:10: Fatal exception of type MWException” — what? [13:25:14] other languages that don’t exist give this error too [13:27:06] se4598: looks like it was fixed on friday but not sure it's deployed yet [13:27:08] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/PostEdit.git;a=commitdiff;h=2b77087b8efc598f26205462f7e28e225f021547 [13:27:27] hoo: ^ [13:28:24] aude: Just noticed myself, the code is no longer there... someone just has to synch the extension [13:30:46] http://wikitech.wikimedia.org/view/Server_admin_log [13:31:16] although that is php-1.21wmf2 [13:32:27] dewiki is still on php-1.21wmf1 so could still have the bug [13:32:29] https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=blob;f=wikiversions.dat;hb=9c313a38fd0354ef717458f67ee5c2f26e64be8d [13:33:25] http://bits.wikimedia.org/static-1.21wmf1/extensions/PostEdit/resources/ext.postEdit.js :( [13:34:22] ori-l: ping [14:18:21] enhydra: I'll dig up a stack trace for it [14:19:17] Nikerabbit: [14:19:17] [14:25:00] http://meta.wikimedia.org/w/index.php?title=Translations:Fundraising_2012/Translation/Ways_to_give/21/ru&action=edit — “[65c7f7b7] 2012-10-23 13:24:10: Fatal exception of type MWException” — what? [14:19:17] [14:25:28] other languages that don’t exist give this error too [14:19:59] Reedy, your clock is 14 seconds late [14:20:02] Reedy: can haz backtrace? [14:20:14] oh you are working on it [14:20:14] cool [14:20:35] root@ubuntu64-shell-esxi:/home/reedy# ntpdate 0.uk.pool.ntp.org [14:20:35] 23 Oct 15:20:29 ntpdate[10216]: step time server 176.74.25.243 offset -14.420851 sec [14:20:42] enhydra: it's on a vm, so it's unsuprising [14:20:56] Reedy, any vm can have an ntpd [14:21:03] (can and should!) [14:21:10] it has [14:21:17] I've no idea how often it updates (nor overly care) [14:21:23] so it drifts [14:21:29] weird. is it running? [14:21:38] (just curious) [14:22:11] reedy is awake [14:22:12] According to opendns, pool.ntp.org is the second most requested domain from my IP... [14:22:28] 13:33 < aude> http://bits.wikimedia.org/static-1.21wmf1/extensions/PostEdit/resources/ext.postEdit.js :( [14:22:50] can you update PostEdit on php-1.21wmf1 ? [14:22:54] yeah [14:23:24] ARRRRRRRRRRRRRRRRRRRRRRFFFGGHGGGGGGGGGGGGGGHHHHHHHHHHHHHHHHH [14:23:31] 2012-10-23 14:23:25 mw70 enwiki: [238f8828] /w/api.php?action=parse&page=Special:WantedCategories&format=json&redirects&prop=text%7Cheaditems%7Cdisplaytitle%7Crevid%7Csections Exception from line 104 of /usr/local/apache/common-local/php-1.21wmf2/includes/WikiPage.php: Invalid or virtual namespace -1 given [14:23:35] enwiki is spamming those [14:23:44] lovely [14:23:50] Reedy: and twn gets spammed by the linkcache bug [14:23:56] http://p.defau.lt/?pxOvEcN9GN1w5kTK4sA1kw [14:24:00] 2012-10-23 14:23:38 srv230 metawiki: [6d5c7d52] /w/index.php?title=Translations:Fundraising_2012/Translation/Ways_to_give/21/ru&action=edit Exception from line 16 of /usr/local/apache/common-local/php-1.21wmf2/includes/content/TextContent.php: TextContent expects a string in the constructor. [14:24:10] Nikerabbit: ContentHandler! [14:24:24] oh noes! [14:24:43] aude: you know "LinkCache does not know about this title" [14:24:51] https://bugzilla.wikimedia.org/show_bug.cgi?id=41303 [14:24:52] yes :( [14:25:23] Why the hell is the api getting a fucktonne of requests to parse special pages? :| [14:26:00] aude: is that backtrace bug in translate or contenthandler [14:26:04] it seems that something is not expecting null [14:26:45] Nikerabbit: i think the bug was there previously but it's giving an exception now [14:28:02] aude: so what do we fix? [14:29:24] Nikerabbit: the fundraising thing above? [14:30:05] could someone explain me what is the challenges in allowing wikis to be renamed? [14:30:44] aude: do we not die on null, or do we prevent passing null there, or both? [14:31:14] die on null is always evil :D [14:31:40] basically, dying is bad thing... [14:31:57] unless it's only way to prevent damage [14:35:35] Nikerabbit: do they all/most involved SMW_ParseData as what you post https://bugzilla.wikimedia.org/show_bug.cgi?id=37209 [14:35:54] aude: backtrace is always the same [14:36:10] ok [14:36:43] let me poke at the text content thing first [14:37:07] i'm not sure the correct thing to do in that case for the link cache [14:37:13] without looking at the code there [14:53:28] Reedy: https://gerrit.wikimedia.org/r/#/c/29552/ [14:54:32] if you can check that.... that fixes it there though i'm trying to see if the problem can happen anywhere else in edit page [14:55:16] and more of a minor bug but still one users noticed: https://gerrit.wikimedia.org/r/#/c/29561/ [14:55:45] will in a few minutes :) [14:55:54] these bugs that don't appear on stack trace worry me just a bit more [14:55:54] ok [15:18:20] I got an error on commons by deleting a file (the file is deleted, but there is still a link visible, so something went wrong) [15:18:23] A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was: [15:18:25] (SQL query hidden) [15:18:27] from within function "LocalFile::lock". Database returned error "1205: Lock wait timeout exceeded; try restarting transaction (10.0.6.41)". [15:18:37] and it happened when deleting this file: https://commons.wikimedia.org/wiki/File:Bruno_Pacillo_Di_Ruggiero_%22El_Magnifico%22_cirugano_plastico.jpg [15:19:00] is this a known bug? [15:19:34] That one comes and goes [15:19:46] But Ops are looking at other file related issues [15:20:14] do I have to mention it somewhere? [15:22:09] Nope, not at the moment [15:25:29] okay [15:25:35] thanks for replying that quick :) [15:29:34] could someone explain me what is the challenges in allowing wikis to be renamed? [15:30:34] Time? Effort? Motivation? [15:30:45] I dunno. When was the last time we renamed any wiki? [15:31:08] never [15:31:30] there is a number of wikis that were agreed to be renamed and have been waiting in line [15:32:11] For starters, it looks like renaming databases isn't possible in mysql [15:32:39] So, that'd require dumping and reimporting the database, which will take time, and cause downtime [15:32:39] http://dev.mysql.com/doc/refman/5.1/en/rename-database.html [15:32:52] Otherwise we have to add yet more hacks to handle renames (ugh) [15:32:57] one of the oldest is https://bugzilla.wikimedia.org/show_bug.cgi?id=34217 I think [15:33:25] I realise it is not the easiest task but the problem grows with time as wikis themselves grow [15:34:10] Reedy: Those are mostly small wikis, dumping reimporting those shouldn't take ages [15:34:14] I suppose a wiki can be made read only to be coppied and reimported under the new name but IIRC the issues exeeded such problems [15:34:41] I wish to create a list of problems so that issue can be slowly resolved [15:35:23] Ideally labs would be the place to try this [15:36:46] hoo: true, but nevertheless it's not instant [15:37:05] And the oldest is 8 months? [15:37:07] Pfft [15:37:20] Reedy do you have the time to list the problems on some wiki page? perhaps meta? [15:37:43] TBH, the whole process should be fairly simple [15:38:00] the oldest is more like 6 years https://bugzilla.wikimedia.org/show_bug.cgi?id=8217 [15:38:58] bugzilla does not have a special field for this [15:39:02] they are marked as enhancement or something [15:39:18] perhaps wiki rename maybe a worthile new option on bugzilla [15:39:26] because it is hard to track them [15:39:29] just a thought [15:40:03] https://bugzilla.wikimedia.org/19986 is the corresponding tracking bug [15:40:08] Bug 19986 - Wikis waiting to be renamed (tracking) [15:40:37] oh [15:40:37] that works too [16:52:59] brion do you have a minute? [16:53:39] at https://bugzilla.wikimedia.org/show_bug.cgi?id=19986 domas was mentioned as a reply to your remark [16:53:42] did this happen? [16:53:55] is there an update for this bug? [16:54:01] it is for wiki renaming [16:56:39] lemme look [16:56:41] ugh renaming [16:57:02] yeah afaik it's still tricky to do, don't know if anybody's gotten to it yet [16:57:57] no they havent [16:58:07] tere are request waiting going back 6+ years [16:58:57] perhaps it could be a priority task since the problem gets worse with time as wikis grow [17:00:20] ToAruShiroiNeko: we had a doc written a long time ago https://wikitech.wikimedia.org/view/Rename_a_wiki [17:00:56] ToAruShiroiNeko: that would be a nice project for sure but I guess it is low priority [17:03:42] hashar I see, my worry is that it wont be ever implimented at this rate [17:05:29] ToAruShiroiNeko: so you will need to get raise the issue in the community [17:05:31] get your voice heard [17:05:44] and make that a higher priority [17:06:00] thats fine but the community already wishes this [17:06:07] there are so many bugs [17:06:54] 14 rename requests exist I believe [17:07:14] 3 being for chinese wikis [17:07:29] err 1 being for 3 chinese wikis [17:07:35] Hello ToAruShiroiNeko. [17:07:36] so 16 wikis [17:07:37] talk about it on wikitech-l ToAruShiroiNeko [17:07:42] hello Dereckson [17:07:49] that is really the best way to get your voice heard [17:07:49] hashar good idea, will do :) [17:08:18] I am sure you will get lot of tip and hints to get that project running [17:08:18] What hashar means is you need developers to be aware of the rename need as an normal / high priority and not as a low one item. [17:08:35] Or better, find someone who would code it. [17:09:10] And wikitech-l is a good place, as theorically, it's what people interested by MediaWiki/Wikimedia development read. [17:09:15] ToAruShiroiNeko: you can point people to https://wikitech.wikimedia.org/view/Rename_a_wiki (I have added the link to bug 19986 [17:09:34] Dereckson: there's nothing really to code.. [17:09:51] it is more about writing a nice "how to" :-] [17:09:58] test it [17:10:00] test it again [17:10:12] eventually try out in prod :-] [17:11:58] Reedy: I thought about a script automating the different steps. I imagine we've to create a new db, import old db tables, replace old name by new name in config files, etc. [17:12:15] stupid labs [17:12:22] Config is just find and replace [17:12:29] found out that mwdeploy is not in the same group as the beta operators :-] [17:12:35] arghgh [17:12:49] As for the database, it can be done with a 1 liner [17:14:01] (speaking about wikitech, have you created my account?) [17:16:00] out for now [17:21:15] sorry got distracted by dinner [17:22:17] Dereckson: No, need a username and email address to use [17:22:46] I just subscribed to the mailing list and waiting for approval :/ [17:23:18] oh just email confirmation [17:23:18] nice [17:47:03] its on the mailing list now [17:56:46] Reedy: when you get a chance, some patches to look at [17:56:47] https://gerrit.wikimedia.org/r/#/c/29552/ [17:56:55] https://gerrit.wikimedia.org/r/#/c/29597/ [17:57:07] https://gerrit.wikimedia.org/r/#/c/29616/ [17:57:51] the last two don't seem as urgent and first one, if you want to double check it, that would be nice [18:03:39] aude: will do [18:09:27] Reedy: ok [18:09:38] * aude is going home but will be online again in a bit [18:28:02] hi binasher. aaron was saying over at https://bugzilla.wikimedia.org/show_bug.cgi?id=39675 not to invest too much time with the archive table since some major changes might be on the way. so, what I'm thinking now is that I should just add the proposed fields, add capability for ApiQueryDeletedrevs to retrieve those properties, and submit that patch. There would be no queries (yet) requiring those fields to be indexed, alt [18:28:02] depending on what other features people want in those special pages and API. So, I guess just don't add any indexes for now? [18:30:04] i would still like to have a primary key added. without that, i can't perform online schema changes, i'd like that to be fixed if i have to go through the trouble of running a migration on that table [18:31:00] oh, so index the primary key, but not the other stuff (e.g. ar_logid, etc.) by the way, do you care if it's ar_logid or ar_log_id? we have an rc_logid so i wasn't sure which convention to follow [18:31:48] binasher: [18:33:33] rc_logid isn't a PK [18:33:50] leucosticte: i think ar_log_id [18:34:56] is that going to be auto increment? [18:37:10] https://bugzilla.wikimedia.org/show_bug.cgi?id=15441 Some tables lack unique or primary keys, may allow confusing duplicate data [18:38:28] binasher, Reedy: ar_log_id will be a dupe of logging.log_id, and will be generated by WikiPage::doDeleteArticleReal. Is that what you were wondering would auto-increment? Also, should those other fields, ar_log_timestamp, ar_log_user, ar_log_user_text, and ar_log_comment be added, or should we just let people query the logs separately for that data? [18:40:05] right now, i'm not adding anything that would run queries with those fields in the WHERE clause, but they could be properties returned by ApiQueryDeletedrevs [18:44:54] leucosticte: wait, the bugzilla ticket proposed adding an ar_id column as the pk [18:45:18] ar_log_id != pk [18:45:47] binasher: right, and then there was going to be ar_log_id as extra info, in case someone wanted to filter or sort by deletion action [18:46:08] would ar_id be auto increment? [18:46:51] I was going to use ADD COLUMN ar_id int unsigned NOT NULL PRIMARY KEY AUTO_INCREMENT, [18:46:53] binasher: [18:49:57] ok [19:19:00] Reedy: was there any updates on the fatal errors of http://meta.wikimedia.org/w/index.php?title=Translations:Fundraising_2012/Translation/Ways_to_give/21/ru&action=edit ? [19:19:57] I didn't do anything about it... [19:20:41] I wonder if it's one of audes commits.. [19:22:40] Nikerabbit: https://bugzilla.wikimedia.org/show_bug.cgi?id=41303 https://gerrit.wikimedia.org/r/#/c/29597/ [19:40:10] Reedy: Nikerabbit it's the TextContent constructor one [19:40:10] it expects a string [19:40:39] https://gerrit.wikimedia.org/r/#/c/29597/ [19:41:01] i made it a little nicer for these use cases [20:23:57] AaronSchulz: Any idea why populateFilearchiveSha1.php would seemingly just stop on enwiki? id 1540046 done (up to 2504918), 61.481% [20:28:38] not really [20:33:41] Reedy: does the script run as wikiadmin? [20:34:10] Not sure [20:34:19] It was working fine for the first 61% [20:36:20] Reedy: is it on hume? [20:36:53] i was running it on fenari [20:37:02] tried it on hume to the same end [20:37:06] so you killed it? [20:37:07] Reedy: yo ... pm [20:37:10] I saw same wikiadmin processes go away [20:37:17] *some [20:44:14] hello [20:44:25] hi Ravingmad_ [20:44:37] i know this is not the best place to ask my question but someone might have the answer [20:44:43] can anyone give me information about "closed shell system" please ? [20:45:39] Ravingmad_: what kind of information do you need? if you just want a definition you can probably use Wikipedia or a search engine, right? :) [20:46:45] yep .. but i cant find a definition for it .. some two lines information about what the closed shell system .. who uses it and a simple idea on hhow to reach it ? [20:47:28] Ravingmad_: since that doesn't have to do with *Wikimedia* technology specifically you should probably find someplace else to ask this question [20:48:41] yeah i guess you are right .. however i cant find anyone in any chat room who can answer this .. can you recommand me a chatroom please ? [20:50:39] Ravingmad_: you should read http://catb.org/esr/faqs/smart-questions.html and you might want to try #linux - no guarantees but they can probably point you somewhere. [20:50:58] thanks very much :) [20:51:07] You're welcome. Good luck [20:52:15] sumanah: It looks like a 4chan joke ;-) [20:52:41] multichill: What does? [20:53:34] "Closed Shell Systems", see http://i.imgur.com/Zay5g.png [20:53:52] thx multichill ill check the link now :) [20:54:18] what's this :) ? [20:55:35] I'm ... whatever the opposite of enlightened is. I'm enblightened. [20:55:39] Thanks multichill [21:07:14] Exception from line 104 of /usr/local/apache/common-local/php-1.21wmf2/includes/WikiPage.php: Invalid or virtual namespace -1 given. [21:07:16] Why is this happening again? [21:48:16] mutante: ☕? [21:55:32] that smart-questions guide has some interesting perspectives. E.g., it says to ask questions to mailing lists rather than individuals. My thought was always that if my question turned out to be dumb, then it would limit the annoyance of others and my own embarrassment if I'd only sent it to one person. And if the docs were on a wiki, I could update them myself with the answer, so it wouldn't be necessary to give the whole [21:55:33] "the complete picture of what questions are asked most often" [22:34:09] gn8 folks [23:04:05] fsck the caches