[00:17:03] in the CentralNotice are country names used in the language of the preferences of someone, does anyone known how I can use them elsewhere too [00:17:03] ? [00:17:36] Presumably via CLDR? [00:17:58] how does that work? [00:18:15] Where are you trying to use them? [00:18:24] It's all in PHP code... [00:19:18] I want to create a multi-langual table on Commons [00:20:20] I suspect then the answer is currently some variation of "you can't" [00:20:33] ah :p [00:20:52] couldn't find it in trabslatewiki [00:22:03] File a bug? [00:22:38] It could possibly be exposed via JS or some parser function type thing [00:22:44] Though, it wouldn't cache so well [00:24:28] under which product should I place it? MediaWiki or MediaWiki extensions? [00:24:54] File it as an enhancement for MediaWiki extensions -> CLDR [00:25:07] as that's where the data comes from [00:31:13] Reedy: https://bugzilla.wikimedia.org/show_bug.cgi?id=53786 [00:35:21] thanks for the help [06:54:30] hi! is there any way to get statistics about how many users are using a particular gadget? (for sv-wikt) [06:55:58] yeah, you can ask a toolserver user to run a query [06:56:39] what query? [06:57:08] (I'm a toolserver user) [06:57:16] * legoktm looks [06:58:28] select count(*) from user_properties_anonym where up_property='propname'; [06:59:08] where propname is one of https://dpaste.de/SKcUU/raw/ (thats from svwiktionary_p) [06:59:42] oh wait [06:59:44] it would be like [06:59:50] select count(*) from user_properties_anonym where up_property='propname' and up_value=1; [07:01:05] ok thanks. I' haven't been using mysql on toolserver very much, but I think I'll figure it out [07:01:53] i can run the query if you want [07:02:03] but it would be $ sql svwiktionary_p [07:02:06] then type in the query [07:02:08] and wait :) [07:06:08] so: does select count(*) from user_properties_anonym where up_property LIKE 'gadget-avaktivera_nytt_uppslag'; produce the number of users using that gadget? [07:08:13] select count(*) from user_properties_anonym where up_property = 'gadget-avaktivera_nytt_uppslag' and up_value=1; <-- should do it [07:10:08] yeah... all of them had up_value=1 anyway [07:10:15] what does ts_user_touched_cropped mean? [07:15:28] i think thats randomizer or something [07:15:41] oh no [07:16:01] thats the value of "user_touched" [07:16:01] but its cropped to prevent privacy [07:16:01] er, for privacy [07:16:01] that way you can kind of select to use results for only current accounts or something [07:16:05] what does "cropped" mean here? [07:16:34] is that more or less when the user was last active? or when s/he changed the pref? [07:17:09] https://www.mediawiki.org/wiki/Manual:User_table#user_touched [07:17:26] normally thats a full timestamp, but it is cropped to YYYYMM [07:17:52] it sure seems to include DD (YYYYMMDD) [07:18:54] oops yeah, i misread [07:21:55] sorry... my program crashed [07:22:45] but I don't get why so many ts_user_touched_cropped have 20130406 (https://dpaste.de/uys1k/raw/) [07:22:50] it doesn't make sense [07:24:12] thats when they last updated their preferences i guess? [07:24:57] the sample size is that big, but why would everybody update at the same time? [07:27:56] no clue [07:28:05] note that it also updates when a user receives a message on their talk page [07:28:07] maybe it's just that it goes 5 months back and no more [07:28:36] can a tech pleas fex the datbase problem on commons....? [07:29:02] so those haven't *actually* edited on 20130406 - they just haven't edited since [07:33:41] nope - at least for other wikis it goes back further [07:36:13] i cannot delete a picture *lalalalal* [07:36:22] commons bd broken :/ [07:42:20] Steinsplitter: inculding a error would be extreamly useful [07:45:31] https://bugzilla.wikimedia.org/show_bug.cgi?id=53770 The target filename is invalid [07:46:44] we hav a lot of such problems in the last weeks (and this is indeed a blocker) [07:51:08] p858snake|l: why wmf dos not fix this problems. i am angry... [07:51:31] the techs ar deleting the immage manually from the db... (the last time).... [07:51:45] but wee need to fix the problem [08:29:25] an other new media storage bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=53768 [08:30:29] increase of internal_api_error_DBQueryError (moving files), [08:30:29] internal_api_error_FileBackendError (deleting files) (all via API) at Wikimedia [08:30:29] Commons [08:30:30] :/ [08:36:10] Steinsplitter: I attached to the bug report the server side exceptions [08:50:31] hashar: :-) [08:56:20] Steinsplitter: not much I can do [08:57:09] hashar_: i know [09:08:24] hashar: is this ok? O_o https://ganglia.wikimedia.org/latest/?r=month&cs=&ce=&c=Miscellaneous+eqiad&h=lanthanum.eqiad.wmnet&tab=m&vn=&mc=2&z=medium&metric_group=ALLGROUPS [09:08:54] Nemo_bis: nop [09:09:12] Nemo_bis: thank you very much to have noticed that issue [09:09:20] Nemo_bis: that is a kernel bug I am afraid :( [10:03:39] hello, I have a recurring problem where connections to the resource loader on bits.wikimedia.org fail to complete. is this a known issue and is there anything that can be done about it? [12:43:52] bugzilla admin araound? :-) [12:45:14] Steinsplitter: andre__ maybe. Else you can fill in a bug against Bugzilla product in bugzilla :-] [12:45:40] Steinsplitter, yes, around [13:18:18] lag lag lag [13:23:15] where? [13:25:35] @lag all [13:28:01] closedmouth: that might be paravoid's accident, but it shouldn't be anything major [13:39:22] hi, thumbnail server is down? [13:41:01] what? [13:41:31] https://commons.wikimedia.org/wiki/Special:NewFiles does not show one single file [13:41:41] paravoid: ^^ [13:41:41] From Europe, at least [13:42:05] * does not show one single *thumb* [13:42:37] looking [13:43:14] http://commons.wikimedia.org/wiki/Special:NewFiles [13:43:25] error generating thumbnail :( [13:43:58] don't show thumb images [13:44:02] de.wp users say they have no thumbs since 15:00 UTC+2 [13:44:40] only new upload images [13:44:43] FWIW some old files are also affected https://commons.wikimedia.org/wiki/File:DWD_Erfurt-Weimar_1988_10554.svg [13:44:50] yep, "error generating thumbnail" for me too [13:45:22] :( [13:46:31] Fix it! [14:00:31] Fix it! Whatever happened, it happened at 12:43, 5 September 2013 [14:02:36] https://commons.wikimedia.org/w/index.php?title=Special:NewFiles&dir=prev&offset=20130905124243 [14:06:46] it's being investigated in #wikimedia-operations, it seems. [14:13:24] fixed, thanks paravoid [15:02:10] apergos, parent5446: hello [15:02:16] hello [15:02:36] hey [15:03:25] so, i started testing trwiki, but it's 100 GB of uncompressed XML, so it's going to take a while to run it [15:03:45] right [15:04:55] i also tested current dumps, and i think that the old primitive way (LZMA separate) is probably the best option [15:05:12] for current, seems likely [15:05:23] with LZMA in groups, updating the dump would require quite a lot of recompression [15:05:34] it sure would [15:05:58] and zdelta is only slightly better than LZMA separate [15:06:28] yeah I can't imagine it would make much difference [15:07:24] what hardware are you running on? [15:10:06] I ask because on our hardware we ge tthrough the entire tr wiki dump from scratch (well with prefetch) in < 1 day [15:10:13] my home PC, it's quadcore intel i5 2.8 GHz [15:10:21] memory? [15:10:50] the recompression step (bz2 -> 7z) is about 5 hours [15:10:53] 8 GB, but the dumps don't use that fully [15:10:59] no they wouldn't [15:11:21] they are intended to be, if not efficient, at least not dreadfully inefficient, so we can run several on a box [15:12:10] anyways lemme know if you want me to build and run something, I can stick it on our larger server in tampa which is currently idle [15:12:27] though until today, i had all nodes of indexes in memory, which won't work for these big dumps [15:12:36] nope! :-) [15:12:52] which is why benchmarking these is going to give us some good data [15:13:01] they are the ones that are most painful for everyone after all [15:13:12] yeah [15:13:16] (tr wiki isn't that huge, it's sort of mid-sized) [15:14:11] though what i'm doing now is creating the whole dump from scratch, which won't be the usual situation [15:14:19] updating existing dump should be much faster [15:14:24] (but that's the whole point) [15:14:27] right [15:14:59] however if disaster strikes we need to know we can generate them without taking 3 months [15:15:08] right [15:16:23] is there anything you can work on in the meantime while you wait for the tr tests? [15:16:36] yeah, what i want to test next is how LZMA with groups behaves for updates [15:16:49] do you know about small, but active wiki that i could use? [15:17:02] (tenwiki doesn't chnage, AFAIK) [15:19:00] right, tenwiki is closed [15:19:01] mm [15:20:14] it doesn't have to be as tiny as tenwiki, but trwiki is way too big [15:21:07] el.wikinews, it has a few changes every day anyways [15:21:17] non latin script which you might not love [15:21:42] <^d> apergos: Yo. What you guys looking for? [15:21:59] small wiki with people that make edits regularly, for dump testing purposes [15:22:31] mediawiki? [15:22:35] the script shouldn't be a problem, i don't need to read it [15:23:11] <^d> apergos: simplewiktionary [15:23:14] es wikiversity i anther good one [15:23:40] <^d> Has daily edits, < 30k total pages across all namespaces. [15:24:01] that's even smaller, sure [15:24:12] <^d> Reedy: mw.org is kinda medium sized. It has > 100k pages. [15:24:20] <^d> Like 130k-ish, I believe. [15:24:25] mw is too big for this [15:24:35] https://noc.wikimedia.org/conf/highlight.php?file=small.dblist [15:24:54] what determines 'small'? [15:25:09] <^d> If it's not big. [15:25:15] :-/ [15:25:42] ss_total_pages < 10000 [15:25:50] i guess elwikinews should work okay for this [15:25:53] ah ha [15:26:21] Not sure when it was last updated. But it won't be too far out [15:26:28] if not es wikiversity or simple wiktionary [15:26:40] <^d> simplewiktionary is my favorite data to test with. [15:26:41] oh, not a daily cron? [15:26:44] <^d> I can still read it :p [15:26:53] I usually non latin script test so el something [15:26:55] <^d> Decent enough data sample to work with. [15:29:27] sumanah: You should try using SASL so that your cloak is applied immediately. Although it doesn't matter too much unless your sensitive about privacy. [15:29:35] *you're [15:29:56] parent5446: you are right! I should [15:30:06] once upon a time I looked it up and it was too much hassle, but I should try again [15:30:32] * sumanah looks up "xchat sasl" [15:30:43] Mhm. It's a hassle, but once it's set up you don't have to deal with it. [15:31:13] akosiaris: etherpad down, do you why? [15:31:23] Oh, hey, Sumanah. [15:31:36] hi AdamCuerden. Thank you for your recent Signpost work! Has it been enjoyable? [15:31:39] apergos: hmm, the zdelta version is still running out of memory for trwiki, i have to look at that [15:32:03] otherwise, i can't think of anything else for today [15:32:27] I got nothin [15:32:29] parent5446: ? [15:33:18] Yes, actually [15:33:38] !log etherpad.wikimedia.org down (error 503) [15:33:41] Logged the message, Master [15:33:44] I mean, you get access to a lot of things through it. [15:34:08] Nope I'm good [15:34:34] And I've come up with a plan for slow tech weeks like this one [15:34:43] svick, apergos: See you both tomorrow. [15:34:53] ok, see folks [15:34:55] bye [15:36:16] AdamCuerden: oh? I'd like to hear more [15:36:44] AdamCuerden: a couple years ago, Harry Burt interviewed the GSoC students during their internships - that might be an interesting thing to do for a few of the weeks in Sept [15:36:49] Well, there's a lot of underused features on Wikipedia, I think you'll agree [15:37:01] Yes [15:37:16] So I'm getting people who are experts on them to write tutorials. [15:37:27] Coooooool [15:38:46] This week has a tutotial on Accessibility; a future Signpost - depends on how much tech news there is in the enxt few weeks - is going to handle Guided Tours [15:40:36] Nice! [15:40:41] Also planning to corner a LilyPond developer and get a clear tutorial on importing LilyPond into Wikipedia; the format is *nearly* identical, but you have to reformat some of the headers slightly. [15:40:56] good idea! [15:41:12] I epect that one to get a copy-paste into help files [15:42:28] You know, I can't help but think you're damn good at finding people to do things who'll actually care about doing it well. =) [15:43:11] You're too kind. :-) [16:40:56] Elsie: are you able to load http://status.wikimedia.org/ ? does it say something of interest? [16:41:50] works for me [16:42:01] everything is "Service is operating normally" [17:10:27] manybubbles: will you be in SF next week, week after? [17:10:53] chrismcmahon: next week including monday after lunch [17:12:39] manybubbles: great, I'm hoping we can talk about browser tests with Zeljko too. I've been running some CirrusSearch tests this morning to get caught up. I found at least one glitch, I don't think the search for "JavaScript disabled" is giving the result you expect. [17:13:43] chrismcmahon: cool! that one is problematic because chad merged it and we let the fix for it languish forever. the fix is still waiting in gerrit for a conversation next week [17:13:59] but I'd love to talk in person about them [17:15:23] manybubbles: I'd still like to think about some refactoring now for future maintenance, but you've been moving pretty quickly, I'm just hand-waving right now, I'll have some concrete points for next week. [17:16:14] chrismcmahon: I certainly understand. I'm adding regression tests for everything as I go and I spend all day fixing things so I'm probably hard to keep up with. [17:18:30] manybubbles: the Language team has been doing browser tests for UniversalLanguageSelector, and we're slated to talk about design with them next week as well. I'm looking forward to it, it's nice to see this framework being used by other people. I'd like that to continue long into the future. [17:41:44] anyone know how to deal with an "unpacker error" on git review? " ! [remote rejected] HEAD -> refs/publish/master/bug/35981 (n/a (unpacker error))" [17:42:15] RoanKattouw: ^ [17:42:30] WTF [17:42:35] I've never seen that one before [17:42:39] Me neither [17:42:42] (happened to me too) [17:43:12] StackOverflow suggests permissions issues on the git server [17:43:16] * RoanKattouw goes to take a look on ytterbium [17:43:54] <^d> No, that's not it. [17:43:58] RoanKattouw: Yeah, all the stuff I found on Google suggests the same [17:44:01] <^d> I'm already looking. [17:44:04] OK [17:44:17] <^d> demon@ytterbium:/var/lib/gerrit2/review_site/logs$ grep -i 'missing unknown' error_log | wc -l [17:44:17] <^d> 353 [17:44:29] <^d> ^ That's what's happening. [17:45:27] Thanks for investigating Chad [18:02:39] <^d> RoanKattouw: So, gerrit's complaining about particular sha1s missing. Seems to coincide with people's various complaints. [18:02:57] <^d> Refreshing a few times seems to make them appear, can't replicate 100% of the time. [18:03:47] <^d> Ah dammit, I see what I did wrong. [18:04:11] <^d> I wasn't replicating refs/*:refs/*, I was defaulting with mirror => true. [18:04:13] <^d> Insufficient. [18:04:15] <^d> Grrr. [18:04:17] <^d> How to fix. [18:04:35] <^d> Ah, I know. [18:13:19] <^d> Well that didn't work. [18:13:47] <^d> But they all seem to work after sufficient reloading. [18:13:56] * ^d blames a cache [18:24:47] By the way, anyone have any code they'd like to have featured in next week's Tech report? [18:25:06] AdamCuerden: btw, the Campaign stuff is going to be deployed next week. [18:25:38] Awesome. I think that'll be worth a writeup when it launches. [18:25:46] I mean, the big writeup, at the top [18:30:04] It'll affect a quite number of people, eventually. [18:31:51] It will depend on when it launches, mind: I need enough time to write it up once I can play with it =) [18:32:03] But that's basically which of two weeks [18:34:59] AdamCuerden: :D it will go live on commons on Tuesday [18:35:32] AdamCuerden: I heard you were planning on doing a feature on the GuidedTour extensions? [18:40:05] Yes [18:40:17] That'll be the next week without a big release [18:40:24] ah, nice [18:41:10] I figure that there's not going to be a major release every week to lead with, but there's a ton of underused features out there. [18:41:41] So why not? [18:41:52] AdamCuerden: makes good sense! [18:42:08] The one disadvantage is I have to find out about them, but... [18:42:19] hang around on IRC enough.... :D [18:42:22] Ay [18:42:34] AdamCuerden: you should probably hang out in #wikimedia-dev too [18:42:47] Good idea [18:43:28] And, of course, I watch the tech reports and the mailing lists [18:44:16] The only fault in the tech reports for my purpose is that they contain a lot of things that aren't necessarily big enough for a general audience. [18:44:41] AdamCuerden: true, but then they won't miss out on the big things [18:44:45] Well, that and I sometimes need slightly more information. [18:44:46] Aye [18:44:57] Mind you... [18:45:11] That said, I have gotten some good tips that aren't on there. [18:45:21] like GuidedTour? [18:45:33] Well, that wouldn't be expected to be on there. [18:45:43] yeah, been there for a while [18:45:49] But they don't cover things like gadgets or site-specific things. [18:46:06] AdamCuerden: aaah, true. [18:46:14] So they're a great start, but I still need to ask around. =) [18:46:36] Reminds me. I should check for Wikidata news as well [18:46:56] I'm cheating on that one. I just read Gerard's blog [18:47:11] hah! [18:47:20] they've a check channel too [18:56:08] What is their channel? [18:56:22] I should probably add them as a source [18:56:27] AdamCuerden: let me find, moment [18:56:52] AdamCuerden: #wikimedia-wikidata [18:57:10] AdamCuerden: something I can help with? [18:57:15] <- Wikidata admin. [18:59:12] Yes hi, hows it going? [19:00:46] Moe_Epsilon: legoktm AdamCuerden is doing the Signpost tech reports, and is trawling around to catch new stories. [19:01:03] I know :) [19:01:56] AdamCuerden: https://www.wikidata.org/wiki/Wikidata:Status_updates#footer [19:02:08] they're published every friday [19:03:55] Ooh, that's really useful [19:04:10] I'm trying to keep updates on Wikidata in the signpost [19:04:23] Sorry, was talking to Gerard =) [19:04:33] :D [19:06:01] AdamCuerden: you also probably want to talk to Lydia_WMDE, she knows about all the stuff thats going on. Main thing right now is the url datatype is being tested in test.wikidata.org, and will soon (hopefully!) be live on wikidata itself [19:07:05] I take it the URL datatype will allow, for example, one to set a standard value for someone's main website on themselves? [19:07:54] ...You know what? I don't think the Signpost has ever really covered, in simple terms, what Wikidata is. [19:08:22] Or, if it has, it's long enough ago that it's time for a refresher [19:08:34] * AdamCuerden pencils that in as a guest article to seek out. [19:08:37] we're approaching the 1 year anniversary very quickly [19:08:39] :D [19:08:48] When is the 1 year anniversary? [19:08:58] Revision as of 15:50, 25 October 2012 (edit) [19:09:15] Right. I'm just going to pencil that into my notes. [19:09:51] url datatype will allow any property to be created that links to a url. so a property might be "website". mainly this is going to be used for references, so we can source webpages [19:09:54] You realise, though, that by telling me that, you've delayed the Wikidata spectacular about a month. [19:10:12] now you just have to make it more spectacular ;) [19:10:14] *nod* Well, I did figure it would have a variety of uses. [19:12:24] I'm putting in the Wikidata datatype news, by the way [19:12:52] :) [20:10:14] Reedy: Can you confirm that you deployed what YuviPanda wanted you to? [20:10:22] marktraceur: yes! [20:11:24] Sexy. [20:11:27] greg-g: ^^ [20:23:01] * greg-g nods [20:32:09] Does anyone have know what the file maintenance/dictionary/mediawiki.dic is for? [20:32:55] Reedy: ^ [20:33:11] sorry about the bad grammar :) [20:33:55] o_0 [20:33:58] I like to end my sentences with prepositions for for for [20:46:18] kaldari: If I had to guess, captcha images *or* random image names? [20:47:21] Hm, no [20:47:22] that sounds like a far-fetched guess, but I have no idea [20:47:31] I hadn't looked at the file [20:47:37] That looks ... odd [20:47:42] And it's not referenced anywhere [20:47:53] and it has a bunch of non-core stuff in it [20:48:00] it's very mysterious [20:48:07] NSA [20:48:11] * greg-g rus [20:48:13] +n [20:48:24] (they tried to steal my n) [20:48:39] must be a list of back door passwords [20:49:10] I'll jsut delete it [20:49:14] whew [20:49:42] Is it tracked? [20:50:00] Huh, it is [20:50:41] and all a sudden, my laptop is running slow (took about 30 seconds to load gedit, I have a freaking ssd man) [20:57:21] apparently mediawiki.dic is just a dictionary file for IDEs [20:57:29] it's not actually used by MediaWiki itlsef [20:57:37] marktraceur: ^ [20:57:48] Hah [20:57:57] kaldari: Someone checked it in accidentally? [20:58:05] marktraceur: yurik checked it in intentionally [20:58:09] heh [20:58:14] Pffahahaha [20:58:27] marktraceur: apparently every other person here uses phpstorm, which uses that [20:58:39] "every other person"? [20:58:50] 1 in 2 [20:58:53] Show of hands? [20:58:54] or so i was told. [20:59:07] never heard of it :) [20:59:10] * marktraceur plays a recording of crickets [20:59:11] <^d> This totally sucks. [21:00:09] so, kaldari, the disambiguations thing can be merged, right? [21:00:11] <^d> Anybody around who's good with git? [21:00:21] MatmaRex: should be [21:00:26] ^d: yeah, ^d is [21:00:27] ^d: oh wait... [21:00:34] okay, let me just verify and merge [21:00:41] Yay! [21:00:43] gah [21:00:47] git review fails [21:00:50] * MatmaRex glares at ^d [21:01:10] <^d> Well git review is always full of fail, but yes I know. [21:01:18] let's do it the old-fashioned way, then [21:04:02] actually, kaldari [21:04:12] kaldari: can you fix the link in release notes to be https? :D [21:04:20] sure... [21:04:41] kaldari: and what is that "Dependency: " thing in cpmmit message for? [21:04:46] <^d> greg-g: What do we do when I don't know what to do? [21:05:02] ^d: rm -rf [21:05:10] ^d: we ask joey hess? [21:05:10] :) [21:05:16] it creates a dependency for the merge [21:05:37] <^d> I asked #lame_upstream_channel but they didn't respond. [21:05:42] in this case making ProofreadPage not depend on the core code if I remember [21:06:13] kaldari: as in, gerrit handles this? [21:06:45] yeah, either gerrit or jenkins [21:07:05] hm. nice, i didn't know about it. [21:07:25] shit that http->https pushes the char count for the column to 81 :) [21:07:39] ignooore it [21:07:49] or change ") (" to "; " [21:08:11] or drop the www. :D [21:08:12] I don't want Krinkle to be mad :) [21:08:20] I just wrapped the line [21:08:32] * YuviPanda makes a commit with 119 chars [21:09:01] 119?!! Inconceivable [21:09:13] pep8 has been modified to say upto 120 is okay :P [21:09:13] so [21:09:37] kaldari: +2-d [21:09:42] * MatmaRex hides [21:09:47] I'm not going to be the first person to commit a > 80 char line to the Release Notes :P [21:09:57] I'll leave that to someone braver [21:10:14] Reedy, what about a special pages manual update :P [21:10:27] last update was 24th Aug [21:10:48] 100, not 120 [21:10:53] kaldari: the up to 120 isn't applicable to non-code anyways :) [21:11:00] kaldari: you wouldn't be, i think [21:11:40] e.g. 20th line has 96 characters :D [21:11:48] yikes ;) [21:12:16] the longest is 102 [21:13:27] it feels good to remove 230 lines from core :) [21:14:24] :) [21:14:32] :D [21:14:48] kaldari: while we're removing codeā€¦ [21:15:05] kaldari: https://gerrit.wikimedia.org/r/#/c/80716/ ? [21:16:13] MatmaRex: Can you add a TODO comment to the Nostalgia extension? [21:19:56] there are skins that don't use SkinTemplate? [21:28:03] ^d: gerrit is still having issues? [21:29:09] <^d> We're still trying to figure things out, yes. [21:29:30] ok :/ [21:48:15] AdamCuerden: sorry - am traveling with bad internet - as others said url is most important atm and maybe the interview gerard did [21:56:18] James_F: Just closed bug 6754 :) [21:58:41] kaldari: Ha. :-) [21:59:11] only took 7 years [22:06:55] !b 6754 [22:06:55] https://bugzilla.wikimedia.org/6754 [22:07:33] Cool! [23:37:47] +b *!*@wikimedia/Jamesofur$##FIX_YOUR_CONNECTION