[00:05:26] hello, how can i export a wikipedia article and inmported to my mediawiki on my server ? I have the latest mediawiki installed however every time i try to export a wikipedia article and imported to my mediawiki it displays with errors in the page do to the template i believe. even when i choose to export including the templates i still get the same errors once i import it. [00:19:31] I'm assuming some of those templates use lua which won't export. [07:46:20] who wants some bug spam? :) "Bug 9436 - Vertical writing support in MediaWiki" ( https://bugzilla.wikimedia.org/show_bug.cgi?id=9436 ) being done sure sounds like it would be useful, huh? ... [09:04:05] Does the Dutch API, i.e. http://nl.wikipedia/w/api.php?action=query&prop=coordinates&format=json&colimit=500&generator=allpages&gaplimit=max returns the JSON response with field names in english ? Because I see that the Dutch wiki doesn't return any coordinates on that API endpoint [09:04:49] i.e. no a single article has the 'coordinates' field [09:04:51] *not [09:09:42] there's no such thing as a dutch API [09:11:09] indeed :) [09:15:03] from the top endpoints only the Dutch one doesn't give any coords [09:16:18] (at least from the ones I've tested) [09:17:48] the generator=allpages only generates pages from the specified language right? [09:23:58] Is there an API to get the number of articles from each langauge Wikipedia is available in ? [09:24:08] s/from/for [09:49:34] Does anyone know what the throttle limit are on logins? :> [09:51:47] heh, would you look at that, it tells you in the message ;p [11:08:22] andre__: I made Ironholds post a screenshot how it should look like [11:30:18] thanks [12:17:31] andre__: is there a way to filter all bugmail from some components, disregarding other settings like being the reporter, cc or voter, or watching a user? [12:19:54] Nemo_bis: filter in your bugmail? yes, by using the bugmail headers. subscribing in Bugzilla's UI? not yet, but I want that: https://bugzilla.wikimedia.org/show_bug.cgi?id=37105 [12:34:00] kaulen (bugzilla) being rebooted in 5 minutes. [12:38:27] andre__: does it allow unwatching too? [12:41:20] Nemo_bis, what does that mean? [12:42:39] andre__: unwatching a component so that I don't receive notifications about activity on bugs in that component I'm cc'd to etc. [12:43:16] Ah. No, but an ignore list is an interesting idea that you could file upstream [12:43:59] Is the pageid for an article the same between languages? Does https://en.wikipedia.org/wiki/NASA article from EN wiki have the same ID as the https://hu.wikipedia.org/wiki/NASA article from the Hungarian wiki ? [12:44:05] Nemo_bis, though I wonder where a good place would be, it's rather a bmo extension than a real upstream one. I normally refer to https://bugzilla.mozilla.org/show_bug.cgi?id=634531 [12:45:15] BadDesign: no, no [12:45:22] bugzilla back up [12:46:25] andre__: I'll just use email headers then, hoping I will resist looking into the trash :) [12:46:34] hehe [12:46:56] hmm almost 20k messages in trash, may be time for cleanup [13:03:08] py [13:35:54] Is there a generator that only returns articles no all pages? That is, for English Wikipedia I want 4,281,923 articles not 30,647,182, that generator=allpages returns; https://meta.wikimedia.org/wiki/List_of_Wikipedias#All_Wikipedias_ordered_by_number_of_articles [13:36:00] *not all pages [13:38:26] generator? err [13:38:33] not sure what your after, might it be {{NUMBEROFARTICLES}} ? [13:38:51] in the http://en.wikipedia.org/w/api.php?action=query&prop=coordinates&format=json&colimit=500&generator=allpages&gaplimit=max [13:39:01] API request [13:39:33] generator=allpages doesn't return 4 million articles [13:39:41] it goes beyond that [13:39:49] oh sorry, no idea [14:07:21] I can clearly see that it goes beyond 4.2 million articles because the count of HTTP requests times 500 gives a number beyond 4,281,923 [14:26:41] wikitech is gone? [14:26:45] I get 404 [14:27:02] https://wikitech.wikimedia.org/wiki/ >> The requested URL /wiki/ was not found on this server. [14:28:04] petan: http://www.downforeveryoneorjustme.com/https://wikitech.wikimedia.org/wiki/ [14:28:20] petan: use the HTTP version [14:28:27] https is not working on that site [14:28:52] lol BadDesign [14:28:54] that site suck [14:29:01] it's apparently down to everyone [14:29:06] just that site is claiming it's up [14:29:14] how does it check it by doing ping? :D [14:34:20] dunno [14:36:00] most probably a HEAD request [14:48:00] Hrm. [15:03:15] petan, re your email, did you ask freenode for a higher limit? [15:03:26] they used to give those out if you asked nicely and could justify it [15:04:06] What's he after? Higher join limit? [15:23:44] Elsie, yes for wm-bot [15:24:08] The GCs should be able to request that... [15:24:16] If you can't find a useful staff member. [15:28:18] do gadgets still have problems? [15:28:33] As far as I am aware, no. [15:28:39] If you find some, please report them [15:28:50] sumanah: i'm just responding to the /topic [15:29:09] ah [15:40:58] Elsie: AFAIK petan works under the assumption that GC don't exist or are totally inactive [15:43:15] Nemo_bis lol what [15:43:37] Nemo_bis where did you get that nonsense from, I am working with GC's a lot [15:43:37] petan: this is what I remember from previous discussions with you [15:43:40] ah, good to hear [15:43:46] are you talking about garbage collector? [15:43:57] nope, unless that's my mind [15:44:05] in that case I have no idea what you mean [15:44:32] better so, must be very old information [15:44:41] Thehelpfulone yes I asked freenode many times, they told me it's technically impossible [16:54:55] hello? [16:54:59] excuse me? [16:56:22] Music123_: Hi there. Can we help you? [16:56:42] can i use autowikibrowser without register [16:57:09] Why would you want to do that Music123_? [16:57:10] i want to change category [16:57:17] for eg. category:a ->category:B [16:57:24] uh. [16:57:27] because [16:57:44] category:a is wrong name [16:57:47] so move [16:57:53] to category:B [16:57:59] BUT [16:58:02] Music123: you can do that as an anonymous editor, just manually. [16:58:04] TO MANY ARTICLES [16:58:10] then...no, you should register. [16:58:17] OMG [16:58:33] um same as other language wikipedia? [16:58:43] is the rule same? [16:58:48] what rule? [16:58:54] must register [16:59:00] for awb [16:59:12] to use AWB? I'd imagine so, if it works on other language Wikipedias, which I have not tried. [16:59:30] how to set other language in awb [16:59:31] Can you tell us why you don't want to log in to do this, Music123? [16:59:38] uh. [16:59:52] i actually have a id [16:59:55] but [17:00:08] i don't know how to register [17:00:26] and i'm not native English speaker [17:00:38] (omg, ~er) [17:01:53] um [17:02:02] it just login into english wiki? [17:06:59] excuse [17:07:01] me [17:09:06] uh [17:09:10] excuse me? [17:12:22] abcMusic123: hi. have you already looked at https://en.wikipedia.org/wiki/Wikipedia:AutoWikiBrowser ? [17:12:37] abcMusic123: there is a user manual you could read: https://en.wikipedia.org/wiki/Wikipedia:AutoWikiBrowser/User_manual [18:07:09] upgradey time [18:07:55] New MediaWiki deployment to enwiki? [18:08:08] or is that phase 3? [18:09:01] Everything non 'pedia [18:09:03] Hi [18:09:46] I was sent here from wikipedia-en-help [18:10:53] I have this problem, sometimes when I try to copy paste, the content is copied two times [18:11:30] It also happens when copying using the mouse (right click + paste) not only ctrl-v [21:37:07] greg-g or Reedy: Yo. WikiLove seems to be broken on Commons. [21:37:14] I think it's wmf10-related. [21:37:42] JS console errors? [21:38:05] wee [21:38:13] VE just went out, too [21:38:27] https://bugzilla.wikimedia.org/show_bug.cgi?id=51399 [21:38:55] oh, on commons [21:39:08] And test.wikipedia.org. [21:39:33] https://git.wikimedia.org/commit/mediawiki%2Fextensions%2FWikiLove.git/a2a081819ff15d3c8039e753d1f92634457e6037 [21:40:04] hrm [21:40:41] Off-hand, either that change is broken or there's a JS cache something going on. [21:40:48] hrm [21:41:04] that's from 6/29, should have been out there for a while.. no? [21:41:23] greg-g: The week of 7/4 was skipped, remember? [21:41:23] NFI. [21:41:25] or I guess no, the week of the 4th was a no deploy week [21:41:37] yeah [21:41:37] alright then [21:41:49] Hi Roan. :-) [21:42:01] Howdy [21:42:07] Can someone test that (reverting it) to see if it fixes the issue? able to reproduce locally anywho? [21:42:33] s/Can/Will/ [21:42:55] Please? [21:43:00] :) [21:43:04] Getting from git.wm.o to gerrit.wm.o is painful. [21:43:14] yeah, 'tis annoying [21:43:22] https://git.wikimedia.org/blobdiff/mediawiki%2Fextensions%2FWikiLove.git/0675f00b28608bbf4aaec586343bfa735c4d0a45/WikiLove.hooks.php is the actual commit [21:43:24] this is where I liked the unified experience of Launchpad [21:43:41] https://git.wikimedia.org/commit/mediawiki%2Fextensions%2FWikiLove.git/0675f00b28608bbf4aaec586343bfa735c4d0a45 shows the Change-Id [21:43:49] And so https://gerrit.wikimedia.org/r/#q,I6d2fa44550361ebda12c602c487d6a38bff1c479,n,z is the Gerrit link [21:44:01] Oh, I was using the commit hash. [21:44:11] Silly me. [21:44:16] In Gerrit searching for the commit hash should also work [21:44:17] hrmmm [21:44:25] But only the commit hash of the actual commit, not the merge commit (yes this sucks) [21:44:36] (Indeed.) [21:44:52] It's worse that git.wm.o doesn't provide that link in its interface. [21:45:09] Oh, Kaldari just commented. [21:45:17] Who's able to revert and deploy? [21:45:56] I can do it if someone will give me a deployment window. greg-g : ^ [21:46:51] kaldari: yeah, we're open now to 4pm [21:47:22] I re-opened . [21:48:03] thanks [21:48:32] Yay regressions. [21:58:47] https://gerrit.wikimedia.org/r/#/c/73871/ for those following along at home. [21:59:12] James_F: https://twitter.com/danlev/status/356894986035396608/photo/1 [21:59:34] StevenW: Argh. [21:59:36] RoanKattouw: ^^^ [21:59:41] He says same issue in Chrome, FF, Safari for him [21:59:46] Logged in? [21:59:52] StevenW: Yeah. [21:59:54] Gadget issue? [22:00:11] Potentially. He's [[User:Danlev]] if you want to reach out. [22:00:29] StevenW: Roan says he's managed to try to break jQuery with a gadget. [22:00:30] StevenW: He's running some sort of Gadget that's overridden $ with something that isn't jQuery [22:00:45] Gadget or custom user JS, right? [22:00:50] Or just gadget? [22:01:01] danlev, this name is not new to me [22:01:03] Or user JS [22:01:42] kaldari: Want to talk about WikiLove for a second? [22:01:44] yep, he's a nice user, worth contacting [22:01:54] possibly in -dev [22:01:57] sur [22:01:59] sure [22:02:34] Are you going to fix WikiLove's JS? [22:02:40] It's clearly giving broken input [22:02:46] yeah, just reverted the regression and was about to deploy it [22:03:37] hoo: Which JS? [22:03:42] PHP actually [22:03:52] https://gerrit.wikimedia.org/r/#/c/73871/ [22:03:52] The hooks file seems to have caused the recent regression. [22:04:03] kaldari: Nah, the bug is in JS [22:04:10] oh? [22:04:10] PHP is acting totally right [22:04:28] You're just using the bug the old version had in a very creative way [22:04:57] hoo: would it be OK if we fix it after the revert is deployed? [22:05:13] I guess so, the other bug probably wasn't critical, was it? [22:05:18] No. [22:05:25] The recent regression is high priority. The older issue can wait. [22:05:26] Ok, go ahead then [22:05:30] just want to get it back to semi-functional for now :) [22:05:35] Right. [22:06:01] hoo: You should prepare a better commit. :P [22:06:22] Tests in this area might also be nice. [22:06:28] hoo: Yeah, if you want to go ahead and patch it up correctly, I'll be happy to review it [22:07:51] kaldari: Is it currently possible to give Wikilove to other users then the one we're seeing the page of= [22:07:54] *? [22:08:27] yes, but only via an API call, not via the WikiLove interface [22:08:29] There's an API module. [22:08:35] Drat, beaten. [22:08:42] kaldari: Ok, but we don't have to care for that in the JS? [22:08:52] correct [22:09:28] the one tricky part is that we want to be able to give WikiLove from both the User page and the User Talk page, but it should always be posted to the User Talk page. [22:09:32] https://bugzilla.wikimedia.org/show_bug.cgi?id=51399#c9 [22:09:41] And it should work from user subpages. [22:09:44] And prevent self-love. [22:10:43] Elsie: you anti-narcissist? [22:10:57] I love me some self-love. [22:11:07] good so then [22:14:18] I have a lot of virtual beer to cleanup after this [22:14:24] :P [22:16:15] > echo Title::newFromText( 'User:User:Hoo')->getBaseText(); [22:16:15] User:Hoo [22:16:20] kaldari: That's why it worked [22:16:27] Damn, I love those side effects :P [22:17:03] ha, that's convenient [22:17:36] jshint is crying, but I got a fix ;) [22:17:52] lol, I imagine so. That code is pretty old [22:20:34] in WikiYears at least [22:23:43] kaldari: I kept it adding User: automatically for b/c (theoretically it's possible to use the JS with multiple users... just nobody seems to do it, but unless we remove that, we should support it) [22:23:52] Untested, but I scanned the code a lot [22:23:57] https://gerrit.wikimedia.org/r/73883 [22:24:12] thanks! [22:24:33] added myself [22:30:09] OK, should be fixed on Commons and test.wiki now [22:31:26] now to deploy to wmf9 [22:31:44] actually nevermind [22:31:44] I'm done :) [22:31:49] \o/ [22:32:26] greg-g: finished with deployment [22:38:27] hoo: Unfortunately, I have a hard deadline of 5pm on some mobile code review I have to do, so I won't be able to review the WikiLove fix immediately. It looks fine to me, mostly just need to test it, and also make sure the WikiLove features in the PageTriage extension still work correctly. [22:38:52] Ok, I guess it's not urgent :) [22:39:08] Might be able to look at it after 5 [22:39:12] code freeze isn't today anyway :P