[00:28:03] helderwiki, poke [00:28:10] hi! [00:28:35] good night Krenair :-) [00:29:26] helderwiki, want me to do https://en.wikipedia.org/wiki/MediaWiki_talk:Guidedtour-tour-twa1.js ? [00:29:50] sure! :-) [00:30:08] If you have the time, there are few similar requests [00:30:14] https://en.wikipedia.org/wiki/Category:Wikipedia_protected_edit_requests [00:31:45] hm. Well as long as they're just replacing the use of deprecated functionality or something similarly uncontroversial, I can probably do some of those [00:32:24] yeah.. That batch of requests was mostly to update wgFoo to mw.config.get('wgFoo') in some scripts [00:33:28] one or two have also mw.util.wikiGetlink or some other deprecated method, but they are all about deprecations [03:58:44] I am trying to clear my watchlist, but it's so big that the transaction is failing. What can I do? [03:58:46] Request: POST http://en.wikipedia.org/wiki/Special:EditWatchlist/clear, from 10.64.0.105 via cp1053 cp1053 ([10.64.32.105]:3128), Varnish XID 1903534509 [03:58:49] Forwarded for: 2601:1:400:531:9cfd:387c:2b4e:4bf, 208.80.154.75, 10.64.0.105 [03:58:51] Error: 503, Service Unavailable at Tue, 15 Jul 2014 03:57:16 GMT [04:03:25] Magog_the_Ogre: let me see... [04:05:13] Magog_the_Ogre: you have 173668 watchlist entries >.> [04:05:29] Magog_the_Ogre: want me to clear it? [04:10:12] well, let me know [04:32:19] i hope James_F|Away has not expired [09:06:33] Hm http://google-opensource.blogspot.it/search/label/gci [09:06:59] odder: Poland http://google-opensource.blogspot.it/2014/06/my-google-code-in-grand-prize-trip.html [09:10:40] guillom: I think that's worth a retweet from @wikimedia or @mediawiki, our winner wrote a wonderful post ;) [10:10:46] Nemo_bis: Will look after lunch :) [10:23:52] [[Tech]]; Urheber- Mimi Lyrik 07.07.2014; /* Meine Vorstellung, Mimis Lyrik */ new section; https://meta.wikimedia.org/w/index.php?diff=9203567&oldid=9196654&rcid=5433147 [10:26:54] https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.Api.plugin.edit um, is that all? I'm now just googling for examples, as I am trying to do something different from a new section [10:27:53] svetlana: have you tried [[mw:RL]] [10:28:46] found https://en.wikipedia.org/wiki/MediaWiki:Guidedtour-tour-twa5.js .... Nemo_bis, it just links to the doc. thing [10:31:56] ...... [10:38:48] Nemo_bis, what did you want to say? [10:39:43] Nemo_bis, see https://www.mediawiki.org/wiki/ResourceLoader/Default_modules#mediawiki.api.edit and please don't hide your thoughts from me, I'm sure they would be useful [10:42:13] It does link to the doc.wikimedia. yadda yadda I mentioned originally, and neither of them have examples. -- I have trouble seeing how this channel works; people aren't even remotely verbose. [10:47:57] Nemo_bis, ?? [10:49:54] and how am I supposed to write something? people even /stop talking/ to me. [11:05:41] svetlana: I'm confused what you're stuck with [11:05:46] https://doc.wikimedia.org/mediawiki-core/master/js/source/mediawiki.api.edit.html#mw-Api-plugin-edit-method-newSection [11:06:09] Drop the section: 'new' and you've got full page editing essentially [11:06:13] Reedy, hi. I'd like to prepend new text, not add a new section. [11:06:20] You've not said what you're trying to do [11:06:34] Have you looked at https://en.wikipedia.org/w/api.php ? [11:06:48] prependtext - Add this text to the beginning of the page. Overrides text [11:06:49] appendtext - Add this text to the end of the page. Overrides text. [11:07:44] I did now. I did too. But I have brittle js experience: I have used the api from Perl before with success. With js, for the 'edit' action, I couldn't find a single example (other than what I linked above) which would involve retrieving page content first, and then doing something with it. (it is supposed to prepend it to page top, after the first line). [11:08:05] And the thing I linked is a little messy, so I wanted to know whether there's an easier way to do that. [11:08:24] https://www.mediawiki.org/wiki/API:Client_code [11:09:15] https://www.mediawiki.org/wiki/API:Client_code#JavaScript ? [11:10:51] That's some sort of wrappers and extra libs; I could look at their inner workings, but it's even /more/ reading. Hm. (I think I could go and put all existing user scripts onto mw.org and get people attach them to the documentation, as examples, in some way.) But I'll read them too. [11:11:30] svetlana: that api call would probably look similar to this (haven’t checked out all params): [11:11:34] var api = new mw.Api(); [11:11:35] api.postWithEditToken( { [11:11:36] action: 'edit', [11:11:38] title: ‘page title’, [11:11:39] summary: ‘summary text’, [11:11:40] prependtext: ‘text to prepend’ [11:11:41] } ); [11:12:59] I guess https://en.wikipedia.org/wiki/MediaWiki:Guidedtour-tour-twa5.js is pretty close and is not overcomplicated. I will read on edit tokens too and figure things out from there. Thank you mlitn. [11:15:28] svetlana: that indeed does something very similar [11:15:51] it just appends instead of prepend, and manually adds the token (api.postWithToken does that for you) [11:16:06] yup [11:31:00] I don't remember when autofocus was added to Special:Search; it breaks page down... [11:31:16] I think we've already fixed this before, or perhaps it was on Special:ListFiles [11:39:06] ^d: :) https://meta.wikimedia.org/wiki/Fulltext_index_statistics_for_en_Wikipedia [12:58:56] helderwiki, around? [13:08:17] Hi, is there any known issue on user preferences? I seem to be unable to save most of them... [13:10:01] I don't think so [13:11:24] The only thing I can save is my gender, the "yournick" box, and two email toggles [13:11:47] Out of a few things I have tried, though. Also, I can only save them if I change one at a time [13:16:03] Or, things I can't save includes prefershttps, yourlanguage/variant, tog-fancysig, recentchangesdays, prefs-watchlist-days, and seemingly anything related to echo [13:16:11] Even if I change them one at a time... [13:16:44] hey Krenair [13:17:15] helderwiki, I did a bunch of those enwiki mw.util.wikiGetlink -> mw.util.getUrl changes [13:18:05] * helderwiki is opening the diffs [13:20:13] helderwiki, you also added some unrelated files to the request which I didn't do [13:22:18] everything seems ok [13:22:35] I posted them in the same request just because the change was the same [13:22:52] but if you prefer I can repeat the request in the two other talk pages [13:28:07] Krenair: https://en.wikipedia.org/w/index.php?diff=617045371 [13:28:07] https://en.wikipedia.org/w/index.php?diff=617045396 [13:30:31] Krenair: BTW: you inspired me to make a request at https://meta.wikimedia.org/wiki/Steward_requests/Global_permissions#Global_editinterface_for_Helder.wiki [13:30:43] yeah, I saw [13:36:38] helderwiki: if you're going to fix LinkFA on all wikis, I vote you :D [13:36:53] Is it broken? [13:36:59] Yes, since December [13:37:27] what kind of brokenness? [13:37:35] where can I see it? [13:37:41] (or not to see it ;-) [13:37:43] Doesn't display anything [13:37:49] It needs to do something like id = "interwiki-" + InterwikiLinks[i].firstChild.getAttribute( "lang" ); https://bi.wikipedia.org/wiki/MediaWiki:Common.js [13:38:05] On a few hundreds wikis, it's still checking for classes which no longer exist, so it fails [13:38:29] Sometimes, CSS and template need to be fixed as well [13:39:11] For instance on bi.wiki I have no idea why it's not working e.g. on https://bi.wikipedia.org/wiki/Germany [13:39:59] I knew I had seen this before... https://en.wikipedia.org/wiki/MediaWiki_talk:Common.js/Archive_20#LinkFA_is_broken [13:40:02] Sure [13:40:08] That's why I'm asking you ;) [13:41:42] Usually I fix these things on ptwiki (home) and enwiki (more accessed), to maximize the changes someone will fix it on smaller wikis... [13:42:10] Sure, that works if you're ok with waiting 10 years [13:42:33] heh [13:43:17] https://toolserver.org/~hoo/globalPageHistory.php is down but I verified that hundreds wikis using LinkFA have not modified their JS since before December, hence are certainly broken (list: https://www.wikidata.org/wiki/Q16467 ) [13:43:26] but there is more chances a fix will to go from enwiki to smaller wikis than the other way around [13:43:39] Actually en.wiki took it from vi.wiki :P [13:43:52] Nemo_bis: Ok... pity that I don't find time to revamp that tool [13:44:09] hoo: luckily I made most of my searches beforehand :D [13:44:43] We'll regret the Toolserver for many years still [13:44:50] I wonder how is the progress on Wikidata badges going.. [13:45:28] helderwiki: very well, we got a lot of things in last week. Bene* is heavily working on it :) [13:45:39] if it is just a matter of waiting for one month or so, maybe we won't have to fix the linkFA [13:45:45] :-) [13:46:19] We had a lot of discussions within the Wikidata team about this already [13:47:01] The script wont be needed once we 1) have badges support enabled on Wikidata.org and 2) Enough badges have been imported so that we can switch the wikipedias over [13:48:32] do you know if anyone is preparing a bot for that task? Is the backend/API part ready for someone to work on that? [13:49:23] Not sure about the bot, but the API is ready, yes... we just wait for the UI to "catch up" so that we can enable it on Wikidata [13:51:01] Sigh. As a result, feature broken for one year. [13:51:22] Initially because "they'll learn how to fix it themselves" then "oh we'll supersed it in N months anyway" [13:51:40] Nemo_bis: ? [15:22:47] Krinkle: Hi! [15:23:31] I saw this usage of $.when and found no documentation about the $.ready promise (deferred), but I assume it is ok to use also in this case: [15:23:32] https://en.wikipedia.org/w/index.php?diff=616236009&oldid=616232754 [15:23:55] *This case: https://en.wikipedia.org/w/index.php?oldid=617046908&diff=617058302 [15:33:16] dtm: No expiry from me. [16:18:01] So where is codfw? :) http://www.openstreetmap.org/node/2644971019#map=13/33.0010/-96.8922 [16:42:32] Nemo_bis, https://wikitech.wikimedia.org/wiki/Codfw -> http://www.cyrusone.com/data-center-locations/dallas-data-center-carrollton.php -> 1649 West Frankford Rd. [16:51:33] Nemo_bis, https://www.openstreetmap.org/changeset/24164759 [16:51:44] any idea why http://meta.m.wikimedia.org/wiki/meta is accessed ~ 250 M times per day? http://tools.wmflabs.org/wikiviewstats/?lang=meta&datefrom=2014-06-15&autofilter&type=world [16:51:54] thanks Krenair [16:51:56] seems like a bug to me [16:52:24] it started somewhen in july 2013... [16:53:05] since then each month that page gets ~ 7 G views, that's way more than the en_wiki mainpage! [16:54:11] jorn: probably a bug in the tool, cf. http://stats.grok.se/meta.m/latest30/Meta [16:55:51] Nemo_bis: i do an own aggregation of the raw stats and also see "meta.mw: meta 7447788120" as top page... [16:56:13] aggregation from what [16:56:27] I suggest that you grep http://dumps.wikimedia.org/other/pagecounts-ez/merged//pagecounts-2014-06-views-ge-5-totals.bz2 [16:56:31] http://dumps.wikimedia.org/other/pagecounts-raw/ [16:56:56] Nemo_bis: will do [16:59:50] Nemo_bis: but maybe this convinces you as well: http://stats.grok.se/meta.mw/latest30/meta [17:00:49] that's not meta-wiki [17:00:50] $ zgrep -Ec "meta.m [Mm]eta " pagecounts-20140715-160000.gz [17:00:50] 0 [17:01:31] sorry, got that link from http://tools.wmflabs.org/wikiviewstats [17:02:19] err, now i'm confused... [17:03:51] what is meta.mw / meta then? [17:03:55] bd808: still no login on beta labs [17:05:04] jorn: wikipedia mobile: ".mw" [17:05:07] http://dumps.wikimedia.org/other/pagecounts-raw/ [17:05:39] Nemo_bis: yupp, i meant as link... isn't it http://meta.m.wikimedia.org/wiki/meta [17:05:47] (watch out for the redirect) [17:06:11] http://meta.m.wikimedia.org/wiki/meta?redirect=no [17:06:33] "it"? [17:06:43] wiki_P_edia mobile [17:07:06] bd808: I'm thinking I should restart memcache just to see what happens [17:09:51] Nemo_bis: sorry if i'm confusing... let me try again: i observ a very very high number of views in the raw pageview stats for a page corresponding to a line starting with "meta.mw meta ". I see this in my own aggregation, on http://tools.wmflabs.org/wikiviewstats/?lang=meta&datefrom=2014-06-15&autofilter&type=world and on on http://meta.m.wikimedia.org/wiki/meta [17:10:19] i _think_ the page that is accessed so often is: http://meta.m.wikimedia.org/wiki/meta?redirect=no [17:10:38] helderwiki: $.ready should be safe to use. [17:10:54] helderwiki: I'm providing official api.jquery.com as I speak. [17:11:06] helderwiki: I'm writing documentation for api.jquery.com as I speak. [17:11:08] Nemo_bis: and grepping through erik's aggregates i can't find a line starting with "meta" [17:11:28] Krenair: Great! Thank you! [17:11:36] I mean, Krinkle [17:12:00] [api.jquery.com] Krinkle opened pull request #530: Document jQuery.ready.promise() (master...issue/205) http://git.io/EYQM4w [17:12:41] helderwiki: Click Subscribe on https://github.com/jquery/api.jquery.com/issues/205 [17:13:04] jorn: I'm not sure erik's aggregates consider the wikimedia.org domain, which is very dirty [17:13:06] done [17:13:17] especially meta which has several billions fake visits form centralnotice [17:15:36] Hello. I'm not sure where I should report errors with tools on the wmf labs - enwp10 is currently not working, e.g. see: http://tools.wmflabs.org/enwp10/cgi-bin/list2.fcgi?run=yes&projecta=Physics&importance=Top-Class&quality=FA-Class [17:16:15] Nemo_bis: hmm ok, so what about that > 2 K reported pageviews per second... should i report that somewhere ;) [17:17:32] jorn: I still didn't see you find them in the raw logs [17:17:40] I see 1 in a hourly log [17:18:01] Nemo_bis: just found them in a second run with bzcat and grep in erik's aggregates... seems to have been a problem with less ?!? [17:18:32] Nemo_bis: found this line in erik's aggregates: [17:18:37] meta.mw meta 7556585586 [17:18:50] if you want i also go through one of the hour logs... [17:18:56] And what do you think that is? [17:19:38] i hope a bug in the stats writer [17:19:47] if not it's accesses to http://meta.m.wikimedia.org/wiki/meta?redirect=no [17:19:54] helderwiki: I noticed your request for editinterface [17:19:54] sigh [17:20:02] let's chat in PM [17:20:11] 19.05 < Nemo_bis> jorn: wikipedia mobile: ".mw" [17:20:16] 19.06 < Nemo_bis> wiki_P_edia mobile [17:20:55] Nemo_bis: i read that but i'm not sure i'm wrong here [17:31:04] Nemo_bis: ah: it seems that under the lines "^.mw: " the full wikipedia mobile stats are aggregated... [17:31:35] if that's the case it's heavily inconsistent with the rest of the file... [17:34:11] (cause that seems like a line that would belong into the projectcounts files instead of the pageviews) [17:34:32] The aggregation file is not authoritative [17:39:35] chrismcmahon: Did you restart memcached servers yet? I'm working on some scap failures in beta but those should not be related to wiki logins. [17:40:02] bd808: I have not, 3 other things going on [17:40:09] * bd808 nods [17:40:26] bd808: I think Ori is hacking on something though, I just gave him a beta labs login [17:40:37] cool beans [17:41:11] hoo|away, Lydia_WMDE or helderwiki, do you know if the wikidata badges will also be available in a form consumable by ULS? See https://www.mediawiki.org/wiki/Talk:Universal_Language_Selector/Design/Interlanguage_links#Featured_articles.2Flists (would be useful to have a ULS bug depending on the appropriate wikidata bug) [17:43:57] I don't known Nemo_bis [18:00:46] Nemo_bis: I don't quite understand what that section should tell me [18:00:48] Nemo_bis: should be possible [18:00:52] can you elaborate further? [18:06:01] hoo: ULS would need to extract from interlanguage links attributes which of them are FA / GA and prioritise them [18:06:54] Ah, I see... ULS can of course use Wikidata's data [18:07:13] They just need to find a sane way to bind to Wikidata (like an extension to ULS) [18:10:00] hoo: an extension? link attributes are the only sensible option [18:10:17] so my question is, are link attributes already planned, is there a bug report for them [18:10:30] What do you mean by "link attributes" [18:10:39] attributes to the a tag [18:10:59] there will be classes set by Wikibase [18:11:25] so you will have wb-badge-Q12345 on them (or something like that, not sure baout the exact names) [18:11:57] what's Q12345 [18:12:04] the badge [18:12:12] that is? [18:12:15] just a dummy number, thought that way obvious [18:12:26] could be featured article [18:12:35] or whatever [18:12:35] of course, but does it mean it's free tagging? [18:13:00] or the ID of the item corresponding to the current page? [18:13:03] Yes, as long as the item is an allowed badge [18:13:10] AKA No [18:13:27] What defines the allowed badges? [18:13:35] A setting [18:13:39] Where [18:13:45] we will start with just two featured and good articles [18:13:54] but if the community wants/ needs more, there might be more [18:14:16] Wikibase Repo or Wikibase Client setting? [18:14:39] Repo of course, the client just consumes what's set in the repo [18:15:06] It's not so obvious, the client could ignore unwanted data [18:15:24] That makes it easier [18:15:27] it can to a certain degree [18:15:47] it can't remove the classes [18:15:57] but the visible badges etc. of course [18:17:39] Ugh. Then it's complex again [18:18:06] not really, no [18:18:25] although it might sound complex right now it's fairly simple [18:18:28] No? Can you file a bug in ULS for how it should be used then? Not clear to me [18:19:04] I don't really understand what ULS tries to do and to what degree ULS is ok with binding to Wikibase/ Wikimedia [18:19:10] Maybe an ULS global relying on the fact that it's sync'ed both with WikibaseRepo and Wikibase Client config [18:19:39] hoo: https://www.mediawiki.org/wiki/Universal_Language_Selector/Design/Interlanguage_links#How_it_works [18:20:31] Nemo_bis: Ah... and over there it cares for FA etc. articles more than for "normal" ones? [18:21:34] hoo: not currently, that's the point [18:21:45] Ah, but they want to [18:22:00] well, that wont be hard, if you're ok with binding to Wikibase [18:22:43] you could just have two configuration vars like: var featuredArticleSelector = '.wb-badge-Q12345'; [18:23:00] then you can just do $( featuredArticleSelector ) in your code and you have all FA articles [18:23:21] that will work on all Wikimedia wikis (and with different settings on all Wikis with Wikibase) [18:28:28] https://bugzilla.wikimedia.org/show_bug.cgi?id=4901 [18:29:25] * edsu waves [18:29:54] q: should twitter.com/congressedits be worried about ip spoofing [18:36:23] hoo: is there a summary of what's actually going to be implemented? :) https://bugzilla.wikimedia.org/show_bug.cgi?id=40810 is not very consumable [18:37:32] and https://www.wikidata.org/wiki/Wikidata:Development_plan#Badges doesn't seem up to date [18:38:30] Nemo_bis: That's not up to date... [18:38:40] jorn: thanks for http://de.wikipedia.org/w/index.php?title=Wikipedia_Diskussion:Wiki_ViewStats&diff=next&oldid=130815882 [18:39:22] Nemo_bis: We only agreed on how to actually implement stuff late last week, so there's no more documentation for that than you can see in gerrit [18:40:28] now don't tell he that's you :) [18:43:19] hoo: I'm updating the page [18:48:20] s/he/me/ [18:49:48] Lydia_WMDE and hoo, https://www.wikidata.org/wiki/Wikidata:Development_plan#Badges [18:51:00] Nemo_bis: Not quite correct... will fix it [18:51:40] ;) thanks [18:52:06] Then I'll file my bugs after dinner when I can access the corrected plan :) [18:54:23] hoo: speaking of plans, how's the "in other projects" sidebar stuff going? I see a -1 from Jeroen on one of the changes but I haven't looked deeper [18:54:43] that's a good question... I'll have to look myself [18:55:11] :) [18:55:12] We talked about this a bit last week and hope to find a way forward [18:55:16] cool [18:55:18] but I haven't yet looked at progress since [18:59:49] [[Tech]]; Verdy p; /* Meine Vorstellung, Mimis Lyrik */; https://meta.wikimedia.org/w/index.php?diff=9207770&oldid=9203567&rcid=5433709 [19:15:17] csteipp, when you have a moment; can you talk to me about the plans to introduce a content security policy to mediawiki [19:27:40] legoktm: How do I get your patch looked at/pushed through faster? [19:27:45] 1clickspam [19:27:48] busy right now, sorry [19:28:04] No problem. Though I only wanted direction. Not action. [19:28:05] :P [19:30:55] hoo|away: further tweaked https://www.wikidata.org/w/index.php?title=Wikidata%3ADevelopment_plan&diff=144903859&oldid=144899811 [19:31:05] I don't undertand what a "generic class" is [19:39:00] Nemo_bis: a css class like "featured", as opposed to "badge-q3477165" [19:39:21] maybe "generic" isn't the best term to use here [19:42:42] hey guys [19:42:48] can anyone help with https://bugzilla.wikimedia.org/show_bug.cgi?id=59242 ? [19:43:05] there is a database missing after implementing UploadWizard on ro.wp [19:44:52] * marktraceur not totally surprised [19:45:24] DanielK_WMDE: but how does it differ from "These CSS classes can be configured per badge (like Q120 => good article)" then [19:45:25] Reedy: ^^ [19:45:27] + [19:45:42] YuviPanda: You too - does the database schema update hook do that right? [19:46:34] marktraceur: we don't use update.php on the cluster... [19:47:11] marktraceur: campaigns one? it did. [19:47:43] legoktm: Figures. That's why Reedy had to be reminded about BetaFeatures every damn time :) [19:48:13] marktraceur: https://wikitech.wikimedia.org/wiki/How_to_do_a_schema_change [19:48:15] read the top [19:48:21] Sigh. [19:48:35] legoktm: He's even *in* SFO now! [19:48:38] We're all in danger. [19:51:38] nope, just you! [19:52:06] who's in SFO now? [19:53:00] oh, roan, nvm [19:53:26] legoktm: You're a manageable drive away [19:53:36] Nemo_bis: sorry, got the terminology backwards. the "generic" class would be the one derived from the item id, e.g. wb-badge-Q1234. [19:53:37] I don't know what " [19:53:39] TODO: add greg-g email requesting running mwscript on machine terbium " means :) [19:53:56] Heh. [19:54:05] $ wiki blame [19:54:08] spage [19:54:10] ;) [19:54:13] Hm. [19:54:28] Alas, he not here [19:54:54] Shall I fix rowiki? [19:55:30] yesplz [19:56:00] strainu: Fixed [19:56:09] thanks Reedy [20:13:31] Reedy: do you remember if there is a config variable to exclude certain codes from interwiki links [20:13:45] I mean, from the sidebar [20:15:34] James_F: hellos! [20:16:10] dtm: How are you? [20:17:08] Nemo_bis: All settled now? [20:17:33] hoo: https://www.wikidata.org/w/index.php?title=Wikidata%3ADevelopment_plan&diff=144903859&oldid=144899811 is still last diff [20:17:52] Nemo_bis: Will change later on [20:18:26] James_F: well i just might ask you the same thing. i have just discovered a giant pile of user scripts and i have mega menus and a CLOCK! and i joined Wikipedia 1.0. [20:18:57] James_F: furthermore, i have spread the joyous news of your ideas to others who are interested in a wikidata backing store and such [20:21:04] dtm: Awesome. :-) [20:23:33] James_F: yeah so i don't really have much of a clue what wikidata is but. i think WP 1.0 is what i'm looking for, to make wikipedia not suck and the wikipedian experience hopefully not suck. quality control, basically. but overall we need to utilize the current platform to build a new one. so obviously i want to work with you in standardizing the existing citation formats. [20:24:39] James_F: we need to expand the existing citation templates to suit forward compatibility and portability. and define that as a spec or a project name, and then actually edit existing citations to suit it. and hopefully create tools to help automating that. probably starting with articles that are in WP 1.0's set. [20:24:45] amirite [20:24:56] dtm: Well, that's one way to do it. [20:25:19] dtm: I worry that that would end up being a lot of effort spent only for the benefit of the English Wikipedia, and not all our projects. [20:25:34] James_F: how? [20:26:00] James_F: how could there be any other way and why does it have anything to do with just one wiki [20:26:22] dtm: "the existing citation formats" is not a standard between wikis. [20:26:41] James_F: what, do others not use {{cite web}} and such? [20:26:43] dtm: "the existing citation templates" similarly – indeed, some pretty major languages don't even have citation templates. [20:26:46] dtm: Indeed. [20:26:48] what! [20:27:16] okay but anyway, citations are inherently language dependent anyway, and wikidata is supposed to be transcendent of any one format anyway [20:27:25] like it's supposed to give each citation a serial number etc [20:27:39] So I think a route that involves moving a bunch of the structured data into Wikidata and shows it in an organised way to users as a "cite" makes sense. [20:27:50] and that serial ID would be associated wth each wiki's citation [20:27:55] dtm: other wikipedias often use something very similar, but subtly different [20:28:04] MatmaRex: yikes [20:28:09] okay but we can have a common core [20:28:11] dtm: like code that was vcopied from en.wp in 2009 and then tweaked for five years ;) [20:28:33] there's always going to be a common core of features [20:28:35] copied* [20:28:37] it's like evolution across continents due to tectonic plate movements :) [20:28:50] title, first, last, url etc [20:29:02] isbn, oclc [20:29:07] * greg-g shudders at "common core" out of context [20:29:11] :-o [20:29:13] greg-g: sorry [20:29:15] :) [20:29:21] * YuviPanda makes greg-g take a standardized test [20:29:33] * dtm delivers greg-g some brain bleach in a firehose [20:29:36] I worked on educational metadata standard(s) for about 2 years [20:32:28] greg-g: i'm so sorry [20:32:49] i meant to say "data normalization" ^_^ [20:36:03] :) [20:36:12] go on, I don't want to kill the conversation :) [20:44:28] greg-g: well i was kinda waiting on James_F [20:50:32] * James_F doesn't really have much to say that isn't already on-wiki. [20:56:41] James_F: .....where? [20:57:07] dtm: https://www.mediawiki.org/wiki/Citoid and related pages. [20:57:25] James_F: kthx [21:06:11] mwalker: Yeah, I'd like to. No firm plans right now, but I would -1 any patches that makes it more difficult. First step is to allow unsafe inlines, and just use it for privacy protection. [21:09:25] the reason I ask is because the CentralNotice caching RfC is reliant on adding another inline script for performance reasons [21:09:44] in the text I sort of fobbed it off as a problem to fix once we identify how we're going to address it for resourceloader [21:10:00] csteipp, ^ [21:10:34] Well, I can't say I like it, but I expect it will keep working for the near future. [21:11:38] Are you using OutputPage's addScript (or whatever that function is)? Or are you writing out the script from javascript? [21:11:58] it'll be using addScript [21:12:43] well; hmm... it has to create a request [21:12:45] so it'll do both :( [21:13:03] I can get around this if I can make a request to the local wikis varnish [21:13:39] mark, thoughts about polluting the root of our application urls? e.g. have varnish be about to route something like //en.wikipedia.org/banners? [21:14:41] Cool. Daniel Friesen had a patch that set CSP to allow unsafe inlines only on pages that explicitly add scripts, and disallow on the rest. We might do that as a first step. But again, that's at least 6-12 months out. [21:18:44] James_F: do you have any thoughts on Wikipedia 1.0? are you aware of it? does it serve as a testbed for any of your ideas? i am not yet clear whether it serves any separate codebase or just content. [21:19:33] dtm: I do, I am, it doesn't. It's "just" content. [21:19:45] dtm: It's a huge amount of work that lots of people have put in, however. :-) [21:19:54] James_F: ok [21:20:09] James_F: are you involved? i just discovered it yesterday [21:20:18] James_F: does it have an irc channel? [21:20:48] dtm: I'm not, and I don't know. [21:20:59] James_F: you said you want to rewrite Cite (Cite.php), right? are there published plans on that [21:21:19] dtm: It's not planned out yet, no. [21:26:09] James_F: is it a high priority [21:26:28] you said you expect it to happen within some months, so i figured that means there's an idea of priority [21:28:21] dtm: The auto-filling-in references will be within a few months, yes. [21:28:34] James_F: \o/ [21:28:35] The completely-rewrite-Cite-and-start-again work isn't as high priority. [21:40:26] James_F: is there any movement toward a "house style" of citation [21:42:52] dtm: That's more an editorial decision than something from the technology perspective. [21:44:58] James_F: yeah. [21:51:49] James_F: out of curiosity, how much software design that goes on over there keeps Wikipedia in mind exclusively, as opposed to other Wikimedia projects or all the possible use-cases of MediaWiki? [21:52:29] Nemo_bis: https://www.wikidata.org/wiki/Wikidata:Development_plan#Badges Good now? [21:52:36] harej: Most is focussed on "the standard use case", but in practice Wikipedia is the standard use case for most tools. [21:54:07] Have you thought on something that would be useful for, say, Wikibooks? [21:54:10] or Wikisource? [21:54:19] (These sound like loaded questions but they're not :) [21:55:15] Wikisource in particular would be interesting. Huge potential for contributions but it's very difficult to use—worse than Wikipedia. [21:58:45] harej: Yes, we talk about tools specific to wiki families other than Wikipedia sometimes, but not much. [22:00:07] To some extent there's a need for those wikis' denizens to sell changes to us. [22:00:10] And with that, meeting. [22:10:47] harej: for Wikisource there was some talk with Tpt, I'm not sure if it's documented [22:12:31] VisualEditor on Wikiquote, Wikibooks, Wikiversity, Wikispecies and Wikivoyage should be rather easy, but other than Wikiquote and some languages of Wikibooks those are rather dead/unvisited. Wiktionary and Wikisource are trickier because of their crazy templates and ProofreadPage. :) [22:19:17] hoo: ah, yes, now it's extremely clear. Thank you very much for the patience! [22:19:45] Hopefully now that even Nemo_bis understood you'll get less questions from others as well. :D [22:20:20] heh. We really need to get the communicated well in advance to avoid problems. [22:20:50] technically we should be able to have all wikis switch to the new system with 0 problems :) [22:21:15] hoo: ugh, turns out I have another question. Will the icons be added like src=something PHP side? I'd like them to survive ULS if they're copied as-is from the sidebar to ULS' language selector panel, see https://bugzilla.wikimedia.org/show_bug.cgi?id=64797 [22:21:48] Nemo_bis: They will be set as list icons from CSS (for most wikis) [22:21:57] some wikis will probably choose to override that [22:22:00] eg. ruwiki [22:22:05] Right [22:22:28] So again, just keeping the classes should make them work wherever they are displayed, no? [22:22:50] Yeah, but the stuff local wikis do might break ULS's view [22:23:05] we can't guarantee on the CSS rules from local wikis :/ [22:23:50] Well, once one has clearly communicated what will be available, there isn't much more we can do. [22:24:56] true [22:29:07] hoo: unrelately, I got questions from some WMIT members about the accessibility work that got done; is there a summary I can point them to? [22:30:13] Nemo_bis: Yes, not sure they published it yet, let me have a look [22:33:32] Nemo_bis: In German: https://upload.wikimedia.org/wikipedia/commons/0/0e/TAO_Silberwissen_Schlussbericht_End_2013_10_05.pdf [22:33:59] hoo: great, I'll submit to our German-speaking former board member ;) [22:34:30] He always complains he doesn't know German *that* well, so we try to make him exercise [22:35:06] Nemo_bis: Nice :) My work was only a subset of that (p. 31-32 are what I wrote back then AFAIS) [23:14:24] thedj: ? [23:36:48] If I go to https://en.wikipedia.org/wiki/Special:MyPage/common.js?action=edit and CodeEditor is enabled, how can I make $('#wpTextbox1').val('NEWCODE') to work? [23:36:49] https://www.mediawiki.org/wiki/Thread:Extension_talk:CodeEditor/How_to_set_the_content_of_the_textbox%3F [23:37:22] Has anyone been able to do that? [23:39:17] hrm [23:39:52] oh :D [23:40:04] you probably need to do this after document finished loading, let me see [23:40:43] I'm trying this in the console, svetlana [23:40:50] so the page is already loaded [23:41:13] (using Firefox 30) [23:41:34] https://en.wikipedia.org/wiki/User:Gryllida/common.js working example [23:41:46] ah [23:42:40] yep $('#wpTextbox1').val('NEWCODE') in fx console also works [23:43:15] didn't work [23:43:27] does it tell you anything, an error or something? [23:43:34] nothing [23:43:50] try to do it when you're editing a talk page, not a codeEditor window [23:44:05] look at what $('#wpTextbox1') says, should say it found an object [23:44:21] or try document.getElementById('wpTextbox1') and see if that shows up properly [23:44:28] what is more intriguing is that after executing $('#wpTextbox1').val('NEWCODE'); the code $('#wpTextbox1').val(); returns "NEWCODE" but what I see in the textbox is still the old code [23:44:42] you probably have more than one element with that id on that page [23:45:37] nope, $('#wpTextbox1').length === 1 [23:46:05] and I'm trying these commands without logging in, so it is not one of my custom JS [23:47:13] try it on https://en.wikipedia.org/wiki/Special:MyPage?action=edit and see if it works there [23:48:10] CodeEditor is not enabled in normal wikipages [23:48:37] because if you're editing a js, the code editor did hide the normal edit box; whatever you set is not visible (for some reason it only hid it, but not removed it from the document tree) [23:48:39] only in JS CSS or Lua pages, so I can't test in that page is not [23:48:45] *since it is not there [23:49:22] you're not expecting that to work to edit codeEditor's box contents; its box is a set of DIVs with one DIV for each line of your code [23:50:24]
[23:50:24]
// importScript('User:Gryllida/afch2/v2.js');
[23:50:24]
// importScript('MediaWiki:Gadget-afch.js')
[23:50:24]
[23:50:36] that's what code editor's window is [23:51:04] what I needm specifically, is to fix jsUpdater, to make it work even if CodeEditor is enabled: https://pt.wikibooks.org/wiki/User:Helder.wiki/Tools/jsUpdater.js [23:51:10] *need [23:51:35] that script has a list of /regex/ --> replacement which it applies to JS code [23:51:59] but the resulting code is not being saved to the textbox if I keep CodeEditor enabled [23:52:01] =/ [23:52:21] oldText = $('#wpTextbox1').val(); [23:52:21] conversion = jsUpdater.doConversion(oldText, patternIDs); [23:52:40] you need to do this on each of these lines -- enumerate on document elements with the ace_line class [23:52:57] o.O [23:53:01] or probably with the ace_comment class [23:53:03] I would consider that a bug [23:53:26] or probably ace_comment class [23:54:42] it should not remove our ability to get/set the content of the textbox [23:54:52] you may want to look into whether codeEditor syncs with the box1 when saving the page [23:54:59] but it surely doesn't sync the other way round [23:55:10] and editing the lines the user can see would be a nice option to have codeeditor do the rest [23:55:47] but it surely doesn't sync the other way round [23:56:00] it /might/ have a subroutine which does that, but I don't know where to look for it [23:56:42] I don't think it would even know when to call it since it hid that box and doesn't expect it to be edited by users in the first place [23:59:39] i wonder why "$( ".ace_comment" ).each(function( index ) { [23:59:39] $( this ).val('newcode'); [23:59:39] });" doesn't work [23:59:53] wait, it's a span, it is probably not a .val()