[00:03:26] does anyone else have anything that may be able to help me with timeouts on a page with a lot (whole lot) of {{#invoke}}s on it timing out on saving? [00:05:54] what's the page? could you consolidate the multiple invocations somehow? [00:07:06] might be hard. There is a template on en.wiki that currently calls loads an loads of templates (one for each row of a table). It's so big it breaks, so I thought it was a good idea to convert part of it to Lua. That works somewhat on preview, but breaks on attempting to save [00:07:28] the original is http://en.wikipedia.org/w/index.php?title=Template:AFC_statistics [00:07:55] i'm attempting to convert Template:AFC statistics/row [00:08:27] on trying to save the converted page on a testpage, it times out after exactly one minute [00:08:49] Error: ERR_READ_TIMEOUT, errno [No Error] [00:11:15] hrm, dunno. maybe rewrite it so it's a single invocation that accepts all rows at once? [00:11:59] I was looking in the documentation for how I could accomplish that [00:12:10] but I'm not sure [00:12:28] I'll probably run in to some argument limit somewhere, which is bound to exist [00:13:10] yeah, i was just looking for that :P [00:14:04] and it'll get quite horrible anyway. I'll think off something different [00:14:15] not quite sure what though [00:15:28] would you happen to know by the way the equivalent of subst:'ing in scribuntu? This really seems like something that should be cached. [00:15:52] otoh, that would mean that all those invokes need to get expanded at save time, which would probably take even longer [00:16:23] gn8 folks [00:17:13] s/untu/unto/ [03:27:44] does wikimedia have data of visits to projects pages with useragent and referrer? [03:28:26] I'd think that's private information, but I know for a fact CheckUser does keep user agent data (don't take my word for it, I don't work for the foundation) [03:29:09] the squid logs have that information, but i don't think they are stored anywhere [03:29:25] you could say that we "have" that data in the form of a stream [03:30:00] squid? I thought there's a webserver, but a proxy sounds unexpected [03:31:38] reverse-proxy [03:32:57] i.e., not making connections on behalf of clients to random sites on the internet, but making connections to mediawiki servers as needed and caching responses [03:33:27] anyways, why do you need that data? some information is available in aggregate form [04:14:08] do you have an url of the aggregate from please? I'm just curious how users find pages, i.e., what the referers are [04:14:24] (and how many of the visits actually are people) [04:22:40] ... would like to know this to improve usability, just so I know what navigation features are used most [04:24:09] gry: you've seen stats.wikimedia.org ? [04:33:44] yes, no useragents and no referers on this one :) it's intuitive but not the complete thing [04:36:49] there are useragents there [04:37:10] hmm [04:38:53] where? [04:41:05] i'm looking :) [04:51:57] gry: http://stats.wikimedia.org/#requests ? [04:52:28] that's a list of wikimedia projects and a png, I clicked few things and didn't see the information I'm after in there [04:53:07] http://stats.wikimedia.org/wikimedia/squids/SquidReportGoogle.htm [04:53:08] http://stats.wikimedia.org/wikimedia/squids/SquidReportDevices.htm [04:53:26] http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm [04:53:41] http://stats.wikimedia.org/wikimedia/squids/SquidReportOperatingSystems.htm [04:53:44] ... [04:54:21] please look harder. and if what you're looking for is really not there then give more detail about what you want to know [04:54:43] and also find some ways to make this stuff easier to find for the next person :) [04:55:08] but i really don't know much better than you besides that i refused to believe it wasn't there [04:57:20] hello superm [04:57:45] hmm, those look nice, except I'm looking for stats for one page. Not clients on all projects, but clients that visited one page. (and where they came to it from) [04:58:59] you'll need someone NDA'd to do the analysis for you. AFAIK [05:00:44] I'd be somewhat inclined to get such information about once a month, but if such request requires escalating the issue, it is a bit odd [05:01:38] is there a reason it's intended to be kept confidential? it's not like it would shape a certain user profile. It's anonymous. [05:01:50] gry: i suggest you move this to #wikimedia-analytics and try again during the work day US Pacific time [05:02:34] ok, thanks for your hints and the pointer, will try [05:02:39] sure. it's just not been generated in aggregate. so doing the analysis you want requires access to raw logs [05:06:41] odd that this and that channels are logged (wouldn't useful info be documented anyway?) but some folks manage to use forums, so there's not much to be surprised with [05:07:09] thanks again [05:07:12] * gry dies off for a bit [07:39:57] Is it just me or are pages taking a while to load? [07:40:01] "" [07:40:02] ool [07:40:04] oO* [07:41:48] Special:RecentChanges took 10 seconds to load [07:41:49] wtf [07:52:30] is this still an issue? [07:52:44] Not anymore. [07:52:48] grr forgot to take my phone off vibrate, even though I put it right next to me [07:52:49] meh [07:52:58] ok, well we seem to ahve had a hiccup, no idea why [07:53:10] It was strange though. It could have been me. But I am using OpenDNS for my DNS servers. [07:53:26] They're usually reliable. [07:53:40] no, I don't think it was you, see the ops channel for the hiccup warnings [07:54:28] Where is that again? [07:57:28] wikimedia-operations [07:57:34] ah [07:57:40] it's logged [07:57:58] the outag and recovery was so fast it wasn't possible to do anything about it [07:59:25] so it's not the same as hegesippe's slow watchlist? [08:00:22] I don't know about the slow watchlist but this included bits [12:25:32] Reedy, ? [12:26:02] Could you take a look at https://bugzilla.wikimedia.org/show_bug.cgi?id=46264? [13:38:13] (not for wikimedia to be honest) Where are userrights edited [13:38:21] I thought in Localsettings? [13:39:25] #mediawiki ? [13:40:08] yeah, I know, I was already here, lazy [17:33:05] woosters: ping re the emgt call [18:16:25] !log payments cluster updated from c19cc66 to fe4fb96b [18:16:33] Logged the message, Master [19:32:58] guillom: Re. https://www.mediawiki.org/w/index.php?title=Admin_tools_development&diff=661025&oldid=646111 - that may be the "standard tasks header", but it's completely wrong. Most of those tasks aren't "open", they're being worked on, and I'd worry that it would encourage people to waste effort on things already in-flight... [19:33:11] guillom: But don't want to just blindly revert you or break anything. :-) [19:34:42] James_F: I understand; The thing is, we need to link to open tasks from https://www.mediawiki.org/wiki/Product_development , and if the section is the same on all pages, it makes it much easier to maintain. I assumed that the "doing" and "done" markers were enough to avoid confusion. [19:35:16] guillom: Huh. Interesting. [19:35:27] guillom: Maybe I could split out the "open tasks" from the rest inside the Roadmap? [19:36:03] James_F: that works for me, but I actually thought that would be too disruptive an edit. If you prefer that, I'm all for it :) [19:36:12] guillom: Nah, 'tis fine. :-) [19:36:49] James_F: Would you have time to do the splitting? You'd know better than me. [19:36:58] guillom: Yeah, no problem at all. [19:37:06] Thank you! [19:42:54] guillom: Done - does that help? [19:44:38] James_F: ugh, are you telling that there was *already* an "Open tasks" section that was split out, and that I didn't see it? [19:44:43] Guys; I am wondering if there could be a reason why I am unable to access Gerrit? (I may have missed something) [19:45:07] guillom: Mostly. It was "Other tasks" and some of it was inflight (moved that out). Don't worry about it. :-) [19:45:20] Ignore what I said. It fixed itself :) [19:45:20] guillom: Anyway, must run. [19:45:25] ok [19:45:34] thanks [19:46:49] James_F|Away: I guess I just don't understand the diff: https://www.mediawiki.org/w/index.php?title=Admin_tools_development&action=historysubmit&diff=661070&oldid=661025 [19:47:13] Oh, well. [20:46:13] guillom: See also https://www.mediawiki.org/w/index.php?title=Admin_tools_development/Roadmap&diff=prev&oldid=661071 which is the important bit. :-) [20:52:49] hello [20:53:20] notpeter / binasher: i have a question about db access [20:53:29] toolserver gets a new ip range [20:53:53] do you have to make another access entry for the replication? [20:54:39] nosy: i actually have no idea [20:55:36] nosy: i don't know how toolserver managers to connect to our databases, and if it gets nat'd to an internal ip or not [20:56:08] LeslieCarr: do you know how toolserver is provided network access to our internal dbs? [20:57:03] binasher: we make a tunnel to private ip addresses via amaranth [20:57:43] amaranth is the web server in the us [20:58:11] i'm not aware of what amaranth is.. what's the fqdn? [20:58:34] amaranth.toolserver.org [20:58:39] but this ip wont change [20:58:43] afaik [20:59:21] but it looks like we would be safe just to test what happens if we come through our new range and probably switch back if we need access entries [20:59:37] oh, *that*'s why it was a spof for replication... [21:00:05] jeremyb_: ? [21:00:18] binasher: i am not sure honestly [21:00:57] nosy: amaranth was completely unresponsive for a bit a few months ago. that explains why that meant no replication [21:01:08] amaranth.toolserver.org is a box in pmtpa? [21:01:11] yes [21:01:26] interesting [21:01:41] nosy: if you're involved with toolserver databases, I have a couple of unrelated questions [21:01:56] oh boy ill better leave :D [21:02:01] ok ask [21:03:12] nosy: no changes in grants needs, they're granted via amaranth [21:03:26] LeslieCarr: paravoid: amaranth should get a security audit [21:03:39] binasher: why are you telling me? [21:03:43] :) [21:03:54] haha [21:04:02] it's in the sandbox vlan [21:04:07] nosy: we have a bunch of related DNS records [21:04:13] it's all scary [21:04:14] $name.db.ts.wikimedia.org [21:04:28] aawiki-p.db.ts.wikimedia.org is an alias for sql-s3.toolserver.org. [21:04:29] etc. [21:04:39] are these still needed? [21:04:48] LeslieCarr: how is it sandboxed? can only toolserver ip's connect externally? [21:04:56] are interwiki links breaking a known issue? [21:04:57] lemme see the actual rools [21:05:06] second question is if the whole ts.wikimedia.org zone is needed [21:05:07] it's in a vlan with some stateless fw rules [21:05:09] jira 1H IN CNAME web.amaranth.toolserver.org. [21:05:10] confluence 1H IN CNAME web.amaranth.toolserver.org. [21:05:10] fisheye 1H IN CNAME web.amaranth.toolserver.org. [21:05:10] wiki 1H IN CNAME web.amaranth.toolserver.org. [21:05:18] also no MartijnH examples please ? [21:05:21] jira.ts.wikimedia.org etc. that is :) [21:05:45] LeslieCarr, on [[Leave_(U.S._military)]] [21:05:51] paravoid: aawiki-p.db.ts.wikimedia.org i dont have any idea if any tool uses this hostname [21:06:00] I get Languages [21:06:00] <0> [21:06:03] sql-s3.toolserver.org would be what i recommended [21:06:12] can I remove them? [21:06:20] that is almost cerainly not what it's supposed to be ;) [21:06:51] paravoid: well do and i know what happened if users have questions [21:07:02] what about the rest of the records? [21:07:09] jira, confluence etc. [21:07:17] everything under ts.wikimedia.org basically [21:07:18] we still have these [21:07:34] but i always use jira.toolserver.org etc [21:07:41] I'm not talking about *.toolserver.org [21:07:43] just ts.wikimedia.org [21:07:49] can I just ditch that? [21:08:04] if you like [21:08:10] yes I'd like that :) [21:08:25] then go ahead [21:08:32] at least i know of it :D [21:09:57] basically it's allowed to access the database servers and has ssh to a few db servers [21:10:35] Hi. [21:10:41] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/78117 doesn't have CSS for me logged in. [21:11:14] thanks Susan for the frontpage change [21:11:30] No problem. :-) I think it looks much better. [21:11:48] paravoid: you are involved in labs? [21:12:03] if you have any questions about our structures just ask [21:12:08] bits a bit awry? [21:12:31] 503 :/ [21:12:43] hoo: I'm getting no CSS at https://www.mediawiki.org/wiki/Special:Code/MediaWiki/78117 [21:12:49] I'm not sure if it's just me. [21:12:51] gah. Well, it works on the second load, and I accidentally deleted the evidence. I'll see if I can reproduce the issue, but I have little hopes [21:13:08] That happens once every 3 months maybe... and of course just by the time I want to debug some live JS -.- [21:13:14] Hmm, working now... [21:13:25] bits seems funky, though. [21:13:30] It seems to be blocking page load. [21:13:35] Or delaying, rather. [21:14:07] https://ganglia.wikimedia.org/latest/graph.php?c=Bits%20application%20servers%20eqiad&m=cpu_report&r=hour&s=by%20name&hc=4&mc=2&st=1363641221&g=network_report&z=medium [21:14:35] :-/ [21:15:21] notpeter: Is bits having issues? (Is this known?) [21:16:40] Susan: there were just a couple of gracefuls of apaches (see -operations) [21:16:50] that might account for the drops [21:16:57] are problems ongoing? [21:17:05] nosy: not much, no [21:17:08] nosy: why? [21:17:25] nosy: and how can I get a very confident "yes please get rid of all *.ts.wikimedia.org"? :) [21:17:31] notpeter: Seems better now. [21:17:36] https://ganglia.wikimedia.org/latest/graph.php?c=Bits%20application%20servers%20eqiad&m=cpu_report&r=hour&s=by%20name&hc=4&mc=2&st=1363641221&g=network_report&z=medium seems to be stabilizing. [21:17:38] Susan: cool [21:17:42] :-) [21:17:50] nosy: should I ask more people perhaps? [21:18:12] there was also a very brief rendering problem. and I don't know every way that mediawiki connects things... [21:18:16] could also be the cause [21:18:18] but [21:18:21] glad that it's recovering! [21:21:26] paravoid: if you dont mind you could write a mail to toolserver-l@lists.wikimedia.org [21:22:00] notpeter: You are responsible for bits as it seems?! [21:22:14] or i can stating the records are cleaned up [21:22:29] the bits must flow.... [21:22:35] afair the users should not use them anyway [21:22:49] hoo: yep notpeter is the one in charge of bits [21:22:49] paravoid: Do we have requests stats for *.ts.wikimedia.org? [21:22:57] no. [21:23:18] Well, those would likely be helpful. :-) [21:23:38] notpeter: I guess you are into the numbers, hang on a second [21:23:50] I know there was a move a long time ago away from *.wikimedia.org domains for security and security through segregation reasons, but some tools are very old and still quite popular. [21:24:04] It'd be nice to know if anything is routing through these old host names. [21:24:55] How much JS/CSS is usually transfered on page load? 50KiB? 100KiB? 200KiB? (Just guessing into the dark) [21:25:21] nfi [21:25:33] Compressed? [21:25:37] Which wiki? [21:25:41] Logged in or logged out? [21:25:42] Susan: Yes, Wikipedia [21:25:43] s [21:25:46] logged out [21:25:53] Not all Wikipedias are crated equal. ;-) [21:25:55] created, too. [21:26:16] Susan: I just want to know whether adding about 190KiB per request is a problem [21:26:24] ... [21:26:26] not all requests [21:26:27] Heh. [21:26:36] That sounds like a lot. [21:26:43] 190KiB is compressed size? [21:26:44] Yes, I just thought of that [21:26:51] Minified and gzipped? [21:27:23] um, in all cases less is better [21:27:33] Except free knowledge! [21:27:45] but i think we tend to let each community have a lot of choice on how slow they make their frontpage ;) [21:27:53] enwiki mainpage [21:27:54] 62 requests ❘ 16.1 KB transferred ❘ 1.48 s (onload: 1.50 s, DOMContentLoaded: 741 ms) [21:27:58] ok revised, in data transfer cases, less is better [21:27:59] :) [21:28:03] :-) [21:28:24] I'm not sure whether that size is before or after uncompressing [21:28:29] 16.1 includes en.wikipedia.org and other domains (like bits)? [21:28:36] And that must be gzipped, right? [21:28:47] yup, and images and stuff [21:28:47] so each 1500 bits (1.4 kB) is a potential round trip time [21:29:15] Ok, only aroung 50 KiB when compressed [21:29:19] with ?debug=true [21:29:20] 163 requests ❘ 660 KB transferred ❘ 1.1 min (onload: 14.38 s, DOMContentLoaded: 13.82 s) [21:29:20] * around [21:29:44] hoo: What are you wanting to add? [21:29:46] Reedy: How many pages w/o langlinks do we have? Do they get many views? [21:29:48] Another JS library? [21:30:00] Chrome has these useful things called "Developer tools" [21:30:09] Susan: Well, Wikibase wmf12 deploy will add a lot of JS [21:30:17] To wikidata.org? [21:30:22] Or to all sites? [21:30:30] Susan: All clients (= Wikipedias) [21:30:49] Link to changeset(s)? [21:30:55] Susan: to many [21:31:02] What's the JS doing? [21:31:16] Susan: Linking articles on the client [21:31:17] Susan: Loading [21:31:22] :D [21:31:32] Don't [[links]] do that already? [21:31:46] Do you have a link to the relevant JS? [21:31:47] Susan: I mean langlinks, different language versions [21:32:15] https://github.com/wikimedia/mediawiki-extensions-Wikibase/blame/master/client/resources/wbclient.linkItem.js (the main "problem" are it's dependencies) [21:32:17] nosy: ts.wikimedia.org is gone [21:32:28] nosy: let me know if unfixable from your side problems arise :) [21:32:35] paravoid: ok :) [21:32:48] Someone should definitely e-mail toolserver-l about that. [21:33:00] ill do [21:33:02] or anyone else in #-operations [21:33:09] Cool, thanks, nosy. :-) [21:33:18] thanks :) [21:33:23] I'm reluctant to :) [21:34:27] Reedy: So, in a nutshell: I don't have to worry? :P [21:34:37] i think you do [21:34:49] Adding 30% or so more uncompressed is a big increase [21:35:02] Depends what that works as being compressed... [21:35:18] It's around 50KiB compressed [21:35:50] but it only loads on page w/o langlinks [21:37:08] * hoo loves firefox's new PDF.js :) [21:38:39] That's like 300% more [21:39:01] Reedy: gnah... that's going live on Wednesday [21:39:15] We could change it to only load for logged ins easily, though [21:39:17] I'm sure Susan will complain for you [21:39:46] Who approved the changes to Wikibase? [21:40:07] Susan: The wikidata team? Various persons, why? [21:40:08] greg-g: Yo. [21:40:23] aude: ^ [21:40:24] hoo: I think that kind of increased load might cause performance problems. [21:40:49] Susan: Are you into bits as well? [21:40:56] Not really. [21:41:01] Except when it breaks. [21:42:03] Susan: ? (on a call, but please leave a message ;) ) [21:42:28] greg-g: hoo says there's a good amount of additional JS intended to go out on Wednesday of this week. [21:42:40] greg-g: I'm wondering if that needs to be flagged as a scaptrap. [21:42:43] Only for pages w/o langlinks and only WPs [21:42:53] How many pages is that? [21:42:59] (Do we know?) [21:43:11] Reedy: ping call? Core team? [21:43:18] Susan: Probably many of them are barely viewed... but we (I) don't have stats [21:43:21] Oh, cute, the new tool to add interwikis? [21:43:23] Reedy: we're talking about the branching from today [21:43:23] It's something like 3 million pages on en.wiki without langlinks, isn't it? [21:43:28] Nemo_bis: Yeah :D [21:43:36] hoo: wonderful! [21:43:37] Susan: I have hardly an idea [21:43:45] It'd be nice to know. :-) [21:44:11] Is the JS lazy-loaded? [21:44:16] Or could it be? [21:44:57] Susan: We have several things we *could* do [21:45:00] hoo: only on the 27th for non-en Wikipedias, right? [21:45:05] I just don't know whether we need to do them [21:45:19] Nemo_bis: non WPs don't have the WB client [21:45:35] Non-English Wikipedia, he's saying. [21:45:45] yep [21:45:47] Non-English Wikipedias, rather. [21:46:00] No, WB client is supposed to go live on the same day on all WPs, no? Reedy? [21:46:16] hoo: I'm not on ops, but I think the additional load on bits should definitely be flagged for review. [21:48:17] Susan: There are various things we could do, I'm just not sure which and I dunno whether we need to backport them [21:48:23] hoo: where is this JS going to be deployed? [21:48:33] greg-g: Wikipedias [21:49:15] hoo: all? what is this for? I don't see it on the Deployments calendar: https://wikitech.wikimedia.org/wiki/Deployments [21:49:58] I didn't read the full scrollback, the call literally just ended. [21:50:12] greg-g: Well, it's just a wikibase client change [21:51:17] hoo: so, sorry, I just want some clarity here: is the code being deployed on the WMF cluster of apaches that serve all Wikipedias OR on the wikidata machine(s)? [21:51:41] greg-g: WMF cluster [21:51:44] if A (and even B, if of sizeable importance): it should really be on the Deployments calendar [21:51:56] yeah, this should *definitely* be on the Deployments calendar, full stop. [21:52:54] greg-g: Well, it's going to be part of "MediaWiki general deployment window (1.21) [Sam / Aaron / RobLa]" probably [21:53:08] (I'm rather clueless here= [21:53:10] *) [21:53:29] is the bode already in the wmf12 branch? https://www.mediawiki.org/wiki/MediaWiki_1.21/wmf12 [21:53:35] s/bode/code/ [21:54:04] greg-g: It's in Wikibase's wmf12 branch which is not yet part of core wmf12 [21:54:34] if it isn't in the wmf12 branch that was deployed today by Reedy, then it won't/shouldn't be deployed on Wed [21:54:43] see: https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap#Schedule_for_the_deployments [21:55:07] greg-g: "Wikidata (repo + switching on all Wikipedias)" I guess that's us [21:55:22] ahhhhhhh [21:55:33] hoo: sorry, I was confused [21:56:04] greg-g: you weren't entirely wrong. hoo, are you all planning to deploy stuff to all wikis next Wednesday that won't be tested on the smaller wikis this week? [21:56:04] so, yeah, this is probably scaptrap worthy, to answer Susan's question [21:56:32] Please don't use relative time. [21:56:53] The deployment schedule says Wikidata will be updated/deployed Wednesday, March 20. [21:57:43] Susan: that may have been a thinko by one of us (likely me) [21:58:08] Sure. :-) [21:58:35] robla: I think the plan is to have all WPs at the same version, but I'm not into that... I just came here to mention that a rather big amount of JS is going to be added to *some* pages [21:58:35] anyway, disappearing into a phone call right now [21:58:39] all time is relative ! [21:59:09] hoo: will you/others familiar with the code be available during that deployment window on Wed? [21:59:32] greg-g: I can hang around, probably which time (UTC) is that? [21:59:35] greg-g: It's not clear whether there is a Wikidata deployment window on Wednesday, March 20. [21:59:48] greg-g: Can you ask the Wikidata folks if that schedule is accurate? [21:59:52] I'm not really into deployments, though [22:01:10] ok...here's what the schedule should be: * Wed 27th - completion of Wikidata phase II IT, HU, HE Wikipedia and ru, tr, uk, uz, hr, bs, sr, sh [22:01:17] greg-g: ^ [22:01:51] the March 20 date is incorrect and can be removed. may want to confirm that tomorrow with the Wikidata team generall [22:01:54] generally [22:01:56] right, that's the 27th, I thought hoo/Susan were talking about the 20th [22:02:07] will do on that part (confirming with wikidata) [22:02:11] ok...now really I've gotta go. [22:02:47] Fine... I was just as clueless about that date as you, I just knew about my JS thing [22:03:00] greg-g: And updating https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap#Schedule_for_the_deployments (which is the source of March 20). [22:03:01] thanks for the heads up Susan/hoo, I'll confirm with reedy/Wikidata, I assume the "Wikidata (repo + switching on all Wikipedias)" for wed 3/20 was a copy/paste error by Reedy today [22:03:09] yeah [22:03:09] Cool. :-) [22:03:16] Thanks for looking into this. [22:03:28] 'tis my job ;) [22:07:46] https://wikimediafoundation.org/wiki/20110808_country_specific_test/no/NO is giving me an error message. [22:07:57] The green CANNOT_FORWARD one. [22:08:00] It went away on refresh. [22:08:05] I hate computers. [22:11:50] Susan: Sounds like it's mutual [22:13:19] * Susan cages Reedy. [23:22:38] Guys, knows anybody for what the table enwiki.ep_articles is? [23:24:38] probably EducationProgram [23:25:43] TimStarling: is that stil active? [23:26:33] It is [23:28:44] DaBPunkt: it's even expanding to other territories, it seems [23:29:06] ok, tnx [23:36:09] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/EducationProgram.git;a=blob;f=sql/EducationProgram.sql;h=073ca005ebbfe0fd9360f69c36787275d0ff1811;hb=HEAD [23:36:16] ep_articles is in there. [23:42:48] Susan: tnx. My google search was not sucesful