[15:26:10] btw guillom I removed something about socialcoding4good from the status msg for my engineering report section because it seems like sc4g wants to be a little more quiet for the moment (need to get more clarification on that but that's my understanding) [15:26:53] sumanah, ok. Since the page was public, I didn't think it was an issue. [15:27:30] understood [15:27:36] guillom: that is a very reasonable inference [15:27:42] guillom: which was my inference as well till I asked [15:27:46] :) [15:27:47] guillom: I guess it's softlaunchy? [15:27:50] ok [18:22:40] TheCount: I'm curious about the nick change, of course [18:22:59] Meeting in another channel. :) [18:27:59] sumanah, am I wrong or your last email to cultural-partners was not to wikisource-l as well? [18:28:24] Nemo_bis: I am not on wikisource-l [18:28:28] Nemo_bis: feel free to forward [18:28:40] sumanah, ok, so I'm giving you a link :) [18:28:43] Nemo_bis: and link to https://bugzilla.wikimedia.org/show_bug.cgi?id=32695 [18:29:28] sumanah, it should be http://lists.wikimedia.org/pipermail/wikisource-l/2011-February/000939.html [18:30:09] I am reading that now [18:30:25] Platonides: ^^ [18:32:19] sumanah, I think Alex wrote something to wikitech-l as well [18:33:40] Nemo_bis: feel like adding links to that bug in bugzilla? [18:34:06] ok [18:53:36] Good morning [18:57:04] Anyone here familiar with how centralnotice works internally ? [18:57:34] I can't find a way to export variables with a banner from php to the front. [18:57:45] good evening [18:57:49] Roan might know [18:57:56] Else Kaldari (But he's not about) [18:58:11] I don't, really [18:58:14] Ask Kaldari [18:58:49] It can't be in mw.config since it has to be cacheable, but there is an ajax request going on to get the banner, that's where I think custom variables can be added [18:59:35] which can then be used by javascript in building the banner. [18:59:37] Ping him an email and ask him to go on irc [19:00:15] Hello, and welcome to the weekly features meeting! [19:00:43] Oh right! [19:00:49] (the features meeting) [19:02:05] ohai! [19:03:14] http://etherpad.wikimedia.org/FeaturesTeam20111129 [19:05:40] RoanKattouw: Uploading WCN to commons/wikitech today. [19:05:48] RL2 presentation that is [19:06:50] yay [19:06:59] I watched the recording of our Wikimania talk today [19:08:07] i wasn't too happy with those recordings. [19:08:22] it was like, "zoomed in on the guy talking" and then the slides got lost and ignored. [19:08:38] Ours were bad in a different way [19:09:03] we could make a literal video out of that :P [19:09:06] There's a cut somewhere where about 20 seconds of video is missnig [19:09:26] The slides are visible but hard to see [19:09:33] however, my understanding is that this is still a far better situation than the 2010 wikimania videos :) [19:09:45] There's a light spot obscuring the top left corner and the colors are bland and hard to distinguish [19:09:47] yeah, much better [19:09:47] the one of howie and me and swalling is missing the first 2 or 3 minutes. [19:09:47] sumanah: Heck yeah [19:09:49] also, they actually had tripods [19:09:59] It STILL bothers me that I've never seen my own 2010 presentation [19:10:00] unlike any video I've taken of our presentations :) [19:10:29] RoanKattouw: you're closer to Poland than I am, for interrogation/chasing/etc purposes :) [19:10:55] Believe me, we've tried [19:12:46] So, to the topic of the features meeting: I am once again announcing I'll be working very little this week, it's college application crunch time [19:58:06] Krinkle: howdy [19:58:21] kaldari: Hey, thanks for coming in,. [19:58:37] I can't talk for long though [19:58:59] So I'm working on the CentralNotice banner for research (Dario) [19:59:17] I can't find a way to add variables to the data exported to the front-end when a banner is requested [19:59:26] I need to add data that is queried from MySQL [19:59:54] Krinkle: have you heard from Jerome if eventually they need to have the user_name or user_id passed as a parameter? [20:00:19] and you want the data in the banner request itself? [20:00:49] that would ruin the caching [20:01:39] what sort of data are you trying to collect? [20:01:57] kaldari: Well, I don't want to force any method in particular. Basically I need to know certain variables (i.e. account age, boolean true/false if user has 20 edits in last X days, and some others). These values can be cached in that they need not be live, but fixed at time of banner launch [20:02:02] however it is both user and banner specific. [20:02:36] you need this data for every banner impression, or just for people who click the banner? [20:03:08] every, since whether or not to show this banner (or another) depends on it. Only users with certain user group memberships and account age are shown this banner. [20:03:30] rectify [20:03:40] Without "(or another)" [20:03:56] It's fine that if this banner is choosen, in javascript either something or nothing get's showqn. [20:03:58] shown* [20:04:12] ah, that's a tricky one to implement. The only way you can do it currently is display a blank banner, evaluate the user with client-side js, and then conditionally display the banner. [20:04:37] I got that far, but my problem is getting that data to client-side JS [20:04:38] actually, that's not a very clear explanation... [20:04:46] only for logged-in users though [20:05:04] banners are created entirely on-wiki it seems. [20:05:14] There are some obscure api methods you can use via ajax [20:05:37] lemme find it... [20:05:38] Krinkle, if you find a solution please document it somewhere, because many asked this before [20:05:57] Nemo_bis: yes, will do [20:06:02] thanks [20:06:38] http://www.mediawiki.org/wiki/Extension:UserDailyContribs [20:07:07] it's enabled on the cluster and has an API for querying user data [20:07:33] the API is fairly limited though [20:08:37] http://en.wikipedia.org/w/api.php search for "userdailycontribs" [20:08:38] Hm.. I was thinking more towards a PHP solution (i.e. api.php(or Special:NoticeFoo)?getdatafor=banner_id&format=json) [20:08:51] but that api might be enough [20:09:14] There's no easy way to do it server side [20:09:33] You would have to rewrite some of SpecialBannerController [20:09:34] I'd have to do quite a few api queries though. some to userdailycontribs, some to userinfo, some elsewhere [20:10:01] why's that? [20:10:16] oh you need a bunch of data huh? [20:11:06] yeah [20:11:23] There might be some undocumented params in UserDailyContribs, but probably not for everything you need [20:12:07] *Krinkle waits for gDocuments to load [20:12:47] Krinkle: Take a look at http://meta.wikimedia.org/w/index.php?title=Special:NoticeTemplate/view&template=2011EditorSurvey [20:13:29] it looks like you can get registration date from that API [20:13:49] although, I must warn you account registration date isn't 100% reliable. [20:13:55] some accounts are missing that data [20:14:04] especially really old ones [20:14:33] but I think it's less than 1% of accounts [20:14:41] dario might know better than me [20:14:46] arch, I can't find the document. [20:15:16] kaldari: I'm on a skype call, will read you later [20:15:27] DarTar: The specs were in a google document, right ? [20:15:49] For some reason the mail thread about it got spread over three emailaccounts. [20:16:14] DarTar: Could you verify and/or fix that you got me as ttijhof@wikimedia.org instead of *@gmail.com ? [20:16:20] (from now on) [20:16:36] http://en.wikipedia.org/w/api.php?action=userdailycontribs&user=Krinkle&daysago=90&format=xml [20:17:00] That should get you most of the way there [20:17:35] kaldari: very nice [20:18:05] "To be eligible, an en:wp user must:" [20:18:10] * be an admin on en:wp (group=sysop); [20:18:16] or [20:18:27] * on en:wp, have a total number of edits of at least 300 prior to the launching date of the study + of which at least 20 edits within the last 180 days prior to the launch of the study; [20:18:37] or [20:18:40] * have had an en:wp account registered for less than 30 days prior to the launching date of the study; and in any case [20:18:48] * and in any case, not be a bot (group=bot). [20:18:50] be careful about copying code from that editor survey banner though, it sets some cookies and does other weirdness as well, which you probably don't want. [20:19:11] I'll look out for that [20:19:19] geez, that's complicated :P [20:19:57] yeah, I'm somewhat confused by some of it though, why include anyone under 300 edits that's less than 30 days old ? [20:20:07] but I think you can get it all from the API and the user's groups [20:20:10] but for accounts older than 30 days, only if more than 300 edits [20:20:26] kaldari: yep, user groups are exported in mw.config already [20:20:38] wgUserGroups [20:20:49] which is nice [20:22:02] I gotta run, need anything else? [20:22:21] kaldari: Don't think so, this is enough stuff to keep me busy and hopefully finish it all. [20:22:27] Thanks so much [20:22:28] cool [20:23:01] yuck, now for getting it up on prototype :P [20:23:18] and manipulating accounts to get test subjects that can actually match the parameters [20:23:33] Krinkle: if you feel like documenting your solution: http://meta.wikimedia.org/wiki/Help:CentralNotice [20:25:32] kaldari: will do once I verify it works for me :) [20:28:07] re [20:28:38] Krinkle: updating my address book [20:30:01] Krinkle: do you want me to ping Jerome and ask him to join us in this channel? [20:30:18] he's online on skype [20:30:32] well, I'm feeling like I can go back to work now. Is there something you'd like to discuss? [20:31:17] I see you had a question regarding the account reg date + edit count [20:32:06] so if the specs are confusing we should ask Jerome to confirm what they need [20:32:47] personally I found the parameters a bit odd, but I don't know the survey so.. [20:32:54] but sure, let's ask to be sure :) [20:32:58] before I knock myself out [20:33:41] ok hang on [20:37:48] Krinkle: For the editor survey banner, couldn't you just use autopromote to resolve those conditions server side? [20:38:05] APCOND_* allow complex expressions involving edit count and account age [20:38:10] it's not boolean, the values need to be send to the survey server [20:38:19] Oh [20:38:19] the individual values that is [20:38:23] Right [20:39:01] RoanKattouw: first time we do this, that's why we have Tech + Community + Legal + RCom involved [20:39:15] DarTar: What ARE you doing, anyway? [20:39:17] which reminds me, DarTar have you checked out (or otherwise know there is no issue with) regarding sending all this information to a third party from clicking a banner ? [20:39:40] oh never mind DarTar , we already did that. It's all in the public API [20:40:49] RoanKattouw: running a CN banner to send eligible registered users to an external survey + behavioral economics exp run by Harvard/Sciences Po [20:40:54] the documentation is here: http://meta.wikimedia.org/wiki/Research:Dynamics_of_Online_Interactions_and_Behavior [20:41:03] OK [20:41:33] DarTar: Do you have a link to the google doc or etherpad with the specifications ? [20:41:44] I can't find it, searched both my google accounts [20:41:54] Krinkle: sure, hang on (Jerome is joining in a moment btw) [20:41:57] ok [20:42:07] http://etherpad.wikimedia.org/HarvardSciencesPo [20:42:20] That's the one ! [20:43:14] DarTar: Rrr, dinner's calling. Back in 30 minnutes [20:43:22] np [20:56:17] hey Jerome_ [20:57:02] Krinkle-away: Jerome_ is here if you need to ask him about the editor metrics when you're back from dinner [21:00:50] anybody speak spanish well enough to help somebody in #mediawiki? i'm pretty rusty [21:04:24] Hi Jerome_ [21:05:18] so two things. A question about the parameters, and a small announcement about one aspect that can't be implemented (or is unlikely to be) [21:05:51] first, when looking over the parameters I found something odd, may have been intentional but just making sure this is the case [21:06:53] so basically the three groups targetted: 1) experienced wikipedians who became 'sysop', 2) users with recently active accounts over 30 days old with 300+ edits, 3) users with accounts less than 30 days old, regardless of edit count. [21:06:55] correct ? [21:07:46] It sounds like you could simplify #2 to "users with 300+ edits" and get the same result [21:08:08] RoanKattouw: 300+ edits, 30 days old and 20 edits in last 3 months [21:08:08] The "over 30 days" part is redundant, it doesn't actually exclude anyoen [21:08:24] Right but still [21:08:41] yeah, but it's not a problem to check account age the same time. [21:08:54] You don't need to include age > 30d as a criterion if anyone with age <= 30d already gets a free pass due to #3 [21:09:09] right [21:09:16] *RoanKattouw recognizes he's splitting mathematical hairs [21:09:34] makes sense [21:10:41] Jerome_: okay, It appeared to me as if there where 3 groups there almost cover all users but with small gaps in between that didn't make sense, but it appears there are no small gaps. It's fine as it is, I'm not going to doubt that, these specifications are written and provided, no problem [21:11:39] Jerome_: So the second thing is, the specifications include "at the launching date of the study". that may be a problem as the API for these statistics only takes the number of days from 'now' back in the past (e.g. last day, last month, last 192 days???) [21:12:25] not "between 182 days ago and 2 days ago" [21:16:07] Oh I'm really sorry! Somedy just rushed into my office to ask some urgent questions. [21:16:19] I'm catching up now on what you guys wrote... [21:18:03] 2) users with recently active accounts over 30 days old with 300+ edits [21:18:14] This is not what's in the connection protocol [21:18:29] from what I remember the protocol says "300+ edits, 30 days old and 20 edits in last 3 months" [21:18:55] "total number of edits of at least 300 prior to the launching date of the study + at least 20 edits within the last 180 days prior to the launch of the study" [21:19:06] right, 6 months (180 days) [21:19:30] Which practically means 300+ edits, 180+ days old, 20+ edits within those last 180 days [21:19:48] (prior to the launching of the study) [21:19:53] right? [21:20:11] No [21:20:17] 180 days old is not necessary [21:20:29] it doesn't say the account has to be 180 days old [21:20:42] I could make a thousand edits in the next hour and qualify [21:21:37] I made it "30 days" in my description but that won't be in the actual program code, it's implicit as group "3" says "less then 30 days old, any edit count" so it doesn't matter how old the account, since less than 30 they may not qualify as group 2 but they auto-quality as group 3. [21:21:55] RoanKattouw: Then you'd quality as group 3 also [21:22:10] Oh, right, age < 30 [21:22:15] So the question is, does the survey want to target users with 300+ edits that are between 30 and 180 days old ? [21:22:24] No. [21:22:55] but they do target users with any number of edits less than 30 days old. [21:23:08] That's what I explained in my last mail. There was a small ambiguity there, which is not too detrimental... (users who make 300+ edits in less than 180 are very few I guess) [21:23:16] Yes. [21:23:25] So 3 categories to sum up. [21:23:26] the problem is that, due to the inability to check "at point of launch date" the numbers will change over time [21:23:50] making it kinda weird. If my account is 28 days old with 400 edits, I will quality today as "group 3" but in 3 days I won't quality under any group. [21:24:26] Ah. So we're unable to subject our metrics to one specific point in time (i.e. the launching date of the study)? [21:24:43] It should be possible to agree on a cut-off date for acct creation, right? [21:25:10] The "n days prior to the start" thing isn't feasible for edit counts but it is feasible for account age [21:25:35] Krinkle, do you second that: The "n days prior to the start" thing isn't feasible for edit counts but it is feasible for account age [21:26:31] Sure, account age is available is a timestamp. We can not query "account less than 30 days old at point x" but we can query "account created before date X" (so we calculate the date before doing the query, no problem) [21:26:37] If this is so, what we can do is check for account creation subject to the launching date of the study but have our edit count metrics determined in a dynamic fashion. [21:26:53] Jerome_: Right [21:27:12] This will not change much in the sample of eligible users I guess... [21:27:20] It's important that users cannot disappear from the target group over time, being added is fine I guess. [21:28:13] So if you're a sysop you're a sysop. That's not subject to change. So group one is not a problem. [21:28:19] edit count can decrease over time (deletion of stuff), however the API I intend to use for this is a statistical API, one that does not decrease so we're safe there too. [21:28:34] sysops can get desysopped, but that's raer [21:28:36] *rare [21:28:53] <^demon> user_editcount doesn't decrease on page deletion iirc. [21:28:58] Well, I should say uncommon [21:28:59] i.e. the edit count is not queried live, we have edit counts per day per user in a special statistics table. [21:29:12] Then account creation for less than 30 days prior to the launching date of the study: we can do it. So this group is safe too. [21:29:26] ^demon: dunno when, but I know that it can (or at least could in the past) decrease, which makes sense. [21:30:49] So what will be determined dynamically are the criteria for group 2: 300+ edits, 180+ days old, 20+ edits within those last 180 days [21:31:29] yeah, the total edit count is pretty safe, but the tricky one is 180days. [21:31:35] Do you think that the people who will qualify under those 3 requirements will change over an 8 days period... [21:31:53] should that be last 180 days as of now, or last 180 days +number of days since launch [21:32:17] By the way, those 3 requirements are pretty much those which were used to qualify as a voter or candidate for the last board of trustees election. [21:32:19] I suggest we do the latter, otherwise users could disqualify half-way the banner presentation if they made more edits in the past than they do now. [21:33:16] Jerome_: yes, but those kind of votes are either manually checked afterwards through a direct database query, or it's done server side onsubmisison. Not client-side through a limited API that is loaded for almost all users of the site on any page :) [21:34:00] Krinkle: Okay I see. [21:36:01] Hm.. while I was testing this locally I just realized we may have a more important problem. I was changing the numbers in the API query and realized.. anyone can do that from the client side if they wanted to. Then I remembered the specification mentioned a token that we should use and generate an md5 hash. [21:36:03] Krinkle: So what you want to avoid is users with 300+ edits who made 20+ edits in their last 180 days as of the launch of the study but not anymore a few days after? [21:36:09] so that people can't fake this [21:36:14] What you want is keep those in the sample, right? [21:36:41] Yes. We have a secret key. [21:37:49] Krinkle: So this should prevent any problem, right? [21:37:50] Jerome_: Use case: User X made 20+ edits throughout last 180 days, tomorrow the last of those 180 days is no longer the last 180 days. [21:38:18] Jerome_: yes that should prevent any problem with tampering data [21:38:40] however??? since we're on the client side.. the token is currently public in my local testings [21:38:44] it's right there in the JS [21:39:11] Krinkle: then this guy does not qualify anymore if we check the metrics dynamically. I am with you on this. [21:39:48] Do you think this is an issue? My guess was that very few users would be concerned by this... [21:40:33] Jerome_: yeah that could potentially happen, but it's not a big deal and there's a simple fix. Instead of querying dynamically "last 180 days", we could query "days=180+(days since launch), last {days} days:" [21:41:58] Krinkle: "days=180+(days since launch)" => this is okay for me I guess. [21:43:51] Krinkle: so do you want to implement the system this way? [21:44:27] Krinkle: if so, I'll just inform our team here to let them know and make sure that doesn't trigger any limitations on our side. [21:44:33] Krinkle: sounds good? [21:46:30] yes [21:47:07] so regarding the hash, the reason for the hash is to make sure users can't submit random data [21:47:49] getting that hash without exposing the token is never watertight on the client side, i.e. not gonna happen. I could figure out a server side way (perhaps write a new mini-API) [21:48:02] however, alternatively I could make the banner give you a user_id [21:48:17] then your end could query the API for this user and verify [21:49:19] So I was currently answering to Dario's last mail, basically proposing that we use numeric user ids as login instead of usernames. [21:50:02] Does that help? [21:51:38] yes [21:52:05] well, it doesn't matter for my end but I figure it might help on your end due to the non-latin characters you mentioned being problematic [21:52:11] what is the username used for ? [21:53:23] What we want is to avoid double participations. So we check that the same login does not complete the study twice. [21:53:42] Previously login=username. Now we propose login=numeric user id. [21:54:18] right so the hash isn't only used to avoid invalid submissions with fake numbers (i.e. user=newbie20000 usergroup=sysop (yeah right)) [21:54:28] but is also used to verify that the user is indeed that user [21:54:33] yes. [21:55:14] so using user_ids and letting your end get this data based on the user_id doesn't help as a user could submit the form with a different user_id if there is no check hash [21:55:16] so we need a hash [21:55:19] either way [21:55:55] It would be better, yes. [21:56:00] which means server side since JS isn't good at MD5 and we obviously shouldn't expose the token/salt [21:57:07] CentralNotices are flexible, very flexible. But the one aspect that is currently consistent is that it's entirely created on-wiki and rendered client-side. [21:57:42] So could you tell me with very simple words what's the problem with the secret key? :) [21:58:02] (I don't write code indeed!) [21:58:13] It's trivial to create a little API for this that just returns md5( $input-from-ajax . 'token here'); [21:58:30] oh sorry, I'll translate that to english [21:58:32] :) [21:58:54] Thanks! ;) [21:59:40] So the problem is that right now our CentralNotices are entirely created through wiki-pages (which are public) and rendered through JavaScript (i.e. the program code is sent to the browser and rendered there). This makes them very flexible and makes it very easy to cache since the part that's done on the server is the same for every user, and in the browser it does something different depending on who you are [22:00:23] However if we'd generate this check hash from within the browser, then the browser needs to be given the secret key as well [22:00:28] Okay. [22:00:50] In which case it's not exactly secret any moer [22:00:55] which destroys the purpose of the key since once it's in the browser anyone with a little programming skills can easily see it and make their own fake hashes. [22:00:59] I see. So the secret key would be made public information to anybody who is knowledgeable about this. [22:01:30] well, no, we're not going to do that. In my opinion we either find a way around this or don't use check hashes at all. [22:01:53] The computation would have to be done server-side [22:01:57] That shouldn't be too hard, right? [22:02:16] Yeah sure! Okay. So you propose to write up a short API that is not publicly accessible for the purpose of generating and sending the secret key. [22:02:21] Generate md5( $username . '|' . $editcount . '|supersekritkey' ); on the server and export it to JS [22:04:02] RoanKattouw: Well, PHP doesn't know about the banner at time of parsing a normal page (and shouldn't due to caching). Neither is there currently a way in CentralNotice to send extra data in the Ajax response since all properties about CentralNotices are kept in raw-html ns-mediawiki messages. [22:04:24] so hack it [22:04:31] so we'd simply export a mw.config variable in Extension:CentralNotice regardless of the banner for a little while [22:04:32] yeah, [22:04:36] Hack something in that exports that hash unconditionally [22:04:38] Yeah [22:04:58] that's still very safe, at least safe enough for the banner to verify and avoid fake submissions [22:04:59] awesome [22:06:47] So bottom line: it is possible to generate the secrect key and send it to our server on the server side? [22:07:46] yes, the secret is kept on the server [22:07:52] Cool! :) [22:08:30] So is there any further things you'd like to discuss? [22:09:09] I'm happy to talk through whatever concerns/questions you may have. [22:09:20] Nope, seems like we're getting everything we intended. [22:10:18] Great. I will send this discussion to our IT team so that they are updated on what you're up to. [22:10:26] ok [22:10:47] Then, I send you an updated Connection protocol in a bit. [22:11:01] Mind if I read it first ? There's a bunch of stuff in between that was later made redundant, may wanna keep that out. [22:11:19] (in the IRC conversation) [22:11:51] Would you like to take the basic gist out of it and send me a clear transcript? [22:12:04] sure, that works too [22:12:08] I'll mail it to your shortly [22:13:12] Or I could send it as reply to the mail group thread we already have [22:13:13] Thank you Krinkle! This is a great (and very nice) idea of yours! :) [22:13:34] You're welcome. [22:13:51] Sure. You can write up a group update on the definition of the user metrics + secret key implementation. [22:14:04] I think you would do this way better than myself... [22:21:45] RoanKattouw: UserDailyContribs seems to use both Ymd and Y-m-d mixed when interacting with the date-column in the database (which is of mysql type "DATE"). [22:21:51] which would you recommend using as API format ? [22:22:06] Ugh, MySQL DATE, really? [22:22:21] Yep, wholy [22:22:23] API output: use TS_ISO_8601 [22:22:35] input for basedate [22:22:45] API input: feed the input value through wfTimestamp(), so it's "anything accepted by wfTimestamp()" [22:22:56] right [22:23:02] I'm not sure whether we have a TS_ constant for MySQL dates though. TS_DB is for Postgres [22:23:08] which is strtotime-like ? [22:23:13] The main reason for this is that people aren't supposed to use DATE [22:23:16] No, not quite [22:23:24] wfTimestamp() accepts all TS_* formats [22:24:05] For most purposes, something that accepts ISO 8601 (timezoneless) as well as MW format (YYYYMMDDhhmmss) as well as UNIX timestamps is good enough [22:24:09] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/UserDailyContribs/UserDailyContribs.hooks.php?revision=91244&view=markup#l46 [22:24:09] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/UserDailyContribs/UserDailyContribs.php?revision=99005&view=markup#l49 [22:24:22] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/UserDailyContribs/patches/UserDailyContribs.sql?revision=73480&view=markup#l1 [22:24:25] those three show it [22:24:30] Ymd, Y-m-d, DATE [22:25:01] Right [22:25:11] well gmdate() takes a UNIX timestamp as the second param [22:25:35] So you could use gmdate( 'Y-m-d', wfTimestamp( TS_UNIX, $params['basetimestamp'] ) ); (or Ymd) [22:26:20] mysql doesn't care ? [22:26:58] I don't know [22:27:00] Read the MySQL docs [22:27:06] yeah, will do :) [22:27:57] so I'd prefer of the basedate wouldn't require giving a phony timestamp. Perhaps basedate=Ymd, and then $params['basedate'].'000000' ? [22:28:12] Hrmph [22:28:33] Then you wouldn't have to convert it twice [22:28:37] or is it too standard in the API to give dates always full, in that case I have no problem sticking to that conversion [22:28:39] convention* [22:28:42] I would prefer that basedate be a timestamp [22:28:51] Because it makes validation so much easier [22:28:57] alrighty [22:29:18] If basedate is Ymd you're gonna either stick it directly into the query (eek) or convert it in a 3-step loop [22:30:08] oh no, I;m not gonna stick it into the the query [22:30:15] I was gonna feed it in place where you had put it [22:30:27] gmdate( 'Y-m-d', wfTimestamp( TS_UNIX, $params['basedate'].'000000' ) ); [22:31:54] no PARAM_TYPE 'timestamp' ? [22:32:15] I think there is one [22:32:27] I thought so [22:32:32] Yeah so that's the timestamp conversion loop^Wchain I was talking abuot [22:33:30] 'timestamp' is used indeed by several core APIs, it just wasn't listed in the switch() for PARAM_TYPE help message [22:33:57] or it's in a separate switch(), nevemrind :) [22:34:14] got it [22:34:18] perfect! [22:38:04] Krinkle: I just sent the updated version of the connection protocol. Just to let you know. :) [22:38:13] Thx [22:39:13] If anything is unclear, please just send me an e-mail saying "Jerome come back to IRC, we need to talk!", okay? [22:39:29] Sure [22:39:50] Cool. Keep us updated on this conversation then! [22:40:00] And thanks a lot for your work! [22:41:56] You're welcome.