[03:49:39] [[Tech]]; Bomberswarm2; /* cast generator */ new section; https://meta.wikimedia.org/w/index.php?diff=15281352&oldid=15276376&rcid=7309586 [05:20:48] hi [05:21:47] I need to import tasks from a crappy custom ticket tracking system into Phabricator's Maniphest, and it seems you guys have experience with that from the Bugzilla migration :) [05:23:20] how did you manage to change the creation timestamp of tasks and comments in phab? [05:24:11] hmm [05:24:32] in https://phabricator.wikimedia.org/T12543 I see all comments are from bzimport and credited to the original user in the comment text itself... is there no way around that? :/ [05:26:41] PovAddict: In that case, the user didn’t associate his account with his old bugzilla email. [05:27:10] oh, now I see the very last comment is attributed properly [05:27:14] Notice the bottom comment? We only migrated early last year. [05:27:33] at first I thought all comments were attributed to bzimport [05:27:46] because skimming is the best I can do at 2:20am [05:27:54] No. You picked an odd bug to look at. Hold on. [05:27:58] :) [05:28:13] I literally typed random numbers after a T [05:28:34] https://phabricator.wikimedia.org/T69069 Here is one that was imported farily completely. [05:28:57] that looks pretty great :o [05:29:24] I wasn’t part of the migration, so I can’t offer any information byond that. My apologies. [05:30:11] any chance the scripts are somewhere public? [05:30:31] Probalby so. Let me look. [05:35:20] PovAddict: I believe it is in https://phabricator.wikimedia.org/diffusion/PHTO/browse/master/ , however; I’m not positive. [05:35:29] https://phabricator.wikimedia.org/T259 was our migration task. [05:35:43] I saw T259 and a few of its blockers [05:37:51] Okay. [05:38:32] Again, I”m sorry I wasn’t involved in the migration and don’t know more. [05:39:24] Username “chasemp” appearrs to be the go-to person but he doesn’t appear to be on right now. [05:41:53] hm, migration date was Nov 22 2014? [05:42:45] Sounds about right. [05:43:05] then I'm not sure if https://phabricator.wikimedia.org/T69069 proves much since most comments and changes were done after that :D [05:46:02] About there yes. https://blog.wikimedia.org/2014/11/24/welcome-to-phabricator-wikimedias-new-collaboration-platform/ [05:46:14] but anyway [05:46:18] it seems y'all did do what I need to do [05:47:52] Yep. [05:48:09] Again, I don’t know too much. casemp will, but he is not on right now and I don’t know when he will return. [05:48:27] well, I don't know in what timezone he is but this is generally a bad time in most places :D [05:48:34] so that's expected [05:50:35] *chasemp [05:50:36] Yes. [06:02:22] I thought conduit had been modified to allow the bot to make changes it's not usually allowed to [06:03:01] looks like that's not the case, bugzilla_create.py uses conduit to create the task and then raw SQL to edit fields like the creation timestamp [11:13:00] [[Tech]]; Ruslik0; /* cast generator */; https://meta.wikimedia.org/w/index.php?diff=15282460&oldid=15281352&rcid=7310821 [18:17:51] tgr, anomie: I think there are still occasional errors when a user tries to do OAuth with a just-created account. [18:18:33] Unless there it was a different user with the same IP (possible, as it's a student in a Wiki Ed class), I think it happened to this user: https://en.wikipedia.org/wiki/Special:Log/Happy2016 [18:19:04] ragesoss: Can you be more specific than "occasional errors"? [18:19:22] "error":"mwoauthdatastore-access-token-not-found","message":"No approved grant was found for that authorization token." [18:20:17] ragesoss: Also, I note only mediawiki.org and test wikis are on wmf.11 at this point... [18:20:26] anomie: that's what breaks the mediawiki omniauth gem (which is the fault of that gem, for not handling the error better). [18:20:33] anomie: I don't think this is new. [18:21:09] it's just that I didn't have enough info to understand why my system occasioinally blew up during user login. [18:21:55] but we added some new debugging code to figure it out, and that's the message that I found this morning from the first failed login since the debugging was in place. [18:21:57] so it's like register on enwiki -> authorize OAuth on enwiki -> try an API request to enwiki -> mwoauthdatastore-access-token-not-found ? [18:22:06] tgr: right. [18:22:35] (We also added a useful error message telling the user to try again, so it looks like it worked on second try for that user.) [18:23:18] specifically, it's register on enwiki with a returntoquery that immediate takes the user to the authorize url. [18:24:50] https://dashboard.wikiedu.org/users/auth/mediawiki_signup [18:25:35] or for test.wikipedia: https://dashboard-testing.wikiedu.org/users/auth/mediawiki_signup [18:27:25] master-slave lag, maybe? [18:27:38] if the API request follows the authorization very closely [18:29:52] yeah, could be almost immediately if the user clicks quickly. Or pretty much immediately every time, if that routine fires before the user clicks Allow. [18:30:27] (that is, when going through the new account flow from our site) [18:30:31] how often does this happen, could you tell the difference immediately if it was fixed? [18:30:54] (immediately = in a few days) [18:31:00] yes. [18:31:14] happens several times a day, during this busy sign-up time. [18:31:24] cool, can you file a phab task for it? [18:31:30] will do. [18:41:43] Nikerabbit: Do you have a link to the AbuseFilter surfacing work? One of the stewards was asking me about that recently. [18:42:42] csteipp: sure [18:43:02] csteipp: top-level ticket https://phabricator.wikimedia.org/T114621 [18:43:59] Nikerabbit: Cool, thanks [18:44:14] csteipp: https://phabricator.wikimedia.org/T114621#1968787 might be useful update... there is quite a lot of chatter in the task [18:44:31] Can we kill AbuseFilter altogether? *sigh* [18:45:03] hoo: I wouldn't oppose myself, but... [18:46:27] I know... [18:46:39] I wish I had the time to write an alternative and stuff [18:49:01] eh, what's wrong with abusefilter? [18:50:49] it has it's own language [18:52:16] it is tightly integrated with our editors [18:52:35] (meaning action=edit, VE and friends, not the users) [18:53:40] Nemo_bis, not sure I understand what you're trying to say [20:24:46] Hi! Is here the right chatroom to talk about an API application problem? [20:26:21] depends; what's the problem? [20:26:53] I don't know how to upload a pdf-file to wikipedia [20:27:02] API:Upload [20:28:01] do you have to use the API? can you use https://commons.wikimedia.org/wiki/Special:UploadWizard ? [20:28:41] i got the error line: code badupload_file info {File upload param file is not a file upload; be sure to use multipart/form-data for your POST and include a filename in the Content-Disosition header [20:29:02] i don't upload to commons [20:29:11] no UploadWizard [20:29:17] doctaxon: Are you using multipart/form-data as it suggests? [20:29:39] what is multipart/form-data? [20:30:07] doctaxon: A way to transmit data to our API. How are you uploading the file if not with that? [20:30:40] Why not just use Special:Upload [20:31:32] I cannot use Special:Upload because my connection is too slow for 100 MB file [20:32:59] MarkTraceur: I use it like stated in API:Upload: api.php?action=upload&filename=wta16.pdf&file=$contents&token=$token [20:33:24] doctaxon: In a GET request? [20:33:29] yes [20:33:39] Well that's silly [20:33:58] doctaxon: What HTTP client are you using? [20:34:02] no, in a post request [20:34:05] sorry [20:34:44] http client? [20:34:58] doctaxon: Do you know what HTTP is? [20:35:00] I am starting it from tools.taxonbot [20:35:09] Uhhh [20:35:10] in a shell [20:35:13] What application do oyu use to talk to the API? [20:35:49] i am using the tcl framework [20:36:06] hallo hoo [20:36:20] doctaxon: Okay, look up how the framework supports multipart/form-data [20:37:28] I am not able to look up, didn't wrote the framework? [20:37:40] If you have pywikibot set up, you can use that. It's easy with [20:37:49] it [20:38:04] ich versteh kein python [20:38:06] doctaxon: There's this cool new thing called Google [20:38:38] doctaxon: no need to know python [20:38:46] how'd google help? [20:39:13] doctaxon: Search for tcl multipart form data upload [20:42:22] hoo: I had had pywikibot but the login data collided with the tcl framework. I couldn't run both tcl and pywikibot on tools.taxonbot [20:44:38] hoo: or I have to install pywikibot alternative way [20:45:18] pywikibot has upload.py AFAIR [20:45:34] but I haven't set it up in a while, so I'm no help there [20:48:33] is there another way to upload a big file by a slow connection? [20:49:17] There are many, but I'm not aware of other easy ones [20:49:31] easier as pywikibot, that is [20:50:56] hoo: nicht unbedingt per Skript, sondern mit irgendwelchen Tools oder Specials oder was es da so gibt? [20:54:02] hm... commonist vllt.? [20:54:18] es geht nicht um commons [20:54:22] Gibt ein paar, aber mir fällt sonst nicht ein [20:54:29] das ding soll nach dewp [20:54:30] Commonist kann auch andere Wikis AFAIR [20:54:35] ach so [21:13:17] hoo: commonist running, hope for success [21:13:38] :) [21:15:33] hoo: fällt mir gerade ein, ich hab gar nix fürs Datei-Lemma ausgefüllt bzw. dazu ein feld gefunden [21:17:25] Kann man alles nachtragen [21:17:45] ja, is nicht so schlimm [21:18:38] hoo: das sollen jetzt genau 99,91 MB sein [21:43:42] hoo: commonist hat's ganz sauber hochgeladen und auf user:Doc Taxon/gallery geschoben. scheint wohl standard zu sein. Danke für Deine Hilfe [21:44:21] Prima, bitte