[01:30:07] brion: Hey [01:30:14] howdy [01:30:35] I mentioned over on phab, and I think I’d mentioend to Dispenser. [01:30:58] There were also a bunch with missing 160P.webm transcodes. [01:31:12] yeah, the 160p.webm is relatively new too, it'll be included in the batch runs :) [01:31:29] i just need to add one more throttle control on the maint script and i'll start it up tomorrow [01:31:30] I just re-ran that search, and it was about 9k files that were still missing that ‘entry’ after dispenser did those purges. [01:33:36] should bring that down over the next few days :) [01:33:41] https://commons.wikimedia.org/wiki/File:Bibiana_spiega_quello_che_facciamo-YouTube.webm <- as an example… unless someone yells, I’m purging those now. [01:34:19] Revent: go for it [01:34:19] (not going particularly fast, about 30 a minute or so) [01:37:05] So I don't have to do anymore work. Great! [01:37:31] brion: It might be worthwhile writing a script to rerun (at a sane rate) ‘all’ transcodes at a particular resolution if the targets ever change again. [01:38:02] tho… I guess the pages would have to be re-purged first. (ick) [01:38:14] Revent: requeueTranscodes.php can do exactly that :D [01:38:20] Noice. [01:38:26] that's the one i'm tweaking for this [01:38:31] and removes erroneous (Vorbis=>WebM) and upscaled (240p => 1080p) [01:38:37] !cookie brion [01:38:41] om nom nom [01:39:08] Dispenser: yep, it should be able to clear those out too. currently running a pass over audio [01:39:55] I did it for Commons. But IIRC enwiki + others still has then. [01:39:59] I’m glad that the couplt of ‘explosions’ over the last month or two ended up with this stuff getting fixed. [01:40:27] yeah, it's been funky too long. just can't let it keep blowing up or it eats peoples' time and work [01:40:49] Dispenser: ah good point, i'll schedule a generic cleanup run on the other wikis too [01:41:59] You guys would, lol, laugh your asses off at the script I’m using to do these, btw. [01:43:00] I wrote mine to use Chrome's Web Inspector ;-) [01:43:14] hehehe [01:43:22] bash and curl :P [01:43:37] :D [01:44:05] I just sucked the link into openoffice, and concatted the rest of the commands around each line. [01:44:09] *the list [01:44:33] haha brilliant :) [01:48:22] A bit crazier here: https://commons.wikimedia.org/wiki/User:Dispenser/GIF_check A bash script, pulls numbers out of serialized PHP*, output an HTML table [01:48:24] * Apparently img_duration is stored as a 32-bit float, but written out as a 64-bit double [01:53:34] Some of these are not getting purged (meh)… bash chokes on some of the filenames I think. [01:54:31] There’s probably some utf-safe way to feed the stuff to curl [02:07:30] Percent encode, but bash is really bad for string. Most documentation leave out quoting everything needed for security (Bad: wc -l "$1", Good: wc -l "$1") [02:07:55] Bad: wc -l $1, Good: wc -l "$1" [02:08:42] Yeah, doing that would mar the ‘beauty’ of my lame coding method. :P [02:09:35] I don’t think many at all are confusing it. [02:11:00] I really should take the time to relearn ‘real’ programming… [02:12:35] Also super weird corner cases. e.g. wget has an option to percent decode without replacing characters, but it chokes on some Arabic filenames we have. #Useless [03:32:09] @seen Deskana [03:32:10] Josve05a: Last time I saw Deskana they were talking in the channel, they are still in the channel #wikimedia-discovery at 2/8/2017 8:08:38 PM (1d7h23m31s ago) [03:43:50] 0.O [03:48:12] @seen notaspy [03:48:12] JustBerry: Last time I saw NotASpy they were quitting the network with reason: Ping timeout: 255 seconds N/A at 2/10/2017 12:51:22 AM (2h56m49s ago) [07:55:39] [[Tech]]; ArchiverBot; Bot: Archiving 1 thread (older than 30 days) to [[Tech/Archives/2017]].; https://meta.wikimedia.org/w/index.php?diff=16306786&oldid=16291217&rcid=8964367 [14:06:56] [[Tech]]; 109.240.97.168; [none]; https://meta.wikimedia.org/w/index.php?diff=16307762&oldid=16306786&rcid=8966015 [14:09:04] [[Tech]]; AlvaroMolina; Undo revision 16307762 by [[Special:Contributions/109.240.97.168|109.240.97.168]] ([[User talk:109.240.97.168|talk]]) Spam; https://meta.wikimedia.org/w/index.php?diff=16307764&oldid=16307762&rcid=8966018 [14:46:51] https://www.irccloud.com/pastebin/k778yBfW/ [14:58:05] Rodejong: context [15:02:46] andre_? [15:06:40] Rodejong: Exactly. You dropped a link without any context. :) [15:06:57] Question: Where do we report systemlanguage alterations. I mean in Danish it says ''hollandsk'' on the language menu which should be ''nederlandsk'' (se: [[:da:Nederlandsk (sprog)]] [15:06:57] https://da.wikipedia.org/wiki/Nederlandsk_(sprog) [15:07:24] where to see the "language menu"? [15:07:36] left menu [15:08:37] You mean what's listed under "Andre sprog" / "In other languages"? [15:08:42] yep [15:08:58] you can also see it here: [15:09:01] https://usercontent.irccloud-cdn.com/file/YJsKUEV5/ [15:09:16] I wonder if that's UniversalLanguageSelector territory or if that also just grabs the names from CLDR or such [15:09:33] That's wikidata, so you can just change that entry yourself there? [15:09:39] No [15:09:44] Try it [15:10:17] You will see "Nederlands" when you edit it. The system adds "(hollandsk)" behind it [15:10:33] Ah [15:10:35] https://www.wikidata.org/wiki/Q7411 [15:11:19] The article we have on Wikipedia calls it "nederlandsk (sprog)" (sprog=language) [15:11:41] so the system should use the same name [15:11:44] not the alias [15:15:48] So I do not know where to report this [15:18:12] Rodejong: Can you provide a link where to see the problem? [15:18:25] https://www.wikidata.org/wiki/Q7411 [15:19:02] Rodejong: Where on that page? [15:19:10] Rodejong: Beforehand you talked about the sidebar. [15:19:17] There is no sidebar for me on that page. [15:19:28] At least none that offers "In other Languages". [15:19:47] "native label" [15:21:17] Rodejong: "label i originalsproget" on https://www.wikidata.org/wiki/Q7411?uselang=da says "Nederlands (hollandsk)". [15:21:35] Correct. [15:21:56] That should show - Nederlands (nederlandsk) [15:21:57] Rodejong: And I asked where to see the original problem that you want to solve. [15:22:38] In infoboxes, where we have dutch names, it shows "name (hollandsk)" [15:24:08] So if you cannot edit that item, have you asked on #wikidata maybe? [15:24:16] and I'd *still* like to see the *original* problem. [15:24:22] In a sidebar, on a left. A link to that problem. [15:26:11] No that is not Wikidata [15:26:26] When you edit the item, it shows only "Nederlands" [15:26:49] When you save it, the system puts "(hollandsk)" behind it [15:27:30] Rodejong: Okay. Can you provide a link where to see the original problem? (Last time I ask :) [15:30:12] Rodejong: Wikidata uses CLDR for language names. Language names are shown when the label is not in your preferred language (i.e. language fallback was applied). [15:31:05] yeah, but when it shows the fallback language, it shows "hollandsk" in stead of "nederlandsk" [15:31:23] andre__ looking for the article where I saw that [15:31:54] wb-monolingualtext-language-name [15:32:30] Rodejong: maybe that's just what's in the CLDR database? [15:33:24] So where do I report it than? [15:33:36] Where is CLDR? [15:34:02] A bug in CLDR? To the Unicode COnsortium, i guess. But I'm not convinced it's actually a bug in CLDR... [15:34:27] https://github.com/unicode-cldr/cldr-localenames-modern/blob/master/main/nl/languages.json sais "nl": "Nederlands" [15:34:38] though that's not the language name... hm... [15:36:06] Rodejong: The thing is: I'm on https://da.wikipedia.org/wiki/Micah_Richards . I look at "Andre sprog" (Other languages). I see 9 languages listed and click "29 mere". I get a popup. When I type "holland" it displays "Nederlands". So I'm wondering how to reproduce. [15:36:08] Internally, Wikibase uses Language::fetchLanguageName [15:36:17] which in turn uses Language::fetchLanguageNamesUncached [15:36:25] ...which loads language names from various sources. [15:37:33] Rodejong: I'd say report it on Phabricator, and tag it with Wikidata and MediaWiki-extensions-CLDR [15:38:22] Okay. I'll try that [15:38:29] Rodejong: Or going to https://da.wikipedia.org/wiki/Hollandsk I see "Nederlands" listed under "Andre sprog". No "hollandsk" in there. So steps to reproduce and see the underlying problem you'd like to solve are needed :) [15:46:51] http://www.unicode.org/cldr/charts/latest/summary/da.html [15:47:16] better, http://www.unicode.org/cldr/charts/latest/summary/da.html#126 [15:57:26] okay, thanks. I'll add that [15:59:09] Andre__ I now see where it went wrong. I was confused about which menu. I was talking about the Infobox. but answered yes on the question of languagelisting [15:59:23] Sorry about that [16:00:51] :) [16:12:39] https://phabricator.wikimedia.org/T157809 [22:01:12] Confusing https://www.mediawiki.org/w/index.php?title=Technical_Collaboration_Guidance/Private_planning&diff=next&oldid=2370646 [22:21:04] Anyone familiar with Wikimedia apps? (asking in a few channels, so mind the slight repetition here) [22:21:15] Anyone familiar with Wikimedia Maps? (asking in a few channels, so mind the slight repetition here)* (meant Maps, whops) [22:24:11] #wikimedia-interactive ? [22:24:32] JustBerry: The question though is what your followup question would be. Don't ask to ask, just ask. [22:26:22] JustBerry: somewhat yes. [22:30:55] hm, gerrit seems mighty slow right now... [22:30:55] git review is taking a minute to push a change o_O [22:31:41] whoops [22:31:43] hm, gerrit seems mighty slow right now... [22:31:45] git review is taking a minute to push a change o_O [22:36:48] andre__: abbe98[m] here goes I have a list of latitude/longitude coordinates. I want to plot them on a world map. I was using basemap (a sublime in the package matplotlib for python). I was told that there may be scalability problems on the grid, so it was recommended to me that I use Wikimedia Maps. Regarding Wikimedia Maps, there is little reference made to an API at https://www.mediawiki.org/wiki/Maps besides "GeoData exte [22:36:48] nsion allows articles to specify geographical coordinates, and expose them via search API." The GeoData extension API seems to return articles that are relevant to a specific geographical coordinate, rather than return a world map with dots marking a list of coordinates. I see some related stuff being used here (https://github.com/kartotherian/kartotherian/) with OpenStreetMap. I’ve been told that client-side JS might be th [22:36:49] e way to address the aforementioned scalability issue. Any ideas/thoughts? [22:38:45] JustBerry: hmm, tbh i'm not 100% sure how the two interesect :S The wikimedia maps service will mostly allow you to grab pre-rendered tiles at various zoom levels. The GeoData extension is for attaching latitude/longitude to articles, but it sounds like you have your own coordinates already [22:39:11] JustBerry: there might be some way to provide all the coordinates to the maps integration stuff that will plop down the coordinates, but i'm not sure where [22:39:22] ebernhardson: precisely why I thought that mediawiki maps wasn't the best solution here [22:39:30] ebernhardson: well that's what I'm wondering [22:39:35] I mean I saw the geodata extension [22:40:06] but the API associated with that is to, for example, give the coordinates of the golden gate bridge, and the golden gate bridge (and articles of nearby entities) are returned [22:40:27] JustBerry: if it helps at all, here is a demo i put together awhile ago of using the search api's to find coordinates and plotting them on our map tiles: https://people.wikimedia.org/~ebernhardson/mapsearch.html#Arkansas| [22:40:39] the javascript is all un-minified and pretty straight forward [22:40:45] ebernhardson: thank you. I will take a look [22:42:09] JustBerry: your other chance would be to ask MaxSem in #wikimedia-interactive i suppose [22:42:39] @seen maxsem [22:42:39] JustBerry: Last time I saw MaxSem they were quitting the network with reason: Remote host closed the connection N/A at 2/10/2017 4:51:36 AM (17h51m3s ago) [22:42:51] ebernhardson: don't think they're around atm... perhaps later? [22:42:58] ergh [22:43:08] oh, usually he's working atm but might be taking a long weekend or some such [22:43:21] ebernhardson: ah okay [22:44:54] MaxSem: ! [22:45:01] ebernhardson: max has arrived o/ [22:45:04] better now? :P [22:45:14] MaxSem: I'll pastebin you the convo above? we were talking about maps [22:45:20] I was on my bbackup nick [22:45:28] ebernhardson said I may want to reach out to you [22:45:35] MaxSem: you have the chat history? [22:45:52] yep [22:46:04] MaxSem: okay [22:46:05] and you still haven't explained your use case [22:46:19] what do you need these for? [22:46:29] MaxSem: okay let me explain top to bottom [22:46:34] end-user inputs an article [22:46:45] tool gets list of all users, ips, internal ips [22:46:50] sorts them as such [22:46:53] geolocates the ips [22:46:56] prints all that out [22:47:08] makes a list of the coords of the ips [22:47:10] one list for lats [22:47:13] one list for lons [22:47:18] plot them on a world map [22:47:20] output the map [22:47:22] the end (for now) [22:47:24] MaxSem: ^^ [22:47:49] output where? [22:47:51] basically the tool is called spiarticleanalyzer as the end-user inputs an article and the tool will give information that would be relevant to someone that's doing sock-related investigation [22:48:15] MaxSem: output on tools.wmflabs.org/spiarticleanalyzer using webservice python (have a flask app.py which webservice reads on startup) [22:48:25] so, HTML? [22:48:27] if you already have the tool portion, the search demo i posted might not be too far off. Essentially you just want to load leaflet.js, set a bunch of markers, and and fit the map to the bounds of the markers? [22:48:27] yeah [22:48:37] once again, you don't need to geerate pics [22:48:42] ebernhardson: sure [22:48:50] markers on a map is enough [22:48:51] uuuhhh... gerrit... slooooowwwww :( [22:48:58] MaxSem: okay yes... [22:49:01] that works too [22:49:12] DanielK_WMDE: cuz andrew's trying to upload a patch too lol [22:49:28] it feels like 1000 people are trying... [22:49:39] and hashar left :/ [22:49:51] DanielK_WMDE: /msg andrewbogott ;p [22:50:01] been having similar issues ^^ [22:50:10] but okay MaxSem, so I'm plotting on openstreetmap or something then? [22:50:21] that's what wikimedia maps seems to be off, right ebernhardson? [22:50:40] ыоу цан усе оур тилес [22:50:44] well the python code essentially (with the list of users) outputs html [22:50:46] you can use our tiles [22:50:55] it basically outputs a string of html [22:50:58] so just plop in js [22:51:01] ig [22:51:19]