[00:36:55] Dispenser: no mass-group-article visualization like https://xtools.wmflabs.org/articleinfo/en.wikipedia.org/Apple_Inc. ? [10:37:36] Hi folks -- wondering if someone can give me a pointer. I'm trying to extract the raw page data via the API. So far so good, except, when using 'prop': 'extracts', with my query, it remove raw links (re: [[Media:Somefile.csv | Some File]] will only appears as "Some File" in the response. [10:45:37] zabtard_, isn't extracts supposed to remove such stuff? [10:45:55] extracts is not raw page data [10:47:18] hmm, I thought it was supposed to return the raw data. [10:48:43] do you know what api function/paramaters would allow me to do that? [10:49:29] zabtard_, you just want the source of the page? [10:50:23] Yeah -- I think I just found out how to do it [10:50:39] well, maybe not [10:50:54] https://www.mediawiki.org/wiki/Extension:TextExtracts#API the API docs say this is plain-text/limited [10:51:42] you can get the source this way: https://www.mediawiki.org/w/api.php?action=query&titles=MediaWiki&prop=revisions&rvlimit=1&rvprop=content [10:52:19] Yeah -- I saw that and choosing raw or wiki formatting still removed the wiki macros raw text, but not the wiki formatting [10:52:39] re: the extract function [10:52:43] digigng into your link now [10:54:30] I'll give it a quick try. I had just moved to a query action, and, its spitting out the raw HTML now, which isn't ideal. [10:54:39] [[Tech]]; Tacsipacsi; /* Ability to upload file(wiki azb) */ why is it needed?; https://meta.wikimedia.org/w/index.php?diff=18956316&oldid=18952606&rcid=13340345 [10:55:49] so are you looking for the raw wikitext or the HTML that it translates to? [10:57:13] HTML [10:57:55] extract gives me the wiki formatting (pipes, etc) for tables, but, macros get trashed to their visible name. [10:58:09] I actually am trying the query function again. I was using parse, earlier, which was giving me the raw html [10:58:51] sweeta! [10:58:58] Thanks -- that finally did it! [10:59:15] || [[Media:test.csv|Test CSV]] now appears as I would expect [12:17:43] Do we have a phab task for the zombie sitenotices? There is one in eswiki today [12:19:21] About heartbleed, asking everyone to change their passwords [12:21:53] chicocvenancio: https://phabricator.wikimedia.org/T218826? [12:22:04] (which is a duplicate of https://phabricator.wikimedia.org/T218918) [12:32:35] Thanks Lucas_WMDE [12:34:26] BTW, I asked admins there to make null edits of mediawiki:sitenotice but I still see the heartbleed banner [12:35:06] if I understand daniel’s fix correctly, it would use any random revision of the message page [12:35:13] so even a non-null edit wouldn’t guarantee anything :/ [12:35:21] will hopefully be fixed software-side soon [12:37:37] thanks Lucas_WMDE, yesterday it affected ptwiki and a null edit did the trick [12:43:01] hm, okay [14:55:06] the sitenotice cache problem is one of those evergreen bugs which keeps returning :) [15:20:03] "there are two hard problems in computer science" [15:25:10] * apergos would like to rename these to evergroan bugs [15:25:27] evergreen is a nice thing: trees, always green and doing nice things to the air... unlike these bugs [15:43:41] [[Tech]]; E THP; /* Ability to upload file(wiki azb) */; https://meta.wikimedia.org/w/index.php?diff=18956841&oldid=18956316&rcid=13341394 [17:58:31] hey guys! something has changed on the DBs? two tools that I use aren't working, with GlobalContribs returning an SQL error: Warning: parse_ini_file(/replica.my.cnf): failed to open stream: No such file or directory in /data/project/guc/labs-tools-guc/src/Settings.php on line 35 Error: MySQL login data not found at [17:58:40] and ipcheck doesn't load the wikis to login [18:02:56] Tks4Fish: is that about toolforge? [18:03:10] Tks4Fish: then #wikimedia-cloud [18:03:17] ah, okay [18:03:20] thanks :) [19:24:37] [[Tech]]; Tacsipacsi; /* Ability to upload file(wiki azb) */ non-free files are unwelcome; https://meta.wikimedia.org/w/index.php?diff=18957514&oldid=18956841&rcid=13342380 [20:37:11] is there a standard irc bot setup? an eggdrop plugin for wikipedia API perhaps? or something server-side (on my server or wikimedia's) to convert wikipedia reports and events into something a bot can subscribe to like an RSS feed? [20:37:41] i want an irc bot's announcements for new articles created within a wikiproject or category, or edits to a watchlist [20:49:30] dtm: https://www.mediawiki.org/wiki/EventStreams [20:55:24] andre__: thank you [20:56:27] andre__: okay so for a non-programmer, how about the rest of my inquiry? is there an existing irc bot setup or server data export for a bot? [21:01:19] https://meta.wikimedia.org/wiki/IRC/Bots maybe? [21:36:56] dtm: perhaps wmbot is what you want? [21:37:14] dtm: https://meta.wikimedia.org/wiki/Wm-bot [21:44:26] i apologize for my lack of google-fu [21:45:28] andre__, Tks4Fish: thanks [21:46:56] np :)