[04:02:06] is parsoid something i can access on WP or what, i'm trying to parse wikitext in javascript [04:09:29] what was the rationale behind not using inbuilt datetime/date fields but using binary fields with fixed width for dates ? [04:10:16] just curious [05:20:27] Hi damo22. [05:21:25] https://lists.wikimedia.org/pipermail/wikitech-l/2016-May/085550.html sort of addresses this. [05:25:26] thanks Leah [05:32:30] Many of the oddities in MediaWiki's database schema are related to MySQL oddities. [05:32:59] For example, we store bytestrings instead of Unicode strings due to previous limitations in MySQL's support of higher plane Unicode. [05:33:02] As I understand it. [05:38:59] that makes sense [05:40:08] i am just surprised that wikipedia uses a relational database at all, considering how big it is.. i would have thought it would be using a flat file approach with some kind of aggregator that selects bits that are requested [05:42:25] you cant exactly do a mysqldump and reinsert if something goes wrong that easily [05:44:57] im guessing there is a reason why most of the dumps are provided in xml, maybe there is a plan to use raw xml in the future? [07:48:48] [[Tech]]; 23.91.70.7; TECHNO ZAPIERA DECHNO; https://meta.wikimedia.org/w/index.php?diff=15648070&oldid=15644066&rcid=7833053 [07:49:39] [[Tech]]; Ruslik0; Reverted changes by [[Special:Contributions/23.91.70.7|23.91.70.7]] ([[User talk:23.91.70.7|talk]]) to last version by 202.79.203.111; https://meta.wikimedia.org/w/index.php?diff=15648071&oldid=15648070&rcid=7833054 [07:49:49] [[Tech]]; 23.91.70.7; /* techno walenie w kafle rypanie w garnki */ new section; https://meta.wikimedia.org/w/index.php?diff=15648072&oldid=15648071&rcid=7833055 [07:51:54] [[Tech]]; 23.91.70.7; fvck off - go fvck yourself; https://meta.wikimedia.org/w/index.php?diff=15648080&oldid=15648072&rcid=7833063 [07:53:55] [[Tech]]; Ajraddatz; Reverted changes by [[Special:Contributions/23.91.70.7|23.91.70.7]] ([[User talk:23.91.70.7|talk]]) to last version by Ruslik0; https://meta.wikimedia.org/w/index.php?diff=15648083&oldid=15648080&rcid=7833068 [18:08:51] anomie: are https API quires logged too? [18:09:46] Betacommand: How do you mean? [18:11:00] anomie: for UA checks [18:13:11] Betacommand: They won't trigger the https-expected "feature", obviously, but any other "feature" will get logged no matter the access method. [18:13:44] I set a new UA for my tools and am getting zero results from the API special page for the new UA [18:15:24] Betacommand: Are you sure you're triggering anything that gets logged in that? [18:15:50] anomie: what all gets logged? [18:16:46] Betacommand: Anything that's marked deprecated, for one. Or if you look at the code for the API modules, look for calls to logFeatureUsage(). [18:17:04] Ah ok, then thats why. [18:38:08] Hey folks - I’d like to run a query to identify the total number of characters in the articles translated by the ContentTranslation tool. Is there a way to achieve that without importing a SQL dump locally? i.e. a read-only SQL tool somewhere? [18:39:17] jmadler: sure, Analytics folks run some read-only database copies [18:39:50] where can I find more info on that? [18:40:16] is there a page on wikitech? [18:40:17] i'm trying to find the documentation page. it must be somewhere ;) [18:40:26] awesome :) [18:41:05] jmadler: here: https://wikitech.wikimedia.org/wiki/Analytics/Data_access [18:41:25] jmadler: and to be more precise: https://wikitech.wikimedia.org/wiki/Analytics/Data_access#Analytics_slaves [18:42:59] ok great - awesome. Any tips for running queries across all projects’ DBs? [18:44:23] other than repeating the query 700 times with UNION, which is what i did that one time i needed to do that, not really. perhaps Analytics people know a saner way [18:45:06] ok, thanks. appreciate it :) [20:00:36] https://www.mediawiki.org/wiki/Talk:Technical_Collaboration_Guideline/Vision