[06:23:20] andre__: do the (un)publish buttons on http://www.google-melange.com/gci/dashboard/google/gci2014#all_org_tasks work for you? I used them once but now they seem partly covered by the headers and do nothing [09:25:07] err, phab's notifs are coming from no-reply@phabricator.wikimedia.org [09:25:22] but at the end of thr message, it says to reply [09:25:31] shouldn't that no-reply be changed? [09:34:38] The Reply-To: header points you to a unique specific address [10:02:29] oh [10:05:04] Glaisher: It doesn't mean that the no-reply address isn't confusing, though! [10:05:35] yeah https://phabricator.wikimedia.org/T85127 [12:50:24] Nemo_bis: I've never tried them, I only had to delete two tasks so far that got fixed in the meantime :-/ [15:05:25] Hi folks, [15:06:05] I tried to create tables for huwiki in my XAMPP install according to a maybe out-of-date description. [15:06:11] My command is: [15:06:13] java -client -classpath mwdumper.jar;mysql-connector-java-5.1.22\mysql-connector-java-5.1.22-bin.jar;commons-compress-1.4.1.jar org.mediawiki.dumper.Dumper "--output=mysql://127.0.0.1/huwiki?user=root&password=****" --format=sql:1.5 huwiki-latest-pages-meta-current.xml.bz2 [15:06:59] Error message: [15:07:00] Exception in thread "main" java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'huwiki .text' doesn't exist at org.mediawiki.importer.XmlDumpReader.readDump(XmlDumpReader.java:92) at org.mediawiki.dumper.Dumper.main(Dumper.java:142) Caused by: org.xml.sax.SAXException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'huwiki.text' does n't exist at org.m [15:07:13] Where did I spoil? [15:09:59] binaris: the "text" table has been replaced by "page" and "revision" [15:10:06] http://www.mediawiki.org/wiki/Manual:Text_table [15:10:24] (since your error message complains about "text" not existing) [15:11:10] mutante: No, it's still there... I think you mean the cur table [15:11:26] I just copied the command from manual. Haven't given table names manually. [15:11:35] " Table 'huwiki.text' does n't exist " [15:11:37] Maybe the format? [15:11:59] mutante: It does on the cluster [15:12:03] " All fields in the text table except old_id, old_text and old_flags are not needed anymore and can be safely deleted." [15:13:20] Is there an other way to creata database structure? [15:14:36] mutante: But those three fields are still there and used [15:14:53] in fact we still have all fields in most production DBs as no one cared enough to drop them [15:14:58] XAMPP? I'd drop the "maybe" from oout of date [15:15:27] Hmm I wanted to use meta=filerepoinfo to get the file repositories of a site: https://en.wikipedia.org/w/api.php?action=help&modules=query%2Bfilerepoinfo [15:15:35] ugh mwdumper, even worse then [15:15:36] Is that important? It has a proper MySQL installation. [15:15:43] But that help text is itself inconsistent [15:15:56] and especially that apiurl is missing now is quite a bummer [15:15:59] binaris: Did you actually install a MediaWiki with the db name huwiki? [15:16:26] xzise: https://en.wikipedia.org/w/api.php?action=query&meta=filerepoinfo&friprop=name|displayname [15:16:32] No, just the database itself. [15:16:48] mwdumper doesn't know the correct database structure [15:16:58] http://wikitravel.org/wiki/en/api.php?action=query&meta=filerepoinfo&friprop=name|displayname|rootUrl|local|apiurl|articlepath|server&format=jsonfm for example [15:17:07] Unless you're using a ~10 years old MediaWiki, thatis [15:17:19] hoo, well if I want to write a script to automate it, names and displaynames aren't helpful [15:17:41] xzise: ... so? Just ignore them [15:17:53] ? [15:18:05] xzise: If you don't need them, just ignore them [15:18:21] Well then I got nothing anymore [15:18:35] So is mwdumper unusable? [15:18:41] xzise: What information are you looking for, then? [15:19:40] binaris: mostly; it sometimes manages to import some stuff from which you can build the rest [15:20:17] hoo, a URL to the articles itself like in the interwikimap for example (because there is already code to parse that URL) [15:20:47] xzise: which articles? [15:20:53] There is 'descBaseUrl' which looks like it, but the documentation itself doesn't say what it is [15:21:04] of that shared repository [15:21:19] the iwm returns something like //en.wikipedia.org/wiki/$1 [15:21:22] that server + articlepath [15:21:32] and the replace the $1 in there with your article name [15:21:39] I know [15:21:58] That is what I can parse, but that is from the siprop interwikimap [15:22:03] Hm, wasn't fully updated with WMUK migration experience it seems https://meta.wikimedia.org/wiki/Data_dumps/mwdumper [15:22:04] mwdumper seemed to work for conversion, I have got huwiki-latest-pages-meta-current.sql, but I couldn't test it. Will that fail, too? [15:22:51] Now I need to parse the filerepoinfo and well apart from some substantial changes in the API itself [15:25:51] Is there anywhere a tables.sql for current MW version that I can simply download? Or shall I install MW and let itt create the database? [15:26:25] I didn't want to create a mirror, just make queries. [15:27:08] maybe you could use Wikimedia Labs and an existing database [15:27:15] without having to do it yourself [15:27:51] That's another approach. Not bad, but I should be able to create a db structure locally. [15:28:42] That's extremely unlikely to be a viable approach [15:28:50] What sort of queries are you interested in? [15:29:01] then it might be easiest if you run the installer and let it create it, then mysqldump and just use the very first part of the dump file where it creates the table [15:29:33] what's "extremely unlikely" about havins labsdb? [15:29:55] i thought they were just recently sanitized [15:30:37] Nemo_bis: no concrete plan yet, we have plenty of tasks to be supported by db queries. [15:30:49] binaris: who's we [15:30:56] Huwiki [15:31:56] Well, the question is not about queries, but about installing db structure to be able to import dump. [15:32:35] The best docs available are https://meta.wikimedia.org/wiki/Data_dumps/mwdumper [15:32:55] But many people tried this in the last decade, and most of them failed [15:34:50] I've never heard of someone willingly pursuing recurring mwdumper pain [15:35:20] That must not be the proper way, neither acceptable. This is just a database, not the Saint Gral. [15:35:47] Nemo_bis: I did [15:36:05] akosiaris: great, then I'll mention it on the meta-wiki page [15:36:05] well, the willingly part may not be absolutely true [15:36:26] binaris: that = ? [15:36:39] http://wikipapers.referata.com/wiki/List_of_tools has some options, fwiw [15:36:58] That = constant failure [15:38:08] That's a result, not a wya [15:39:15] Is it a big desire to have a downloadable tables.sql for each MW version, full of create commands? I mean that is not a big deal to update it sometimes when the structure changes. [15:39:25] And I encourage to think of your aims first. Because mwdumper will not create most of MediaWiki's tables [15:39:57] (create/populate) [15:43:37] nsinvert on special:newpages seems to be failing [15:43:58] https://meta.wikimedia.org/wiki/Special:NewPages?namespace=1198&nsinvert=1&tagfilter=&username= [15:51:47] https://phabricator.wikimedia.org/T85145 [16:25:21] I installed MediaWiki (though I didn't want) and now I have database. [16:25:33] Began import, and have another error: ERROR 1062 (23000) at line 55: Duplicate entry '1' for key 'PRIMARY' [16:27:39] binaris, on some Wikimedia server? or is that MediaWiki on some private website? [16:27:43] if the latter: #mediawiki [16:28:02] (plus better steps - which version of MW, how you "began import", etc) [16:28:02] My own computer. [16:28:13] should be asked in #mediawiki then, with more info [16:28:18] this channel is for Wikimedia wikis [16:28:26] he is from hu.wikimedia [16:28:45] You missed the previous talk. :) [16:28:54] heh. oh yeah I did. Sorry [16:29:14] Sorry, I shut up... :P [16:29:14] I installed latest MW, and want to import huwiki from dump. [17:01:03] binaris: did you drop the tables as docs say? [17:01:35] Oh no! [17:05:37] Very nice that new MW forced me to provide root a password, and now my own phpmyadmin throws me out for inactivity which was not the case until an hour ago! Grrgghh! [17:11:11] Nemo_bis: thanks, that must have been the issue. Now it is running and I hope it will complete. [17:59:08] Partial success: loading content stopped after 43 minutes, I don't know, how many percents. ERROR 1153 (08S01) at line 1118: Got a packet bigger than 'max_allowed_packet' bytes [18:09:15] Does anyone know, when I set "max_allowed_packet" in my.ini, will that be effective without restarting mysql? [18:11:09] And that was also covered the in the docs. :P [18:11:22] That's what happens when one doesn't read the links Nemo_bis gives. :D [18:22:33] [[Tech]]; Ganímedes; /* Several problems in es:WP */ new section; https://meta.wikimedia.org/w/index.php?diff=10811003&oldid=10790550&rcid=5801566 [18:31:35] Nemo_bis: I haven't read through the whole stuff really, I thought the whole process would be much more simple. Google found for me an alternative solution to restart. [18:33:38] Chi non ha testa abbia gambe, we say ;) [18:35:50] [[Tech]]; AKlapper (WMF); /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10811065&oldid=10811003&rcid=5801600 [18:36:17] [[Tech]]; AKlapper (WMF); /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10811071&oldid=10811065&rcid=5801604 [20:50:36] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812435&oldid=10811071&rcid=5802620 [21:12:44] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812541&oldid=10812435&rcid=5802665 [21:18:42] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812594&oldid=10812541&rcid=5802696 [21:21:55] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812608&oldid=10812594&rcid=5802700 [21:24:05] might be useful if it was in english... [21:33:55] [[Tech]]; Krenair; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812661&oldid=10812608&rcid=5802729 [21:41:18] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812690&oldid=10812661&rcid=5802739 [21:43:23] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812698&oldid=10812690&rcid=5802740 [21:43:51] [[Tech]]; AKlapper (WMF); /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812702&oldid=10812698&rcid=5802741 [21:52:10] [[Tech]]; Krenair; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10812723&oldid=10812702&rcid=5802749 [23:20:53] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10813066&oldid=10812723&rcid=5802889 [23:22:03] [[Tech]]; AKlapper (WMF); /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10813070&oldid=10813066&rcid=5802892 [23:24:27] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10813080&oldid=10813070&rcid=5802893 [23:26:49] [[Tech]]; AKlapper (WMF); /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10813085&oldid=10813080&rcid=5802896 [23:46:46] [[Tech]]; Ganímedes; /* Several problems in es:WP */; https://meta.wikimedia.org/w/index.php?diff=10813139&oldid=10813085&rcid=5802914