[00:00:45] 400 to go [00:01:07] 300 [00:01:42] 200 [00:01:59] 150 [00:02:11] 100 [00:02:56] * hoo missed it by 3 [00:03:31] who won? [00:03:41] https://www.wikidata.org/w/index.php?oldid=100000000 [00:03:51] a merge change [00:03:58] oooh [00:04:01] by a human :D [00:04:03] will have to delete [00:04:16] At least that : [00:04:23] yay Sjoerddebruin [00:04:31] congrats sjoerddebruin [00:04:35] :O [00:04:47] :D [00:05:56] https://www.wikidata.org/w/index.php?diff=100000000 [00:06:12] congratulations! :) [00:06:39] Thanks. :) [00:06:43] Lydia_WMDE: Missing out on all the number fun? [00:07:55] 100 M edits, 27 M statements, 9 M subjects, 43 M labels, 33 M descriptions. We're doing good for a first year. [00:08:33] dennyvrandecic1: 13.8 (or so) million items [00:08:35] :) [00:09:12] yeah, I started using the counting of items less [00:09:30] subjects (i.e. items with at least one statement) is a bit more interesting [00:09:35] dennyvrandecic: I deleted more than 59k this week :P [00:09:52] thank you :) [00:11:46] I wonder how long we need for 1 × 10^9 ... might be fun long term bet thing :P [00:12:17] hmm, good question [00:12:41] my assumption is 2017 [00:12:46] early 2017 [00:12:54] If we keep the edit it would be 126 months, but I doubt that [00:13:23] mh [00:13:51] awww, i got logged out [00:14:06] why? [00:14:21] why what? :P [00:14:27] why did i get logged out? [00:14:28] To many things to ask about [00:14:30] * aude didn't do it [00:14:44] reached the time out for a "remember me" session at sudden? [00:14:52] doubt it [00:14:56] no likely but who knows [00:14:57] loss of session data [00:15:28] my bot script logs users out in it's destructor, so whenever I use my real account in it for some reason it logs me out over and over :P [00:18:02] I say we reach the 1 billion in mid 2019 [00:18:23] anyone else got a bet? Come on :P [00:19:51] early 2017 [00:20:11] make a wikipage for collecting this pool [00:20:35] dennyvrandecic1: Exactly my idea... but two bets are a bit weak :P [00:20:39] * hoo eyes aude [00:22:17] * aude thinks 2017 or sooner :) [00:22:35] aude: got that :) [00:22:43] Lydia_WMDE: maybe? :P [00:23:03] or Eloquence [00:24:40] * aude has to go to the airport in a few hours... should sleep :) [00:25:03] aude: heh :) You can do that in-flight... I hope [00:25:15] doubt it [00:25:57] see folks tomorrow or so [00:26:16] I doubt it :D Cu, have a pleasant travel [00:26:35] but we'll mit in SF [00:26:47] yep [00:26:52] * meet [00:26:57] damn German [00:30:30] * hoo accidentally opened Firebug on Facebook... :/ [00:32:02] dennyvrandecic1: https://www.wikidata.org/wiki/User:Hoo_man/10%5E9 ;) [00:32:26] If anyone also has a bet about when we will reach the #1,000,000,000 feel free to amend :P [01:11:02] 163.148742675781GB [01:11:12] 99GB to catch up with dewiki [01:11:19] hoo: > today [01:11:53] Reedy: heh... I guess that's something we can do much faster than edit 10^9 :P [01:12:21] Noting that doesn't include ES [01:12:31] enwiki 786.582473754883 GB [01:12:33] :D [01:12:40] Sure, the json is *big* [01:13:28] SUM( page_len ) [01:13:29] 18255901590 [01:16:05] that's some big json objects [01:16:18] ttkay: That also included non ns-0 pages [01:16:26] https://www.wikidata.org/w/index.php?title=Q2572407&action=history luckily nobody yet dared to delete :P [01:16:49] * ttkay always liked to break up large datasets into a newline-delimited stream of individual json records [01:17:50] ttkay: http://tools.wmflabs.org/hoo/dbq/dbq-207.gz [01:18:11] That must be great for you then... all commons files in teh first column and their metadate in the second [01:18:18] about 11GiB uncompressed or so :P [01:18:49] metadata converted to JSON, cause php's serialization isn't really *this* portable [01:18:56] even with work's excellent network connection this is taking a few minutes to download :-) [01:19:13] JSON's an excellent choice, imo [01:19:20] heh :) [01:20:39] ES has about 180GiB of data (not taking compression into account and relying on rev_len/ar_len) [01:21:28] mmm ES .. how many underlying Lucene instances? [01:21:52] ES is still in MariaDB/MySQL AFAIK [01:22:23] oh? interesting, didn't know ES could use a relational database for its underlying indexes [01:22:41] wish I'd known that at my previous job .. Lucene's a pita [01:22:42] ttkay: ES = External storage, not elastic search [01:22:46] OH! [01:22:52] sorry, I made a bad assumption [01:29:26] ahh, I see what you mean .. filename/tab/metadata/newline [01:29:50] excellent .. I have some tools that already take this format [01:30:17] ttkay: The commons meta stuff? :P [01:30:56] hoo - yeah, looking at decompressed dbq-207 [01:31:11] http://ciar.org/ttk/codecloset/select/ should jfw .. testing that now [01:32:00] ttkay: heh :) I wrote a small PHP script to get the PHP serialized data from the commons DB into JSOM [01:32:19] line wise... went faster than expected, actually [01:32:49] cool [01:36:47] good night people ;) [05:20:28] can anyone help ? [05:21:07] anyone around ? [05:23:30] hello [05:23:32] what do you need? [05:24:05] hi [05:26:02] I need to find a property code for the parralel Wikivoyage article names [05:26:30] for example P373 is for wikivoyage article names [05:26:47] hmm [05:26:48] I need the P code for wikivoyage article name [05:27:11] https://www.wikidata.org/wiki/Special:MyLanguage/Wikidata:List_of_properties/all has all properties [05:27:15] lemme see if I can find it [05:27:19] * I meant P373 is for the wikicommons article names [05:27:35] There might not be one for wikivoyage [05:27:48] :( [05:27:57] commons needed to link between category pages and gallery pages, which is probably why that category exists [05:28:12] wikivoyage articles can be added to the same items as the wikipedia articles [05:28:20] s/category/property/ [05:30:01] no P code ? [05:30:17] I couldn't find one on that page I linked to [05:30:35] ohh you mean just use the wikipedia item P code ? [05:30:46] yeah! (Q code ;-) ) [05:30:58] the search bar actually works for finding those now [05:31:12] though, I guess it has for a long time now... :p [05:32:08] I cannot find P373 on that list [05:32:22] if the wikicommons item doesn't appear on that list [05:32:36] true. Not sure how else you could find it though. [05:32:48] ...then maybe the Wikivoyage one also apears somewhere else...? [05:33:21] my thought would be that it doesn't exist [05:33:40] like I said, the interwiki links go on the same page so I don't imagine there would be different items [05:34:44] oh well [05:34:51] thanks for the help [05:35:20] sorry I can't be more helpful :( [05:58:10] P373 is a text string for the commons category [05:58:14] it is not a link of any sort [06:56:01] This "ping all the crats" thing is a bit annoying. [07:53:50] rschen7754: eh? https://www.wikidata.org/w/index.php?diff=100065320&oldid=100058358&rcid=100310855 [07:54:31] legoktm: Probably due to the fact the task was being ran on Succu and not the bot account? [07:54:38] legoktm: well there were 2 issues at stake: he was running a fully automatic bot on his main account, and it did not seem that the particular task was approved [07:55:38] ok [07:57:05] probably at this point, best to just wait the rest of the 24 hours and then just approve it [07:58:20] i run semi-automated scripts on my own account all the time, but semi-automatic means that i can pull the plug if something goes wrong or people complain [08:35:09] * JohnLewis pokes Lydia_WMDE, aude and addshore [08:35:23] * Lydia_WMDE pokes JohnLewis [08:35:34] Lydia_WMDE: Dev summary? :) [08:35:44] right... [08:35:49] give me 15 mins? :D [08:35:57] Let me guess 'Vacations again' :p [08:36:02] heh [08:36:02] Oh, 15 minutes. Alright :p [08:36:03] no [08:36:07] people are back now [08:36:10] :D [08:36:14] I know addshore is :p [08:36:25] yeah [08:36:29] he's still catching up though [08:36:33] aaaaanyway [08:36:37] getting on it [12:08:42] DanielK_WMDE: Also merging Jens' 1-based index patch into my stuff now... :P [12:09:07] This going to be a complete overhaul... but we will keep a legacy interface + the new one :) [12:25:50] morning JohnLewis [12:26:22] addshore: Morning [12:26:59] addshore: The poke earlier was about the dev update but Lydia_WMDE was here :p [12:27:04] [= [12:27:20] so do not ping everyone next time :) [12:27:58] Stryn: Hush :p She wasn't here when I pinged her the first 100 times :p [12:28:08] okay :P [12:30:03] tata for now [12:49:03] addshore: around? [12:53:10] hoo: He just went out about 10 minutes ago. [12:53:40] mh :/ [12:53:52] * JeroenDeDauw wonders if an addshore will appear at the office [13:28:34] JeroenDeDauw: Outside of Wikibase: Is there a good way to access the wikis global site id? [13:29:04] hoo: idk, seems more like a MW thing than a WB one [13:29:29] really depends on the context as well [13:29:57] I want to implement a function in Scribunto that gives the site global id [13:32:01] JeroenDeDauw: --^ [13:32:13] Alternative is taht I create a mw.wikibase.globalSiteId [13:34:36] Or... I guess I'll just hide that bit of information from the user and ask Daniel on monday... I guess the global site ids have to be integrated with core, rather than with Wikibase [13:34:43] * then [13:42:00] meh, presentation layer stuff is so fiddly [13:42:50] JeroenDeDauw: Yeah... and passing Entity data back and forth with Lua in can get a mess either... [13:44:46] JeroenDeDauw: In my current version the user has a (in Lua) entity:getSnak( 'P123' ) function (returns the main snak)... then I have to pass that back to PHP to format it, which is bad, bad, bad [13:45:42] so my current approach is in case the user passes that back to PHP just read the property id from the Snak given by Lua get a new Snak from the Entity in PHP [13:47:37] JeroenDeDauw: --^ I know that sounds scary... do you think it would be a clean approach to give a getClaim( propertyId ) function to the Lua users and then implement the Snak parser upon that... mh :/ [13:50:13] There can be more than one claim per property. I am not sure what you are asking either way. Lack of context and garammar errors :) [13:50:22] Are there high level docs on the lua thing? [13:50:30] Or some sane readable code? [13:50:44] JeroenDeDauw: Not yet, I'm currently rewriting the Lua stuff to be saner [13:51:03] but it's still bat shit insane, I guess :/ [13:53:22] JeroenDeDauw: Or asked other way round... I have an array representation of a Snak, how do I best get a Snak object from that [13:55:42] hoo: you send the request to the future, where it will get handled by https://github.com/wmde/WikibaseDataModelSerialization/blob/master/src/Wikibase/DataModel/Deserializers/SnakDeserializer.php [13:55:52] be sure to pass in a callback that returns to the present [13:56:12] And to compile your PHP with CTC support [13:57:19] ... that's totally an option :P [13:57:32] There is some crappy old thing somewhere in Wikibase that does the same, though I want to see it die soon [13:58:29] JeroenDeDauw: mh... what if I do it a little more like Jens did: https://gerrit.wikimedia.org/r/#/c/101201/7/client/includes/scribunto/Scribunto_LuaWikibaseLibraryImplementation.php [13:58:39] that would safe us the back and fourth pain [13:58:56] * forth [13:59:53] oh ffs, one more abomination pretending to be a class [14:00:03] hoo: so you want to add another collaborator to this thing? [14:00:29] JeroenDeDauw: I've actually split that stuff out into a new class, which I first hoped to be less messy but *runs* [14:03:47] JeroenDeDauw: That could look like this (Lua side): local entity = canIhazSomeEntityObject() [14:03:47] entity:getProperties / or getClaims -> { 'P1', 'P42' } [14:03:47] entity:renderFor( 'P42' ) ->. .. [14:03:57] Like that? :P [14:10:33] hoo: it is still not clear to me what the exact question is, and I am still unaware of the context, so cannot judge if it is the right question to begin with. That clearly should be IcanHazSomeEntityObject() though [14:13:34] hoo: sounds good to me, just don't put too much into one change set [14:13:54] (that's re "complete overhaul") [14:16:21] DanielK_WMDE: That's going to end up in a rebase mess, as I wont get immediate feedback [14:16:40] but it's possible, I can split it up [14:25:51] DanielK_WMDE: I guess I can make like 3 commits from it... and there's probably going to be an extra one with integration tests to save us from a LuaCalypse like we had before [15:53:37] https://www.wikidata.org/wiki/Wikidata:Project_chat#New_project_to_reflect_the_information_that_is_available_because_of_Wikidata [16:14:09] Is it just me, or does compser not retieve the ValueView extension automatically? [16:14:37] thewarriorwiki: It does, it's not in /vendor but /extension [16:14:52] * extensions [16:15:01] Alright, I'll give it another shot brb [16:15:59] Alsom is there a page somewhere with noce install instructions? Or should I just update https://www.mediawiki.org/wiki/Wikibase#Installation as I figure it out? [16:16:10] Also* nice* [16:17:09] thewarriorwiki: That should actually be enough for a simple one-wiki setup [16:18:02] My composer.json file is quite simple, but it doesn't get ValueView in vendor or extensions [16:18:04] { "require": { "php": ">=5.3.2", "wikibase/wikibase": "*" }, "minimum-stability" : "dev" } [16:19:27] oh... you fetch Wikibase using composer already... JeroenDeDauw will probably know more about that [16:23:11] hoo: Alright, I'm quite excited to try mucking around with Wikibase but so far I know absolutely nothing about how to install it correctly. Is there a common composer.json file that people used checked in somewhere? [16:24:55] thewarriorwiki: Well, the hard part is probably getting Wikibase in the first place... but I think there's doc. about taht, hang on [16:25:35] https://www.mediawiki.org/wiki/Extension:Wikibase_Repository [16:25:58] howdy hoo :) [16:26:06] and https://www.mediawiki.org/wiki/Extension:Wikibase_Client [16:26:15] hi addshore :) [16:27:17] apparently grrrit-wm is broke? :P [16:27:21] (03PS2) 10Addshore: Merge References for Statements in wbmergeitems [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/106897 [16:27:27] or not ^^ :P [16:28:23] hoo: was there a bug open for that? [16:28:50] addshore: Sure... the one which denny commented on [16:29:07] https://bugzilla.wikimedia.org/show_bug.cgi?id=58850 [16:29:47] [= [16:29:57] (03PS3) 10Addshore: Merge References for Statements in wbmergeitems [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/106897 [16:30:00] wow... the tests log a little messy :P [17:12:14] "Error: Property Entity unserializer expects a "datatype" field" [17:12:33] addshore: Is the repo UI working for you? [18:16:26] (03PS1) 10Hoo man: Refactor the PHP part of the Lua library [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/106905 [18:18:00] (03PS2) 10Hoo man: Refactor the PHP part of the Lua library [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/106905 [18:21:45] (03PS3) 10Hoo man: Refactor the PHP part of the Lua library [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/106905 [18:25:52] (03Abandoned) 10Hoo man: (Bug 54324) [DON'T MERGE] Sequences in Lua should start with 1 [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/96232 (owner: 10Jens Ohlig) [18:28:34] (03PS4) 10Hoo man: Refactor the PHP part of the Lua library [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/106905 [18:29:19] DanielK_WMDE: https://gerrit.wikimedia.org/r/106905 [19:05:44] Hey everybody [19:07:43] Is anybody free for a few quick questions? [19:56:00] ChinRake: never ask to ask, just ask! [20:05:28] !ask [20:05:28] https://www.mediawiki.org/wiki/Extension:Ask [20:05:33] >_> [20:05:41] !del ask [20:05:45] !ask del [20:05:46] Successfully removed ask [20:06:05] JeroenDeDauw: I doubt that extension can fulfill that specific task, yet :P [20:06:26] hoo|away: it is not an extension [20:06:44] did you decouple that? [20:08:22] [02wmde/Ask] 07JeroenDeDauw pushed 031 commit to 03master [+0/-0/±1] 13http://git.io/2AcJ5A [20:08:23] [02wmde/Ask] 07JeroenDeDauw 03820838c - Update README.md [20:09:26] hoo: from what? [20:09:36] MediaWiki? :P [20:09:53] wasn't it an extension some time back [20:13:26] hoo: nope. this code has nothing to do with mw [20:18:09] not bad :) [21:28:02] Bug query: Is the "Unresponsive script" timeout error a known problem, or should I submit a bug? I.e. I get this error on most large Wikidata pages, in Firefox, and have done for quite a few months: http://i.imgur.com/BnjJ1qZ.png [21:28:34] quiddity: Known bug which is being worked on [21:28:40] thanks hoo :) [21:29:48] (I did look around bugzilla, but couldn't see anything. Possibly I was using the wrong keywords, or was searching the wrong product/component. Arrrg, bugzilla! >.< ) [21:30:17] heh, will find the bug for you ;) [21:33:46] quiddity: https://bugzilla.wikimedia.org/show_bug.cgi?id=54098 ;) [21:34:11] perfect, much thanks hoo :) [22:35:23] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1545 (master - 67a353e : Translation updater bot): The build was fixed. [22:35:23] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/e6438e416a17...67a353e52d8d [22:35:23] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/16794931