[07:19:57] PROBLEM - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1419 bytes in 0.306 second response time [08:21:57] RECOVERY - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is OK: HTTP OK: HTTP/1.1 200 OK - 1413 bytes in 0.118 second response time [11:55:34] Wikidata is sloooooww. [14:04:57] Yes. [14:15:46] Does Wikidata API provide information about items content? I.e.: does some item contain some property, what value has this property in this item etc.? API request action=query&prop=pageprops doesn't provide this info [15:24:53] mbh: Yes, https://www.wikidata.org/w/api.php?action=help&modules=wbgetentities [15:43:07] jem: thanks. How to find out in which wikipedias article has "Featured" or "Good" status? https://www.wikidata.org/w/api.php?action=wbgetentities&format=xml&sites=enwiki&titles=Dishonored&props=info%7Csitelinks%7Csitelinks%2Furls%7Caliases%7Clabels%7Cdescriptions%7Cclaims%7Cdatatype doesn't provide this info [15:58:01] mbh: it does [15:58:02] Q17437796 [15:58:14] Look for that... the badges are what you're looking for [16:05:24] thanks. Badges indicates only FAR/GAR status or something else? [16:08:14] mbh: They can indicate several statuses. We have badges for: good article, featured article, recommended article, featured list, featured portal [16:10:22] OK. Is it possible to find articles by badges, find all articles in language section or in all language sections with some badge? [16:11:13] mbh: (Using build in functionality) No, what exactly do you mean?, no [16:11:29] Using one of the SPARQL end points: Yes to (probably) all of these [16:11:55] sjoerddebruin: So we almost have unconnectedpages support in Pywikibot. [16:12:04] <3 [16:12:14] Should a bot create new images for pages at the Dutch Wikipedia at some point or is the manual work keeping up? [16:12:32] I want to create list of articles that have FAR/GAR status in maximum number of lang. sections [16:13:18] multichill: Trying to keep it up. [16:13:30] Last time I checked there were like 900 articles unconnected. [16:13:40] That's not too bad at all [16:14:25] 1128 atm https://tools.wmflabs.org/wikidata-todo/duplicity.php [16:14:59] mbh: There are several ways to do that, but I guess none of them are really straight forward... ahve you worked with SPARQL beforE? [16:15:13] No [16:15:24] Weird to see that, for a heavy Wikidata-user as ruwiki, that they have 13766 unconnected pages. [16:15:58] How do you count? https://nl.wikipedia.org/w/index.php?title=Speciaal:OngekoppeldePaginas&limit=500&offset=3500 still yields results [16:16:10] Only ns:0? [16:16:36] Yup, with the link I've just gave. [16:18:14] hoo: In any case, there is simplest and most straightforward path: go through all items and count the number of relevant badges in everyone item; isn't? [16:18:59] If you want to go through ~20M items, sure [16:19:29] More 18.7M [16:20:18] If you use the json dump, that should be doable in a reasonable amount of time, if you want to use the API, you should take a different approach [16:20:40] by only looking at items that link to one of the badge items [16:20:46] multichill: I think the best you can build soon is a tool for importing disambiguation pages. [16:20:50] we're talking about API; isn't a easier way is a using Labs account with access to DB replicas? [16:21:21] sjoerddebruin: As in creation of items or to mark these item as disambiguation pages? [16:21:42] Hm, but first try to find if it already exists. [16:22:18] by searching for the same pagename (with P31) or "pagename (disambiguation)" [16:23:20] I can mass-import them with a tool, but most pages can't be imported because there is already a item with the same label. You can disable that check, but then you need to merge them afterwards. [16:23:34] mbh: You don't have that data their either... you have the dumps at hand and the meta data [16:23:41] but no data about badges specifically [16:24:40] where is json dumps, https://dumps.wikimedia.org/wikidatawiki/20150704/ here? [16:24:56] Meh, only 22 pages without a item. [16:25:52] If we want a better workflow, people should be asked to add statements after connecting pages with the widget. Now you only get this: https://www.wikidata.org/w/index.php?title=Q20745782&action=history [16:26:24] Oh, sweet disambiguation is a page property :-D [16:27:04] Btw, you need to ignore items with disambiguation and last name as P31. [16:27:12] Those still need cleaning up. [16:27:29] I only work on items without any claims [16:27:53] Still talking about my idea. ;) [16:46:52] Dammit, only 4 hits for the disambiguation pages [16:48:40] I always got like 100 every time I checked. [16:48:45] Things have changed then... [16:50:41] I'm only looking at pages that already have an item [16:52:16] Ah, ok [16:53:05] hoo: so, I can't get list of articles with badges through requests to DB replicas? [16:53:37] Kind of yes... you can get a list of all items that link to one of the badge items [16:53:59] all badges items will be belong these (but not all these will be badge items, necessarily) [17:33:36] JeroenDeDauw: ping [18:35:11] Lydia_WMDE: anything noteworthy? [18:35:29] twitter returns nothing which is unusual in a week :p [18:37:30] JohnFLewis: About to send out the weekly? [18:37:40] hoo: yeah [18:38:10] was out all day yesterday and it escaped my mind this morning :) [18:39:40] mh... looking throught it, I don't think I have anything to add [18:41:37] sjoerddebruin: Ben jij http://cths.fr/an/prosopo.php?id=106451 wel eens tegengekomen? [18:41:52] Die site of die persoon? [18:42:09] hoo: slow week then, guess I'll send [18:42:21] JohnFLewis: wait [18:42:23] I haven't been working all week (until today) [18:42:32] So yes, very slow week for me at least [18:42:55] JohnFLewis: Can you include my oversight thingy again? It's still running [18:43:05] sjoerddebruin: if I have to :p [18:43:38] site [18:43:44] multichill: nope [18:43:48] JohnFLewis: Everyone was recovering from Wikimania [18:44:02] You can include the two CC0 releases (MOMA and Walter) JohnFLewis [18:46:48] Ugh, this is taking long. :P https://www.wikidata.org/wiki/Wikidata:Property_proposal/Authority_control#Catalogus_Professorum_Academiae_Groninganae-identificatiecode [18:47:10] * sjoerddebruin wants to have stuff done, but hooray for bureaucratic things [18:49:06] sjoerddebruin: Go ahead, no opposes, just create it [18:49:32] I would do it if I wasn't fighting autolist [18:50:16] JohnFLewis: yeah there is. let me go and fill up twitter [18:50:21] Not sure if I can call this consensus tbh [18:50:59] https://www.wikidata.org/wiki/Help:Properties#Creating_a_new_property notes that there should be some kind of support. [18:51:36] Lydia_WMDE: can you add them to https://www.wikidata.org/wiki/Wikidata:Status_updates/2015_08_01 [18:51:48] hoo: pong? [18:51:48] it'll be easier for me because I'm working on another project atm :) [18:51:58] ok [18:52:06] !nyan | Lydia_WMDE [18:52:06] Lydia_WMDE: ~=[,,_,,]:3 [18:52:25] Lydia_WMDE: quit impersonation the real Lydia_WMDE, I know you are a fake one! [18:52:34] -.- [18:52:52] Go ahead and prove you are not a fake Lydia_WMDE then :) [18:52:58] #wolololo [18:54:47] how to prove if Lydia_WMDE is real or not: Get reaction to http://wallhd.in/wp-content/uploads/2015/01/cats_nature_my_cat_wallpapers_desktop_hq_backgrounds.jpg [18:54:50] sjoerddebruin: At least you have one +1 [18:55:26] Maybe it's time for nikki to learn how to create a property... ;) [18:55:34] hoo: what do you want? [18:57:27] !lydia [18:57:27] \o/ [18:57:32] sjoerddebruin: you killed it! [18:57:41] oh no people are talking about me :P [18:57:42] And then brought back a zombie version? [18:57:57] nikki: well of course, how could they not? [18:59:58] I'm not that important! [19:00:05] You are. <3 [19:00:41] Yeah <3 [19:12:03] . [19:12:12] ! [19:12:18] @ [19:13:58] JeroenDeDauw: http://wikiba.se/usage/ is not up to date [19:14:03] But it's not in git... so :S [19:14:59] Was a copy of https://www.mediawiki.org/wiki/Wikibase/Installation [19:15:40] hoo: not in git? [19:15:49] look at the top right? ;p [19:16:21] I know, I've cloned that [19:16:25] but it's not there [19:16:32] oh [19:16:34] hehe [19:16:37] Probably got deleted [19:16:40] bad me [19:16:50] Let me properly update the site :) [19:17:11] We probably have incoming links to that [19:17:28] Can we redirect it to the original? [19:22:20] JohnFLewis: done [19:22:32] Lydia_WMDE: awesome [19:22:34] hoo: submit a PR then ;p [19:22:46] meh :P [19:23:19] Is the .htaccess in that repo as well? [19:23:41] hoo: look at the bloody repo then >_> [19:23:46] lazy hoo is lazy [19:23:59] it's not [19:26:25] Lydia_WMDE: https://phabricator.wikimedia.org/tag/%C2%A7_wikidata-sprint-current/ is that our latest sprint? [19:29:49] hoo: yes [20:50:25] Great, more mess to clean up... https://www.wikidata.org/w/index.php?title=Q20746866&action=history [21:07:43] sjoerddebruin: Looking for a mess? Check out https://www.wikidata.org/wiki/Q18122643 :P [21:08:04] ugh [21:08:41] I don't get https://www.wikidata.org/wiki/Wikidata:Database_reports/Most_linked_category_items [21:08:47] Why would anyone want to do that? [21:09:06] It feels weird for me. [21:09:17] to* [21:09:46] Do you use https://www.wikidata.org/wiki/User:Ivan_A._Krestinin/To_merge/nlwiki ? [21:09:49] Looks interesting [21:10:07] I use https://www.wikidata.org/wiki/User:Pasleim/projectmerge/enwiki-nlwiki [21:10:15] FakirNL uses the other one [21:11:59] I wonder why https://www.wikidata.org/wiki/Q2668146 and https://www.wikidata.org/wiki/Q16143609 are not merged [21:12:07] He seems to have moved some sitelinks around [21:12:44] Oh wait, two series [21:13:30] Nice mess to solve [21:13:39] Jup [21:15:40] Who was asking about badges again? Should check out https://www.wikidata.org/wiki/User:Pasleim/Badge_statistics [21:15:59] mbh [21:23:02] ffs https://www.wikidata.org/w/index.php?title=Q18015470&action=history [21:25:38] Tagging painters. Hope I didn't get too many false positives [21:47:22] Hi :) I have offline mirror of Wikipedia in 4 languages (4 MediaWiki's installations), and I want Interwiki to be available on the 4 sites. I tried to do it using Wikibase and the JSON file of Wikidata. [21:47:30] Right now, I can't find a proper tool to load the data from the JSON into the 5th MediaWiki I installed (to be used as Wikidata/Wikibase repo) [21:47:37] Is there any existing way of doing it? [21:48:27] I think aude was working on a way to do that [21:48:32] dunno how the state of that is [21:48:40] She's probably not here right now [21:53:23] Thanks :) Is there any other way to create Interwiki links based on the JSON? [21:53:26] Or any other dump? [21:55:31] mh [21:55:41] You could use the xml dump and try importing that [21:55:50] Not going to be super nice, but will work [21:56:25] Well, few days ago I imported the enwiki, so nothing will surprise me :P [21:56:27] You will need to set the repo setting "allowEntityImport" to true before/ while importing the xml dump [21:56:44] But despite of that just using the dump will probably work [21:57:09] You should take the one with all pages, otherwise properties will be missing which is going to be messy [21:57:30] Cool, thanks [21:57:43] So I need to import the dump into another Mediawiki instance [21:57:55] The one that has Wikibase repo on it [21:57:57] yes [21:58:15] K, Thanks ^^ [22:31:37] PROBLEM - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1426 bytes in 0.194 second response time [22:32:09] meh [22:41:09] LOL [22:48:09] Should be good again in a bit [22:48:19] I'm running a huge load of extra dispatchers [22:50:31] Good again [23:02:38] RECOVERY - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is OK: HTTP OK: HTTP/1.1 200 OK - 1418 bytes in 0.184 second response time [23:57:36] Easy chocolate (yet incredibly annoying bug, when testing update.php on SQLite): https://gerrit.wikimedia.org/r/228755