[05:07:39] https://query.wikidata.org/ <-- not working for me, i getting timeouts [05:07:51] s/i/i'm/ [06:04:42] PROBLEM - Blazegraph Port on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:04:43] PROBLEM - Blazegraph process on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:04:43] PROBLEM - Updater process on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:04:53] PROBLEM - SSH on wdqs1003 is CRITICAL: Server answer [06:05:02] PROBLEM - configured eth on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:05:13] PROBLEM - Disk space on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:05:23] PROBLEM - dhclient process on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:05:33] PROBLEM - DPKG on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:05:33] PROBLEM - Check systemd state on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:05:43] PROBLEM - puppet last run on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:06:03] PROBLEM - WDQS HTTP Port on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:20:06] https://query.wikidata.org/ <-- "java.lang.OutOfMemoryError: unable to create new native thread" [06:23:49] edward: Looking [06:24:09] Ah, I see [06:26:20] hoo: you're also getting errors? [06:27:42] PROBLEM - IPMI Sensor Status on wdqs1003 is CRITICAL: CHECK_NRPE: Error - Could not complete SSL handshake. [06:31:34] edward: Yeah… I'm looking into this, but can't resolve that myself [06:31:44] thanks [06:47:54] PROBLEM - Host wdqs1003 is DOWN: PING CRITICAL - Packet loss = 100% [06:48:32] RECOVERY - dhclient process on wdqs1003 is OK: PROCS OK: 0 processes with command name dhclient [06:48:42] RECOVERY - Host wdqs1003 is UP: PING OK - Packet loss = 0%, RTA = 0.28 ms [06:48:42] RECOVERY - DPKG on wdqs1003 is OK: All packages OK [06:48:43] RECOVERY - Check systemd state on wdqs1003 is OK: OK - running: The system is fully operational [06:49:02] RECOVERY - SSH on wdqs1003 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u3 (protocol 2.0) [06:49:04] RECOVERY - configured eth on wdqs1003 is OK: OK - interfaces up [06:49:12] RECOVERY - WDQS HTTP Port on wdqs1003 is OK: TCP OK - 0.000 second response time on 127.0.0.1 port 80 [06:49:22] RECOVERY - Disk space on wdqs1003 is OK: DISK OK [06:50:43] RECOVERY - puppet last run on wdqs1003 is OK: OK: Puppet is currently enabled, last run 32 seconds ago with 0 failures [06:57:39] hi [06:57:42] RECOVERY - IPMI Sensor Status on wdqs1003 is OK: Sensor Type(s) Temperature, Power_Supply Status: OK [06:58:20] is there someone here [06:58:36] sure [06:59:18] is this thing working ? [06:59:52] it is [07:00:19] i need help [07:00:34] are you free [07:01:28] Just ask your question(s) and someone will eventually answer [07:02:37] I made query and i test it and its working [07:03:39] but when i post it in wiki nothing happened [07:05:12] Mojackjutaily: what does 'post it in the wiki' mean? [07:05:49] umm [07:05:58] https://ar.wikipedia.org/wiki/%D9%88%D9%8A%D9%83%D9%8A%D8%A8%D9%8A%D8%AF%D9%8A%D8%A7:%D9%85%D8%B4%D8%B1%D9%88%D8%B9_%D9%88%D9%8A%D9%83%D9%8A_%D9%85%D9%88%D8%B3%D9%8A%D9%82%D9%89/%D8%A3%D8%B9%D9%85%D8%A7%D9%84_%D8%AA%D8%AD%D8%B1%D9%8A%D8%B1%D9%8A%D8%A9/%D8%B4%D8%B1%D9%83%D8%A7%D8%AA_%D8%A7%D9%84%D8%AA%D8%B3%D8%AC%D9%8A%D9%84%D8%A7%D8%AA [07:06:01] here [07:07:56] PROBLEM - High lag on wdqs1003 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] [07:08:26] Mojackjutaily: you're trying to use the {{Wikidata list}} template on arwiki and it isn't working? [07:08:26] 10[1] 10https://www.wikidata.org/wiki/Template:Wikidata_list [07:08:43] edward: Yes [07:11:15] Mojackjutaily: the {{Wikidata list}} transclusion looks broken [07:11:16] 10[2] 10https://www.wikidata.org/wiki/Template:Wikidata_list [07:11:38] there should be nothing between the opening '{{' and Wikidata list [07:11:53] you have a heading and {{#time: Y-m-d H:i|{{REVISIONTIMESTAMP}}}} in there [07:11:53] 10[3] 04https://www.wikidata.org/wiki/Template:REVISIONTIMESTAMP [07:12:29] check the example on https://ar.wikipedia.org/wiki/%D9%82%D8%A7%D9%84%D8%A8:Wikidata_list [07:12:57] yah i used heading and {{#time: Y-m-d H:i|{{REVISIONTIMESTAMP}}}} before and it was working [07:12:58] 10[4] 04https://www.wikidata.org/wiki/Template:REVISIONTIMESTAMP [07:14:34] the 'Wikidata list' template needs to be included in the page. you need to fix your page so the template is included properly [07:14:53] nothing between '{{' and 'Wikidata list' [07:19:43] look here i made exactly same list [07:19:56] https://ar.wikipedia.org/wiki/%D9%88%D9%8A%D9%83%D9%8A%D8%A8%D9%8A%D8%AF%D9%8A%D8%A7:%D9%85%D8%B4%D8%B1%D9%88%D8%B9_%D9%88%D9%8A%D9%83%D9%8A_%D9%85%D9%88%D8%B3%D9%8A%D9%82%D9%89/%D8%A3%D8%B9%D9%85%D8%A7%D9%84_%D8%AA%D8%AD%D8%B1%D9%8A%D8%B1%D9%8A%D8%A9/%D8%A3%D9%87%D9%85_%D8%A7%D9%84%D8%A3%D9%84%D8%A8%D9%88%D9%85%D8%A7%D8%AA_%D8%BA%D9%8A%D8%B1_%D8%A7%D9%84%D9%85%D9%88%D8%AC%D9%88%D8%AF%D8%A9 [07:20:39] not exactly but similar [07:20:45] sorry, my mistake [07:21:01] no its ok [07:21:14] i was reading the include wrong because the text is right-to-left [07:21:26] lol [07:21:49] yah Arabic is different [07:23:22] just delete all the Arabic letters [07:50:04] no one was able to figure it out? [09:49:31] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1962 bytes in 0.129 second response time [09:59:31] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1947 bytes in 0.096 second response time [12:34:18] RECOVERY - High lag on wdqs1003 is OK: OK: Less than 30.00% above the threshold [600.0] [12:54:12] o/ timezone-appropriate greetings, all. Still looking for someone who can spend some time with me looking at implementation details for https://phabricator.wikimedia.org/T177022 [14:29:33] Technical Advice IRC meeting starting in 30 minutes in channel #wikimedia-tech, hosts: @addshore & @Christoph_Jauera_(WMDE) - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:52:40] marktraceur: that might be DanielK_WMDE, addshore or Thiemo_WMDE [15:52:46] (sorry only seeing it now) [15:52:55] ohalo [15:53:17] I still have that ticket open in my browser! :D Is there a WIP patch to look at yeT? [15:53:18] itsanaddshore! \o/ [15:53:24] hi Lydia_WMDE ! [15:53:28] its cold :( [15:53:36] ohnoes [15:53:51] it is too warm here for the season [15:53:53] so come to berlin :D [15:54:37] hi Lydia_WMDE [15:54:37] addshore: Not really, I'm a little confused how to hook things up [15:54:47] hey Harmonia_Amanda :) [15:55:15] addshore: I wrote some lookup service classes, but how to get those to be discoverable by generalized code in wbgetentities is beyond me right now [16:03:52] marktraceur: is it on gerrit? :D [16:04:34] addshore: Not yet, I can dump that in if you want. One moment. [16:04:51] yeh, just to give me an idea of which looks you have done [16:05:29] addshore: https://gerrit.wikimedia.org/r/384996 [16:09:14] Okay, so I think setp 1 marktraceur would be alter wbgetentities so that it can lookup entities using titles on the local site (totally ignoring media info for now) [16:09:32] so wbgetentities&localtitle=Q1|Property:P2 [16:10:06] Youll probably have to touch the getentities api module class a bit to make it use some entityid lookups (which it doesnt currently use) and probably a title parser of some sort [16:11:03] Step2) would be to allow multiple entityid lookups to be registered / configured for the api module, so when wikibase is used along side the mediainfo extension, mediainfo would register the lookups you have in that gerrit patch, and then magicaly wbgetentities would be able to look them up too! [16:12:14] You might need some sortof DispatchingEntityIdLookup in wikibase too, as it doesn't look like there is one currently [16:13:15] For what that should look like, you can probably take a look at DispatchingEntityLookup which is in datamodel services [16:15:13] addshore: OK...that makes sense then [16:15:48] feel free to add me to any patches :) I'm suite busy this week, but next week I'll have much more time! [16:16:05] addshore: Yeah no problem, we're getting there slowly but surely [16:16:23] Yup. I imagine much of Wikibase can be a steep learning curve [16:17:18] addshore: One thing I'm wondering at the moment is, once you have a localtitles=File:Foobar.jpg request come in, how do you decide it's a MediaInfo ID that you're looking for? I imagine we can't just assume that, but maybe I'm not quite right [16:18:04] At least I can imagine an extension that links file pages to entities representing the subject instead of the file itself, or the author, or whatever [16:19:18] So, I believe right now each entity has a namespace, Items are main, properties are in the property namespace and i guess media info are in the file namespace [16:19:34] correct [16:19:39] I guess that could be part of the logic in the dispatching lookup [16:19:41] though that should change right? [16:19:42] addshore: MediaInfo:M4 is how I see them now [16:19:42] M4: Phabricator project labels - https://phabricator.wikimedia.org/M4 [16:19:47] ty stashbot [16:19:57] Lydia_WMDE: I also assume that assumption is always going to be correct? [16:20:06] mediainfo should be integrated in the file page [16:20:09] addshore: I imagine that will change once MCR becomes a thing [16:20:12] marktraceur: ooooh, i guess that is because we done have MCR yet [16:20:16] ^ that. [16:20:21] though it'd still be an exclusive namespace [16:20:22] *dont [16:21:02] yeh, its a bit more annoying / ugly without mcr done I guess [16:21:02] Maybe I'm overthinking things, but is it possible that there may be two Wikibase-like content slots on the file namespace? [16:21:25] i don't have info about any plans right now [16:21:29] but not impossible i guess [16:22:16] I suppose the relevant question is, is it impossible enough that we don't care [16:22:26] atm i'd say yeah [16:22:35] OK, that makes life easier [16:22:44] but DanielK_WMDE might disagree [16:22:49] I can assume a File<-->MediaInfo 1:1 link, and change it to MCR in the future [16:22:52] but yeah i guess for now work with that [16:23:17] If we really need a content-slot disambiguation parameter for the wbgetentities API later on, we can do it [16:23:25] I believe in us [16:23:28] *nod* [16:23:30] \o/ [16:23:43] :D [16:24:08] marktraceur: yeh, I think that assumption is okay in MediaInfo for now, with a reminder to kill it once we have mcr [16:24:30] I wonder what the current record is for "// TODO" in a patch [16:24:37] lol [16:26:59] :D marktraceur I'm sure we could write some code to go and find that out for us! [16:27:54] time well spent ;-) [16:29:55] Lydia_WMDE: I am going to try and look at the docker wikibase stuff for the birthday next week :D not leaving it until the last minute at all! [16:30:06] addshore: <3 [16:30:07] :P [16:30:45] Does anyone know of a wiki which uses Wikidata descriptions via Lua or the parser function? [16:34:52] hoo, ask on #wikidata [16:35:05] Okay, we are on #wikidata. [16:51:51] marktraceur, addshore: i missed the original question. [16:52:05] why should we have two wikibase entity slots on the file namespace? [16:53:06] marktraceur: wbgetentities does not use pages or titles or namespaces or slots. it's entirely based on entity IDs. [16:53:07] DanielK_WMDE: Admittedly that's a contrived example of one way our assumptions could come back to bite us, but as an example, one could store information about the file separately from information about a piece of artwork in the file, on the same wiki...but again, contrived [16:54:02] DanielK_WMDE: We're talking about wbgetentities using filenames (and wbgetentities *does* have a titles/sites parameter combination currently) [16:54:24] marktraceur: it would work if they use different entity types. having the *same* entity type in two slots would require the slot to be determined from the ID somehow. [16:55:16] DanielK_WMDE: Yeah, I think the ultimate answer is either to return both (and let the client figure out its mistakes) or to accept a "slot" parameter that can disambiguate which slot the client wanted [16:55:30] marktraceur: ah, titles/sites for addressing by connected page, right. nothing to do with slots. This does *not* give you the entity stored on the page you give. it loads the entity *connected* to that page. [16:55:33] Like localtitles=File:Foobar.jpg&slot=mediainfo or whatever [16:55:47] for data items, this is very different (typicalyl even on aq different wiki) [16:56:15] DanielK_WMDE: Right. Do you think things will get completely screwy if we later try to change wbgetentities from handling connected pages to a wikibase content slot within a page? [16:56:26] for mediainfo, it will eventually be equivalent, since the mediainfo on the file description page is conceptually "connected" to that very page. but that's really a confusing edge case. [16:57:16] marktraceur: i see no need to change that. [16:57:38] marktraceur: what should become more flexible is how wikibase determins which pages are "connected". [16:57:38] Maybe that use case will be completely trivial with the MCR work, and this workaround is only needed in the interim? [16:57:58] what workaround, precisely? [16:58:29] DanielK_WMDE: The connection between file pages and MediaInfo entities, and the changes I'm making to the API to make that connection meaningful [16:58:55] marktraceur: i do not think any changes to the api are needed. i would try to avoid making any. why do you think they are needed? [16:59:43] marktraceur: i'm proposing to allow SiteLinkLookup to return a MediaInfoId when called for local/File:Foo.jpg [17:00:11] if you have a SiteLinkLookup that does that, the API has no need to change, right? [17:00:36] DanielK_WMDE: I mean, in theory, yes, but I don't think the core code that corresponds to SiteLinkLookup handles 'local' at all [17:00:54] I'm not even sure we could trust the string 'local' to necessarily be available or consistent across sites [17:01:03] sure, the repo is a client to itself. it can be addressed like any other client - using it's wiki ID [17:01:14] (commonswiki, in that case) [17:01:27] I've certainly tried that locally and had no luck [17:01:32] Also on the structured-commons beta [17:01:35] But SiteLinkLookup currently assumes that only Items can be "connected" to pages. [17:01:39] that would need to be changed [17:02:21] that may be a config issue then. it definitly should work. if not, that's a bug. [17:02:55] it should currently work for *items*. it doesn't work for mediainfo. because that currently can't be "connected". [17:03:01] i'm proposing to change that [17:03:03] Right [17:03:28] Well, if you point me at how to change that I can get into it [17:04:28] In SiteLinkLookup and its implementations, all references to ItemId need to be replaced by EntityId. [17:05:01] then we have to decide how the "connection" between mediainfo and file page should work. we could store it in the db, or do it programmatically. [17:05:55] marktraceur: actually... there is already MediaInfoIdLookup and FilePageLookup. That'S the functionality we need. We want to fold that into the SiteLinkLookup interface somehow. [17:06:17] I suppose we'll need some sort of multi-lookup implementation of SiteLinkLookup. [17:07:03] hm, only really if Items and MediaIno love in the same repo. shjould be possible, even if we don't need that in production [17:09:21] DanielK_WMDE: As far as I can tell there's only one implementation of SiteLinkLookup right now, so if it's just a matter of trying a lookup with whatever lookup services have been registered, that seems pretty straightforward [17:10:01] i count four, but only one "real" implementation [17:10:28] yes, that would work for now, as long as the wiki only has MediaInfo *or* Items. [17:10:46] if it has both, it needs a SiteLinkLookup that somehow figures out where to look for what. [17:11:32] ah - we can't easily store the mediainfo "connections" to the db, even if we wanted: the table doesn't track the entity type. it assumpes "item" for everything. [17:12:16] so yea - an implementation of SiteLinkLookup based on MediaInfoIdLookup and FilePageLookup should do the trick for now [17:15:55] marktraceur: if implementing SiteLinkLookup turns out to be problematic, and alternative (yet not-so-prettry, imho) option is to hook into ItemByTitleHelper somehow. [17:16:10] oh, and that thing needs to at least be renamed [17:16:20] it no longer guarantees Items. [17:16:35] DanielK_WMDE: I can't just dump MediaInfoIdLookup into core Wikibase, though - probably need to register lookup classes in extensions somehow [17:17:13] Could pretty easily add a test function/closure that determines whether the class will return useful results for a page [17:18:40] well, the simplest way would be to allow the implementation of SiteLinkLookup to be replaced based on config. [17:18:54] Wikibase doesn't use wiring files for that. if it did, this would be trivial [17:21:01] DanielK_WMDE: But since it doesn't, lookup registration seems like a reasonable way to go about it? [17:21:47] marktraceur: yes [17:22:05] Cool. [17:22:17] So here goes attempt #3... :) [17:22:59] marktraceur: look for "new SiteLinkTable". that code should depend on config/registry somehow. [17:23:13] note that there is no good concept of registries in wikibase (or in mw core). [17:23:17] it's all mixed up with configuration [17:23:27] one day i'll fix that... [17:23:47] DanielK_WMDE: Yeah, I figure it'll end up being some hook or something [17:24:05] Which is not ideal but also not the worst thing in the world [17:24:33] yea, and it's easy enough to change the registration interface, as long as there isn't a ton of 3rd party things using it [17:25:29] Honestly probably fine to just use a hook rather than a registration...the MediaInfo code could just determine whether a file page is being looked up and return a MediaInfoEntity or whatever [17:25:51] (the class names are starting to bleed together a little bit at this point) [17:27:36] Though it seems like the first hurdle is to figure out what the actual wiki ID is for the local site... [17:29:57] marktraceur: wfWikiID() will work in a pinch. [17:30:20] DanielK_WMDE: Well, I assume it's wikiid from meta=siteinfo, but that's not working on structured-commons [17:30:29] http://structured-commons.wmflabs.org/api.php?action=wbgetentities&sites=commonswiki&titles=File:LighthouseinDublin.jpg [17:30:43] it only works if the sites table is set up correctly... [17:30:51] Ah, so it's not my fault. :) [17:30:52] another nasty thing i really want to fix :( [17:32:53] marktraceur: i think for now, wfWikiID() is your best bet [17:33:11] DanielK_WMDE: In the code sure, I just meant for API clients [17:33:18] Though the Sites stuff may get in the way, if not set up correctly [17:33:21] Sounds like I was doing it right but the sites table is the problem [17:33:38] ah, right. the cleint needs to actually specify. [17:33:54] i guess it would make sense to allow the client to use some kind of alias for "local". [17:33:57] maybe "local" :) [17:34:00] or "*" or something [17:34:31] maybe it we could allow the sites parameter to be moitted entirely. [17:37:41] I'm honestly not sure if the sites table is meant to hold information about the local site [17:38:04] marktraceur: it definitly is. [17:48:07] marktraceur: just copy sites and site_ids from somewhere sane. should be the same wiki family, though [18:00:54] I mean, I don't have anywhere to copy from that would have my localhost testwiki set up as a site [18:01:28] But also for some fun reason, the addSite script has decided that the filepath I entered is int(1), and now I get to deal with that [18:05:46] Finally. [18:05:58] Yeah, I don't think that's all. [19:54:07] How does a nulledit on Wikidata work? [19:54:49] Some reports are based on SPARQL now and sometimes the number of statements in pageprops isn't correct. [21:10:36] sjoerddebruin: nulledits should "just work", but there is no way to create one via the UI, I'm afraid. [21:10:43] well, you could edit and revert yourself ;) [21:11:27] I don't like that. I need to do LinksUpdate, but there is no API entry for that AFAIK. [21:19:06] sjoerddebruin / DanielK_WMDE_ : Null edit with link refresh using the api [21:19:23] That's what I used for updating page_props in the past [21:20:01] https://github.com/multichill/toollabs/blob/master/bot/wikidata/page_props_purge.py#L46 [21:20:39] Ah, let me write a script then. [21:21:30] https://www.wikidata.org/w/api.php?action=help&modules=purge [21:21:37] Yeah, I've found it now. [21:21:44] You can just re-use that python script [21:21:54] But I want to have a button in the UI. :P [21:22:32] Oh, that should just be a little bit of javscript [21:27:45] Hm, I don't think it makes a huge difference. I'm still depending on all the different WDQS servers. :| [22:51:44] https://grafana.wikimedia.org/dashboard/db/wikidata-dispatch?refresh=1m&orgId=1&from=now-24h&to=now [22:52:03] Warned someone who edited two days with over 200 edits per minute now.