[04:07:03] Hello [09:44:48] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1972 bytes in 0.103 second response time [09:46:05] !admin Hi, can someone help me identify and fix the issues with the formatting URL and regex in this proposal? Would also appreciate any comments on the proposal as well. Thanks. [09:46:05] Attention requested  HakanIST sjoerddebruin revi [09:46:20] https://www.wikidata.org/wiki/Wikidata:Property_proposal/D%26B_Hoovers_company_profile [09:51:17] what's the problem, exactly? [10:01:40] gotitbro: the problem is that if we store "776cc66357fe4df3" only, we can't form the URL [10:02:02] So we probably need to store the whole of "vivaldi_technologies_as.776cc66357fe4df3" or whatnot as the value [10:02:29] Your current suggestion would give us http://www.hoovers.com/company-information/cs/company-profile.776cc66357fe4df3.html which does not work [10:13:10] and erm [10:13:41] bit offtopic: I think we should modify topic to say "attention of admin (i.e. block, protection, delete)" [10:13:54] I think 80% of ping I've got was not related to my function as admin [10:15:10] AlexZ: ^ pls? :DDDD [11:10:59] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1971 bytes in 0.084 second response time [12:02:06] hi! working with OAuth, so we have a few questions: [12:02:06] * [this consumer key](https://www.wikidata.org/wiki/Special:OAuthListConsumers/view/5ad72f099629357c183a1b850534c807) seem to have stopped working for the Special:OAuth/initiate 1st: we get this error in response `Error: oauth_callback must be set, and must be set to "oob" (case-sensitive), or the configured callback must be a prefix of the supplied callback.` while nothing has changed on our side. Meanwhile, [12:02:07] this key is still valid when used with previously initialized tokens [12:03:40] * what is the process to add new rights to existing consumers keys? we would like to add the right to create items to both https://www.wikidata.org/wiki/Special:OAuthListConsumers/view/5ad72f099629357c183a1b850534c807 and https://www.wikidata.org/wiki/Special:OAuthListConsumers/view/02151058803c2fa05a7c34b81f1873bd , to be able to solve this issue https://github.com/inventaire/inventaire/issues/104 [12:05:20] Lucas_WMDE any clue? [12:33:10] @revi This was my original proposal https://www.wikidata.org/w/index.php?title=Wikidata:Property_proposal/D%26B_Hoovers_company_profile&oldid=678460325 [12:33:40] Pigsonthewings changed it, was my original proposal correct? [12:34:45] well, that's not really an admin question, and I don't know about specific case here [12:35:33] alright, thanks [15:25:35] number 12 on https://www.wikidata.org/wiki/Help:FAQ#Editing ... am I doing it wrong or does prefixing with P: no longer work? [15:26:00] sjoerddebruin: someone was asking a couple of days ago about the missing property suggestions, do you know anything about that? [15:26:10] nikki: see the project chat [15:26:37] https://www.wikidata.org/wiki/Wikidata:Project_chat#Some_features_temporarily_disabled [15:27:17] thanks [15:46:12] Hello there! the IRC office hour will take place in 15min on #wikimedia-office :) [15:47:51] sjoerddebruin: nikki: currently in the process of reenabling [15:58:10] sjoerddebruin: nikki: should be back now [15:58:22] yay! [16:03:13] IRC office hour happening now :) [19:52:36] hello. [19:52:49] wikidata got problems? (: [19:54:49] Could you be more specific? [19:55:25] Yes, we do. SothoTalKer is around, for example D: [19:56:30] (no really, what broke? Loads fine for me but didn't try editing) [19:56:39] reosarevok: don't mess wiz ze germans [19:57:26] i sometimes get a sparql error when checking for statement constraints o.o [19:57:57] WikidataFacts: ^ [19:58:13] distinct values constraintHelp Discuss [19:58:13] The SPARQL query resulted in an error. [19:58:23] hm [19:58:37] I’d have to check if we log any more details than that… [19:58:47] does it work if you reload the page or is it persistent? [19:59:23] a simple F5 reload does nothing, but a cache purge resets it. [20:00:23] hm [20:00:31] I didn’t think we cached those errors :/ [20:00:37] well, it is random. on a cache purge it can either be a) all good, b) some identifiers get the warning, c) all get the warning [20:01:13] but a simple F5 is always the same [21:23:16] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1976 bytes in 0.152 second response time [21:52:48] anyone still awake? (: [21:53:11] * abian is sleeping [21:53:11] who is familiar with queries? (: [21:53:45] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1967 bytes in 0.060 second response time [21:54:01] icinga-wm isn't sleeping :) [21:54:57] you are cheating [21:55:38] Ouch! [21:55:56] Tell me :) [21:56:03] But not an expert... [21:56:34] i want something like "thingy has P1 and P2 but not P3) [21:56:38] " [21:57:35] You can use OPTIONAL { thingy wdt:P3 ?ouch } FILTER(!BOUND(?ouch)) [21:58:12] what does it do? :) [21:58:39] "thingy doesn't have a statement using P3" :) [21:59:30] let me try ;p [22:04:21] abian: perfect <3 [22:04:29] Yay :D [22:06:53] whoops [22:08:39] limiting it to 100 still takes it over 30 seconds to finish (: [22:08:45] or: FILTER NOT EXISTS { thingy wdt:P31 ?anything. } [22:08:55] or MINUS { thingy wdt:P31 ?anything. } [22:09:02] I usually use MINUS, but not sure which is more efficient [22:11:01] hmm. https://www.w3.org/TR/sparql11-query/#neg-notexists-minus [22:12:06] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1967 bytes in 0.067 second response time [22:12:23] it’s “probably” going to be equivalent on your query :) [22:12:57] likely, it's basically just a simple query to check for a missing property ;) [22:13:05] yeah :) [22:13:12] can you paste a link to the query? [22:13:30] WikidataFacts: is there any difference between "?anything" and "[]? [22:13:47] if ?anything doesn’t occur anywhere else in the query, not really [22:13:54] I've generally used MINUS { ?item wdt:P1234 [] } [22:14:13] you can use [] as a shorthand for an anonymous variable, so to speak [22:14:21] I just didn’t want to make the example too confusing ;) [22:14:26] Sure :) [22:14:41] aww [22:14:56] filter not exists takes too long :x [22:15:19] I have no idea what is most efficient, but it's probably not a filter / bound , because those seem to take forever all the time I use them :/ [22:15:23] See if MINUS works [22:15:33] i do that now [22:15:47] SothoTalKer: if you post the query I could check for other things to optimize too [22:15:54] e. g. removing the label service often helps [22:15:58] (and then there are tricks to add it back) [22:16:15] sure, the solution provided by abian worked, though (: [22:16:22] grmbl [22:16:40] Huh, surprised that BOUND didn't time out, but neat :D [22:16:40] it displeases me that the hack works while the dedicated features (FILTER NOT EXISTS / MINUS) don’t :D [22:17:06] We really need to get an endpoint like this for MusicBrainz... [22:17:08] i set the limit to 100 for all queries [22:17:42] reosarevok we got samj1912 :p [22:18:07] I don't think he's done any SPARQL :) And he has enough with SOLR [22:18:10] WikidataFacts: there you go: http://tinyurl.com/y8ka4m9q [22:18:24] it's using the hack (: [22:18:44] yeah, super fast without the label service: http://tinyurl.com/y96vmszh [22:18:49] But it would be lovely to have something like this we can just give to users like "here, do whatever you want with this, get read-only stuff, you can't mess our data up, all is good" [22:19:09] No more hand-written reports! [22:19:29] and now we add the label service back – tada! http://tinyurl.com/ycyybrlq [22:19:36] (don’t ask why the outer query needs the LIMIT as well) [22:19:38] Anyway, when you want to retrieve As not being Bs, and there are many many Bs and just a few As, will be the !BOUND option more efficient? [22:20:14] WikidataFacts: so, what you're saying is "I hate hacks, except this specific hack with label service"? :p [22:20:15] SothoTalKer: 10k results in under ten seconds: http://tinyurl.com/y9fbekux [22:20:32] reosarevok: in my view that’s not a hack, it’s a manual optimization :) [22:20:47] That's the name for the ones that we like, right? ;) [22:20:56] doing an OPTIONAL just to check that it *doesn’t* match just feels so wrong to me [22:21:59] what's the difference exactly? why is one slow and one fast :p [22:22:16] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1954 bytes in 0.063 second response time [22:22:51] with the WITH … AS %results syntax, we’re first running the query without the label service, and then we add the label service only to those 100/10000 results that were found [22:23:13] I’m not sure what’s wrong with the first version, but for some reason the LIMIT and the label service don’t seem to interact well [22:23:24] Why don't we apply always that hack internally? [22:23:31] The SERVICE part isn't standard... [22:23:34] funny you should ask that :) [22:24:47] abian: https://phabricator.wikimedia.org/T166139 [22:27:57] Thanks for the link! :) [22:28:28] Couldn't a middleware read the queries (text) and rewrite them? [22:28:32] ... when possible [22:28:56] well we already have query rewriters [22:29:11] that’s how SELECT ?foo ?fooLabel works [22:29:29] there’s an optimizer which adds `?foo rdfs:label ?fooLabel` to the label service [22:29:35] so you don’t have to mention it manually [22:29:47] (in Blazegraph, an optimizer is just anything that rewrites a query) [22:29:55] Oh, I did wonder how that magically worked [22:30:09] Ah, okay, truly interesting :) [22:30:37] indeed. [22:30:50] why is getting labels so slow? [22:37:02] not sure [22:37:12] perhaps because it has to go through the language fallbacks [22:40:17] one could try if getting only the english label is faster?