[08:27:58] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1969 bytes in 0.092 second response time [08:48:18] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1971 bytes in 0.074 second response time [09:08:56] cheers [09:09:55] is there a bot that adds the inverse property for 'edition or translation of' and 'edition'? [09:36:18] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1976 bytes in 0.070 second response time [09:41:27] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1966 bytes in 0.106 second response time [09:48:47] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1963 bytes in 0.079 second response time [09:53:48] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1966 bytes in 0.068 second response time [12:21:41] WikidataFacts: I was wondering about your last comment on https://phabricator.wikimedia.org/T185895 [12:22:26] by generic, do you mean it would apply to mediawiki in general, or just to wikidata in general? [12:23:06] and what counts as failing to look up? only when it would return "Main page" or also when there's no MediaWiki:Mainpage/xx page? [12:26:25] I just stumbled on it and tried in various languages on various projects and wow, what a mess... [12:27:08] https://commons.wikimedia.org/?uselang=gn this is my favourite so far [12:47:13] another question regarding queries :-) [12:48:18] if an item has a statement with multiple values, that item shows up in the results multiple times. Is there an easy way to combine them, so that it is only returned once? [12:52:59] you have to group the results [12:53:24] you can use sample() to get a random value (e.g. if they're all the same), group_concat() to combine them into a single string [12:53:33] there's other stuff like min and max for numbers [12:55:11] do you have an example for group_concat()? [12:57:02] i guess i found an example. [12:59:32] http://tinyurl.com/y94ymd5t [13:00:06] (you could of course just group by name instead of sampling it, but for the sake of making an example I didn't :P) [13:00:10] that looks complicated :D [13:03:46] hm, looks like it does not work for subqueries :-( https://jira.blazegraph.com/browse/BLZG-8993 [13:50:07] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1965 bytes in 0.072 second response time [14:10:28] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1965 bytes in 0.070 second response time [14:16:57] nikki: by “failing to look up” I meant this condition: https://gerrit.wikimedia.org/g/mediawiki/core/+/master/includes/Title.php#589 [14:17:38] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1972 bytes in 0.074 second response time [14:37:58] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1953 bytes in 0.066 second response time [14:45:18] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1976 bytes in 0.095 second response time [14:57:03] owl:sameAs links in the LOD: https://sameas.cc/explicit/img [14:59:43] (Yes, Wikidata is there, but only linked to DBpedia) [15:00:38] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1942 bytes in 0.087 second response time [15:22:14] abian: interesting https://upload.wikimedia.org/wikipedia/commons/2/29/2018-06-10_sameAs_DBpedia.png [15:22:37] A bunch of Wikipedias are stacked together and then a few are very isolated [15:23:36] Wiktionary too [15:23:40] *DBpedias, no? [15:24:42] Ah, no, the Wikipedias are explicitly there too, right :D [15:25:27] Yes, I mean the DBpedias for Wikipedias [15:25:56] The isolated ones seem to be all very small, so perhaps there's just no enough content to be meaningful [15:27:31] It seems they just don't exist :/ [15:27:47] http://sc.dbpedia.org/ [15:27:58] Ah, makes sense :) [15:27:58] data.europeana.eu [15:28:03] is very isolated too [15:32:23] Hmm https://lod-cloud.net/dataset/wikidata [15:33:08] OMG, Wikidata already appears in the LOD Cloud :O [15:35:14] Next to the bioportal black hole [15:35:20] xDD [15:41:34] Nemo_bis: Edit conflict on the task :) [15:43:11] Sorry [15:43:20] I'd expect more links though [15:43:46] I mean, more targets [15:46:49] Hahaha, no problem [15:46:52] Yeah, probably [15:47:30] But I don't know what algorithms they apply [15:47:59] Numbers of some datasets seem mere estimations [15:48:10] Links to flickr-wrappr 8,800,000 triples [15:48:12] Links to freebase 3,400,000 triples [15:48:25] DBpedia, for example [15:49:55] Now that I read https://twitter.com/johnmccrae/status/990958137845919745 again, it almost looks like they ask people to manually submit a JSON which contains the number of links to their IDs for each domain [15:50:44] "links": [ { "target": "doi", "value": "12414003" }, { "target": "geonames-semantic-web", "value": "3541065" }, { "target": "viaf", "value": "1576879" }, [15:50:49] http://lod-cloud.net/versions/2018-30-05/lod-data.json [15:50:51] That I mean [15:51:11] So far I thought the links were counted programmatically [15:55:44] Pfff, VIAF has just two targets: DBpedia and the DNB [15:57:16] I don't understand ¯\_(ツ)_/¯ [15:58:00] But, anyway, Wikidata is finally there [16:13:18] Nemo_bis: yeah, that was a lot of fun :/ see https://www.wikidata.org/wiki/User:Lucas_Werkmeister_(WMDE)/LOD_Cloud [16:13:32] also, looks like I fucked up the image on https://lod-cloud.net/dataset/wikidata -.- [16:15:01] WikidataFacts: Anyway, you've improved things a lot, you'll agree :) [16:15:27] :) [18:51:49] hi [18:55:41] hi SothoTalKer [18:57:14] :-) [19:00:39] hi :) [19:32:37] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1968 bytes in 0.065 second response time [19:37:38] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1953 bytes in 0.076 second response time [19:55:08] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1966 bytes in 0.063 second response time [20:00:08] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1945 bytes in 0.070 second response time [20:07:28] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1965 bytes in 0.084 second response time [20:17:38] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1971 bytes in 0.061 second response time [21:40:04] hi! [23:48:28] SMalyshev, has there been any major changes in the wdqs? I am trying to reimport everything, and the process basically dies in write thrashing - 10+G/s writing, and nothing else, and the system is down to a crawl. gc logs: 1st and 2nd links work ok, 3rd - current [23:48:32] http://gceasy.io/my-gc-report.jsp?p=c2hhcmVkLzIwMTgvMDYvMTAvLS13ZHFzLWJsYXplZ3JhcGhfanZtX2djLnBpZDExNDE3LmxvZy4xLmd6LS0yMy00Ny0xNA== [23:48:49] http://gceasy.io/my-gc-report.jsp?p=c2hhcmVkLzIwMTgvMDYvMTAvLS13ZHFzLWJsYXplZ3JhcGhfanZtX2djLnBpZDExNDE3LmxvZy43Lmd6LS0yMy0yNy00MA== [23:48:57] http://gceasy.io/my-gc-report.jsp?p=c2hhcmVkLzIwMTgvMDYvMTAvLS13ZHFzLWJsYXplZ3JhcGhfanZtX2djLnBpZDExNDE3LmxvZy44LmN1cnJlbnQuZ3otLTIzLTMwLTE= [23:50:36] (1st & 2nd were generated in the beginning of the process, when it went through most of the data very well - first consuming osm data, than wikidata data, all from the same ttl splits dir [23:55:30] yurik: no major changes