[07:29:53] Hi :) [07:30:14] How does wikidata decides which entry to put #1 on its search bar ? [07:31:27] e.g. if I type "Ink" how does it decides to rank ink the dye over ink the movie [07:33:59] I guess by entity number [07:35:00] The search engine does some kind of relevance scoring. I don't think it's purely the item ID. [07:35:01] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1968 bytes in 0.095 second response time [07:45:10] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1942 bytes in 0.168 second response time [08:07:19] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1975 bytes in 0.078 second response time [08:12:19] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1961 bytes in 0.072 second response time [13:05:23] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1965 bytes in 0.083 second response time [13:10:22] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1971 bytes in 0.078 second response time [13:22:32] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1967 bytes in 0.088 second response time [13:42:37] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1947 bytes in 0.079 second response time [17:56:28] 6 [17:56:39] accidental [18:11:52] https://news.vice.com/en_us/article/vbq38d/google-is-listing-nazism-as-the-first-ideology-of-the-california-republican-party [18:12:22] Not our fault. :) [19:29:57] Harmonia_Amanda: hi :) [19:30:16] hi! [19:36:25] WikidataFacts: In for a challenge? See https://www.wikidata.org/wiki/Wikidata:Request_a_query#Combine_two_queries ;-) [19:38:27] That's the one we were messing around with at the hackathon [19:38:54] or maybe Harmonia_Amanda can solve it.... [19:39:27] multichill: shhh, I'm making WikidataFacts works right now [19:43:46] multichill: http://tinyurl.com/y9jo4gcf maybe? [19:44:49] nikki: hey! [19:44:52] quit being faster than me :( [19:44:55] no! [19:44:57] :D [19:45:03] multichill: http://tinyurl.com/yd3s6nxq [19:45:21] * Harmonia_Amanda close her half-written query [19:47:30] you can also put optional around one of the includes if you want to find ones which are in one but not the other [19:49:02] hm... for wd I'm getting two rows, one with and one without a link [19:54:26] nikki: WikidataFacts: THe with AS stuff, why??? :P [19:54:35] why not? [19:55:25] Why do you have to use it? [19:55:57] it's an easy way to join two queries [19:56:08] and sometimes means the query stops timing out [19:56:39] Ok, so the engine will just run two queries and in the end do a join? [19:56:59] Instead of some rapidly expanding time and memory hog? Cute [19:57:27] I don't know what goes on behind the scenes, but I assume that's how it works [19:57:46] Looking at https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples we should add an example like this [19:58:15] This is still a WIP query, so probably not the best one to add [19:58:35] you could probably do that without the WITH/AS/INCLUDE stuff, using regular subqueries instead [19:59:12] This seems to do what I want: Run two queries indepent and compare the results [20:00:46] two subqueries side-by-side should be equivalent [20:01:01] and I don’t think the optimizer would do anything stupid in this case [20:01:04] but I haven’t checked [20:01:41] don't give the optimiser ideas :P [20:04:38] Haha, found a nice edge case at https://www.wikidata.org/wiki/Q17195901 [20:04:46] It's a canal AND a street :P [20:45:49] hi, we are currently getting Wikimedia\Assert\ParameterTypeException from line 89 of /srv/mediawiki/w/extensions/Wikibase/vendor/wikimedia/assert/src/Assert.php: Bad value for parameter $maxSerializedEntitySizeInBytes: must be a integer [20:46:04] on mediawiki 1.30 using php 7.0, wondering does anyone know how to fix that please? [20:54:42] paladox: sounds like your maxSerializedEntitySize in $wgRepoSettings might not be set correctly? [20:55:47] ah [20:55:48] thanks [20:55:50] will check [20:57:37] WikidataFacts we doin't wgRepoSettings have set. [20:59:23] hm [21:06:57] we found the problem [21:07:07] fixed in https://github.com/miraheze/mw-config/commit/de10430540082b9819e836f6f409ef8505805577 [21:07:41] ok good [21:07:59] so MediaWiki supports strings in that setting but Wikibase doesn’t? [21:32:18] WikidataFacts not sure but changing to not a string fixed it