[01:53:07] gehel: I tried using gas with a high nthreads and somehow I see only one core having a load, did you happen to encounter this problem ? [04:26:41] Ok, so GROUP BY was the culprit [04:26:46] and order by I guess [11:33:28] Lucas_WMDE: I've seen pretend constraint violations which say "The SPARQL query resulted in an error" quite a few times today... not sure what's going wrong there [11:34:30] it seems to work again after a while, so some sort of network error or rate limiting problem maybe? [11:34:59] eek [11:35:09] LOTS of HTTP 429 errors apparently? https://grafana.wikimedia.org/dashboard/db/wikidata-quality?orgId=1&from=1531654488640&to=1531827288640&refresh=10s&panelId=9&fullscreen [11:35:19] which is Too Many Requests, great [11:36:01] fun [11:36:03] but it looks like we have way more SPARQL requests than usual? https://grafana.wikimedia.org/dashboard/db/wikidata-quality?orgId=1&from=1531222552361&to=1531827352361&refresh=10s&panelId=8&fullscreen [11:36:14] did someone add lots of “format” constraints ,perhaps? [11:37:44] * Lucas_WMDE checks RecentChanges [11:38:33] hm, it would be really neat to have all P2302 edits tagged with some special tag… [11:39:16] that sounds like it should be possible (more or less... I guess it's hard to catch things like restoring an older version) [11:39:23] but don't ask me how [11:43:11] hm, there were some changes of the type constraints of P179 “series” [11:44:17] but no new constraints [11:44:27] and changing the set of types shouldn’t have an effect on the number of queries [11:45:48] yeah, I'm not seeing anything either [11:46:39] it looks like we’re using it a lot more for type checks [11:46:46] which could also be caused by any change in the type hierarchy [11:47:23] if some direct subclass relation became an indirect relation, which causes some common check to go over the threshold where we give up following the type links in WBQC and instead ask the query service [11:47:48] e. g. if the chain from “book” to “creative work” grew from 4 to 6 links, or whatever [11:49:39] that sounds like it'll be annoying to find... [11:50:51] yeah [11:50:59] and even if I find it… so what? [11:51:09] it’s not like I could just revert that change [11:53:07] maybe the change was wrong though! or maybe we can improve the constraint or the hierarchy [11:53:33] perhaps this is the time to actually investigate what the best cutoff is [11:53:48] instead of the current value, which really is just a wild guess on my part [11:56:58] haha [11:58:11] anyway, if you do find out what change it was, it would be nice to know [12:03:05] https://quarry.wmflabs.org/query/28310 [12:03:07] let’s see… [12:04:03] It would be great if a cutoff weren't necessary [12:04:21] And if we had the graph of classes dumped and ready to be queried efficiently [12:05:00] isn’t that what we have, with the query service? [12:06:19] I mean, storing the entire hierarchy (just QIDs) in each node so that you don't have to iterate [12:10:42] hm, wait, “regex cache” *also* has spikes https://grafana.wikimedia.org/dashboard/db/wikidata-quality?orgId=1&from=1531570229295&to=1531829429296&refresh=10s&panelId=10&fullscreen [12:14:35] * Lucas_WMDE is confused [12:46:19] Lucas_WMDE: I have made a "Property constraint changed" tag that you can use (sadly it will only work for future changes, I don't see any way to tag previous edits) [12:48:10] pintoch: <3 [12:50:39] https://www.wikidata.org/wiki/Special:RecentChanges?tagfilter=property+constraint+changed&urlversion=2 \o/ [12:51:29] should still be useful for future edits, thanks! [13:53:02] Where is the 400 character limit for strings and other text properties specified in Wikibase? [13:55:52] benwbrum1: here, I think https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+/a5089b7addaefd8a81fcf03d7d45463b8eab7f61/repo/includes/ValidatorBuilders.php#181 [13:56:11] (note that some validators override the limit, e. g. Commons media https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/Wikibase/+/a5089b7addaefd8a81fcf03d7d45463b8eab7f61/repo/includes/ValidatorBuilders.php#201 [13:56:13] ) [14:00:50] Thanks! That's very helpful indeed. [14:01:56] We're trying to add at least one large text block to a local wikibase installation--similar to DBPedia's dbo:abstract--and trying to decide whether to expand the limit on a datatype or store the text somewhere else and reference it in a URL. [14:03:32] If the latter, we might try to use the mediawiki pages themselves, but it'd add a level of indirection and complicate data entry. [14:11:06] benwbrum1: If there's something in Wikibase that you have to adapt for being hardcoded or Wikidata-centered, remember that you can propose that change so that it's available for later versions :) [14:12:47] Absolutely! We're still in the exploring/planning/prototyping phase of this project, but we're definitely interested in contributing code back if we can. [14:14:00] Yay :D [14:17:46] there are various places where we can't store data ourselves because the limit is too low [14:17:52] so making it more flexible seems sensible [14:19:25] benwbrum1: here's the ticket we have for increasing the length https://phabricator.wikimedia.org/T154660 [14:44:45] Thanks! [19:58:52] The "primary sources" tool isn't working. [20:31:10] hello? [20:34:39] anyone? [20:41:11] "isn't working" is quite vague, in what way is it not working? [20:42:02] although I doubt I can help, I haven't looked at it for a long time. seemed like it was buggy and unmaintained [20:54:05] Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://tools.wmflabs.org/wikidata-primary-sources/entities/any?dataset=. [20:55:49] nikki: that is the error