[01:45:03] looks like the link for property P691 is dead/unreachable. [01:45:28] https://www.wikidata.org/wiki/Property:P691 - check the examples :) [01:50:29] it has two formatter URLs, and I have enabled this: https://www.wikidata.org/wiki/MediaWiki_talk:Gadget-AuthorityControl.js which also has the not working one. [02:18:15] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1968 bytes in 0.106 second response time [02:40:24] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1963 bytes in 0.089 second response time [04:50:25] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1972 bytes in 0.095 second response time [06:27:25] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1967 bytes in 0.117 second response time [11:42:55] Hi guys! I created a WordPress plugin which works with Wikidata for my final project in the University, and I need some feedback. It only takes 10 minutes: https://contexto.thetwentyseven.co.uk/research/ [11:47:31] Really interesting, thetwentyseven, thank you! [11:48:08] I would suggest that you leave a message on https://www.wikidata.org/wiki/Wikidata:Project_chat and send another via wikidata@lists.wikimedia.org if you need more feedback [11:49:53] Thanks abian [11:52:10] * abian has to leave now, but will finish the survey later [11:52:38] Good luck, thetwentyseven :) [12:19:18] um... I got an error about "malformed input"... but the field is now disabled so I can't edit it [12:19:27] that didn't used to happen [12:20:16] of course there was nothing wrong with my input, as usual [12:50:42] this is strange [13:26:11] hmm how do i indicate that someone is a member of a musical group? (and that musical group has members) "has member" gives me value constraints [13:29:03] also https://www.wikidata.org/wiki/Property:P1953 apparently has the constraints "human" or "musicla organisation" but not "band or /rock band" these arenot subclasses of musical organisation? shoudl they be? [13:29:36] https://www.wikidata.org/wiki/Help:Basic_membership_properties - here, all technical for you :D [13:29:49] oh no. [13:29:53] alos hi sotoh ¬¬ [13:29:56] sotho* [13:30:04] hi CatQuest <3 [13:30:48] you can probably use part of [13:31:34] for te singer https://www.wikidata.org/wiki/Q50971018 cwrtanly. what about the other awy around? [13:33:41] you can use "has part" P527 [13:33:51] https://www.wikidata.org/wiki/Property:P527 [13:34:26] cheers, thnaks [13:34:43] i don't know if you can add temporal constraints, though :x [13:38:07] CatQuest: oh you can. look at the Beatles :3 https://www.wikidata.org/wiki/Q1299 [13:39:06] oh so start time should be "inception" not "start date" ? [13:39:56] it uses "start time" and "end time" [13:40:58] it uses inception [13:41:22] you mean for the band, yes [13:41:30] yea [13:41:44] it's a bit unclear when these things aren't the same thing but separate things [13:42:28] I mean how is one supposed to know that "start date" isn't to be used for music bands? :/ [13:42:44] i guess you can use it anyway [13:45:01] hello everyone, i've a question. [13:45:36] don't ask to ask, just ask :) [13:45:37] there is anyway that you know, to get a list of common properties asociated with a value of "instance of" [13:45:46] hehe, yes yes, i was writing :) [13:46:04] .. I am not english,h I have no idea what the diff between "subject has role and object has role" ¬¬ :/ [13:46:06] but from what i can see "start time" refers only directly to properties you add, while for everything else you use "inception" [13:46:14] like: te most common properties asociated with 'instance of human' are: sex, ocupation, etc. [13:46:48] paula_: sorta "this aprticualr thing template" i na way? liek for an instrument you'd add these and these statements [13:46:49] i wonder which are the most common properties asociated with instance of photograpy [13:47:16] i don't think there is such a thing. but it would be very useful! [13:47:27] yes :) [13:48:13] im wondering which properties should i use when creating a photograpy item, and a good place to start is to know which are the most common already used [13:48:50] maybe some query guru could know how to do it.. ? [13:48:59] You mean "photograph"? https://www.wikidata.org/wiki/Q125191 [13:49:27] ‎SothoTalKer‎ , yes [13:49:31] sorry for the tyop [13:49:33] typo [13:50:08] all good. [13:50:13] that definitely something I hink sounds very useful. right now i think the only way is ot generally lookup a famous example [13:51:13] I mean, I'm still learning here. and there are properties that i'e never heard of [13:51:23] sure, same here [13:52:20] for some kind of items, (like books), there are whole projects to define a data model, but for many others there arent, and a list of common properties would be great to start [13:52:48] i can't give you a query sadly, but if you hang around, someone else might can :x [13:53:12] thank you ‎SothoTalKer‎ ill be around for a while [13:54:18] maybe abartov ?, i've heard he's quite good with queries ;) [13:58:31] https://www.wikidata.org/wiki/Wikidata:Statistics [13:58:46] We have negative 3.4% "other P31/P279"? :D [14:01:38] the question is not to find the global amount, but only those that are used for Q125191 [14:02:11] i.e. get all elements of Q125191 and print them := [14:02:37] thats right [14:02:51] get all the items, and group the properties used in them [14:03:27] i guess that for a photograph it would be date, height, width, depicts, etc. [14:03:38] i usually do what CatQuest does, find a very popular item and use that as reference. [14:03:41] but id like to really know :) [14:04:00] yes, i've done that too, its a very useful tip [14:04:16] but maybe this can become a request for the techies :) [14:06:33] SothoTalKer: I know, this was just an unrelated wtf :) [14:07:05] hehe :D [14:07:55] paula_: https://www.wikidata.org/wiki/Wikidata:WikiProject_Visual_arts/Item_structure might be a bit useful too? [14:08:01] More general than photography though [14:13:43] https://tools.wmflabs.org/sqid/#/view?id=Q125191&lang=en - should list the most common properties [14:19:29] Oh, neat :) [14:20:35] allright! [14:20:39] trying! [14:24:00] i don't seem to be gettin any results :( [14:24:04] maybe timeout? [14:24:15] did you get results on that page ‎SothoTalKer ? [14:26:27] i did [14:38:14] it worked! [14:38:20] thank you! [14:38:36] and thank you ‎reosarevok‎ for the project page [14:38:42] ill be sure to check that out too [14:38:47] regards to all! [14:41:42] you're welcome! [15:08:21] https://www.wikidata.org/w/index.php?title=Q7311&type=revision&diff=671573550&oldid=669106521 - 8 references given and it was still the wrong date. You can do better, Wikidata! [15:11:22] pfft, vandalism: https://www.wikidata.org/w/index.php?title=Q7311&diff=prev&oldid=634086779 [15:54:19] awww, we have dispatch problems again :-( [15:55:09] https://grafana.wikimedia.org/dashboard/db/wikidata-dispatch [15:55:28] MisterSynergy: Yes, but I'm on that [15:55:34] it should be fine again in a bit [15:56:10] did something fail, or do we just have a high volume of edits? [15:56:39] high volume, also many edits to "high impact" items (Items that are widely used, so dispatching is more expensive) [15:57:01] I saw your complaint on a bot operators talk page about edit rates "way over 30 edits per minute", which really isn't that much [15:58:28] Well, accounts should always stay under that… and even more so when we already have significant lag [15:59:01] a single quickstatements batch runs at ~80 per minute already [15:59:38] btw. can you identify who's doing the "high impact" item editing? [15:59:38] Which it totally shouldn't… [16:00:00] Yeah… although there are even more factors that play into this [16:00:28] That's why I messaged a specific bot operator although other people were even editing slightly faster [16:00:28] yeah, it's a complex system as I already heard, with several potential bottlenecks [16:01:05] Yeah… and even if we were to remove the bottleneck around dispatching, it would only move to some other part of the infrastructure [16:01:13] I was, but I stopped for a while :-) (sitting on ~1E5 prepared edits right now) [16:01:40] :-) [16:03:24] Btw. the way there is an issue with our Wikidata policy. According to [[Wikidata:Bots]], one should not edit if the median lag is above 60 (seconds), but it typically is 2-3 minutes and never ever goes below 1 minute again even if everything goes smooth. So no bot should be editing Wikidata at any time :-) ... [16:03:25] 10[2] 10https://www.wikidata.org/wiki/Wikidata:Bots [16:04:09] Yeah, these numbers have changed [16:04:22] should we change it in the policy as well? [16:04:37] We need some policy that applies to semi-editing as well [16:04:41] IMO it does not make sense to write a number there which can't be reached [16:05:01] No surprise that most (or all) operators ignore this parameter [16:05:24] yeah, this is a good point [16:05:36] I'll probably write up a ticket about this later today anyway [16:05:40] Will also mention this [16:07:08] maybe raise it to 5 minutes/300 seconds, assuming we cannot speed up the dispatching itself... [16:09:59] can i add a wikidata item that does not have a wikipedia entry yet, but has a VIAF record, etc? [16:12:44] identifiers such as VIAF records are typically okay to indicate notability; user-generated content identifiers (facebook and the like) are not sufficient [16:13:42] he's just missing a wikipedia entry :D [16:18:17] Can't you get a VIAF record through Wikidata nowadays? : [16:18:18] * :D [16:18:51] i'll make one for you, reosarevok :3 [16:22:37] Are you aware of audio player going outside bounderies, hoo? https://www.wikidata.org/wiki/Q181#P990 [16:23:04] sjoerddebruin: Yes :S [16:23:15] It's quite a challenge to get all things right :P [16:23:41] Well, I think some thumbnail services don't adhere to the size we give them [16:23:56] 3d and audio seems to partly ignore that [16:24:36] i suggest you do the easier things first O:-) [16:25:56] I'll see how easy/ nice we can fix these [16:26:18] If it gets to tedious/ fragile, we might go back to showing links for (some) non-image media types [16:30:24] reosarevok: AFAIK VIAD maps their identifiers to ours and updates them accordingly if something changed Wikidata-side, but they do not create VIAF entries for missing humans [16:31:34] ^ [16:37:45] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1965 bytes in 0.109 second response time [16:40:27] MisterSynergy: what about all those constraint violations? https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P214 [16:41:20] they do get sorted out every 4 to 6 weeks, right? [16:43:28] because i do not see those getting resolved. =X [16:48:38] Sorted out every 4 to 6 weeks? [16:50:26] VIAF merging multiple records into one where it is applicable. [16:51:14] https://en.wikipedia.org/wiki/Wikipedia_talk:Authority_control/VIAF [16:51:57] it's an old talk, and from wikipedia [16:52:07] Oh yeah, I track my own [16:52:16] Lots of CiNii seperate for some reason [16:52:32] but there's a chronology of how changes get merged/applied [16:58:38] https://www.wikidata.org/wiki/Q21645467 have been open for one year for some reason [17:11:11] the question is, if VIAF still processes the entries for merging. I know they likely need to check all of those manually [17:20:02] back in November/December I worked a bit with VIAF identifiers in Wikidata (verify ~300 of them). Turned out that ~10% were wrong (but mutually linked on both sides, i.e. VIAF was linking to wrong Wikidata items as well). I created new items for wrongly linked VIAF entries and removed IDs from the old ones. It took around a month to see that VIAF linked to the new items... [17:20:41] Probably if there are no unique/single value constraints, VIAF links to the item that links back to VIAF, and crawls ~monthly [17:21:02] They do not create an entry for items that do not use the VIAF identifier [17:22:02] [[User:MisterSynergy/P214 for rowers]] summarizes my work btw. [17:22:03] 10[3] 10https://www.wikidata.org/wiki/User:MisterSynergy/P214_for_rowers [17:25:00] VIAF basically unifies and backlinks the different libraries authority information, which in 99% only covers authors [17:25:42] so, yeah, sports people won't get a VIAF/ISNI/authority record [17:25:47] correct [17:26:16] unless they wrote a book ;) [17:27:09] that's why I was wondering about so many VIAFs in my field of work. many were indeed correct, but 10% error rate is too high [17:27:29] it is [17:27:31] Anyway, most mistakes were not created in Wikidata [17:28:01] that mutual VIAF+enwiki linking based on same "name + year of birth" was a bad idea [17:28:03] it often happens that libraries themselves make errorous entries [17:28:38] true, but not surprising giving the difficulties with that task [17:29:45] we now have the power of the hive mind in wikidata, which could be used to sort out most of it. [17:30:13] i just checked a popular soccer play, and guess what. two VIAF numbers :D [17:30:24] http://viaf.org/viaf/120389074/ + http://viaf.org/viaf/311425157/ [17:31:45] oh yes [17:31:53] ISNI is even much worse :-) [17:32:20] i guess wikidata is the most correct source ;p [17:33:15] ISNI links to the correct wikidata entry, which is linked in the other VIAF entry ;) [17:33:36] so this would be a no brainer for merging. [17:36:28] now added both VIAFs and the ISNI to our item; let's see whether VIAF merges it within the next month [17:37:37] do you monitor it? [17:38:53] not really, at least for that item; I did so for my little project linked above, that's why I know that they do compare their data to Wikidata items monthly [17:39:45] they do, but aparently they do not merge anymore. [17:40:37] I don't know; I've seen some quire complex rearrangements on their side [17:41:22] sometimes the even ignore what we do; I wouldn't be surprised if some person on their side reviewed proposed changes from their crawls [17:41:57] it would be nice to get some feedback [17:43:05] well I'm happy if they fix on their side most of my Wikidata fixes as well [17:43:37] and it's quite cool that I don't even need to leave Wikidata to (indirectly) fix it in VIAF as well [17:44:01] hehe, yeah [17:44:34] i would love a reply, if they objected to merging for some things. [17:44:55] then it usually means a 'problem' on our side. [17:47:54] if they would put a person in charge that only merges 100 things per day, they would be done within 4 months :D [18:13:17] sjoerddebruin: how can "award received" be restricted to one point in time? :) [18:16:37] Hm> [18:16:38] ? [18:17:56] sjoerddebruin: see here: https://www.wikidata.org/wiki/Q43682 [18:18:40] Multiple statements with the same value but different qualifier value is mostly the best solution [18:19:25] adding it 3 times is allowed and not a constraint violation? [18:20:18] Yes. [18:20:55] and this is the preferred way, too? :-p [18:21:09] Well, what I just said is the preferred way [18:21:33] okay [18:23:24] now it just looks ugly because the entries don't get sorted :p [18:23:53] (stupid humans) [18:24:46] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1970 bytes in 0.112 second response time [20:00:52] yay nice. 3270 results for a birth year of 1908 and 3070 results for a birth year of 1909. [20:24:46] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1946 bytes in 0.103 second response time [20:25:28] https://www.youtube.com/watch?v=YKUOB8MN4Kc [20:36:46] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1969 bytes in 0.100 second response time [22:33:39] yay, found something where WD was wrong ;p [22:41:45] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1954 bytes in 0.096 second response time [22:42:38] OH GOD, THIS IS A #$!& MESS