[02:05:10] SMalyshev, do you plan to present anything at the data conf this weekend at WMF SF? [03:05:06] So the personal bests of athletes are set up like [Athlete item] -> ['personal best' statement w/ 'quantity' datatype] -> [quantity] -> ['sports discipline competed in' qualifier], and I want to get the athlete name along with all Personal Bests and their associated disciplines competeted in. This is what I have now: http://tinyurl.com/zqbbmt2 [03:05:11] How can I get it to also show the personal best quantity, along with the name and event labels? [04:15:44] #wikidata [04:16:09] #wikipidea foundation.org [04:16:23] property of peter flores [04:16:32] el cholo flores [04:16:40] peter cholo flores [04:16:51] there can only be one [04:17:38] all rights copyrights ownership if all wiki [04:17:45] and data wiki [04:18:08] also the free wikipidea search engine [04:18:33] also under under owner ship [04:19:09] [04:19:40] [04:20:23] < and wikipidea foundation.org and the free wiki search engine and all data and info in regards to any use of wiki or ipiki or any form of the word wiki or pedia in a phrase or sentenxe is then classified as violating copyright ownership or registerd domain name and or ip adress of any and all wiki or any web site or search engine or internet www ethernet or intrant ir fiber or wifu or cable dsl of use of online cnet cern or ibm or [04:25:52] > [04:26:41] <> [04:27:05] [[[[[[[[[[]]]]]]]]]]]][[[[[]]]]]]][ [04:27:05] Caracteres inválidos en el enlace «[[[[[[[[»; no están permitidos: #<>[]|{}13 => [04:27:08] Caracteres inválidos en el enlace «[[[»; no están permitidos: #<>[]|{} [04:27:16] [[[]]][[[[[[[[[[[[[[[[[[[[][[[[[[[[[[[^[[[[[[[]]]]]][ [04:27:17] Caracteres inválidos en el enlace «[»; no están permitidos: #<>[]|{}13 => [04:27:19] Caracteres inválidos en el enlace «[[[[[[[[[[[[[[[[[[»; no están permitidos: #<>[]|{}13 => [04:27:22] Caracteres inválidos en el enlace «[[[[[[[[[^[[[[[[[»; no están permitidos: #<>[]|{} [04:28:10] [[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[inc company by peter cholo flores ]]]]]]]]]]]]]]]]]]] [04:28:10] Caracteres inválidos en el enlace «[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[inc company by peter cholo flores »; no están permitidos: #<>[]|{} [06:13:40] PROBLEM - puppet last run on wdqs1003 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [06:41:40] RECOVERY - puppet last run on wdqs1003 is OK: OK: Puppet is currently enabled, last run 20 seconds ago with 0 failures [07:04:25] yurik: probably not, I didn't even know about this conf until now :) [07:05:23] SMalyshev, https://www.eventbrite.com/e/wikipedia-data-design-challenge-2017-tickets-31783891475?aff=eac2 [07:15:55] can sparql do recursive queries? [07:51:43] amz3, I think yes ... it depends a bit on how you define recursive in this case [07:51:52] you can nest queries, if that is what you want to do [07:52:17] often that is a good way to avoid time outs on the query results [09:36:08] Hello there :) Do you know how to find the recently merged items? [09:38:18] Auregann_WMDE: no idea [09:38:39] I usually see them three days later when they mess up my beautiful constraints pages :p [09:39:32] Harmonia_Boulot: xD [10:20:53] Jonas_WMDE: Are you working today? Jens just asked for you. [13:44:24] Jonas_WMDE: https://gerrit.wikimedia.org/r/339437 [14:31:00] Auregann_WMDE: l'endpoint rame ? [14:31:12] Auregann_WMDE: j'ai pas mal "d'erreur serveur" :s [16:50:40] PROBLEM - High lag on wdqs1002 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] [17:50:42] yurik: https://commons.wikimedia.org/w/index.php?title=GWToolset:Metadata_Mappings/Pharos/Met.json&action=edit <- enabling pretty json view/editing is on a per namespace basis right? Any idea how to enable it here? :-) [17:51:56] multichill, i was under the impression that you can only do that from code - there is some hook, which returns "json" as your text format [19:01:44] RECOVERY - High lag on wdqs1002 is OK: OK: Less than 30.00% above the threshold [600.0] [19:54:13] SMalyshev: Query service is throwing 502's at me :-( [19:56:29] multichill: 502 when you doing what? [19:57:27] Queries [19:57:49] 502 Bad Gateway, http://tinyurl.com/j3mquh2 [19:57:59] On and off [19:58:12] LIMIT 10000009? [19:58:27] that should be LDF request [19:59:01] anyway, looks like 1003 is OOM again... let me deploy the patch that bumps memory maybe it'll make it better [19:59:23] Oh, that's just to trigger a non cached request [20:00:10] Thanks SMalyshev [20:03:06] multichill: should be ok now [20:11:41] ty [20:18:43] PROBLEM - High lag on wdqs1003 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] [20:21:44] PROBLEM - High lag on wdqs1003 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] [21:19:40] what does the “label gaps on wikidata” thread says in summary? [21:19:42] please [22:09:53] RECOVERY - High lag on wdqs1003 is OK: OK: Less than 30.00% above the threshold [600.0] [23:14:18] taxon synonyms :/ https://www.wikidata.org/wiki/Q28859833 [23:17:43] oh for crying out loud [23:18:05] I thought this was going to be something like human vs. homo sapiens, but a misspelling? :D [23:19:43] yeah...listed in iNaturalist as a valid species... [23:19:53] I flagged it there for curation [23:20:33] but they report it as a valid species...not sure how we should report it...would it be OR to mark it as a synonym only, and not as a valid taxon as well if one source claims it is valid...? [23:25:03] apparently most synonyms are also instance of taxon, and their statements aren’t deprecated… http://tinyurl.com/jtwf7ls [23:27:21] I know one user (Succo) removing P31:taxons when thye are synonyms...not sure what to do...