[03:41:05] hi, [03:43:35] with pywikibot I can access the claims [03:44:19] For example I can get one of the release version of a given prject like that: page.get()["claims"]["P348"][3].getTarget() [03:44:20] El búfer 3 está vacío. [03:45:04] I saw a way to set qualifiers, but is there a builtin way to retrive them? [03:45:29] There is a way to get json, but as I understand it it would require me to parse it [03:45:40] which I don't find confortable with at all [03:58:32] (I'd have to make sure that there is no way that the parser I use is affected by the flaws typically found in XML parsers) [09:11:40] Hi! i'm trying to do a "GROUP BY" in a query, but I just get "Bad aggregate": https://w.wiki/KNH [09:25:40] in the absence of specific identifier properties like RfC id, is there some generic identifier type I can use? [09:25:57] Also, how can I use SPARQL to get a list of properties and their datatypes [09:26:16] https://w.wiki/KNW - this gets proprerties but don't know what next [09:28:54] Itried now also to add (DATATYPE(?property) as ?propertyDataType) [09:28:56] but did not help [09:29:46] ah nvm got it [09:29:59] https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples#Properties_grouped_by_their_Wikibase_datatype_(Q19798645)_with_number_of_properties [09:36:02] I guess same as [09:36:19] sorry, I mean: https://www.wikidata.org/wiki/Property:P2888 [09:36:24] exact match (P2888) [10:42:30] Hi! i'm trying to do a "GROUP BY" in a query, but I just get "Bad aggregate": https://w.wiki/KNH [17:27:12] Hi. [17:27:12] Do we have workgroup for ncov pandemy? [17:29:42] what would do the workgroup? [17:30:56] Define common data scheme for all countries and maybe provide support for ad-hoc contributors [17:39:24] B13gan: I think there was sometihng about free softwsare and NCOV in lwn.net but beside that I'm a total newbie with reguard to wikidata [17:39:33] so I don't know much [17:39:59] Data Sharing and Open Source Software Help Combat Covid-19 (Wired) [17:40:10] https://lwn.net/Articles/814851/#Comments [17:40:34] btw, is json dangerous? [17:40:40] like XML is? [17:41:06] Or is it immune to the usual attack like billion laughts, expansions, other types of DOS attacks etc [17:42:36] Maybe I'd better ask in some python channel though [17:43:33] Probably ;) [17:49:13] Uh wikidata [17:49:16] > Pausing due to database lag: Waiting for all: 31.35 seconds lagged. [17:50:25] revi: If you need to do any serious job with wb you need own instance -_- [17:50:35] I run a bot on WD :P [17:51:05] So my own instance wouldn't do much (it is about maintenance of Wikidata stuff) [17:52:12] ahh. so it's even more painful. [17:52:30] yeah :P