[06:08:55] hi [07:08:37] Krinkle: thanks. i'll ask the team to have a look [07:30:07] dear wikidatans: ive got a question [07:31:31] how do i query for the itemLabel in one language [07:31:43] and print the itemLabel in English? [07:34:20] you can explicitly select a label using ?item rdfs:label ?label filter (lang(?label) = "langcode") (where langcode is the appropriate language code and ?item and ?label are variable names you can change) [08:01:39] kopiersperre: i found the examples to be quite helpful, e.g. https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples#Translation_of_labels_and_descriptions [08:06:25] btw, is there a way to abort a query on query.wikidata.org? [08:09:47] hm. i guess reloading works well enough. [08:14:49] ciss: thx [08:19:34] if i notice wrong associations (e.g. Q43445 female organism assigned to a human), what should i do with those? [08:22:27] in general, if it doesn't have a proper reference, just fix it. if it has a proper reference (i.e. an external source), check whether the reference actually says that. if it does, mark the statement as deprecated, if it doesn't, correct it [08:23:15] for female vs female organism, that's almost certainly just someone accidentally selecting the wrong thing from the search results because the search results can be rather stupid at times [08:24:55] ... ha, and a horse marked as instance of human ^^ [08:25:30] i feel like these things could be automated. my guess is someone might have already put such a framework in place? [08:28:42] we have reports of things that look wrong (like https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P21#One_of lists things which don't look like proper values for genders) but we don't have much automatic fixing [08:29:15] automated fixing would be a step too far indeed. i just had several horses that were marked as human [08:29:45] yeah, it can be hard to know what the right way to fix it is [08:29:46] btw, props for how easy it is to fix these records :) [08:30:04] quick and clean ui, i like it [08:45:20] is it possible to query items in commons categories? [08:50:11] please disregard that last question. [12:21:51] just to clarify: there's no semantic link between a wikipedia list/category and the items it contains, right? it's just a bunch of "unstructured" content? [12:22:54] ... although that can't be quite right, because articles list the categories they're part of. so it's just that the data is not provided in form that can be consumed via wikidata? [12:29:50] ciss: yes that is correct [12:39:23] ciss: you can access them from SPARQL using MWAPI: https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual/MWAPI [12:40:24] Lucas_WMDE: thanks! i've been searching for nearly an hour for that kind of overview :) [13:23:04] So, i'm almost there (thanks to Lucas_WMDE), but there's one detail i don't understand: uncommenting the last line in this query causes a timeout, while the base query returns almost immediately: http://tinyurl.com/y8zydbsg [13:24:28] i'm running into similar issues when querying images or location. basically anything i add causes a timeout. [13:26:48] updated comment: http://tinyurl.com/y8vrhr6b [13:29:13] my guess (I haven't tried anything) is that it doesn't bind ?item until later, so at that point in the query, you're actually requesting that it fetches all labels in the entire database [13:30:18] my thoughts as well. running "?item wdt:P625 ?location" separately finishes, but i guess the act of ... combining(?) the graphs is too much [13:33:51] ... but I can't figure out how to make it work :( [13:34:23] where is lucas when we need him D: [13:34:46] i can raise gcmlimit up to 57 and the query will finish almost immediately. [13:34:53] as soon as i hit 58 i get a timeout. [13:35:00] smells like a bug? [13:38:04] turning the optimiser off seems to work for http://tinyurl.com/yd4z6qn5 but it crashes my browser tab if I do it with coordinates [13:38:04] i've updated the query with better error descriptions: http://tinyurl.com/yd4ctodw [13:42:41] yup, dev tools show a js error in embed.vendor (unknown prefix: mwapi), and the xhr request is loading *tons* of data [13:47:07] this is querying gigabytes of data within seconds ... [13:48:38] ... 2,59GB to be precise. trying to peek inside ... [13:50:49] appears to be the same data over and over again. my guess would be that it's not the *number* of items, but one specific item at position 58 [13:53:16] here's the first 100K lines of the response: https://gist.github.com/mootari/cf02a70b588f715c3c5ca08b0fdb34f6 [13:57:39] what the ... items and titles appear to be combined rather freely. e.g. i'm seeing http://www.wikidata.org/entity/Q6923721 several times with various titles. [13:57:48] (various completely unrelated titles) [13:59:24] looks to me like a combination of the titles from the service results with every result returned by "?item wdt:P625 ?location ." [13:59:47] you basically see the titles result set repeated over and over again, but each time with a different item [14:25:03] is this the official bug tracker for all things wikidata? https://phabricator.wikimedia.org/project/view/71/ [14:33:16] just to add to the above, "Mahnmal Gleis 17" is at position 58 of the result set. it is marked as italic on the category page (not sure what that means, probably section?), and links to https://de.wikipedia.org/wiki/Bahnhof_Berlin-Grunewald#Deportationen [14:34:00] it's also the first italic entry on the category page. https://de.wikipedia.org/wiki/Kategorie:Holocaustgedenkst%C3%A4tte [14:35:20] italic usually means a redirect [14:35:41] and yes, phabricator is the bug tracker [15:55:41] I'm editwarring .... https://www.wikidata.org/w/index.php?title=Q32308519&action=history am I competly wrong, or am I going nuts? [16:22:34] Josve05a: it's not an area I'm familiar with... qualifiers seem to make more sense to me (since e.g. "issue" refers to the thing it's published in, it's not the article itself which has an issue number), but top-level statements appear to be how articles are currently modelled (over 1 million items for scientific articles with an issue statement, only a couple of hundred with issue as a qualifier on the published statement) [16:24:04] and I think consistency is important, so I think changing it would need a discussion (and a bot to fix the data) [16:24:33] Yeah, But the property itself describes with an example that it should be used as a qualifier. Just because one (or two) bad boys have gone ahead and created 1M items with bad structure does not change the consensus of the property of when it was created...but...I’ll have to digress... [16:24:43] Bots* [16:25:00] property examples always use qualifiers [16:27:02] e.g. https://www.wikidata.org/wiki/Property:P31 also has the examples as qualifiers [16:27:08] Oh...right...brain fart....but I still think it is wrong. If an article is published elsewhere it will not be in the same issue and page in the publication. Only in the original publication..../me is on phone and has stepped away [16:31:42] well, as I said, I agree that qualifiers make more sense [16:32:09] the problem is that changing just one item doesn't really help anyone, anyone using the data will have to support the existing method because that's what the overwhelming majority of the data uses [16:35:56] and changing one item doesn't mean people will stop adding new items with the existing method, hence needing to discuss it with the people who are adding these items in the first place [22:23:06] hello? need help with an error on a bio page [22:28:31] Need to fix the error since it insults the person's bio page. [22:35:02] I need help fixing an error on a bio page