[09:58:23] nikki: Busy with the years I see :-) [09:58:29] yes :) [09:59:00] they should all have point in time now and I've merged all the duplicates I've found so far [10:00:18] I still keep finding more unmarked years though [10:00:30] Great, that sorts them and makes the missing ones/duplicates visible [10:00:35] :DS [10:00:38] (from pages in smaller languages, not from pokestarfan) [10:01:03] Yeah, I found a whole series in some weird language [10:01:11] Most of them without WIkidata items [10:01:28] That's why I'm thinking about a robot just creating a whole bunch to get it over with [10:04:41] * nikki nods [10:04:53] would you only create missing items, or also fill in the data of existing items? [10:06:09] Probably best to do both, right? [10:06:55] If the structure is there + point in time, it shouldn't be too hard to have a bot add the missing information [10:07:49] yeah, that'd be cool [10:08:08] What year? 3017? :-) [10:09:10] Maybe propose that somewhere on wiki also with what fields and links we want? [10:11:41] I've seen years up to 3001 so far (3000 is the last with a sitelink), so 3017 seems reasonable enough [10:25:18] Hm... https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-statements?refresh=30m&panelId=4&fullscreen&orgId=1 [10:45:50] sjoerddebruin: Big crap bot run? [10:46:04] Looks like it. [10:46:17] Do you have the dashboard with total items? [10:46:39] https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel?refresh=30m&panelId=3&fullscreen&orgId=1 [10:47:20] Would be interesting to have a dashboard like that for the 10% newest items. That makes it easy to detect large bot creations [10:48:27] Research Bot is running pretty fast, but that one is creating decent sized items. [10:48:49] https://www.wikidata.org/wiki/Q30065448 <- 10 statements seems to be the number for my new items [10:49:16] Probably a new item bot catching up for some wiki [10:49:18] It should be bot created items that have like 1 statement or someting. [10:49:36] Oh wait, the newpages page has a bytes filter [10:49:45] https://www.wikidata.org/wiki/Special:Contributions/MechQuesterBot [10:50:40] Hmm, maybe I should setup the new item bot like for nlwp so it's not in huge piles of empty items? [10:51:18] I can currently handle it. [10:51:44] Yeah, I know you can for nlwp, but what about enwp? [10:51:57] Yeah, enwp can use some help. [10:52:18] nikki: If we take https://www.wikidata.org/wiki/Q25290 for reference. All those claims without the category stuff [10:52:46] But also backlinks from https://www.wikidata.org/wiki/Q19022 ? And how about backlinks from https://www.wikidata.org/wiki/Q6939 to the decades? [10:54:17] multichill: this one is the only nlwiki one left in the Q19.... range, any idea? https://www.wikidata.org/wiki/Q19967390 [10:56:07] subclass of "getal"? [10:58:25] At least https://www.wikidata.org/w/index.php?title=Q19967390&type=revision&diff=502791663&oldid=499885770 now [10:58:55] Wow, already in the 4M range [10:59:20] Looks like easier ones are popping up again, right sjoerddebruin? [10:59:24] multichill: instance of year, part of, follows/followed by, point in time, start/end time all sound fine [10:59:58] multichill: i have a huge list at the top that I need to take a look at again, feel free to pick some. [11:02:09] things like common year and common year starting on sunday would depend on whether that year was one of those, not sure whether you were intending to add ones like that [11:02:35] do you know which languages you want to add labels/descriptions for? [11:08:30] nikki: https://www.wikidata.org/wiki/Wikidata:Project_chat#Missing_years [11:08:50] instance of 2017? :) [11:09:12] ha, paste error [11:11:08] a bunch of the descriptions on 2017 are uppercased when they shouldn't be [11:11:23] * nikki has been checking the labels on the item for year and has lowercased tons of them >_< [11:14:25] Don't know if we have a good reference item [11:14:55] In pywikibot I would just grab all labels and descriptions for 2017, do the filter and use that to update other items [11:15:06] If 2017 is incorrect, all edits will have the same mistake [11:16:20] Other option is to grab multiple items for reference and only do the labels and descriptions that are the same for multiple items, but that might as well be a incorrect bot import [11:16:48] Like 2017, 1917 and 1817 [11:18:00] yeah, the autoedit script has done that [11:18:37] I'll think about it and see if I can think of anything [11:22:23] nikki / sjoerddebruin : https://www.wikidata.org/wiki/Wikidata:Project_chat#Integrate_the_merge_function_into_the_default_UI maybe you can comment? I think enabling merge by default is a good suggestion [11:24:43] sjoerddebruin: So the number of items without claims went down with about 11.000 in the past year. So say about 1000 a month. I think you're even going faster now so next year around this time maybe? :-) [12:00:42] multichill: i hope faster, my actual goal was October but I guess i need more help ten. [16:43:28] Seems like we recently got a new largest item. https://www.wikidata.org/w/index.php?title=Q30300250&action=history [16:47:23] Harmonia_Amanda, hi! would you care to do this editprotected request? https://www.wikidata.org/wiki/User_talk:Jitrixis/nameGuzzler.js [16:47:56] sjoerddebruin, hooow is it so large? :o [16:48:32] Jhs: probably a lot of authors [16:48:41] yeah, i see now [16:49:04] that actually exceeds the maximum page size for a normal page in MediaWiki (at least with standard WMF settings) [16:49:22] but how the hell can a paper have more than 2000 authors? :s [17:07:12] too bad my browser crashed when I tried to open the item ^ [17:28:36] sjoerddebruin: Seeing how many items get knocked off https://www.wikidata.org/wiki/Wikidata:Database_reports/without_claims_by_site/nlwiki every day, probably earlier ;-) [17:32:33] Btw, might be worth to have a query of items that are empty and link to a redirect on nlwp? [17:32:42] yes please [17:49:00] sjoerddebruin: Running, but seems to take a while [19:04:33] sjoerddebruin: Oops, ran the wrong one. It's at https://tools.wmflabs.org/multichill/queries/wikidata/redirect_items_enwp.txt running https://tools.wmflabs.org/multichill/queries/wikidata/redirect_items_nlwp.txt now [22:07:44] any query experts awake? I need a query to get Norwegians who are footballers and nothing else. so, if they have occupation footballer + [anything], then I don't want them in the result. is this achievable? [22:08:11] it is [22:10:26] http://tinyurl.com/y72rbgxx I think that should work [22:12:04] nikki, yesss, it does. you're awesome! thanks! [22:14:31] aw, I was too slow :( but here’s my version: http://tinyurl.com/y9uhzsyo [22:15:21] (or, slightly faster: http://tinyurl.com/y9p3vsbk) [22:23:33] thanks WikidataFacts, you're awesome too :D [22:23:44] awww, thanks :) [22:23:47] although, I only see numbers in those URLs? [22:23:58] oh goddammit did noscript fuck it up again [22:24:01] mysterious [22:24:02] lol [22:24:11] this happened a few days ago, was it with you as well? [22:24:17] or with someone else in this channel? [22:25:07] Jhs: http://tinyurl.com/yd4mt7wk better? [22:25:19] wasn't me [22:25:24] okay [22:25:52] ah, that was nikki [22:26:15] I need to remember that noscript does that, though I don’t understand how it happens [22:28:58] WikidataFacts, it's not working as it should... i get the same number of results with and without the MINUS line [22:29:21] hm [22:29:29] but I got the same number of results as nikki’s query… [22:30:10] i get 2934 in yours and 2594 in Nikki's [22:30:39] okay, perhaps I misread the numbers because the last digit is the same? [22:30:53] probably, i did a triple-take myself to be sure ;) [22:31:20] anyways, nikki's is working, so i wouldn't spend too much energy on figuring what's wrong :) [22:31:28] ah, no, the FILTER NOT EXISTS works, just not the MINUS: http://tinyurl.com/y8ew8hgx [22:32:45] that’s why the number was the same, I didn’t check the MINUS version [23:55:46] WikidataFacts : i see your "dangerous occupations", and raise you a "dangerous citizenships" (from a few months ago). ethiopia is not what you'd expect it to be. https://wordpress.alphos.fr/2016/09/16/944/sparquickl-2-risque-de-deces/ [23:56:12] (you already saw that ^^ ) [23:56:55] Alphos: oh, right :)