[06:19:19] MichaelSchoenitz: hi ! [06:20:06] MichaelSchoenitz: when you have time I'd like to have your input on the latest evolutions of https://www.wikidata.org/wiki/Wikidata:WikiProject_Informatics/FLOSS [09:51:39] Moving old interwiki links from Wikiversity to wikidata and cleaning them. I hope I don't run into strange things like English Wikinews [09:51:56] they still use old interwiki and they want it this way [10:01:43] Amir1: doesn't your bot add labels for Frisian (fy)? [10:01:55] it does [10:02:00] Weird. https://www.wikidata.org/w/index.php?title=Q15655434&action=history [10:02:00] it might take some time :D [10:02:11] I'll check [10:02:13] thanks [10:22:15] Hi, I proposed a new showcase at https://www.wikidata.org/wiki/Wikidata:Showcase_items#Proposed_for_Showcase but I'm unclear what's next. Reading the history I see things happening but I miss the associated discussions. Is there another URL I should read ? [10:40:30] dachary: there is not really a clear process, I'm trying to reform it. [10:44:31] dachary: I'll take a look next week, after my exams… [10:47:03] MichaelSchoenitz: cool, good luck with that :-) [10:47:23] sjoerddebruin: ok, I'm glad I did not miss anything :-) [15:20:14] nikki: do you think bots will think this is weird? https://www.wikidata.org/wiki/Q13136035#P1448 [15:20:14] P1448 Masterwork From Distant Lands - https://phabricator.wikimedia.org/P1448 [16:15:11] addshore: https://grafana-admin.wikimedia.org/dashboard/db/article-placeholder Does that include bots? [16:15:38] no, it uses the 'user' pageview definition [16:15:46] which should exclude most bots and spiders [16:17:03] ok [16:17:10] do we also have counts with bots? [16:17:52] agent_type string Agent accessing the pages, can be spider or user [16:18:23] Not currently recorded in graphite but I left room for it / it's trivial to do [16:21:15] addshore: Ok [16:21:20] I would like to have that [16:21:28] if it's null right now, ok [16:21:31] but I doubt it is [16:21:45] split per wiki again? [16:21:53] yes [16:22:01] can you make Lydia_WMDE make a ticket? :D [16:22:16] or, *checks the origional ticket* [16:22:54] " Also it would be useful to differentiate between bot/ spiders and real users (especially for the future, where we plan to add placeholders to search engines)." so after a discussion I think we decided to not do it yet, but I can add it! [16:23:07] Why not? [16:23:16] Is it null on all wikis, yet? [16:23:46] * addshore does not know, but I imagine it was assumed it was thus it didn't matter that much at this stage? [16:24:04] I need the numbers [16:24:15] stashbot always gets me whenever it talks of a phabricator paste while everybody else is talking of a wikidata property :D [16:24:20] if not in grafana, at least in graphite so that we can use them for operation purposes [16:24:36] * operational [16:29:42] numbers don't go to grafana, only in graphite! ;) [16:30:03] yeah [16:30:07] but graphs :P [16:30:11] Okay, I'll amend the stuff to track spiders too! ;) [17:31:26] I want to create a web of Wikidata items that are scientific articles that are sponsored by NIOSH that cite (P2860) other items that are scientific articles that are sponsored by NIOSH. How might I go about doing this? [17:31:26] P2860 cu_changes deletion on beta - https://phabricator.wikimedia.org/P2860 [17:35:09] Hello, does anyone here know if there's a way of making a (Wikipedia) template that outputs the Wikidata-ID (as in "Qxxxxxxx") of the current page? Sort of like {{PAGENAME}} but with the Wikidata-ID [17:35:10] 10[2] 04https://www.wikidata.org/wiki/Template:PAGENAME [17:37:17] Metalindustrien: take a look at https://en.wikipedia.org/wiki/Module:Wikidata, you could make a lua template that does that. [17:37:42] (this is the second time something like this gets asked in a short time, gonna make a request for a magic word) [17:37:50] Thanks, I'll take a look :) [17:38:05] A magic word would add a lot of flexibility, yeah [17:42:37] Metalindustrien: https://phabricator.wikimedia.org/T140796 [17:42:57] Thanks :) [18:36:49] hoo: is special:abouttopic cached? [18:36:52] just curious [18:39:31] What am I doing wrong with this query? I am trying to get a list of items that have P2880 along with their descriptions. [18:39:31] https://query.wikidata.org/#SELECT%20%3Fitem%20%3Flabel%0AWHERE%0A%7B%0A%20%20%3Fitem%20wdt%3AP2880%20%3Fdummy0%20.%0A%20%20%09OPTIONAL%20%7B%20%3Fitem%20wdt%3AP31%20%3Fdummy1%20%7D%0A%20%20%09FILTER%28%21bound%28%3Fdummy1%29%29%20.%0A%20%20%3Fitem%20schema%3Adescription%20.%0A%20%20OPTIONAL%20%7B%0A%20%20%20%20%3Fitem%20rdfs%3Alabel%20%3Flabel.%0A%20%20%20% [18:39:31] 20FILTER%28LANG%28%3Flabel%29%20%3D%20%22en%22%29.%0A%20%20%7D%0A%7D [18:39:31] P2880 More strange deployment-prep puppet errors for T131946 - https://phabricator.wikimedia.org/P2880 [18:39:33] Ugh. [18:39:34] https://query.wikidata.org/#SELECT%20%3Fitem%20%3Flabel%0AWHERE%0A%7B%0A%20%20%3Fitem%20wdt%3AP2880%20%3Fdummy0%20.%0A%20%20%09OPTIONAL%20%7B%20%3Fitem%20wdt%3AP31%20%3Fdummy1%20%7D%0A%20%20%09FILTER%28%21bound%28%3Fdummy1%29%29%20.%0A%20%20%3Fitem%20schema%3Adescription%20.%0A%20%20OPTIONAL%20%7B%0A%20%20%20%20%3Fitem%20rdfs%3Alabel%20%3Flabel.%0A%20%20%20% [18:39:34] 20FILTER%28LANG%28%3Flabel%29%20%3D%20%22en%22%29.%0A%20%20%7D%0A%7D [18:39:46] ... http://tinyurl.com/hetxp4q [18:39:58] (Sorry.) [18:40:03] haha [18:45:27] harej: you’re missing a variable for the description triple [18:45:30] is this what you want? http://tinyurl.com/j74fdbe [18:45:37] Wow, it's *the* WikidataFacts [18:45:49] Yes, that is what I want. Thank you. [18:45:49] * WikidataFacts waves [18:46:11] hello WikidataFacts :) [18:46:19] hi :) [18:46:50] * Harmonia_Amanda should go back to her data [18:47:13] * Harmonia_Amanda isn't very motivated this evening [18:51:53] anyone here using the primary sources tool? [18:55:37] addshore: No, not at all [18:56:27] I would use it if it were a gadget :P [19:05:29] hoo: can I get a speedy CR? :) [19:05:39] sure [19:05:45] https://gerrit.wikimedia.org/r/#/c/299744/ [19:06:01] hoo: one quick question. How do you do schema changes on developer level in extensions? [19:06:11] https://www.mediawiki.org/wiki/Development_policy#Database_patches [19:06:16] this is not very helpful [19:06:25] looks like it's just for core [19:06:31] Amir1: there is a hook! you add your schema changes to it :) [19:06:39] Amir1: give me 2 secs and I can link you to a patch doing it! [19:06:49] addshore: thanks [19:06:52] :) [19:07:03] Amir1: https://gerrit.wikimedia.org/r/#/c/292347/ [19:07:59] thanks hoo! [19:08:14] but damn, that was meant to make it into the branch >.> [19:09:29] yes [19:09:30] thanks [19:17:51] Also WikidataFacts, do you have recommendations for making webs with WDQS queries? [19:20:31] harej: the Wikidata Graph Builder (https://angryloki.github.io/wikidata-graph-builder/) has a WDQS interface, though I find it hard to write the correct query to actually get connections [19:20:43] that’s the only one I know of [19:20:52] Yeah, I am not quite sure how the query is supposed to be phrased. [19:21:50] the weird part is the ?linkTo. You have to make sure, in a way, to get one more result than “normal” [19:22:46] The non-WDQS-based options seem usable enough, but I don't know how flexible they are. [19:23:43] I want to make a web of NIOSH-sponsored papers citing other NIOSH-sponsored papers. [19:23:56] To see how our organization's research builds over time. [19:25:52] I think I have something that might work… [19:26:01] now to wait for a few minutes while d3 janks the graph around [19:26:20] that’s the other annoying part, how it renders the browser basically useless while the graph settles :( [19:26:56] * Harmonia_Amanda is happy with her last query [19:27:40] http://tinyurl.com/hdl6p7q [19:30:56] hoo: addshore: https://gerrit.wikimedia.org/r/#/c/299827/ here's the patch [19:31:06] check it when you have some time, thanks [19:38:15] harej: https://tinyurl.com/zaejpl8 [19:38:22] be warned, I’m really not joking about how slow it is [19:38:44] make sure unsaved work in other tabs doesn’t get lost, and let it settle for five minutes or so [19:38:49] :D [19:39:35] (chrome isolates tabs as separate processes) [19:39:43] every tab? [19:39:52] it kills off individual tabs rather than the whole browser [19:39:53] that’s cool [19:40:08] this is very nice when you have one rogue tab; it dies but it doesn't bring down the whole browser [19:40:19] woof. i have an i7 processor and even then it's slow [19:41:12] this is much better: https://tinyurl.com/zuxrcbh – only papers that cite at least one other NIOSH paper [19:42:27] gets rid of all the unconnected nodes (I think), which you probably don’t care about anyways [19:45:45] what do you suppose the difference is if you use P859:Q60346 instead of P2880? [19:45:46] P859 Cassandra G1GC Settings rb1004 - https://phabricator.wikimedia.org/P859 [19:45:46] P2880 More strange deployment-prep puppet errors for T131946 - https://phabricator.wikimedia.org/P2880 [19:45:53] * harej pets stashbot [19:48:35] Hey, is WikidataFacts somebody? [19:48:45] I always wonder :) [19:49:18] WikidataFacts: seems you are a mystery :p [19:49:54] it's a cyborg [19:49:58] I’m not anyone official, if that’s what you mean :) just someone who gets retweetet by @wikidata a lot [19:49:59] Harmonia_Amanda: Yeah, you are correct :) [19:50:00] half bot , half human [19:50:02] mutante: really? [19:50:21] WikidataFacts has a great Twitter account. [19:50:31] harej: 578 papers with ID but no sponsored: http://tinyurl.com/jh5h25z [19:50:37] Thats what I see harej [19:50:42] WikidataFacts: well, you should have named yourself "sparql-ninja" :p [19:50:45] He is always on tweeter [19:50:53] harej: and 1334 vice versa: http://tinyurl.com/go8oupk [19:51:04] :) [19:51:25] WikidataFacts: But who ever WikidataFacts is, I think he is doing a good job. [19:51:30] WikidataFacts: I'm working in both directions on adding P859 to papers with NIOSHTIC IDs and adding NIOSHTIC IDs to sponsored papers. [19:51:30] P859 Cassandra G1GC Settings rb1004 - https://phabricator.wikimedia.org/P859 [19:51:37] Harmonia_Amanda: that’s the funny part, I had no idea that this is where the journey was going :D [19:51:58] aw, thanks! [19:52:19] to be honest, the most complicated queries I still prefer to ask instead of trying myself :s [19:52:25] even if I'm getting better [19:52:54] (hence my last query, which made me proud) [19:52:55] I need to take a class in SPARQL. It's all nonsense to me :( [19:54:19] I had a SPARQL workshop but I was the only one in the group never having coded anything so it wasn't that easy [19:54:38] and I did a little training on my own after, but still