[08:42:42] Hello there, nice Monday to everybody ^^ If you have something noteworthy to add to the WeekSum, that's the perfect moment :) https://www.wikidata.org/wiki/Wikidata:Status_updates/Next [09:36:25] LeaAuregann_WMDE : https://twitter.com/alphosWP/status/769093806650224641 but it's a bit old now :p [09:37:13] (photo gallery of actors who played Hobbits in peter jackson's LoTR trilogy) [09:38:57] (if the source has to be public, i'll push it to my blog later today) [09:39:13] Alphos: cool! I'll add it :) [09:39:29] or wait [09:39:38] I can source with the tweet, but if you have a blog post it's even better [09:41:03] [11:39:49] i think it'd be better if i did push it to my blog, to add the subtext that we're missing a lot of cast members on that film series ^^ [11:40:10] (there are more than just these six) [09:41:43] it can wait next week ^^ [09:42:58] Alphos: as you wish :) [09:43:09] thanks ^^ [10:02:05] Thiemo_WMDE: Jonas_WMDE: just checked, and I can't find any browsertest-related failures for patches after https://gerrit.wikimedia.org/r/#/c/307771/ got merged [10:02:30] so if there is one, please send me the link and I'll have a look [10:36:16] hi frimelle :) [10:36:26] hey hoo \o/ [10:36:43] Secretly coding already? :D [10:36:54] Code reviewing [10:47:38] Nice! [10:47:43] :3 [10:51:21] DanielK_WMDE: pinging you because I'm not sure who to ask of folks that work with wikidata: who among you is involved with hardware capacity planning for wikidata? [10:51:44] apergos: No one [10:51:58] There's no specific hardware for Wikidata except WDQS right now [10:52:15] and these are boxes mostly managed by wmf's search team [10:52:25] hm ok [10:53:18] so this makes it an internal ops matter [10:53:20] good enough, thanks [10:53:57] see you back in the regular channels [10:54:00] :) [10:57:31] "organizing something cool for Wikidata’s 4th anniversary" 🤔 [10:58:19] sjoerddebruin: FYI: I'll probably update the suggester today/ tomorrow [10:58:33] changed the reminder in my calender from 5 weeks to 4 weeks no [10:58:35] so more updates [10:58:42] Oh, you have reminders. :P [10:58:56] Yes… the whole process is a little to manual… :/ [10:59:04] I've made some changes to musical groups, wonder how they take out. [11:10:27] sjoerddebruin: Yes, first news coming soon ;) [11:10:59] I was promised professional Muse footage three years ago with #soon. ;) [11:13:00] hoo: DanielK_WMDE: do you know what's the process of deleting a repository on gerrit? [11:13:49] no idea [11:14:30] we don't need https://gerrit.wikimedia.org/r/#/admin/projects/wikidata/browsertests anymore since we've moved the tests to Wikibase.git now [11:14:44] Tobi_WMDE_SW: https://www.mediawiki.org/wiki/Gerrit/Inactive_projects I only know that [11:27:54] that's the page indeed... [12:33:45] What to do with https://www.wikidata.org/wiki/Q26214480#P31? [12:33:46] P31 Fork of P29 (An Untitled Masterwork) - https://phabricator.wikimedia.org/P31 [12:56:06] * sjoerddebruin marks a date. [13:38:36] Everyone, do you know someone familiar with the Wikidata API, living not too far from Hamburg, and available on September 18-19 ? (gosh, I'd like to make a query for that :p) [13:38:38] frimelle: +2 https://gerrit.wikimedia.org/r/#/c/307124/ [13:38:52] (why are you not in #wikidata-feed?) [13:48:46] edit: September 17-18 [14:09:45] LeaAuregann_WMDE: A SPARQL endpoint based on Wikidata people with their knowledge and calendars- would be cool but also ultimatively creepy ;) [15:03:12] http://wordpress.alphos.fr/2016/09/05/933/sparquickl-1-hobbits/ mostly in french, but queries and their comments are written in english ^_^ [15:03:57] Thiemo_WMDE: DanielK_WMDE: the failing browsertests had all the same problem.. seems the browser of the integration system sometimes simply fails to load any javascript. [15:04:13] in that case the non-js version is shown and the test fails obviously [15:04:58] I've now changed it to use firefox instead of chrome and rebased some of the patches that were failing on top of it.. seems they all pass now. fingers crossed [15:09:30] moin :) [15:09:41] Thiemo_WMDE: also I think the $.cookie problem was caused by this.. so I don't know whether https://gerrit.wikimedia.org/r/308566 is still necessary [15:09:46] aude: moin moin [15:09:48] :P [15:09:51] heh [15:10:35] "moin"… but it's not morning anymore ! :x [15:11:01] not sure in what timezone aude currently is.. [15:11:06] Tobi_WMDE_SW: I believe this patch is still valid. There is a dependency not declared. Might work without this but having it declared is always better. [15:11:16] Thiemo_WMDE: ok! [15:11:29] * aude eating breakfast [15:12:14] :D [15:13:20] Alphos: "moin" applies regardless of time of day https://en.wikipedia.org/wiki/Moin [15:14:17] TIL ^^ [15:14:47] it's always morning somewhere anyway [15:14:49] :) [15:16:02] nikki tell that to the british empire :p [15:16:43] (says the frenchman whose country actually has the most time offsets :D ) [15:26:54] not that timezones is a good indicator anyway, you could always do what china did and insist on the same timezone everywhere :P [15:27:44] would probably really confuse the people in the pacific but oh well, they'll get used to it :P [15:50:16] Amir1: do you know what do do about https://gerrit.wikimedia.org/r/#/c/308139/ or do you need more info? [16:01:39] $ wdtaxonomy Q811979 [16:01:40] /usr/lib/node_modules/wikidata-taxonomy/wdtaxonomy.js:74 .catch( e => { error(2,"SPARQL request failed!") } ) [16:01:56] Did I do something wrong, or is this subclasses tree just too big? [16:06:21] DanielK_WMDE: IRC had issues [16:06:37] Okay, That's enough for now. I start working on it ASAP [16:07:30] Nemo_bis : i kinda think it would fetch all instances of classes or subclasses of Q811979, which is quite a lot : http://tinyurl.com/gw2cdwx [16:08:05] Amir1: let me know if you have questions [16:08:06] 33097 results in 12 seconds, without labeling [16:08:18] thanks, sure :) [16:09:41] Nemo_bis : you could build your tree from there : http://tinyurl.com/jf2e7a8 [16:14:39] I wouldn't expect https://www.wikidata.org/wiki/Q14627938 to be part of the taxonomy [16:15:25] I'm just debugging https://github.com/alemela/wiki-needs-pictures/issues/19 , I think the Q811979 taxonomy is probably too big for a live query [16:18:35] Nemo_bis and yet wd:Q14627938 wdt:P31 wd:Q19397522 . wd:Q19397522 wdt:P279 wd:Q204776 . wd:Q204776 wdt:P279 wd:Q811979 [16:18:35] P31 Fork of P29 (An Untitled Masterwork) - https://phabricator.wikimedia.org/P31 [16:18:35] P279 404 and 500 error pages - https://phabricator.wikimedia.org/P279 [16:18:48] ugh stashbot :'( [16:18:57] ikr >_> [16:50:13] SMalyshev: are you talking the day off? [16:52:28] gehel: I'm here [16:52:46] SMalyshev: Kool! I did not really want to break WDQS on my own... [16:53:14] I checked the puppet side, I'm pretty sure it will not break anything on its own, we can deploy it first. [16:53:47] the scap part, I understand it much less :( but we have a canary... [16:53:52] gehel: ok, let's check it [16:53:58] I'll get a coffee and we can get started... [16:54:30] gehel: looks like I don't have ssh access to codfw ones though [16:54:56] SMalyshev: they are not yet deployed [16:55:04] gehel: ahh ok :) [16:55:21] I'm waiting for that patch to be merged, so we can install everything at the right place from the go [16:55:41] ok let's see [17:01:15] SMalyshev: puppet compiler output: https://puppet-compiler.wmflabs.org/3956/ [17:01:52] the only significant change is the systemd unit: [17:02:01] -AssertPathExists=/srv/deployment/wdqs/wdqs/wikidata.jnl [17:02:01] +AssertPathExists=/var/lib/wdqs/wikidata.jnl [17:02:29] looks ok [17:02:53] the symlink /srv/deployment/wdqs/wdqs/wikidata.jnl is not managed anymore [17:03:14] and /etc/wdqs/vars.yaml is created, but unused (as much as I understand scap3) [17:03:28] should be ok as long as scap manages the symlink [17:03:50] and if it does not, we are screws anyway [17:05:04] ok, I'm merging the puppet part, disabling puppet and --dry-run-ing it first [17:07:21] ok [17:10:55] SMalyshev: puppet ok, blazegraph restarted and tests are still passing, I'm running puppet on 1002 as well [17:12:15] looks ok to me too on wdq1 [18:19:00] Thiemo_WMDE: I noticed you started triaging bugs and set basically any user request to lowest [18:19:06] And removed Wikidata from https://phabricator.wikimedia.org/T112140 . Why? [18:19:26] There is more to Wikidata than the folks working from Berlin [18:21:04] This is not what I did and not what I intended to do. Sorry if something I did triggered this impression. [18:22:13] T112140 is about pywikibot *using* wikidata. Not about anything happening in the Wikidata project. I may be wrong but this was my impression. [18:22:14] T112140: Provide a wrapper function in pywikibot around wbparsevalue - https://phabricator.wikimedia.org/T112140 [18:22:43] Sure, but all those tasks are also in the main Wikidata project [18:22:55] What "all those"? [18:24:24] Pywikibot Wikidata tasks [18:24:38] Many other projects that use the Wikimedia Phabricator also use Wikidata stuff somehow. They do not add the tag "Wikidata" to all their tasks that use an Wikidata API somehow. This is of no use and would make the Wikidata tag useless. [18:25:27] I have not seen Pywikibot doing this. Why should they? A specific "pywikibot-wikidata" tag would make much more sense, or a column on the pywikibot board. [18:26:04] Didn't you just give the #def of tag? [18:26:07] The Wikidata tag is for stuff that can be done in the Wikidata project, by staff or volunteers. [18:26:20] I'm sorry? What "def"? [18:28:22] And about the triage: I did exactly that, triaging. I skipped many untriaged tasks but I believe after more than two years I have enough information to do a sensible triage. [18:29:02] I'm a human being after all, so I'm sure I'm wrong in some cases. If you disagree in some cases, feel free to change the priority or ask Lydia. [18:29:35] Sorry, that came out wrong [18:29:42] I'm happy to hear you're clearing the backlog [18:30:19] I'm also trying to help so much as possible, even found some problems that were already fixed. [18:32:51] There is no "right" way to triage. I would love to set everything to "unbreak now". I really do. Deciding what needs immediate attention and what can wait is hard and always hurts somebody. I'm sorry for that. But somebody should do it nevertheless, I believe. [18:32:55] Hope that helps. [18:33:14] Hunting food now. ;-) [18:35:13] * sjoerddebruin wonders why the aliases fields have autocomplete. [18:43:03] DanielK_WMDE: Why do you give a CR-1 on rebase conflict? [18:43:09] Those persist rebases [18:43:12] you should use V-1 [18:43:20] https://gerrit.wikimedia.org/r/302199 [18:44:50] sjoerddebruin: https://www.wikidata.org/wiki/User:Multichill/Collections_by_number_of_illustrated_paintings Rijksmuseum toch nog steeds bovenaan [18:44:58] <3 [18:45:11] anyway, got to go [20:21:52] I'm planning to create items on test.wikidata.org (and properties) for the purpose of verifying that a bot does the right thing. Is that ok ? [20:22:29] "The Test Wikidata is for developers to test their code without breaking all of Wikimedia" hmmm [20:22:43] Seems okay to me though [20:44:26] dachary: that's what test.wikidata.org is for, afaik [20:45:53] thanks for the confirmation. I'm not going to create zillion items either but ... it's reassuring to know I'm not doing something completely out of th norm. [21:31:04] so many identifiers https://www.wikidata.org/wiki/Special:Contributions/Josve05a - when will we start scraping all the info on the web? :p [21:57:37] PROBLEM - puppet last run on wdqs1001 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [22:53:30] is there a way to run https://query.wikidata.org/ against test.wikidata.org instead of www.wikidata.org ? [22:53:45] dachary: not really [22:54:43] SMalyshev: ok :-) that saves me time looking for something that does not exist, thank you ! [22:57:20] what would be the easiest way to copy an item from www.wikidata.org to test.wikidata.org ? exporting it to json + importing it maybe ? [22:59:47] dachary: probably. or just using API - get item, set item? [23:00:25] * dachary looking for something like that in pywikibot [23:03:13] pywikibot or https://github.com/addwiki/wikibase-api [23:22:07] SMalyshev: I'm writing a pywikibot based bot, I'd rather use the pywikibot API. I can't figure it out but creating a new item from scratch is probably good enough for test purposes so I won't dig further.