[03:41:25] PROBLEM - Response time of WDQS on einsteinium is CRITICAL: CRITICAL: 11.11% of data above the critical threshold [300000.0] [03:47:25] RECOVERY - Response time of WDQS on einsteinium is OK: OK: Less than 5.00% above the threshold [120000.0] [11:36:44] DanielK_WMDE: Panir Masala, 7,90. [14:51:04] aude: thanks for looking into the backport. i'll poke thiemo about the tests. we did verify... well, apparently not correctly. [14:53:27] DanielK_WMDE: i would cherry pick https://gerrit.wikimedia.org/r/#/c/322085/ (with test fixes) into wmf/1.29.0-wmf.3 wikibase [14:54:15] then i think it's possible to update only wikibase and data-values/numbers [14:54:30] e.g. composer update --prefer-dist -o wikibase/wikibase [14:54:47] e.g. composer update --prefer-dist -o data-values/numbers [14:55:08] addshore can also help [14:55:18] *waves* [14:56:11] :) [14:56:34] i'm not sure i can be around or much help at the next swat [14:56:57] the evening one, i can but we should get this out sooner once it's ready [15:04:42] "TypeError: undefined is not an object (evaluating 'transData[ lang ]')" [16:05:16] SMalyshev: We might need your expert SPARQL help. Any idea if it's possible to pull https://www.wikidata.org/w/index.php?title=Wikidata:Database_reports/Constraint_violations/P650&oldid=401428985#Types_statistics from SPARQL without it timing out? [16:05:36] Or would we just need to pull in all items and process it locally? [16:29:43] hoo, aude: can you help me with the bugfix backport? I'm a bit lost. [16:30:05] Doesn't jenkins run on deployment branches of Wikibase? I'd like to see the tests pass before merging the backport... [16:30:10] https://gerrit.wikimedia.org/r/#/c/322119/ [16:30:16] addshore: or maybe you know [16:30:22] *reads up* [16:30:49] it should do afaik [16:30:57] but it doesn't [16:31:02] not even after i added it manually [16:31:03] its still running nearly done! [16:31:10] oh? huh. [16:31:30] doesn't it usually add V-1 when it *starts* running? [16:31:33] whatever... [16:31:41] https://usercontent.irccloud-cdn.com/file/xzNbP6I9/ [16:31:51] and nope it doesnt. https://integration.wikimedia.org/zuul/ is your friend ! [16:32:07] that page confuses me ;) [16:32:08] there you go, it has a +2 now :) [16:32:12] cool! [16:32:14] care to merge? [16:32:33] *looks* [16:32:38] addshore: i may need more hand holding to get this into the build... can you help? [16:32:46] yup (maybe) ;) [16:33:11] okay, +2ed, now merging [16:33:18] excellent, thank you! [16:33:35] now... in the build... extensions/Wikibase is a submodule, right? [16:34:07] so... i need to change the commit hash for the submodule now?... where?... [16:34:14] * DanielK_WMDE hasn't used submodules in ages [16:34:38] ewww submodules [16:34:55] so, afaik you don't have to do anything with submodules :0 [16:35:10] you just have to make a new build for the branch for the Wikidata repo [16:35:34] i have no idea how to do that >_< [16:35:43] DanielK_WMDE: https://github.com/wikimedia/mediawiki-extensions-Wikidata has a "Manually update a build" section in the readme [16:36:20] afaik, it should be as easy as cloning the build repo, running what it says, and then pushing a commit to gerrit for the branch [16:36:33] ok, trying [16:37:24] DanielK_WMDE: if you manage / get stuck give me a poke :) [16:38:10] so i need to install npm first... [16:38:17] yes ;) [16:38:18] which pulls in node... [16:38:28] just to run a mesely build script. [16:38:29] ugh [16:38:56] oh, ascii block draw characters! [16:39:01] havn't seen those in decades! [16:39:08] you can probably also just read the gruntfile and run each command manually [16:39:27] i don't have grunt, it seems... [16:40:08] addshore: grunt should come with npm, no? [16:40:13] if not, where do i get it? [16:40:24] * DanielK_WMDE has 15 minutes left in the day [16:41:33] DanielK_WMDE: if you dont manage I can do it in a sec! :) [16:41:56] i really feel like i should be able to do this :) [16:42:02] but maybe rushing it isn't a good idea [16:43:23] > ./node_modules/grunt-cli/bin/grunt [16:43:25] /usr/bin/env: »node“: Datei oder Verzeichnis nicht gefunden [16:43:28] *sigh* [16:43:55] this process is clearly designed for people who deal with npm/node/grunt all teh time anyway. [16:44:35] odd how it doesnt find your node install [16:45:51] addshore: i don't *have* a node install [16:45:54] well, i guess it just made one [16:46:36] addshore: i just tried > sudo npm install -g [16:46:54] it does *something*, but grunt is still not found [16:47:16] do "npm install -g grunt-cli" ? [16:47:52] but npm install in the Wikidata repo should install that :/ [16:47:57] but not globally [16:48:29] addshore: well, do i want it globally? and doesn't that need sudo? [16:50:31] addshore: ok, i now have global grunt, but it still doesn't find node. [16:51:11] sudo apt-get install node ? [16:51:31] i thought npm already pulled that in [16:51:33] but i'll try [16:52:26] addshore: already installed [16:52:30] but it's called nodejs [16:52:37] ok, giving up. this is bullshit [16:53:04] addshore: can you update the build? it would be great if we could get this into SWAT in an hour [16:53:31] oh wait. 19:00 UTC. that's in two hours, right? [16:53:40] ok, anyway. i gotta run... [16:54:16] oh, and why the HELL do we require node.js to make a build?? [16:55:11] addshore: this one needs the backport: https://gerrit.wikimedia.org/r/#/c/322092/ [16:55:24] ideally, we exclude the changes to placeholder. if there's a sane way to do that [16:58:10] DanielK_WMDE: yes, I will try and get ll of that done! [18:05:07] aude: still around? It would be great if you could take a speedy look at https://gerrit.wikimedia.org/r/#/c/322133/ and give it a +1 and if so I'm fine to do it in SWAT! :) [18:51:21] or hoo? [18:53:10] Hi [18:53:16] :) [18:53:30] afaik the path i made is all good to go [18:53:52] patch... [18:53:54] Do I need to take care of swat ? [18:54:30] Iif you can, that would be amazing, if not I can :) [18:54:50] need to sync the whole of the Wikidata dir right? but other than that it is like any other extension? [18:55:01] It would have to be the later swat if I do it [18:55:16] All of wikidata is ok [18:55:26] cool, yeh I can do it then :) [18:55:32] Ok [18:55:56] Idk if everything is merged? [18:56:14] And tests pass [18:56:21] afaik everything is except for on the dpeloy branch on Wikidata, tests pass [18:56:25] https://gerrit.wikimedia.org/r/#/c/322092/ [18:56:36] Ok [18:56:40] that has the change in number and the change in wikibase :) [18:56:49] Great [18:57:05] [= [18:58:08] In other news I am getting Internet service installed at my new apartment tomorrow :) [18:58:23] No longer will need to tether [19:06:27] :O [19:37:39] Great. https://www.wikidata.org/wiki/Special:Contributions/PLbot [20:02:47] hi [20:03:10] hello [20:03:47] i need help [20:03:48] hi [20:04:09] with neon [20:04:21] ... [20:04:39] if it's not related to Wikidata, you're in the wrong place [20:14:02] sjoerddebruin: That looks nice indeed! [20:14:23] I don't it's correct. :P [20:15:00] I'm talking about the reports being updated again [20:15:20] More about the human missing claims [20:58:39] autolist is dead: may you please help me with a sparql query? [20:59:14] sure [20:59:54] ok, with the wdq to sparql converter, i can do it on my own [21:00:14] but i just don't find the field, where i can run actions on my output [21:00:47] in old autolist i could run e.g. -31:1173 on my results [21:01:42] you mean, remove all results with P31:Q1173? [21:03:56] WikidataFacts: This was just an example. There was functionality to mass add/remove statements on your query [21:04:34] oh, you mean, actually modify statements in Wikidata? not just tweak the results list? [21:05:01] I don’t know if this is possible in WDQS [21:06:30] it must be possible in petscan, if not it would be quite lame [21:12:34] You can mass add/remove statements from Petscan [21:15:09] multichill: then i must be dumb - i just don't see any box, where i could do it [21:23:42] i would just like to get a query of all physicists in enwp with petscan [21:24:19] due to clutter i run a union query Physicists + People, but i the query is running and running and never finishes [21:30:56] is anyone able to help me overcome a timeout in petscan? [22:24:42] kopiersperre: what sparql query are you using? [22:26:55] nikki: no sparql query at all [22:26:58] i just would like to figure out the number of Physicists in english wikipedia in a timely manner [22:29:01] i would like to get a category intersection of physicists and people, to exclude other things [22:29:26] oh. I can't help then, sorry :/ [22:29:57] don't mind, it's my fault to ask in the wrong channel [22:30:10] but petscan it much slower than autolist [22:30:20] i would really like to get autolist2 back! [22:33:03] yeah, I'm sad to see autolist gone :(