[04:57:53] Hi. I noticed that some Freebase imported items don't have all the descriptions that had in Freebase. I also noticed that those descriptions are the first paragraph of the Wikipedia related articles. So my question is: which is the best aproach to access that paragraph in Wikipedia to build a robot to update this kind of items? [06:06:08] Hallo. [06:06:32] This patch is very naive and untested: https://gerrit.wikimedia.org/r/#/c/239318/ [06:06:42] Can anybody check that it's not entirely silly? :) [06:14:16] aharoni: I agree with Santhosh that the file should be copied into Wikibase instead of re-using core's image [07:08:57] legoktm: whatever works, I'm not emotionally attached to it :) [07:09:15] It's high-prio for us, because it blocks all our merges. [08:13:52] Lydia_WMDE: can you please get anybody to look at https://phabricator.wikimedia.org/T113017 ? It blocks all our merges and it's probably in Wikibase code. [08:22:51] aharoni: I will look into it [08:23:02] benestar: thanks [08:26:14] meh, copying the whole icon set into wikibase seems like a bit weird [08:28:50] aharoni: that line of code is there since about a year o_O [08:29:14] benestar: yes, but it started to be tested only recently [08:29:25] hmm, ok [08:29:32] afaik this rule is never used in wikibase [08:29:57] we don't have icons as buttons [08:30:29] remove it then? :) [08:31:26] I'm not entirely sure about that but it seems this is the best option [08:34:18] aharoni: now get someone to merge my patch ;) https://gerrit.wikimedia.org/r/#/c/239330/1 [08:36:17] benestar: thanks. Where does Wikibase even have _toolbars_? [08:36:51] aharoni: that's what we call all the [edit] things [08:38:31] benestar: OK, I'd merge it, but it also fails tests. [08:38:39] yes, some change in core I guess [08:38:45] Use of ApiResult::setRawMode was deprecated in MediaWiki 1.25. :S [08:38:47] Today's xkcd is very appropriate. [08:38:51] * benestar looks into that [08:45:11] benestar: mmmmm... does this mean that now all Wikibase merges are blocked? [08:45:28] yep [08:45:32] but I provided a fix already [08:45:38] * benestar hopes it actually fixes that [08:46:03] I just removed the call of setRawMode and that seems to fix it [08:58:50] !nyan | benestar [08:58:50] benestar: ~=[,,_,,]:3 [08:58:53] Where are those commits? [08:59:43] JeroenDeDauw: https://gerrit.wikimedia.org/r/#/c/239334/ ? [09:00:42] looks legit to me, I almost +2ed it, but you are much better than I am at knowing what does it actually do. [09:00:44] and https://gerrit.wikimedia.org/r/#/c/239330/ [09:03:01] benestar: https://gerrit.wikimedia.org/r/#/c/235627/ [09:22:22] aharoni: do the tests work now for you? [09:22:59] benestar: checking [09:31:25] benestar: https://gerrit.wikimedia.org/r/#/c/239312/ :( [09:31:31] still looks related to Wikibase [09:32:35] hashar: ^ [09:32:56] https://gerrit.wikimedia.org/r/#/c/239312/ is blocked by failures in Wikidata, on which it depends, [09:33:10] hmm? I think the merged patches aren't in there yet [09:33:17] where? [09:33:23] but WikibaseQuality is also failing *argh* [09:33:32] who can fix that? [09:33:47] and how can it be that we are noticing it and you are not? :) [09:34:48] are you using the WikidataBuild stuff? [09:35:28] I'm not using anything in particular. [09:35:35] ContentTranslation has a dependency on Wikidata when it runs tests. [09:35:43] I don't know the details of how it is configured. [09:37:07] I fixed something with hashar in this area a couple of weeks ago. [09:37:50] aharoni: benestar hello [09:38:42] aharoni: seems Use of ApiResult::setRawMode was deprecated in MediaWiki 1.25 [09:38:44] is one error [09:39:00] ApiResult::getIsRawMode was deprecated in MediaWiki 1.25. [09:39:01] being the other one [09:39:01] hashar: that is supposed to be fixed in the Wikibase code already [09:39:12] both comes from extensions/Wikidata/extensions/ExternalValidation [09:39:20] so patches to Wikidata should currently fail as well [09:39:43] hashar: I'm talking about the latest failure at https://gerrit.wikimedia.org/r/#/c/239312/ . [09:39:48] It has three failures. [09:39:59] ah sorry [09:40:12] there are fives on https://integration.wikimedia.org/ci/job/mwext-testextension-zend/8966/testReport/ [09:40:14] :/ [09:40:31] #1 is about the missing file ui-icons_888888_256x240.png - and benestar already fixed this error and removed the reference to this file entirely from the Wikibase repo. [09:40:50] but maybe the Wikidata repo is not updated from the Wikibase repo yet? [09:40:55] (just a naive guess) [09:41:04] oh [09:41:16] but if you recheck that still fails ? :( [09:41:43] Well, I can try, but I already did it _twice_ after benestar's fix was merged... [09:42:20] ah [09:42:32] if you look at the top of https://integration.wikimedia.org/ci/job/mwext-testextension-zend/8966/consoleFull [09:42:43] the zuul-cloner clones several repos [09:42:48] and clones the Wikidata extension [09:42:51] not the Wikibase one [09:43:02] so seems Wikidata needs to be bumped to include whatever Wikibase patch fixed it [09:43:22] hashar: OK, that's what I suspected. How is it done? [09:44:47] aharoni hashar: talked about it with daniel and aude [09:45:01] the Wikibase master is fixed now but the WikidataBuild isn't updated yet I guess [09:45:11] also the WikibaseQuality stuff still needs updating [09:45:41] aharoni: we make a build once a day currently [09:45:49] hi aude [09:46:01] * aude can look at quality and we can make a new build when it is all fixed [09:46:02] so, two questions: [09:46:17] 1. can it be made earlier todaymerges? because it blocks our merges. [09:46:33] does that mean the Wikidata build was not passing tests ? [09:46:37] 2. why does it happen that we notice it in our builds, and you don't? [09:46:39] hashar: probably not [09:46:42] or we need to add a job that ensure it is passing? [09:46:49] aharoni: core broke our stuff over night [09:46:57] so we also just noticed it today [09:47:00] we can make a build anytime, but it's automatically made once per day at minimum [09:47:12] the whole trouble with extensions depending on each other, is that whenever one is made to break, that break all the others depending on it :\ [09:47:13] anyway, i am looking at quality [09:47:27] (I am not blaming anyone :D ) [09:47:52] hashar: versioning and releases would fix that ;) [09:48:14] fun with depending on mediawiki master :) [09:48:18] apparently the last merged commit for Wikidata passed testextension-zend https://gerrit.wikimedia.org/r/#/c/238705/ [09:48:31] hashar: the breaking change was since then [09:48:47] aude, hashar - I am also not blaming anyone :) [09:48:51] :) [09:49:19] It makes sense that if extension A fails, then extension B, which depends on extension A, also fails. [09:49:28] yeah [09:49:43] ideally a breaking change on A should detects B is going to fail [09:49:44] what's weirder is when extension A doesn't fail, but extension B does _because of extension A_. [09:49:57] forcing A to add back compatibility code [09:50:28] for the deprecation in core, the test should break until all extensions have updated. But we don't run tests of all extensions on mw/core patches [09:51:09] hashar: should we? :) [09:51:18] sounds like we should (really) [09:51:32] that's the point of continuous _integration_, isn't it? :) [10:05:18] so aude, can you make a build now? [10:05:58] oh, looks like you're doing it right now?.. [10:06:06] it's the automatic one [10:06:10] https://gerrit.wikimedia.org/r/#/c/239339/ ? [10:06:29] to unblock you, what we could do is to temporarily not run ExternalValidation tests [10:06:32] (while we fix) [10:06:38] that extension is not deployed yet [10:09:11] I often say "whatever works" (kinda like Q692208) :) [10:09:30] if you remember to fix them and re-enable them, I guess it's OK. [10:11:16] * aude put it in our sprint and will do asap [10:11:34] we'll remember tomorrow when the build fails again and we just don't merge it [10:16:23] https://gerrit.wikimedia.org/r/#/c/239342/ [10:29:16] jzerebecki: is https://gerrit.wikimedia.org/r/#/c/239342/ ok for today? (to unblock aharoni) [10:29:32] * aude looking at external validation but it's non-trivial to fix [10:29:42] and is in the sprint now [10:48:57] aude: can we expect anything today? [10:52:12] aharoni: new build already merged [10:52:13] afaik [10:52:20] oh [10:52:26] or waiting on jenkins [10:52:43] merged :) [10:53:55] aude: and did you disable the WikibaseQuality tests? [10:54:28] yes (well there are 3 quality extensions and only one is affected) [10:54:46] the one being not deployed yet, etc. but we still want to fix asap [11:28:35] how can i get no claim items on specific local wiki? does it possible by query? [12:22:20] reza1615: that is a bit tricky, but should be possible e.g. via https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service [12:30:37] Have Wikibase tests been fixed? [12:30:39] aude: ^ [12:30:46] Greetings from Dresden, btw [12:32:57] They have been, so nvm [12:33:40] * aude waves to hoo ! [12:33:52] :) [12:34:53] aude: I have weird test failures on repo locally... I tried to debug them for an hour in the train, but couldn't get to anything :/ [12:35:05] ItemMoveTest fails telling me the page already exists [12:35:13] * aude raaaaaaaaaaaaaaaaaage at the quality external validation extension [12:35:23] oh noes [12:35:25] :) [12:35:29] is it doing evil stuff? [12:35:35] it is [12:36:23] * hoo runs the tests via php phpunit.php --wiki wikidatawiki ../../extensions/Wikibase/repo/ [12:36:34] Would be nice if anyone could see whether it's only my setup [12:36:39] i am struggling to get csvs from the dumpconverter script [12:36:50] so i have some data and can see how the extension really is intended to work [12:37:00] mh [12:37:06] https://phabricator.wikimedia.org/T113036 [12:37:08] it's a blocker [12:37:26] and then we have to remove use of rawmode in the extension [12:38:38] hm... they told me it wont take more than 15-20 min. [12:38:41] AFAIR [12:39:11] * aude doing somethign wrong [12:39:20] i'm trying it on my server now [12:39:29] Do they have documentation? [12:39:38] https://github.com/WikidataQuality/DumpConverter [12:42:39] * aude gives up for now but shall try again soonish [12:50:23] mh :/ [13:25:31] * hoo will look at the dump stuff tests failing some more [13:26:46] ok [13:26:54] jzerebecki: do you know the query? [13:29:01] reza1615: no, not from memory. [13:41:59] * hoo wonders about DumpJsonTest on hhvm: https://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/jobs/80981832 [13:42:12] * hoo doesn't have hhvm on his new notebook, so can't test offhand [13:43:31] Anyway, I'm leaving for now [15:46:17] any idea why Wikidata tests failing in WikidataPageBanner? https://integration.wikimedia.org/ci/job/mwext-testextension-zend/8946/console [15:46:23] trying to get some things merged for monday [15:47:06] ResourcesTest::testFileExistence with data set #750 ('/mnt/jenkins-workspace/workspace/mwext-testextension-zend/src/extensions/Wikidata/extensions/Wikibase/view/resources/jquery/wikibase/toolbar/themes/default/../../../../../../../../resources/lib/jquery.ui/themes/smoothness/images/ui-icons_888888_256x240.png [15:47:25] is this doing what i think it is doing and trying to load assets from core? [15:48:17] jdlrobson: mh should have already been fixed [15:49:30] * jdlrobson tests [15:49:42] thanks jzerebecki can you point me at the gerrit link/bug? [15:50:33] jdlrobson: https://phabricator.wikimedia.org/T113017 [15:51:26] thanks jzerebecki and yeh looks like its working again phew! :)