[07:07:17] hi [07:11:27] !admin [08:05:24] it's an addshore :DD [08:05:37] indeed [08:12:30] * sjoerddebruin should start working [08:14:56] addshore: If you got some timeleft could you help me understanding why the tests fail here? https://gerrit.wikimedia.org/r/#/c/235608/ [08:15:05] *looks* [08:16:28] are you doing some join ? [08:16:55] ignore that, that was a random guess... hmmm Undefined index: formatterURL [08:18:36] rather than doing isset use array_key_exists [08:18:45] ? [08:18:49] addshore: it only happens on sqlite [08:18:57] so perhaps it isn't a php weirdness [08:19:03] you mean hhvm weirdness [08:20:57] its documented as working in hhvm for array keys... who knows...! [08:21:12] addshore: i mean, the same test works with mysql [08:21:18] yeh I know :P [08:21:48] as I said, your not magically using / doing any sql joins are you? ;) [08:22:15] addshore: I didn't modify any sql stuff in there =o [08:22:30] just added an additional field to the property_info which is a simple BLOB [08:22:32] but arre you now using something that does joins? ;) [08:23:04] me not, the store implementation maybe [08:23:08] didn't look into that [08:23:43] if so it wont work on sqlite [08:24:14] yes the table builder does a join [08:24:24] array( 'LEFT JOIN', [08:24:24] array( [08:24:24] 'pi_property_id = epp_entity_id', [08:24:24] ) [08:24:24] ); [08:25:38] addshore: but why did it work before? o_O [08:25:52] where is that called? [08:26:39] its possible the previous code path just wasnt triggering it, and you made it :) [08:26:56] if you look in some other tests youll see there are some that get skipped in sqlite for this same reason [08:27:03] so just skip if on a sqlite db here ;) [08:27:14] (unless you can find a way to avoid the join).. [08:29:06] addshore: digging into it but perhaps skipping it on sqlite is the best option [08:29:21] why the hell do we run our tests with sqlite if we don't support it anyways? [08:37:46] benestar: we do, the joins just dont work on temporary tables (which are used in tests) [08:38:11] ah, ok [08:38:25] I still don't understand why the joins should cause an issue here... [08:38:26] but well [08:38:32] or so I am told ;) [08:41:40] aargh still those annoying spinners when you work too fast when adding a statement :( [08:41:55] addshore: I will let daniel have a look at it :) [08:49:08] Any idea for https://translatewiki.net/wiki/Thread:Support/About_MediaWiki:Wikibase-api-no-such-sitelink/en ? [08:52:01] Nemo_bis: I think we don't translate the api messages as they are always returned in English [08:53:43] but that doesn't make sense because we display them in the ui afaik [08:53:45] Lydia_WMDE --^ [08:55:16] All API stuff is localised nowadays. [09:00:31] Not the easiest stuff to translate [09:39:52] addshore: hmm, so I guess I can't come in before 1, wondering if that's worth it at all [09:40:02] since I've to leave earlier [09:40:10] YuviPanda: what time do you leave? [09:40:17] i think it's still worth it :) [09:40:38] aude: to SF? I have a week [09:40:57] aude: so I'd come in tomorrow as well and maybe Monday too [09:42:05] Lydia_WMDE: daily?! :) [09:44:31] benestar: there were discussions but i don't remember what the reasons where atm [09:44:38] benestar: jep Tobi_WMDE_SW_NA is setting it up [09:58:45] YuviPanda: hmmm, ok [09:58:53] * aude might not be here tomorrrow, but definitely on monday [10:28:53] aude: I'm actually coming in. See you in like an hour [11:07:23] YuviPanda: \o// [11:27:03] YuviPanda: see you soon! make sure you get on the right train this time! [11:51:19] DanielK_WMDE: https://www.wikidata.org/wiki/Wikidata:Project_chat#Why_are_links_to_wiki_items_displayed_in_a_foreign_script.3F [11:51:34] apperently labels get added to plain links on wiki pages as well? o_O [12:23:01] Lydia_WMDE: https://phabricator.wikimedia.org/T111338 [12:39:10] addshore: https://phabricator.wikimedia.org/T111322 (see all the linked patches) [12:39:39] need review and make sure i didn't miss anything important [12:40:14] and i am getting random (sometimes) qunit errors [12:45:14] * aude wonders when other people run phpunit tests on wikibase, if they all pass? [12:48:58] benestar: yay I have phragile installed ;) [12:49:05] aude: not for me ;) [12:49:10] looking now [12:51:56] aude: also have to make them use the new wikibase/data-model-serialization :/ [12:52:13] addshore: even if i remove suggester and quality stuff [12:52:31] i actually get a lot of failures... some i've been getting for a while, but seems to be more now [12:52:39] https://gist.github.com/filbertkm/43edcbaa37bfbd89cf5f [12:52:50] * aude is going to go through these now, one by one [12:52:59] I havnt run all of the tests in quite a while now [12:53:16] things that depend on settings, things that use my real database, no idea.... [12:55:17] aude, or tests that jenkins is skipping [12:56:23] if they dont have one of these @group tags then jenkins will miss them... Wikibase,WikibaseAPI,Purtle,WikibaseClient [12:57:03] well, I guess they would still be run for the build now, but I remember we had that problem a few times before [12:58:56] i ran with --group Wikibase [12:59:16] i normally just point phpunit to a directory, and maybe find more failures that way [12:59:33] maybe [12:59:41] * aude making tasks [12:59:49] aude: will you ammend those patches that also need the new version of serialization? :) [12:59:51] getting annoyed [13:00:14] addshore: oh noes [13:00:55] nothing big has changed [13:01:07] the main break / basically the only break was the change of the interface contracts [13:01:09] all of the serialization components ? [13:01:11] in services [13:02:05] services 2.0.0 serialization 2.0.0 internal serialization 2.0.0 (not that anything will use that) and datamodel 4! [13:03:13] * aude finds this confusing [13:03:58] i think 2 of them are fine [13:04:39] but external validation needs a data-model-serialization update too [13:04:49] yep [13:04:51] i see [13:05:05] and hmm, QualityConstraits is failing for some other reason... [13:05:37] wtf QualityConstraits said "The requested package wikibase/data-model-services ~1.1" [13:05:45] almost like it was using the old composer.json.... [13:06:42] addshore: probably because it's pulling WikibaseQuality master [13:06:57] ahh probably, yes, so there I have to merge Quality first ;) [13:07:07] cool, so just external validation to fix! [13:07:12] fixed [13:08:31] i wonder if it would be possible to remove any of those dependancies from those extensions easily [13:08:40] addshore: \o/ [13:10:01] addshore: would be nice [13:11:01] dumb quint tests now [13:11:09] no idea why [13:14:20] me neither :D [13:16:45] aude, I will just keep giving it +2s until it actually goes through ;) [13:16:51] ok [13:18:43] benestar: can you file a bug about the "magic label in wrong language on wikitext pages" thing? seems strange indeed. [13:19:27] DanielK_WMDE: the issue you found was only introduced by me in the latest patch set :S [13:19:32] the tests are still failing in https://gerrit.wikimedia.org/r/#/c/235608/ [13:20:55] benestar: what was the issue reported on project chat? [13:21:25] aude: on the page linked, normal [[Q123]] links got prepended by a label in Russian [13:21:26] 10[1] 10https://www.wikidata.org/wiki/Q123 [13:21:33] oh [13:21:39] just like we do for special pages but on a wiki page [13:21:44] after purging they disappeared [13:21:54] strange [13:22:00] Linker strangeness [13:55:52] perhaps it was hungry and eated the links? [14:06:08] DanielK_WMDE: nice, there is also a JobQueueGroup::singleton()->get( )->getSize() [14:06:39] so we could have one that just keeps spitting jobs out until the size of our queue becomes close to the number of clients we have, then start slowing down ;) [14:07:29] to me the biggest question is when where and how does the inital / an initial job get fired [14:10:34] * aude still thinks on page save [14:11:06] and could be some percentage of saves [14:11:09] yeh could do, and we could have a configurable $dispatchRate thing too like the job runner rate, thus could be default to 1 [14:11:14] yep [14:11:14] for for wikidata more like 0.001 [14:11:19] 0.01.. [14:11:19] exactly [14:11:22] :) [14:11:44] and the batch size would relate to that [14:11:44] and if these jobs just wrap the logic we already have, should all work nicely... [14:11:51] :) [14:12:01] aude: how long does the dispatch script take to dispatch stuff to 1 wiki currently? [14:19:24] also aude https://wikitech.wikimedia.org/wiki/Graphite#Record_data [14:19:51] we could just set a cronjob running somewhere sending the current dispatch length to graphie [14:20:41] imo that would be epic [14:23:48] using graphite would be great [14:24:10] * aude used to have a tool that used rrd, but would use graphite today [14:24:32] and how long, can't say [14:53:33] aude: whats the machine name the dispatch stuff runs on? :P [14:55:43] addshore: ambiguous :p [14:57:43] A JohnFLewis has been detected. Everyone hide! [14:58:36] JeroenDeDauw: you can run but you can't hide! [15:07:45] jzerebecki: btw, you know of puppetswat, btw? [15:07:58] (just curious / wondering) [15:08:33] YuviPanda: yes and I think it is a good thing to have :) [15:08:48] jzerebecki: ok! [15:13:01] addshore: it runs on terbium [15:13:09] =] [15:13:25] jzerebecki: Online? https://gerrit.wikimedia.org/r/#/c/235733/ succeeded. meeeeerge! [15:16:27] YuviPanda: this really could make some lovely graphs..... [15:16:40] https://usercontent.irccloud-cdn.com/file/6ZMBycws/ [15:23:46] https://usercontent.irccloud-cdn.com/file/DDdDiOrb/ [15:31:32] addshore: https://www.mediawiki.org/wiki/User:Aude/Profiling :) [15:31:37] =] [15:32:06] * aude will organize this session again this year [15:32:41] YuviPanda: where did you say the docs of what class count max min mean p90 ext actually mean in terms of this data? [15:33:43] addshore: 'statsite extended counters [15:33:44] ' [15:45:33] aude: what methods could I look at for the dispatching stuff? [15:45:54] *will go and dig himself actually* [15:48:15] addshore: getPendingChanges [15:48:41] https://phabricator.wikimedia.org/T109088#1595726 [15:54:05] bah, no profiling for it apparently aude :P [15:54:43] nothing for that class at all [15:54:47] because it's on terbium [15:54:49] e.g. php 5.3 [15:55:06] it looks like the profiling is set for hhvm or php, but i guess only for web requests [15:55:08] * aude hacked around to get these profiling samples [15:55:23] not for cronjobs of maintenance scripts like these [15:55:37] on terbium, especially [16:54:16] Lydia_WMDE: can https://phabricator.wikimedia.org/T111343 be in the sprint? [16:54:34] the test failures somewhat interfere in my ability to review stuff and might be actual bugs [17:00:19] I am making progress, I almost have a good representation of wikidata in a database :) [17:00:28] python2 unicode is a pain :( [17:01:11] I have a better understand of wikidata json dumps. Now I'm convinced it's some kind of hyper graph [17:01:31] for which there is no easy free software solution [17:01:43] i guess rdbms works most of the time [17:07:47] http://i.imgur.com/LrHxg27.png this seems like the kind of weird maps we could generate from the extensive wikidata database :) [17:09:51] if only I had any idea what it's a map of :P [17:10:35] "The map shows the eye colours of the head of states." [17:11:08] ah [17:11:19] impossible to guess :) [17:11:54] aude: Lydia_WMDE jzerebecki JeroenDeDauw if you can think of any stats you might like to look at from the profiling let me know [17:12:01] and I'll see if I can pull them out [17:12:19] * aude wants to explore in the dashboard [17:12:35] wants change dispatcher stats :P [17:12:35] aude: please do not name branches as valid semver version as you did in https://github.com/Wikidata-lib/PropertySuggester/pull/147 [17:12:43] JeroenDeDauw: ok [17:12:46] * addshore is putting a presentation together ;) [17:12:51] \o/ [17:13:05] aude: certain tools treat those as releases [17:13:13] evil tools [17:13:21] no, evil aude :) [17:14:34] bah [17:16:28] someone want to merge https://github.com/Wikidata-lib/PropertySuggester/pull/147 [17:16:33] addshore: ^ [17:19:35] how can i delete a tag? [17:20:15] aude: one already pushed to the public remote? [17:20:19] yeah [17:20:28] i thought my pull request was merged :/ [17:20:30] aude: you can do that, but it is really bad practice [17:20:37] yeah, i know.... [17:20:44] but if it is quick [17:21:11] aude: https://nathanhoad.net/how-to-delete-a-remote-git-tag [17:21:33] ok [17:22:01] * aude wants to retag [17:22:04] aude: protip: always do a git log before you do a tag [17:22:07] and might have to poke packagist [17:22:10] * aude nods [17:25:16] addshore: no need to cr-1 a style issue already caught by the ci https://gerrit.wikimedia.org/r/#/c/235627/ [17:25:23] addshore: now when I fix it the -1 stays [17:25:33] nah, if you fix it the -1 should go [17:25:44] they only stick thgouh rebases and commit message changes dont they? [17:28:45] addshore: will find out [17:32:26] addshore: meh.. the lookup interfaces and implementations in lib are such a mess exception wise >_> Trying to fix https://gerrit.wikimedia.org/r/#/c/235627/ [17:32:39] Though I'm finding a lot of things and I don't know which one is causing this [17:37:54] addshore: you're right - -1 not sticky [17:38:24] Hey, I'm building the classifier for anti-vandalsim tool. One quick question: Is it technically possible for someone to add an statement and remove another one. e.g. add P31:Q5 and remove P21:Q20 [17:51:22] Amir1: I think that's possible using the api [17:51:40] So a bot might be doing that? [17:51:57] yeah [17:52:26] I figure something out to handle situations like this [17:54:09] thanks multichill [17:54:12] :) [18:14:31] Amir1: yes it is :) [18:15:27] thanks, I fix it using id [18:15:35] guid probably [18:52:00] addshore, aude: yes, exactly, the initial trigger job would be scheduled on page save. It would be delayed e.g. for one minute, so the edits done during one minute would only cause one trigger job to be exectued. [18:52:55] But then, maybe that's not even needed. Just run it. The dispatcher itself takes care not to push any changes that are too young. [18:56:17] Amir1: it is also possible to remove 3 claims, add 2 references, change 1 qualifer, remove 2 sitlinks, add 2 different ones, add 2 badges to another sitelink, alter 6 labels, add 12 aliases and remove 7 descriptions ;) [18:56:49] oh boy [18:56:53] all while also changing the precision of 2 coordinates and the month of 6 dates ;) [18:57:13] basically doing everything in a single edit is possible [18:57:38] you can take the entity from one state to a totally new state and everything about the entity can change in that 1 edit [18:59:20] Amir1: how were you thinking of handeling it all ? ;) [18:59:20] Amir1: basically, via the api, you can just replace the entire JSON of an entity. not much magic involved. you just feed it in. [18:59:41] I'll try and send you that email I promised this weekend ;) [19:06:29] addshore: using guid of claims [19:09:26] are you going to look at tackling sitelink, and label / description / alias vandalim too? [20:54:49] wow, this went unnoticed for two years https://gerrit.wikimedia.org/r/#/c/235861/ [20:55:12] :O [20:56:33] haha wat [20:56:37] benestar|cloud: ---^ [20:56:57] JeroenDeDauw: i must have been rather distracted when i wrote that. forgetting an index, ok, but why did i declare one on a blob field instead? [20:57:38] Sailed through review tho [20:57:53] DanielK_WMDE: yeah, declaring one on a blob field is win [20:58:06] yea, all tests passed, since nothing tested the a REPLACE query gainst that table. [20:58:16] until bene wrote a patch adding new info to that table [21:38:45] https://www.wikidata.org/wiki/User:Ladsgroup/Kian/Possible_mistakes/enHuman [21:38:48] lots of mistakes