[00:42:08] aude: Is that bug you just opened with the version in the build? [00:42:37] (the new build) [00:42:50] hoo: probably is the same [00:43:01] :/ [00:43:03] * aude uses build resources + composer update [00:43:12] can take a look.... probably easy fix [00:43:12] Yeah, that's probably the same then [00:43:40] * aude has no idea what has been merged into master [00:45:37] mh [00:46:25] I suspect 2.2.1 introduced the breakage, not 2.3.0 [00:47:12] 2.2.1 what? [00:47:20] serilailzers? [00:47:54] PropertySuggester [00:48:03] But I don't see anything obvious [00:48:07] i doubt it [00:48:08] just lots of changes regarding terms [00:48:32] the problem is suggestion of values, not properties [00:48:57] * aude double checks the problem is limited to values [00:49:05] nope everything [00:49:25] oh, mh [00:49:28] in that case [00:50:13] Also travis is broken, but I have not clue why... [00:50:21] even played with it in my own github [00:50:30] Also broken, but no idea why -.- [00:52:13] Ok, can reproduce the breakage on master [00:52:19] even w/o PropertySuggester [00:54:43] think i'm close [00:55:59] Great :) [00:56:34] just trying to understand the new code (undoubtedly better, but different now) [01:04:56] adding tests now [01:05:11] :)) [01:17:46] i still see more issues [01:19:02] Despite your patch (unrelated to that) or same area? [01:20:23] hoo: related, but don't think it invalidates my patch [01:20:27] https://phabricator.wikimedia.org/T104273 [01:20:55] updated https://www.mediawiki.org/wiki/Wikidata_deployment hope I have everything relevant [01:21:18] i think i'll let addshore take a look at the rest [01:21:29] it's not horrible issue [01:22:03] hoo: also the font style / headings in the site links section changes [01:22:16] serif (h2) => san serif (h3) [01:22:25] Can you link the gerrit change? [01:22:46] it's one of benestar|cloud 's things [01:23:29] https://gerrit.wikimedia.org/r/#/c/203874/ [01:24:09] Just found it myself [01:24:11] :D [01:24:15] https://gerrit.wikimedia.org/r/#/c/203681/16/repo/includes/content/EntityHandler.php (parser version thing got bumped) [01:24:26] Oh, interesting [01:24:41] That will make the alternate links propagate faster [01:24:51] ah [01:26:42] alright, time to eat [01:27:17] Thanks for all the hints :) [01:27:30] sure [01:27:43] good luck with deploy [01:28:15] Should merge your patch and get that ready today [01:28:43] if you want (or can let addshore look) [01:28:55] anyway, fooooood :) [02:10:54] * hoo should call it a day [09:45:18] multichill: https://www.wikidata.org/w/index.php?title=Q20113704&curid=21786519&diff=225338140&oldid=224713282 [09:48:07] benestar: jdlrobson: in the case of that method rename I do not think the issue is how we deal with deprecation, simply that none was done in this case. [09:48:13] We could have kept the old name for some time [09:48:27] Or we could simply not have made the rename... [09:58:40] jzerebecki: you've added this to the sprint: https://phabricator.wikimedia.org/T103626 is that something we are still going to do? [10:29:25] when Wikibase client will be deployed in Wikis? [10:37:38] benestar: https://www.mediawiki.org/wiki/Wikidata_deployment [10:39:12] Tobi_WMDE_SW: yes we need to resolve that task [11:57:24] Lydia_WMDE: still in a meeting.. :( [15:55:07] addshore: https://phabricator.wikimedia.org/T104273 [15:55:18] what do you think? [16:06:24] blergh [16:20:26] addshore :P [16:29:35] benestar: you have to put the NyanCat on the unlocked laptop next to you - for great justice! [16:29:45] url? [16:34:53] benestar: you can also "telnet nyan.danno.us" on the cli ;p [16:35:01] *dakko.us [16:38:22] benestar: hey, have some questions for you about https://phabricator.wikimedia.org/T90115 [16:38:43] benestar: specifically, about deleting properties - what exactly happens when property is deleted? [16:39:39] SMalyshev: I don't know, I just stumbled about that query which showed me a property that got deleted [16:40:06] okay then... let's see [16:40:44] SMalyshev: did you execute that query? it still shows me the deleted property [16:41:16] and for oversighted stuff, we will perhaps need to have access to the suppress log? [16:41:17] addshore: http://thebosshoss.com/flames/ [16:41:21] benestar: the query just times out for me [16:41:55] benestar: what is it supposed to do? [16:42:07] SMalyshev: try it again, sometimes it works, sometimes it times out [16:42:16] it finds duplicate labels [16:43:00] benestar: on other DB, it just returns nothing [16:43:07] benestar: what did it produce for you? [16:44:16] property alias occurences [16:44:16] wd:P1926 [16:44:16] * [16:44:17] Vaccine Ontology ID [16:44:18] [16:44:18] 2 [16:44:19] wd:P1928 [16:44:23] * [16:44:23] Vaccine Ontology ID [16:44:25] [16:44:27] 2 [16:44:29] ups :S [16:44:31] SMalyshev: see above [16:45:04] benestar: hmm... maybe it wasn't just up-to-date, because on other db I can't find it... let me see the main db [16:46:16] the main one is slightly behind because of the labs mess compounded by the fact that it run out of diskspace recently which further messed it up... I probably should just reload the db there from dump because diskspace fisco probably missed some updates [16:48:57] speaking of running of diskspace, it have generated a 8G log again :( I really need to fix logging there [16:50:12] benestar: so that's probably the cause, something broken with updates [16:50:23] Caused by: java.net.UnknownHostException: www.wikidata.org [16:50:50] hmm, ok [16:51:04] grrrr... something very weird is going on there... [16:51:10] so it's an issue in the update script, that really shouldn't happen for oversighted user-private data [16:51:28] benestar: we don't have any user-private data in the DB AFAIK [16:52:06] benestar: unless you create a property and dump there your bank statement as a description there's no private data in the db [16:52:06] SMalyshev: there can be sensitive data be added as labels and they are in the db [16:53:03] benestar: that's right but those are picked up by updates. If updates break, there's no way but to fix the updates - we can't have anything better, since it could break the same way updates could [16:53:48] benestar: but the assumption is if you add something to the wikidata public db, it is public on the internet, so you can't really expect it to be gone [16:54:19] anybody can crawl/sync with wikidata and they can store whatever was there forever [17:21:49] SMalyshev: sure that everyone can crawl wikidata but we don't want to provide this on Wikimedia servers [17:22:07] this is a problem with the privacy policy and the legal team [17:22:21] benestar: we have the dumps for months back [17:22:28] which are on Wikimedia servers [17:22:59] not to talk about the elasticsearch DBs that also sync from the same source [17:23:08] and also not updated instantly [17:23:34] hmm, I see. So that's a general problem :S how is it handled in dumps? [17:23:51] so I don't think it's realistic to expect once you deleted info from main Wikidata DB it instantly vanishes from every information source over Wikimedia servers [17:24:05] benestar: I don't think it's handled in any way right now [17:25:00] well, so it shouldn't be a specific problem in blazegraph [17:25:19] removing information that was made public would be very hard unless you are ready to expect some delay [17:25:23] SMalyshev: so will there be regular updates from dumps to blazegraph to fix issues by the update script? [17:26:30] benestar: I'm not sure about this.. The problem is that dump is 2-3 days behind by the time it's done, so each time we reload from dump we set up for 2-3 days behind. Which takes about 2-3 days to catch up. [17:26:53] maybe if we get real HW the timeframe to catch up will be smaller but it won't be zero [17:27:22] so we _might_ still do it, and of course we'd do it as disaster recovery but we can't do it too often [17:27:25] I see, so there will just be inconsistencies we have to accept? [17:31:18] addshore: w00t?? the test works on sqlite but fails on mysql? :O [17:31:28] no idea whats thats about [17:31:35] working on another patch right now so will look back in a sec [17:31:48] I hope it is nothing serious... >.> [17:34:49] addshore: maybe we just skip that task on sqlite? [17:35:03] nooooo, it should pass everywhere! :O [17:35:08] also it passes on sqlite and fails on mysql [17:35:39] so mysql is broken? [17:36:53] yus [17:37:05] my local install is currently on sqlite, so everything passed for me ;) [17:37:20] but ill check it out again and test it locally on mysql too, but the failure makes no sense really.... [17:37:50] #somethingoddishappeninghere [17:53:19] well... this all make little sense ... :D [17:56:27] To the WBQ people: Please notice that I "froze" the state extensions tonight... if you need/ want any further updates in there, please tell me [17:56:34] * state of the [17:57:00] (Please ack if anyone is here) [18:18:30] hoo: replied on https://phabricator.wikimedia.org/T103626 [18:19:48] addshore: It was renamed [18:19:49] :P [18:20:05] Just because they didn't fix that bug on master, doesn't mean we don't need to fix jenkins [18:20:07] yus, but the use of the evn var is also gone [18:20:13] Yes [18:20:16] but that's a bu [18:20:16] g [18:20:25] w/o the env var we can't use that in production [18:20:33] so it will need to be added before that hits production [18:20:42] so that 'hack' line that is used everywhere in wikibase should be added back [18:21:07] Yes, it is very much needed [18:21:22] It only doesn't explode in Wikibase, because we don't use phpunit on our maint. scripts [18:21:36] s/in/for/ [18:22:06] do you want to file an issue for adding MW_INSTALL_PATH back then? [18:22:36] Have you removed it from anywhere in Wikibase? [18:22:51] hoo: afaik your deploying WDQConstraints to test tonight... [18:23:02] Yes, but the v1 brnach [18:24:03] no I have not removed it from anywhere ;P [18:24:15] $basePath = getenv( 'MW_INSTALL_PATH' ) !== false ? getenv( 'MW_INSTALL_PATH' ) : __DIR__ . '/../../../..'; is literally everywhere :D [18:24:41] And that is needed [18:24:56] production sets MW_INSTALL_PATH [18:25:06] Can you comment on IRC what we just talked about? [18:25:16] I'll open a bug re fixing master then [18:27:22] hoo: opened a bug already [18:27:36] https://phabricator.wikimedia.org/T104364 [18:27:44] with this other bug as a blocker [18:28:03] Nice :) [18:28:07] Thanks [18:32:51] addshore: grep for MW_PHPUNIT_TEST in TermSqlIndex.php [18:41:07] hoo: patch is up [18:42:51] Awesome :) [19:26:58] benestar: https://phabricator.wikimedia.org/T73170 [19:34:40] addshore: Do we need https://gerrit.wikimedia.org/r/221876 deployed? [19:51:02] benestar: https://github.com/wmde/WikibaseDataModelSerialization/pull/149 is fine with me [20:28:30] anyone of the HPI team working on Wikidata Quality here? [20:31:15] Lydia_WMDE: yep :) [20:31:40] \o/ [20:31:58] soldag: :D so many nicks... i keep forgetting which ones you all use [20:32:00] hoo: ^ [20:32:06] hi soldag :) [20:32:52] I took the extensions at the tstate they were in tonight [20:32:55] is that good to go? [20:32:56] soldag: ^ [20:33:09] So anything merged today is not in there [20:34:28] For WikibaseQuality we made a fix that removes the violation table today. This should be included, I think [20:34:33] soldag: I don't need to create the 'wbq_violations' table, right? (As per the email earlier) [20:34:51] Link? [20:34:57] hoo: yes, for v1 this is not required [20:35:20] hoo: https://gerrit.wikimedia.org/r/#/c/221853/ [20:35:22] Do we need to do an actual code update for that, or is it good enough to just not create the table right now? [20:36:11] not creating the tables right now is enough [20:36:11] Ok [20:38:46] schema change applied [20:39:06] hoo: there are more changes not included in the build [20:39:20] Important ones? [20:39:23] no idea [20:39:30] Well, we can just play it safe and update the build [20:40:13] sjoerddebruin: Bot removed the dupes? [20:40:26] multichill: ? [20:40:31] hoo: Quality changes seem not important [20:40:37] You pinged me with a diff [20:40:47] Yeah, the bot was doing it bot stuff. ;) [20:40:53] It's now on the ignore list. [20:41:51] hoo: QualityConstraints are only localisation updates [20:41:57] jzerebecki: of which changes do you think? can you link them? [20:43:01] soldag, hoo: sorry didn't look at the branch, it is only https://gerrit.wikimedia.org/r/221853 the schema change [20:43:39] jzerebecki: alright :) [20:43:41] Ok, in that case it should be fine [20:47:52] Ok, extensions are there: https://test.wikidata.org/wiki/Special:Version [20:48:04] But the wbqc-desc message doesn't exist [20:48:29] Should create a bug about that [20:48:33] but nothing to worry for now [20:49:33] hoo: it was already on beta, there is a task that is already smelly ;) [20:49:59] sol/win 6 [20:50:31] hoo, jzerebecki: we will definitely add a fix soon ;) [20:50:57] hoo, soldag: is anything else happending now, like loading the table, looking at shown constraints or are we done? [20:53:49] I think we're done with enabling the extensions [20:54:01] Yes, I think we should import the table [20:54:12] hoo, jzerebecki: constraint table should be filled with data. are you on that, hoo? [20:54:34] I'm on that [20:54:41] hoo: nice :) [20:56:25] Running the csv generator thing [20:57:35] soldag: How many rows do we expect in that csv? Do you know that? [20:59:27] hoo: should be 17 of nobody has added more constraints to properties on test ;) [21:01:24] Ok, the script is almost done [21:02:03] yeah, got 17 [21:02:32] hoo: perfect :) [21:09:34] soldag: Ok, rows inserted [21:09:36] Please test [21:10:15] hoo: violating kittens https://test.wikidata.org/wiki/Special:ConstraintReport/Q22 [21:10:24] hoo: should be add the i18n-fix now, is it not necessary to bring this fix on test? [21:10:31] hoo: thank, we will do :) [21:10:48] so works for me [21:11:02] Nice :) [21:11:18] soldag: Please fix that, but I don't think it's worth a backport on its own [21:11:29] heh, nice "s in the coordinates there [21:11:50] nikki: Double escaping all the way [21:12:05] double secure :D [21:13:03] just make sure not to remove both when fixing ;) [21:13:41] nikki, hoo: escaping issues are known and partly fixed, but not merged yet :D [21:13:49] jzerebecki: we will ;D [21:13:57] Ok, make sure to show the change to Chris [21:15:02] soldag: looks pretty cool :) [21:15:21] the tooltip should perhaps not overlap the line but be moved a bit top or down [21:15:28] *up [21:15:43] I wonder if the tooltip would make more sense in the constraint column [21:16:07] (since it's talking about the constraint) [21:16:20] and the "entity id" input field should be prefilled if an id has been entered through the url [21:18:08] Someone should make bugs of that (if they don't yet exist) [21:19:39] but more important than these bugs is to get the other extension deployed and then the master branches [21:19:45] soldag: are you also aware of the escaping issue in the tooltip on https://test.wikidata.org/wiki/Special:ConstraintReport/Q21? (just checking since it's in a different place :)) [21:20:36] thanks for the feedback so far. Feel free to add bugs or feature requests to our phabricator project at https://phabricator.wikimedia.org/project/sprint/profile/1202/ [21:21:08] * benestar will create some tasks [21:21:14] nikki: yes, these are also known, but not yet fixed [21:21:38] ok, good [21:21:44] jzerebecki: yes, that has definietly a higher priority [21:27:28] the min/max seem to be missing from the details when expanding the range constraint too [21:27:32] They should included Wikidata. https://gerrit.wikimedia.org/r/#/c/220970/ [21:30:30] nikki: strange. we will check that [21:30:37] * hoo looks for aude [21:52:23] Lydia_WMDE: in here? I managed to create the query, was too simple... [21:52:38] benestar: jep here [21:52:40] :D [21:52:58] just gives timeouts when ordering by language :S but filtering works! [21:53:50] ups, forgot to filter out items :P [22:06:13] Lydia_WMDE: there is some strange behaviour in the SPARQL endpoint :S [22:06:26] benestar: what is it? [22:06:50] gives me duplicate results although it shouldn't [22:07:01] so I basically say find all ?x that have label ?label and all ?y that have ?label [22:07:07] and then I tell it FiLTER (?x != ?y) [22:07:21] but somehow, there ?x and ?y still are mostly the same [22:08:20] DanielK_WMDE: benestar: jzerebecki: Please have a look at https://gerrit.wikimedia.org/r/222017 [22:08:28] Would like to get that deployed to tetswikidata tonight [22:08:49] hoo: "hit that on testwikidata" <- can you give a link? [22:09:12] Nah [22:09:19] Saw it on the fatal logs in fluorine [22:09:29] I need backtraces, that's why I made that an exception [22:09:34] And request URLs [22:11:12] I'm not sure if we should throw an exception [22:11:35] becauset there are already places in that class which check if $entityId !== null and only then do something [22:11:54] so maybe there are situations where we want to have a parser output for a not yet safed entity? [22:13:05] I can't think of any [22:13:31] entity creation doesn't do that [22:13:55] hoo: we thought about a special page which simulates an item, that would need such a thing [22:14:05] but currently, you're right [22:14:09] does someone else want to have a look at it? [22:14:47] Mh... I could just ignore if it is null [22:15:00] but I fear we create more magically weird behavior that way [22:16:33] benestar: Mh... let's log a warning and see how often we hit that [22:16:59] hoo: i think we needed id-less entities for tests somewhere... [22:17:24] DanielK_WMDE: PHPunit tests run through [22:17:37] Did we purposefully create such entities on testwikidata? [22:17:42] maybe we fixed these tests at some point :) [22:17:47] I did [22:17:57] no, it should not be possible to save an entity without an id. [22:18:06] So... throw? [22:18:37] I'm ok with throwing [22:18:53] hoo: well... conceptually, there is no reason for the EntityparserOutputgenerator to require an entity to have an ID, right? [22:19:08] EntityView supports entities without ids [22:19:17] That thing doesn't really have a concept, I think [22:19:20] We could/should verify when saving, and when loading. but for rendering... why is it a requirement? [22:19:21] it just does random stuff [22:19:32] DanielK_WMDE: We need it to generate alternative links [22:19:32] hehe :) [22:19:47] hoo: so skip them if there is no id [22:20:09] i mean, I don't see a big problem with throwing an exception [22:20:23] i just feel it's needlessly restrictive [22:20:36] Worth doing a warning or do we just not care? [22:21:34] i think we shouldn't care *there*. we should probably scream murder if we load an entity from the db and find that it doesn't have an id. [22:21:43] *that* shouldn't happen [22:21:57] yeah, fair enough [22:22:09] we should still show it, ifwe can, or force the id, if we know it (we probably do in that context), and log a warning [22:23:50] Lydia_WMDE: I split the queries into three and created a task on phabricator [22:23:50] thinking about it, we should also verify that the entity we loaded has the *correct* id after loading. [22:23:51] have a look [22:24:10] DanielK_WMDE: Think we do that already [22:24:22] hoo: once we have the alt links live, i'd love to see them exposed somewhere [22:24:31] throw new BadRevisionException( "Revision $revisionId belongs to $actualEntityId instead of expected $entityId" ); [22:24:39] a link in the sidebar, or when clicking on the (Q123) id next to the title label [22:25:13] hoo: then, how did you end up rendering an entity without id on testwiki? how did it ever get past that check? [22:25:28] DanielK_WMDE: I have no idea where that came from [22:25:38] that's why I wanted to throw an Exception... [22:25:49] to make sure we get the stacktrace + the request URL [22:25:54] for investigation, log a stack trace [22:26:57] hrmm... didn't we have a function for that? [22:27:26] hoo: just hot patch it ;) [22:27:55] EntityParserOutputGenerator [22:28:02] blergh [22:28:03] wfLogWarning( [22:28:03] "Encountered an Entity without EntityId in EntityParserOutputGenerator.\n" . [22:28:03] wfDebugBacktrace() [22:28:19] benestar: thanks! [22:28:20] meh, that's an array [22:28:44] Lydia_WMDE: fixed some grouping stuff so now the result should be actually usable [22:28:54] cool [22:29:00] too tired to be useful there atm [22:29:07] will look more tomorrow morning [22:29:41] good idea :) [22:29:43] and gn8 Lydia_WMDE ;) [22:32:55] DanielK_WMDE: Core uses Exception::getTraceAsString for this :P [22:41:13] DanielK_WMDE: benestar: https://gerrit.wikimedia.org/r/222017 Amended [22:41:34] Guess that's good for now... if we see these a lot, we can add the trace or whatever [22:41:36] later on [22:42:21] hoo: will merge when jenkins approves [22:42:35] Thanks :) [22:42:52] why does this evil `Entity::getLabels` stuff still exist? [22:43:11] I want to add `Item::getLabels` which returns a `TermList` but that would be a breaking change then :S [22:44:40] gaaaaaah [22:44:53] we will just have to make another breaking release I think... [22:45:03] those methods have been deprecated for such a long time now already [22:46:39] * benestar -> bed [22:46:46] gn8 hoo DanielK_WMDE Lydia_WMDE :) [22:46:53] good night benestar [22:52:38] could someone protect https://www.wikidata.org/wiki/Q22686?