[00:08:37] Wikibase Quality Constratins no longer works with latest Wikibase.git... [08:55:55] crap [08:56:30] addshore: around? [08:56:34] ya [08:56:49] crap does not sound good .... [08:56:59] Our dumps are apparently incomplete [08:57:02] and I have no clue why [08:57:09] O_o json ones? [08:57:12] is the entity paging broken [08:57:13] yus [08:57:21] hmmmmmm [08:58:41] any indication as to why? [08:59:35] Nothing, no [08:59:41] I recently touched it https://gerrit.wikimedia.org/r/#/c/226509/5 [09:00:12] but doesnt look like that would have broken anything [09:00:26] I'm puzzled [09:00:39] Is this not something to do with the dumps.wikimedia.org outage? [09:00:47] *finds on the mailing list* [09:01:25] http://www.gossamer-threads.com/lists/wiki/wikitech/615028 [09:01:28] I thought so initially [09:01:39] So I deleted the first dump and recreated it after the outage [09:01:47] but still almost 400k entities missing [09:01:58] hmmm [09:02:03] how are you judging that? [09:02:21] I hope your expected count doesnt include redirects ;) [09:02:35] addshore: Looking at the logs to see how many entities were dumped, comparing to SELECT COUNT(*) FROM wb_entity_per_page; [09:03:12] and wb_entity_per_page doesn't contain anything about redirects right? :) [09:03:17] well... thats slightly worrying [09:03:34] it's also smaller (in size) than the previous one [09:03:58] which is a bad sign... unless gzip randomly got better [09:04:18] oh, entity per page has redirects [09:04:38] ok, so after counting them out we have a difference of 3478, which seems sane [09:04:46] but why did the file size decrease [09:04:50] hmmm [09:05:09] well, a decrease in file size could be explained by me breaking the seralization somehow... but I shouldnt have... [09:05:18] The number of entities dumped increased [09:05:41] find the same entity in the 2 dumps and compare the serialization? [09:07:29] Yeah [09:07:36] Looking for a simple one now [09:07:40] Q9725323 [09:08:53] im betting if there is a difference its either hashes or empty keys [09:09:33] or datatypes, I believe they are there... but that probably should make the dump bigger... [09:20:22] got it [09:20:39] what's the difference then? [09:20:44] data types [09:20:48] am preparing a patch [09:21:12] so the datatypes are not appearing? O_o [09:21:15] at all? [09:21:20] yeah [09:21:49] *facepalm... whut [09:22:20] where should I report issues with the constraint reports? (the wikidata:database_reports ones, not the new special page) [09:24:11] nikki: Find the author and use their talk page [09:24:17] or the talk page of the report in question [09:26:17] it's not a particular report, it's https://www.wikidata.org/wiki/Template:Constraint:Source which doesn't seem to do anything [09:27:35] hmm. looking at https://www.wikidata.org/wiki/Property_talk:P143 it links to a report which claims there are no constraints, so maybe the bot updating it doesn't even know it exists [09:27:36] addshore: Do we even have a serializer for that? [09:27:47] for the datatypes stuff? [09:27:51] yeah [09:27:53] its all dont in result builder currently [09:27:56] *done [09:28:01] oh crap [09:28:11] so we would need to invoke that for the dump generation? [09:28:18] yup... [09:28:27] That's going to be slow as hell, I presume [09:28:28] meh [09:28:50] *double checks* [09:28:51] Can't we subclass SnakSerializer? [09:29:12] oh wait no [09:29:19] it uses the entityserializer directly... [09:29:43] okay, its very obvious why the datatypes are not in there... [09:31:11] addshore: is it intended that hashes of snaks are no longer stored in the database? [09:31:27] benestar: ? [09:31:28] all edits show a removal of bytes, even if labels are added, because the hashes get removed [09:31:44] see https://www.wikidata.org/wiki/Wikidata:Contact_the_development_team#Size_calculations_in_history [09:31:59] https://www.wikidata.org/w/index.php?title=Q90315&type=revision&diff=238697598&oldid=214130711 shows -1 byte [09:32:19] >.> I havn't touched anything that writes to the db... [09:32:58] That sounds scary [09:33:08] like UBN! scary [09:33:24] addshore: well, you introduced the new options to not include hashes [09:33:36] benestar: yes but the default was as before [09:33:36] so perhaps they somehow get set for the database storage [09:33:55] hoo: what hash / version of wikibase is currently deployed? [09:33:58] How much of a problem is that? [09:34:02] wmf16 [09:34:16] *goes to see what is in* [09:34:47] so this broke the datatypes in dumps https://github.com/wikimedia/mediawiki-extensions-Wikibase/commit/c6c4a5649ac1752a001bf24d65c58934ecb66c8e [09:35:59] has someone actually confirmed hashes are being lost? *goes to test localy*... [09:36:40] addshore: yes, I tested and the hashes are no longer present in mw.config.get( 'wbEntity' ) [09:36:54] in mw.config.get( 'wbEntity' )... thats no the db though.. [09:37:00] yeah [09:37:31] well, how else should the bytes get removed? [09:37:52] no idea, no longer seirializing empty keys or something [09:38:08] could be anything that change in DataModelSerialization [09:38:18] maybe the empty keys =o [09:38:32] I'm going to double check [09:38:50] but as I said I'm not aware I touched anything that writes to the db.... unless something is doing something dumb [09:41:21] addshore: after thinking more about it, I guess that removing all hashes would've caused more removed bytes [09:41:32] but I cannot check atm what's going on there [09:41:39] yeh, but it could just be a hash from one place, such as mainsnak [09:41:41] im looking now [09:42:06] thanks :) [09:42:09] {"type":"item","id":"Q6","labels":{"en-gb":{"language":"en-gb","value":"adam"}},"descriptions":{"en-gb":{"language":"en-gb","value":"is a [09:42:09] person"}},"aliases":{"en-gb":[{"language":"en-gb","value":"shorl"}]},"claims":{"P2":[{"mainsnak":{"snaktype":"value","property":"P2","hash":"255c75b069dce330a5df677c6aa3c5cc635ca918","datavalue":{"value":"AAA","type":"string"}},"type":"statement","qualifiers":{"P2":[{"snaktype":"value","property":"P2","hash":"1565b258cb9294f24c368576fd8505098020cd48","datava [09:42:09] lue":{"value":"BBB","type":"string"}}]},"qualifiers-order":["P2"],"id":"Q6$932c8286-4e21-ed36-ef65-bca8420cdd0f","rank":"normal","references":[{"hash":"fc689cbea65766d86deddc4a6a2bfed9480c294c","snaks":{"P2":[{"snaktype":"value","property":"P2","hash":"c9f2e3410510828d760b609952c682c9f9648015","datavalue":{"value":"CCC","type":"string"}}]},"snaks-order":["P2 [09:42:09] "]}]}]},"sitelinks":[]} [09:42:12] Lydia_WMDE: https://wikimania2015.wikimedia.org/wiki/Submissions/Pluricentric_languages_and_Wikipedia_-_where_to_draw_the_line..._and_where_not_to [09:42:35] so benestar hoo all hashes are still there on master [09:42:37] checking branch now [09:42:54] the json from the DB for that specific page looks sane a swell [09:42:59] but might be that I'm missing something [09:43:35] hoo: does it have the key "aliases" ? [09:43:40] because there are no aliases [09:44:24] "aliases":[] [09:44:26] yes [09:44:27] hmm [09:44:55] hoo: can you get the json from the db for that revision and the one before and diff it? [09:46:58] addshore: want to join the daily? :) [09:47:02] Lydia_WMDE: ya! [09:48:02] addshore: coming? [09:48:25] yup [09:48:27] addshore: Got it [09:49:31] what was it? :O [09:50:02] * hoo wtfs [09:50:26] * addshore also wtfs [09:50:42] {"time":"+00000002014-04-24T00:00:00Z","timezone":0,"before":0,"after":0,"precision":11,"calendarmodel":"http:\/\/www.wikidata.org\/entity\/Q1985727"} [09:51:00] ? [09:51:08] became [09:51:09] {"time":"+2014-04-24T00:00:00Z","timezone":0,"before":0,"after":0,"precision":11,"calendarmodel":"http:\/\/www.wikidata.org\/entity\/Q1985727"} [09:51:26] hmmmm, I think thats okay [09:52:06] Thiemo_WMDE knows about that, I guess [09:52:07] or DanielK_WMDE [09:53:26] *goes to find the patch* [09:54:44] hoo: https://github.com/DataValues/Time/commit/84e8846b22424383ba39f55db9dc92bd43a5ebce ? [09:54:53] I don't believe it is anything to worry about [09:55:21] addshore: Ok [09:55:29] I looked at the whole thing and that's the only difference [09:55:34] will comment on Wiki [09:55:45] awesome [09:56:04] right, I'll go come up with something for the dumps hoo ! [09:56:11] unless you want to ;) [09:56:45] You're far more into that, so please go ahead [09:56:53] and make it not suck performance wise [09:57:05] we have to iterate over 18M+ entities there [09:57:27] indeed [09:58:49] Shall we recreate the dump again or is the current one good enough(tm) [09:59:12] good enough(tm)(c)(r) [09:59:32] tbh, this is dumb [09:59:33] it's not my opinion, I'm just improving the syntax! [09:59:56] we shouldnt bother adding the datatpyes in the dump... I mean the proerties are in the dump right, with the datatypes.. [10:00:13] dumb stuff is dumb(tm)(c)(r) [10:00:27] :p [10:00:39] addshore: I think we explicitly added them at some point to make the data as easy to use as possible [10:00:42] and I can see the point [10:13:05] well hoo this is the easiest patch https://gerrit.wikimedia.org/r/#/c/229098/ [10:13:22] trying to fix how we add the datatypes to the serialization might be the next thing I look at.. [10:13:24] all so ugly.. [10:15:10] ewkkkkk [10:15:41] ewkkkkk indeed, I could probably make it mildly more efficent by just having a massive pile of foreach loops... [10:16:10] massive pile of loops... that sounds both fast and sane [10:16:15] Sorry, couldn't resist :P [10:16:19] :P [10:16:26] Can't we just subclass the SnakSerializer? [10:16:45] well, we already have TypedSnak [10:16:53] and TypedSnakSerializer [10:17:12] the issue is there is currently not really a way to use these when serializing a whole entity [10:17:50] I touch on this in my comment at https://github.com/wmde/WikibaseDataModelSerialization/pull/162#issuecomment-126610771 [10:18:43] addshore: dispatch based on an instanceof check? [10:19:15] TypedSnak doesnt extend Snak though, so it cant actually be part of an entity currently [10:20:13] thats basically the reason this thing is not used yet [10:20:56] well, typedsnak probably shouldn't really be part of an entity, ever, right? [10:21:19] Maybe intermediate, though... mh [10:22:41] well, if its never part of entity then we have to loop over the entity serialization reseirlaizing every snak using a typed snak and overwriting that part of the array, which is probably even worse than what is currently being done :P [10:23:22] hoo: addshore: can you please see markus' email [10:23:34] Where? [10:23:56] wikidata list [10:23:57] Lydia_WMDE: awesome, will patch that too [10:24:30] thx [10:24:35] addshore: will you reply? [10:24:43] I will do once I patch! [10:24:48] k [10:24:56] ewk :/ [10:25:05] Guess we need to backport and push out a new dump today, then [10:25:07] yikes [10:25:18] yeah [10:25:19] hoo: the fix for that email is easy (its an option) [10:25:32] or should be >.> [10:27:43] I may add some more integration tests for the json dumps stuff >.> [10:27:55] Yeah :S [10:28:05] We only have some basic ones [10:28:14] that test that entities appear in the dumps and such [10:28:23] but not really specific serialization tests [10:31:07] hoo: https://gerrit.wikimedia.org/r/#/c/229099/ [10:31:59] that should be the fix for the email issue [10:32:02] addshore: That was to easy :D [10:32:06] yeh >.> [10:32:31] also https://gerrit.wikimedia.org/r/#/c/229100/ for the branch [10:33:28] Yeah, will merge once we have both patches [10:35:13] yup [10:35:28] well, right now the most efficent thing for the branch for the datatypes would be a big loop [10:35:36] of loops of loops >.> its the only way... [10:35:55] using ResultBuilder would add even more unwanted overhead [10:36:02] addshore: merci :) [10:36:13] or the code in the patch now, which is a CP from ResultBuilder but doesnt come with all the other crap in there [10:36:38] addshore: Do you know how much slower it will be? [10:37:15] well, I did do a benchmark on some of the stuff in ResultBuilder and though slower it was only about 25 to 50% slower, and that was doing more stuff [10:37:35] Ok, that's not nice but... well :/ [10:37:44] Can we open a bug about this, plesae? [10:37:47] ideally we want a proper fix fr the next branch.... [10:38:25] +1 [10:39:59] I think that basically means more discussion about the TypedSnak thing, the Slotty stuff and the other Derived thing [10:40:47] bah, cant CP https://gerrit.wikimedia.org/r/#/c/229098/ to the branch, will have to manually do it in a sec! [10:41:00] k [10:43:08] hoo: https://gerrit.wikimedia.org/r/#/c/229102/ [10:45:34] oh hoo https://integration.wikimedia.org/ci/job/mwext-Wikibase-repo-tests-mysql-hhvm/3363/console the things the fix uses are not in the branch ;) [10:46:50] * hoo sobs [10:47:04] Can we backport those easily? [10:47:11] yeh [10:47:41] well, Ill C&P them into a patch, the patch that introduces them does other stuff too [10:47:43] gimmie a sec [10:55:40] hoo: so https://gerrit.wikimedia.org/r/#/c/229104/ and https://gerrit.wikimedia.org/r/#/c/229102/ (tests just running now) [10:58:40] addshore: PropertyDataTypeLookup [10:58:45] That one is in DM services [10:58:58] yup, will change the namespace [10:59:08] ok [10:59:25] ah, here comes jenkins to complain [10:59:39] :) [11:00:06] welll, removing 6000 lines, adding around like 2000 and only broke the json dumps so far ;) [11:00:51] * addshore really thinks we should split out unit and integration tests sometime soon [11:02:26] Lydia_WMDE: https://dumps.wikimedia.org/wikidatawiki/entities/dcatap.rdf \o/ [11:02:46] \o/ [11:02:50] -> food [11:03:14] Lydia_WMDE: :D [11:03:18] Shall we announce that? [11:18:22] hoo: where will I find that debug log I added yesterday in logstash? :) Or do you want to pastbein be a grep of the logs? ;) [11:18:46] I can look at it on fluorine, one second [11:18:57] epic! [11:19:02] uh, nasty [11:19:09] Your backtraces includ arguments [11:19:19] Which means that I have Parser doms dumps in ther [11:19:19] e [11:19:25] >.> [11:20:08] Do you want to fix taht and include a human readable backtrace? [11:20:14] Those are truly unreadable [11:20:30] We can backport that with the other things [11:20:58] HAH, stupid mw functions, which should I use? xD [11:21:16] perhaps wfGetAllCallers instead of backtrace [11:21:49] wfBacktrace( true ) maybe? [11:22:46] yeah, that looks sane [11:24:20] hoo: https://gerrit.wikimedia.org/r/#/c/229108/ [11:25:06] Is that code on master as well? [11:25:14] oh yeh >.> [11:25:15] If so, please fix it there as well or remove the stuff from aster [11:25:26] hoo: https://gerrit.wikimedia.org/r/#/c/229109/ [11:26:03] Ok, I think we have everything for the backport in place now [11:26:06] don't we? [11:26:09] yup! [11:26:20] * addshore will be back in 30 mins [11:26:25] FYI https://phabricator.wikimedia.org/T107865 [11:26:53] yup, I saw! :) [11:45:52] Deploying the backport is going to take a bit... need to wait for codfw to recover [12:33:57] aude: ping [12:34:44] pong [12:34:59] Wikidata is live patched on tin? [12:35:07] + RequestContext::getMain()->getStats()->increment( 'edit.failures.session_loss' ); [12:35:09] by ori afaik [12:35:11] yeah [12:35:17] i just stash and put it back [12:35:33] * aude prefer he not live patch, but whatever [12:35:54] Indeed [12:36:02] and tell us :) [12:36:15] but i know what it is [13:00:40] addshore: dumpJson on the branch fatals [13:00:42] :S [13:00:47] Did you test it? [13:00:47] ... [13:01:44] I thought I did... *goes to look at the branch* [13:02:32] whats the fatal? I'm going to clap myselt if I tested after I reset my branch... [13:03:43] addshore: http://fpaste.org/251375/43869341/ [13:04:09] oh WTF BLERGH [13:04:15] gah! [13:04:59] yeh, so I tested each patch individually... but not them together... [13:05:13] and only when together will they fatal.. ffs.. [13:06:06] right, guess I need to fix this too...... [13:06:22] definitely :/ [13:06:36] this stupid datatype stuff [13:08:03] Why didn't the tests catch this? :S [13:08:21] I dont know, the gate tests on the second patch should have... [13:08:47] and on the branch as well [13:08:53] uhm build, I mean [13:10:00] oh, so JsonDumpGenerator is tested, but doesnt test the options passed to the serializer [13:10:14] and then of course the maint script is not tested itself, which is where the options are passed in [13:14:25] hoo: cant you generate the dumps using the script on the old branch? ;) [13:14:48] not without downgrading Wikidata [13:14:54] and I doubt taht's a good idea [13:15:01] or hacking up multiversion or so [13:15:07] but that's all super scary and dangerous [13:15:36] so annoying as each fix works individually :P [13:16:11] Can't we have the callback just ignore empty objects? [13:17:56] thats not the issue, I just have to do a comparison... if is_array() then $foo['bar'] else $foo->bar [13:18:18] well, why should we ever get non-empty objects in there? [13:18:39] Or do we have all of them as objects in case we pass that option? [13:19:40] or you could just cast to array, not sure how much performance we loose that way, though [13:20:18] hoo this fixes the fatal, but is making me wonder about if this is still right now... [13:20:21] https://www.irccloud.com/pastebin/0W8p0bRb/ [13:20:39] that thing wants me to log in [13:21:00] hah really.... [13:21:32] http://pastebin.com/D3MvjjNb [13:21:48] should also need to do a similar check in getCallbackToAddDataTypeToSnaksGroupedByProperty .... [13:24:00] but would have to do some dumb stuff with get-object-vars for that... [13:24:36] :/ [13:24:55] but that would stop the fatals..... but its so ugly [13:25:20] That's why we only apply it on the branch [13:25:22] the other way is dont pass the option, so get arrays, and then loop through it and convert empty arrays to a new stdobject... [13:26:13] mh ok [13:26:26] which do you prefer? [13:26:26] that sounds somewhat scary [13:26:57] Both are equally scary I think... maybe modifying the callbacks is a little less [13:27:05] but not a strong opinion at all [13:28:28] going to go get a drink and think :) [13:29:12] if it is only top level things such as sitelinks, and statements etc that could be stupid empty arrays then the second option sounds best [13:29:33] if hoever we need to also care about empty badges arrays for example I think the first option of modifying callbacks is probably best [13:30:36] Ok, yes [13:30:45] so I'd say modify the callbacks then [13:45:28] hoo: want to verify it? ;) https://gerrit.wikimedia.org/r/#/c/229125/1 [13:47:11] Looks good [13:47:15] but I guess I should test that [13:47:53] yeh, the json dumping has never totally worked locally for me [13:48:08] That sounds good :P [13:48:51] yeh :P I always get the same invalid arg about a property id being null, but I know that means most of the script has worked ;) [13:50:46] [3f45a510] [no req] InvalidArgumentException from line 29 of /var/www/html/mwt/wiki/extensions/Wikibase/vendor/wikibase/data-model/src/Entity/PropertyId.php: $idSerialization must be a string; got NULL [13:50:49] Like that? [13:50:54] oh, interesting, yes [13:51:13] That script used to work for me [13:51:55] oh, the stack is different to what it used to be ;) [13:52:54] so the key is for some reason null... [13:53:20] $value->property or $propertyIdGroupKey, oh I really need to stop looking at this code [13:54:47] if ( !$value->property ) { return $value; } [13:54:52] with that it works for me [13:54:59] in getCallbackToAddDataTypeToSnak ? [13:55:03] yeah [13:55:14] use isset and stuff should be fine [13:56:34] right, that should be it then.... [14:05:14] hoo: im dashing out for a few hours now! Ping me and I should respond though! ;) [14:05:27] addshore: Is the fix ready? [14:05:40] https://gerrit.wikimedia.org/r/#/c/229125/ [14:05:41] afaik [14:06:31] and when I get back im going to write some integration tests for the script for master... [14:11:35] addshore: datatypes broken on the latest version for me [14:12:29] oh ffs, cant look at it now, but can do later! shouldnt be gone for more than 2 hours [14:12:40] this just 100% needs tests.... [14:19:56] addshore: It's even more fun than that [14:20:07] The data structure of the objects is totally different [14:21:06] mh [14:21:17] Seems getElementsMatchingPath doesn't do what it says [14:21:42] Hm, page moves of nlwiki are not processing it seems. [14:21:54] I've also renamed a page on etwiki and it changed instantly. [14:22:26] UpdateRepoOnMove Special:Badtitle/JobSpecification siteId=nlwiki entityId=Q3743857 oldTitle=Simon van der Aa newTitle=Jan Simon van der Aa user=Sjoerddebruin jobReleaseTimestamp=1438697762 (uuid=b0486d307067482384f8e42fd0c30372,timestamp=1438697760) status=delayed [14:22:37] that one? [14:22:40] Jup [14:23:28] I hope these are going to unstuck themselves [14:23:58] Can't see why this is a "Badtitle". Or is that not the reason? [14:24:08] no, that's not the reason [14:29:48] sjoerddebruin: Solved now? [14:29:54] jup [14:30:21] Renamed on 15, processed at 28. [14:30:36] With https://phabricator.wikimedia.org/T97909 that would probably not be a problem [14:32:34] All of SerializationModifier doesn't work with objects [14:32:36] * hoo cries [14:35:05] * hoo starts to doubt that we will be able to get this fixed today [14:38:21] addshore: badges have always been serialized as [] so your loop solution might work [14:45:37] I need a break now as well [14:45:56] Would be nice to get a solution, but might be to complicated to fix on the branch [16:40:03] hoo|away: back now [16:40:17] hoo|away: thats dumb that badges stay as [] ... ffs [16:40:53] addshore: Who cares about badges [16:41:06] nobody apparently [16:41:08] Let's try to at least get back to the state we had last week [16:41:23] yeh, will do, I'll ammend my patch for the branch [16:46:36] hoo|away: https://gerrit.wikimedia.org/r/#/c/229125/ runs and has datatypes and objects for empty elements [16:46:50] That sounds to good to be true [16:47:02] ;) [16:47:18] alright, let's try that [16:47:43] going to go patch master this this same thing now [16:47:51] I don't have much more time today [16:47:54] then start some email thread about what to do with this typedsnak stuff [16:48:00] ok [16:48:10] So probably wont be able to backport that properly [16:48:14] maybe aude wants [16:48:20] otherwise I can do early tomorrow [16:50:33] wow, that looks good [16:50:35] awesome [16:50:56] xD [16:57:04] I'll try to get it deployed very quick [16:59:42] :D [17:02:19] hoo: backporting? [17:02:45] yeah [17:02:59] I don't have much time left [17:03:07] I'll push it quickly and then start the dump [17:03:25] please make sure that Wikidata is not going to get downgraded during the train [17:03:36] (assuming you will be around for it) [17:03:38] ok [17:04:07] today is just test* [17:04:25] but was quite surprised to find test.wikidata on wmf9 last week :o when backporting [17:05:54] yeah, that keeps going wrong [17:07:01] :( [17:16:04] Output confirmed good in production [17:16:57] yay [17:17:24] Lydia_WMDE: aude: Shall I delete the dump from yesterday? [17:17:44] * addshore thinks we should [17:18:04] and hoo, yay! [17:18:46] I'll wait for Lydia's reply [17:18:51] will probably be back tonight [17:37:46] I'll leave the dump in place then [17:38:04] should it be removed, drop me an email or ping any op [17:41:17] agree with addshore but can wait for lydia [17:43:09] addshore: maybe see what different extensions/config that kowiki/idwiki have? [17:43:39] legoktm: will do, just got to go and find a different laptop charger as apparnetly my laptop is no longer charging.... [17:44:57] :| [17:46:14] got one! #win! [17:47:19] legoktm: saw something like uselang in one of the templates on kowiki [17:47:45] hm [17:47:46] dm4 1 of x: https://gerrit.wikimedia.org/r/#/c/229166/ so our build starts working again [17:48:10] https://ko.wikipedia.org/w/index.php?title=%ED%8B%80:%EC%9C%84%ED%82%A4%EA%B3%B5%EC%9A%A9%EB%B6%84%EB%A5%98&action=edit [17:48:21] no idea if it's an issue though [17:48:52] something in the lua module [17:49:40] no, probably not the issue [17:52:01] well aude I have a stacktrase with the whole parser object ;) so I should be able to figure out where it is coming from [17:52:36] * aude sees abusefilter [17:52:42] yeh, I also saw that [17:53:46] PPTemplateFrame_Hash: tplframe{\"1\":\"Nanchang Railway Station\"})"]},{"file":"/srv/mediawiki/php-1.26wmf16/includes/parser/Parser.php","line":3427,"function":"expand","class":"PPFrame_Hash","object":"[object] (PPTemplateFrame_Hash: tplframe{\"1\":\"Nanchang Railway Station\"})","type":"->","args":["[object] (PPNode_Hash_Tree: [17:53:46] #if:<template><title>#property:P373 [17:53:50] yeah [18:16:18] sjoerddebruin: Quite a few people with articles on nlwp, but no birth date.... [18:20:47] multichill: Yup, thanks to the bots that only import a link... [18:20:59] Working on reducing backlogs 24/7. :) [18:21:18] But as you know, I can't just focus on a task. [18:23:12] addshore: aude: ack on deleting [18:59:33] Lydia_WMDE: even if the dump is broken and we make a new one? [19:00:12] aude: ? yes let's delete it [19:00:14] ;-) [19:00:16] or? [19:08:29] Lydia_WMDE: https://phabricator.wikimedia.org/T107602#1507403 we are fine with wikidata-query.wikimedia.org instead of query.wikidata.org , right? [19:10:21] I personally don't care too much, though wikidata would be nicer [19:20:40] SMalyshev: jzerebecki: i'd very much prefer query.wikidata.org [19:20:45] * Lydia_WMDE reads ticket [19:21:22] SMalyshev: jzerebecki: should i comment? [19:21:41] Lydia_WMDE: i think you should [19:21:52] k [19:22:08] it just creates more security hurdles to have it under *.wikidata.org [19:22:34] Lydia_WMDE: I'd like that too, but if it's a big problem I'd rather have wikidata-query.wikimedia.org sooner than wait for a month or more [19:22:41] wikidata-query.wikimedia.org is pretty ugly [19:22:54] agree with both of you :) [19:23:02] I think security concerns are mostly imaginary (it's static content nginx, how much harm could it do?) [19:23:31] but maybe I'm missing something there [19:24:16] *nod* [19:24:33] SMalyshev: we want to have CORS enabled for the query service, right? [19:24:44] of course, somebody with access to the server ( (e.g. me) then would possibly be able to see the cookies, unless they are stripped by varnish [19:24:58] jzerebecki: yes I think so. GUI needs it for one [19:25:13] any JS GUI woud need it [19:25:41] I imagine... maybe there's a way around it [19:26:04] JSONP is ugly [19:26:31] so it makes it easier for others to have CORS enabled if there is no downside... [19:26:37] I'm not sure blazegraph does JSONP [19:26:47] ah yes probably not [19:26:58] and adding recoding is more moving parts which is complications [19:29:04] it seems like the topic namespace (from flow, I guess) is showing up in unconnected pages, is that known? [19:29:23] nikki: not known [19:29:46] wonder if this is something to fix on out side or flows [19:31:04] I can go enter it in phabricator anyway [19:31:09] thx! [19:38:55] https://phabricator.wikimedia.org/T107927 there you go :) [19:42:02] \o/ [19:43:19] anyway blazegraph won't be getting any cookies just nginx [19:43:28] nikki: are there other namespaces showing up that shouldn't? [19:44:32] I'll have to check [19:45:56] e.g. https://en.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=namespaces [19:46:55] https://gerrit.wikimedia.org/r/#/c/229197/ [19:52:18] hm... I can't see anything else that definitely shouldn't be there, although I'm not familiar with all namespaces in all wikis so there might be something still [19:53:00] the only other thing I'm seeing right now is "Draft" in enwiki, which is a bit questionable, but there's arguments both ways for that one [19:57:06] nikki: thought the same [19:59:48] I don't think we want drafts in wikidata... [20:02:32] yeah [20:07:42] added drafts to my patch [20:10:20] Hoo, aude, did everything get back ported properly? :)) [20:11:02] addshore: afaik, yes [20:11:06] * aude didn't do anything [20:11:56] Epic!! :) [21:35:22] !nyan | addshore [21:35:22] addshore: ~=[,,_,,]:3