[02:01:07] ( how ican write in kutchi language this language is not in menu what i do? ihave also kutcho font [02:01:47] axat: add the language in your babel box on your userpage [02:02:09] or add the gadget in your preferences to see all languages [04:18:28] Hi I had a Wikidata question [04:18:55] Created a Wikidata page but someone did a redirect on the Wikipedia page right after I created the Wikidata entry [04:19:29] so I wanted to make sure that the two entries are linked. I don't see the Wikidata item on the Wikipedia page's left side. I flushed my caches [04:20:37] Just wondered what the timeframe is for the Wikidata item to appear, and if it will understand the page changed name (not my doing) [04:20:44] https://en.wikipedia.org/wiki/Death_of_Alex_Nieto [04:20:48] https://www.wikidata.org/wiki/Q24044974 [04:21:12] The Wikipedia page was Alex Nieto until a little while ago until someone changed it [11:54:41] hello there! [11:54:41] I'm having issues with query.wikidata.org home page: [11:54:41] - some times it seems like frozen [11:54:41] - some times it returns an empty body: [11:54:41] $ curl https://query.wikidata.org [11:54:42] => curl: (18) transfer closed with 15042 bytes remaining to read [11:55:01] anyone having related issues? [11:58:19] i got a broken version here [11:58:26] https://www.dropbox.com/s/u8me2xbts06el7h/Schermafdruk%202016-05-11%2013.58.25.png?dl=0 [12:12:27] sjoerddebruin: yes, this kind of broken [12:12:27] ping Lydia_WMDE [12:13:13] maxlath: hey [12:13:30] query service issues? [12:14:07] Lydia_WMDE: yes, it seems that the home page can't load correctly [12:14:20] the /sparql endpoint seems ok though [12:14:38] -.- [12:15:15] gehel: ^ could you have a look maybe? [12:15:37] I'll check right now... [12:15:41] thank you! [12:18:01] of course, it's not completely random, it works when I try to reproduce the issue :P [12:19:07] maxlath: does the issue happen only while getting the home page? Or also on other resources? [12:20:32] gehel: it seems to be limited to the query editor [12:20:54] so a page like this one won't load either https://query.wikidata.org/#PREFIX%20wd%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fentity%2F%3E%0APREFIX%20wdt%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fprop%2Fdirect%2F%3E%0A%0ASELECT%20%3Fi%20WHERE%20{%0A%20%20%20%3Fi%20wdt%3AP31%2Fwdt%3AP279*%20wd%3AQ44%20.%0A} [12:21:45] gehel: oh, guess what, now it's back ^^ [12:22:09] I had to empty the cache again to get the working version [12:22:29] maxlath: it broke for me... something is fishy... [12:34:25] Jonas_WMDE: jzerebecki: https://integration.wikimedia.org/ci/view/BrowserTests/view/-Dashboard/job/browsertests-Wikidata-WikidataTests-linux-firefox-sauce/lastBuild/testReport/%28root%29/Creating%20statements/Cancel_statement__outline_example_____press_the_ESC_key_in_the_claim_entity_selector_input_field___/ [12:54:47] Jonas_WMDE: jzerebecki: revert 7a23ebd2356ab71b505e67bcb45672afc04b79c1 [13:28:04] Lydia_WMDE: around? [13:28:15] aude: She's in a meeting afaik [13:28:16] aude: kinda yes - wasup? [13:28:33] i am working on https://gerrit.wikimedia.org/r/#/c/288097/ [13:28:50] but unsure about serbian and wonder who to ask? [13:29:01] amir? [13:29:18] maybe [13:29:54] Lydia_WMDE, maxlath: I don't understand the issue. I created T134989 to track it. I'll ask for some more help on the subject... [13:29:55] T134989: WDQS empty response - transfer clsoed with 15042 bytes remaining to read - https://phabricator.wikimedia.org/T134989 [13:30:03] gehel: thanks a lot [13:31:48] maybe denny? [13:31:54] that works too [13:32:46] Lydia_WMDE, maxlath: it seems that we just got an upgrade on the cache-misc cluster. This might be related to our issue. I'll recheck in a few minutes... [13:33:13] Lydia_WMDE: Hi [13:33:22] d3r1ck: hi [13:34:15] Still doing some investigations on the document. We need to start ranking the triggers :) [13:34:20] Lydia_WMDE: What do you think? [13:34:32] sounds good :) do you want to make a first ranking? [13:34:52] Not yet. I need to investigate very well before doing the ranking [13:35:03] ok :) [13:35:08] But I updated the timeline like we agreed, Did you receive a copy of it? [13:35:18] I mean the proposal/ [13:35:21] yes it looks good to me [13:35:33] Ok fine. Thats great. [13:35:44] You've got some few minutes that we can PM? [13:35:53] right now is bad unfortunately [13:36:15] haha.., Ok. Let me leave you to concentrate :) [13:36:22] :) [13:36:29] Have a nice day L ;D [13:36:34] you too [13:37:40] gehel: thanks! playing with a local version meanwhile :) [13:38:10] maxlath: and thanks for the report! [13:55:27] maxlath: the issue *should* be resolved. What I understand from bblack is that we had some nginx experiment going on and might have cached some wrong responses. Caches have been purged [13:56:13] maxlath: I can't reproduce the issue anymore, but as it was random before, I might just be lucky. If you happen to see the problem again, ping me and I'll reopen T134989 [13:56:14] T134989: WDQS empty response - transfer clsoed with 15042 bytes remaining to read - https://phabricator.wikimedia.org/T134989 [14:00:08] gehel: will do! [14:00:17] maxlath: thanks! [14:01:59] gehel: so, it seems I'm back to the state of sjoerddebruin's screenshot now ^^ [14:03:23] maxlath: damn, it would have been too easy. Let me get a coffee and look deeper... [14:10:05] gehel: so from what I saw in the console, it looks like some of the scripts returned empty, triggering random TypeError / ReferenceError: [14:10:05] TypeError: a.fn.bootstrapTable is undefined [14:10:05] bootstrap-table-mobile.min.js:7:807 [14:10:05] TypeError: a.fn.bootstrapTable is undefined [14:10:05] bootstrap-table-key-events.min.js:7:27 [14:10:06] TypeError: $.fn.bootstrapTable is undefined [14:10:06] bootstrap-table-cookie.js:136:5 [14:10:07] ReferenceError: mediaWiki is not defined [14:10:07] vis.js:35671:1 [14:10:08] ReferenceError: mediaWiki is not defined [14:10:08] wdqs.js:121:1 [14:10:09] TypeError: $.uls.data.getAutonyms is not a function [14:10:43] Oh, so same symptoms, but not the same issue? [14:10:45] so indeed, that's probably an issue with the way files are served/cached [14:12:25] gehel: the first time it was probably the index.html that returned empty, while for cases with a crashed UI, it seems that one or several scripts returned empty. Looks like a single issue to me: files not being served as expected [14:14:34] yep, might well be... [14:48:24] maxlath: cached have been wiped (again). I can't reproduce the issue at the moment, but that's the nature of randomness... Can you let me know if it works / fail for you? [14:49:02] maxlath: belay that, I can suddenly reproduce the issue again... [14:59:34] Hi, I'd like to query the WQDS by label. Is that possible? I know I can use the service to get a label if I have the item but can get get all the items that have a given label? [15:01:27] theoretically yes, but the query might take too long to run [15:03:32] cool, that's fine I can restrict it to a few items to start with by some other relationships. What is the predicate for label? [15:03:38] gehel: not sure if it has anything to do with the issue above, but when I type a query into the box and try to run it, it reloads the page with an empty query [15:03:59] rdfs:label [15:04:22] ah, that isn't just for statements? [15:05:03] nikki: Do you reproduce the issue consistently? Or only some times? [15:05:43] each label has a language so I usually have to do something like ?item rdfs:label ?enlabel filter (lang(?enlabel) = "en") to get english ones [15:06:03] (and since right now it's being weird, I can't actually test selecting specific labels :P) [15:06:09] gehel: hm, now it's just stopped loading properly entirely :/ [15:06:27] damn, not a good month for WDQS... [15:06:43] I did notice that the url had ?query=queryhere where it used to have #queryhere [15:06:55] ah, so I just tried ?item rdfs:label "Berlin" and got no results. Is that because the query service is acting up? [15:08:47] hm, it's probably something to do with there being languages associated with it... "Berlin"@en seems to work, not sure how you would do a query that matches labels in any language [15:13:40] ?item rdfs:label ?label filter (str(?label) = "Berlin") times out so I guess that doesn't help [15:50:14] gehel: hey, I had a look to wikidata-query-gui code and made a pull request to decrease the number of js and css file request from 79 to just 2 https://github.com/wikimedia/wikidata-query-gui/pull/1 [15:50:14] I don't see how it would fix the issue we where discussing but it might make it less likely, and should give a loading time boost anyway :) [15:51:12] maxlath: Thanks! [15:51:43] maxlath: That's great to have people trying to fix this! [15:52:59] maxlath: and now I'm going to crash the party... We don't actually accept pull requests on GitHub, we get changes through gerrit (there is a line about that close to the top of the GitHub page). [15:57:59] maxlath: and we actually already have something in progress along that line (https://gerrit.wikimedia.org/r/#/c/284709/) and some discussion on https://phabricator.wikimedia.org/T133026 [16:09:47] Basically, before bundling all resources, we want to make sure it does not add complexity to the release / deployment. At the moment, with the bandwidth we have in the team, it seems more important to optimize that bottleneck... [16:10:15] gehel: oups, I did read that line but my brain was probably in "gaarg this got to be optimized" frenzy mode and I started hacking without much thoughts on possible discussions [16:10:17] I mean the people bottleneck, not the 100 resources to load on the page ... [16:10:44] gehel: so apologies for that [16:10:54] maxlath: That's the spirit! Start fixing first! Saddly, in a few cases, it does not work [16:11:07] yep [16:11:17] maxlath: I'm the one who should apologies for not being able to accept your contribution... [16:12:53] maxlath, gehel: https://www.mediawiki.org/wiki/Gerrit_patch_uploader [16:13:08] maxlath \o/ [16:13:45] frimelle: \o/ [16:13:53] DanielK_WMDE_: I did not know about that one. Thanks! [16:16:08] gehel: i haven't used it myself. sometimes it times out, i think [16:16:37] DanielK_WMDE_: Too me, it actually seems more complex than git review... [16:19:59] gehel: DanielK_WMDE_ I agree [16:20:28] i hate now much git review is pushed for new contributors, I think it's one of the biggest turn offs.. [16:24:16] jzerebecki: https://github.com/wikimedia/data-values-value-view/blob/master/lib/jquery.ui/jquery.ui.suggester.js#L350 [16:24:31] DanielK_WMDE_ gehel: if this is of any use, I created a gerrit account so I can push those commits somewhere there(?) [16:25:51] maxlath: you can always create a change, but at this point there is honestly not much chance of it being accepted before we have a good idea of how to automate all that [16:25:54] maxlath: gerrit uses the git-review workflow. there should be a howto on mediawiki.org somewhere [17:20:21] addshore: yea a plain git line could replace git review... i'm not sure why we suggest git review honestly. do you just use plain git to push reviews to gerrit? [17:20:39] i just use plain git [17:21:13] and I wrote a tiny wrapper around the longer command so I just type "g p" or "g pd" for pushing something or pushing a draft [17:21:45] I mean, the command is only git push origin HEAD:refs/publish/$branch ..... [17:23:16] I wrote a blog post about it ;) http://addshore.com/2015/12/submitting-a-patch-to-mediawiki-gerrit/ although I dont think I shouted about git-review enough though... [17:28:45] jzerebecki: What did you do about composer creating the static class maps? [17:29:34] hoo: use the previous version, instead of the RC [17:30:12] jzerebecki: 1.1.0 is stable now [17:30:15] but ok, still [17:30:50] so fast :-O [17:34:13] hoo: my idea was to test config.platform to describe that the target doesn't have php 5.6 https://phabricator.wikimedia.org/T125343#2254777 . but I suspect it will still generate that code and upstream will say won't fix https://github.com/composer/composer/issues/5273 . [17:34:16] Yeah... I think we really need a real fix by now [17:35:25] oh hoo is AP now? [17:35:33] *looks at the colander* [17:35:42] addshore: Yes [17:35:48] but we're a bit behind schedule [17:35:52] okay! [17:35:53] still updating Wikibase right now [17:37:14] if you need any review etc im here for the next 30 or 45 mins [17:37:45] maxlath: if your still around you should read the blog post I linked above! [17:39:36] addshore: thanks, reading it [17:40:03] and maxlath it you think it is missing anything let me know so I can update it, but it should get you to the stage of uploading a patch in the easiest way [17:41:24] I'll make a post about aliases soo ;) [17:47:41] addshore: nice sum-up! I'm just missing two lines on what `git push HEAD:refs/publish/master` exactly do, never used this kind of beast so far ^^ [17:48:13] maxlath: okay *writes a sticky note to add it to the post* [17:48:31] basically that pushes your commit to gerrit to be published and reviewed [17:48:43] that is just where gerrit expects things to be pushed to [17:49:17] using that at the end of a branch you have been working on will push all of those commits at once, so say you have a branch with 2 commits from master, it will push those 2 to gerrit [17:49:20] Lydia_WMDE: https://scontent.ftxl1-1.fna.fbcdn.net/v/t1.0-9/13151414_1039560279454649_1748494216694642219_n.jpg?oh=c94367f0f99e4144b02ba9a5b3ee3fe8&oe=57DC88CC [17:49:39] and then git push HEAD:refs/drafts/master will push it to gerrit but as a draft so only you can see it until you publish it [18:06:35] * aude waves :) [18:10:42] hi aude :) [18:10:49] We're currently deploying AP [18:11:10] * hoo waiting for scap [18:11:28] \o/ [18:11:31] \o/ [18:12:42] * aude is on my way back to DC [19:16:10] {{done}} [19:16:10] You rule, hoo! [19:16:48] haha [19:18:58] :) [19:25:40] DanielK_WMDE_ addshore gehel: https://phabricator.wikimedia.org/T133026#2286446 [19:27:08] maxlath: SMalyshev is probably interested in this conversation. He is the one who will be the most impacted by changes in the build process (or actually the introduction of a build process) [19:28:09] Lydia_WMDE Aaaaahhh it's deployed!!1! [19:28:17] :D [19:28:24] working on annoucement [19:29:09] maxlath: did you upload it as a draft to gerrit? [19:30:38] \o/ [19:30:49] gehel, maxlath: the change linked there doesn't seem to exist... [19:31:10] probably a draft [19:42:01] SMalyshev: heh drafts are leaked to phab: https://phabricator.wikimedia.org/rWDQG14c5cce3b28d5d50b4a0cebd08070feee77acc10 [19:44:51] jzerebecki: They've been leaked to Zuul for years… [19:46:05] jzerebecki: I'm not sure I understand how that script is supposed to work [19:46:07] James_F: yup, phab is new since april 28th or something :) [19:46:27] * James_F nods. [19:46:32] jzerebecki: looking as it changes index.html that means you can't just check out the code and have it run? [19:47:32] SMalyshev: yes [19:47:48] jzerebecki: that doesn't sound good... [19:48:13] jzerebecki: also not sure how would one deploy this [19:48:22] SMalyshev: and each time you change a source js or css file you need to run the build script. [19:48:32] I certianly don't want to deploy all these scripts [19:48:36] aude: Will you be around for the late SWAT? [19:48:57] jzerebecki: this looks like quite impeding setup for development [19:50:14] If so, https://gerrit.wikimedia.org/r/288254 would be nice to get backported [19:50:17] SMalyshev: that is the reason why Mediawiki does this on the fly partially in js on the client and the other part during request in php [19:50:19] if not, it's also not urgent [19:53:05] * Lydia_WMDE high5s frimelle [19:53:06] :D [19:53:52] jzerebecki: we thought about somehow doing the build step on merge and committing it into production branch... but I'm not sure we have it working and don't know how exactly to pull it off with CI [19:54:26] Lydia_WMDE yaaaay! \o/ [19:57:56] :D [19:58:03] whoot whooot [19:58:27] hoo: won't be around [20:00:48] SMalyshev: you can test the build in a job during the gate-and-submit pipeline and then redo the build in the postmerge pipeline, but this time push the result to the deployment branch. however that is probably the first time this is done at Wikimedia, so needs security review. I have the same way in mind for building other things, like vendor.git . [20:02:17] * nikki wonders why there's a big gap at the top of article placeholder pages [20:03:43] jzerebecki: there's also a problem - what if it doesn't merge cleanly to dpeloy branch? [20:03:59] though I guess we could have a script to clean up that... [20:08:07] SMalyshev: why wouldn't it merge cleanly? the time between preparing the commit in the postmerge job and submitting it would be less than a minute. if that ever happens because you just manually did something, you could just rerun the job. however normally you shouldn't need to manually update the deployment-build-result branch. [20:08:50] yeah. I guess we just need to write the script carefully. aude did some of that but it wasn't finished as I see [20:13:31] SMalyshev: we probably need a branch production for the source in addition to master. then have a build for master and one for production that get automatically pushed from the source. [20:15:04] aude: Ok, same for me [20:15:08] not that important, then [20:15:57] cu o/ [20:17:59] jzerebecki: not sure I got this [20:18:25] we already have 2 branches so what the diff? [20:20:20] SMalyshev: we currently have master and deployment. deployment is what I named production above. there is nothing yet to store the build result for each. [20:21:00] jzerebecki: actually we have master and production :) but I assumed production is where build result is going [20:22:34] * Lydia_WMDE goes home now and is reachable via hangout and signal [20:22:36] so if not I need some info. What I assumed is like this: we have master which would for dev. Then we run build and get another set of files, not the same as dev, and we put that in producation [20:22:49] jzerebecki: alternatively maybe we just need to have deployment repo separately [20:24:44] SMalyshev: yes a separate repo would work. if you automatically push to production from master during postmerge, then everything gets automatically deployed if you also ran scap after that. [20:27:01] SMalyshev: so if we want to be able to backport fixes and only deploy that change, we need a source branch and a build result branch for production separate from master and a build result of master. [20:27:07] well, we don't even have scap as of now [20:28:34] s/scap/trebuchet/ [20:52:12] Hi. I'm new here. I'm trying to find a schema somewhere for wikidata. For example, is there a notion of types? Like People, Place, Events. And a set of properties that each type has. Like Person can have date of birth, place of birth, etc. And where I can browse this schema in more detail. [20:54:29] keshav57: generally thematic projects wrote somewhere their ontologies [20:55:53] Harmonia_Amanda: What about just the general ontology. For the basic non-domain specific things. Like Person, Places, etc. [20:56:26] keshav57: you can explore https://tools.wmflabs.org/wikidata-todo/tree.html [20:58:50] keshav57: https://www.wikidata.org/wiki/Help:Properties [21:00:06] keshav57: the software doesn't know about types, but people model them using the instanceof property. No schemas are enforced by the software, but conventions are upheld by the community. [21:06:20] Thanks guys. Checking out the links gave, and browsing around to get a better sense. [21:06:32] Checking out the links* [21:10:32] Is there a way to traverse edges in the opposite direction. For example, find all people who have occupation physicist [21:18:50] keshav57: you can do a query [21:19:00] we have a SPARQL endpoint