[10:14:27] Lydia_WMDE: hey, around? [10:14:53] benestar: hey [10:14:53] jep [10:15:32] Lydia_WMDE: should I just start working on some refactoring for the media extension? [10:15:45] or are there some other things that need to be done first? [10:16:02] * benestar only remembers the identifier stuff which is still blocked on some stuff by Adrian [10:16:34] benestar: no that sounds good to me :) [10:16:40] benestar: just note in the ticket what you're tackling [10:17:13] ok [10:18:29] \o/ [10:18:44] hoo: did you backport the fix for editing terms already? [10:19:13] benestar: Yes, the minimal fix is live [10:19:20] do you still see the problem? [10:19:26] no, just asking [10:19:56] Ah ok, because I already saw a report about it still occurring earlier… two would have made me worrying [10:22:28] I'm just testing it now [10:23:08] seems to work for me :) perhaps the one user had the old code still in their cache [10:23:12] I tested it on both Wikidata and testwikidata immediately post deploy and got reports about it working [10:23:21] yeah, told them to try again and maybe clear it [10:24:38] I should get some coffee and move rooms… cu o/ [10:24:48] * switch [11:18:50] * yurik wonders if Lydia_WMDE is sick no more [11:21:28] yurik: She's working today AFAIK, but possibly not around now (dunno) [11:21:51] * yurik pokes DanielK_WMDE_ with https://phabricator.wikimedia.org/T124569, and thanks hoo [11:23:10] yurik: maybe can look at the rfc during the weekend [11:23:33] aude, thx :) [11:23:34] as you know, we worked on this sort of thing back at one of the hackahtons [11:23:58] storing on wikidata might be the most controversial part and question i will defer to otheres [11:24:15] but storing *somewhere* seems good [11:24:28] yeah, and we keep bumping against this issue on all fronts, so we really ought to solve it :) [11:24:35] wikidata is for data :) [11:24:42] commons is for multimedia i guess :) [11:25:07] aude, i think wikidata community is the most technically advanced - because most users are bots :) [11:25:25] which is perfect for data formats :) [11:25:29] this would be a pretty big change [11:25:36] aude, why? [11:25:42] having new content types [11:25:52] aude, it wouldn't be part of the wikibase [11:25:57] doesn't matter [11:26:09] the idea is to simply allocate a different namespace to data [11:26:12] also not sure we want it on Wikidata [11:26:24] commons seems like a better fit for such rather self contained things [11:26:28] we might want data types (eg. geo) to reference external data from items [11:26:42] are we sure we want to set up yet another domain? [11:27:01] * aude defers to Lydia_WMDE etc. on question of where [11:27:05] IMO it fits commons somewhat [11:27:25] commons is not very good because it mostly stores images, whereas "data" in a CSV/jSON formats are much closer resembles wikidata's own data [11:27:43] Which is problematic [11:27:49] and community behind commons mostly deals with multimedia uploads as a whole [11:27:59] because you introduce new data which totally doesn't work like the other data which is similar [11:28:00] wikidata community is more targeted towards facts [11:28:05] and it's both on Wikidata [11:28:05] * aude remembers some years ago when a Table: namespace was introduced on enwiki [11:28:13] then got reverted for some reason [11:28:24] it was still wikitext [11:28:36] https://en.wikipedia.org/wiki/Wikipedia:Table_namespace :) [11:28:42] hoo, yes, wikidata's current data is different in a way - it is "facts", not "data blobs" [11:28:58] aude, thx for the ref, good to know [11:29:28] hoo, but those facts are closer to data blobs than a video, right? :) [11:30:08] i really don't think we should introduce a new domain for this, as that is even worse [11:30:15] * aude put https://en.wikipedia.org/wiki/Template:Climate_in_Middle_East_cities there [11:30:18] yurik: I tend to disagree… they are closer to PDFs than to the Wikibase data model [11:31:24] hoo, having json would show something like this - https://meta.wikimedia.org/wiki/Schema:APIRequest [11:31:35] anyway, best i stay out of question *where* but agree it would be great to have a place for this sort of stuff and handle it in a mor estructuured way [11:31:36] PDF is a print-ready media [11:31:39] that content handler allows now [11:32:24] hey Lydia_WMDE , an anonymous created an empty userpage for you on trwiki, I've requested deletion just to let you know [11:32:41] yurik: But that would make Wikidata our machine readable, linked and queryable data store. Well, except for the random stuff lying in the Random: namespace [11:33:05] hoo, why wouldn't the data blobs be machine readable?? [11:33:10] also would that mean all stuff should be CC0? Or would you exempt it? [11:34:05] They are in a machine readable format, but the data itself is modeled differently in each [11:34:09] potentially [11:34:21] food anyone? https://www.lieferheld.de/lieferservices-berlin/restaurant-china-lieferservice-lon-men/406/ @Lydia_WMDE? [11:34:36] @jzerebecki [11:35:33] Yay, we pass in PHP7: https://travis-ci.org/mariushoch/mediawiki-extensions-Wikibase/builds/107211668 [11:35:57] hoo, licensing is a very valid point - data blobs do have different license depending on the source of it. But than what if someone imports a large dataset via a bot into multiple wikidata items? Wouldn't that put that import's license on each of the imported statements? [11:36:30] yurik: Wikidata is all CC0… let's not get into details here [11:36:38] @addshore here [11:36:40] but I don't think we should have an exempted data namespace [11:36:43] :D [11:36:51] hoo: YAY [11:36:54] mittagsangebot is the cheapest [11:37:07] hoo: put that patch on top of the other travis one? ;) [11:37:30] hoo, i think licensing might be the biggest reason not to use wikidata for this. Regardless, I think we should implement the technology, and let the community decide which domain to use - since it will be community who maintains it [11:37:36] addshore: Will do… and php7 is surprisingly fast… at first I thought it didn't run any tests at all maybe :P [11:37:55] hoo: yeh, faster than hhvm, especially for tests (ie one of runs) [11:38:00] *off [11:38:15] Jonas_WMDE: S�� sauer tofu + fr�hlingsrolle vegetarisch [11:38:29] addshore: Yeah, the jit doesn't pay off there [11:38:36] (tofu is under mittagsangebote mit tofu) [11:38:56] * hoo is jealous… Frühlingsrollen [11:39:21] * aude gives hoo a cookie :) [11:39:27] :) [11:39:48] Jonas_WMDE: can I have the same as aude but also with 6 Minirollen vegetarisch!!!! [11:39:55] they are under Vorspeisen [11:40:33] you can also order them as extra (cheaper) but not sure if vegetarian [11:40:37] * hoo adds Arabic tests for aude :D [11:41:14] hoo: :) [11:42:10] jzerebecki: ^ [11:42:13] Thiemo_WMDE: ^ [11:45:12] Jonas_WMDE: im fine with either! [11:47:23] @addshore @aude @Thiemo_WMDE Erwarteter Lieferzeitpunkt: Heute gegen 13:46 Uhr [11:47:29] nom nom [11:47:30] ty! [11:47:45] Jonas_WMDE: ok :) [11:47:50] wie geld do I owe you Jonas_WMDE ? [11:48:06] was ... :P [11:54:28] @addshore 8.8 euronen (2,8 + 5,5 + 0,5) [12:16:36] interesting... https://integration.wikimedia.org/ci/job/mwext-Wikibase-repo-tests-mysql-hhvm/8016/console [12:16:40] hoo --^ [12:17:25] hm… just retry [12:18:28] all databases are being thrown away post run... but still weird [12:19:23] wahaha jzerebecki [12:19:23] cp: error writing ‘/mnt/jenkins-workspace/workspace/mediawiki-extensions-hhvm/log/LocalSettings.php’: No space left on device [12:19:36] :P [12:19:58] yeah, that's one of the things that MySQL really doesn't like [12:20:17] although I thought we had a separate tmpfs for MySQL data?! [12:31:29] yurik: back in the office and catching up yes [12:35:14] HakanIST: thanks a lot. [12:35:37] welcome [13:10:53] DanielK_WMDE_: around? [13:13:28] benestar: hey [13:13:40] hi, why do we actually have onContentHandlerForModelID hook? [13:14:23] can't we just get the entity handler there via the ContentHandler registry of mediawiki? [13:16:02] benestar: i think the problem was that that doesn't support callbacks. or didn't back then. [13:16:17] we need to inject stuff into the handler, so regisering a class name doesn't work. [13:16:22] ah, I see. Do you know if it does now? [13:16:29] i'll check [13:17:19] it seems it doesn't... [13:17:47] as an alternative, we can just register a callback on our own and use that in the hook [13:18:05] that is what I tried first before realizing that in theory core can already do that [13:18:15] benestar: or fix the issue in core. just just be three lines, plus a test [13:18:21] i'll happily merge it :) [13:18:53] the thing is, we perhaps need the callbacks anyways for clients [13:19:06] what do you mean? [13:19:20] the client shouldn't instantiate any EntityHandler or EntityContent, ever [13:19:21] "However, any entity type specific code needed on the client cannot use this mechanism, since the EntityHandler is not (and should not be) present on the client." [13:19:28] (unless it's also a repo) [13:19:40] I assume that means that some stuff cannot go into an EntityHandler [13:19:48] like serializers [13:19:53] yes, serializers [13:19:57] anything else? [13:20:01] can't think of anything else, but there may be other cases [13:20:19] so if it affects only serializers we indeed don't need any further callbacks [13:20:46] I agree: since we can't cover everything with EntityHandler, we could use the "array of callabcks" approach for everything. [13:20:48] however, if there is anything that doesn't fit into the EntityHandler but is needed on client we have to introduce something like WikibaseRepo.datatypes.php, right? [13:20:48] on the other hand: [13:21:04] serializers are special anyway, since they need to be registered with a stand-alone component. [13:21:13] agreed [13:21:42] so if it's just serializers, I'd suggest to guidl a mechanism for registering them, and use that from EntityHandler on the repo, and use it directly from the client [13:21:57] hopefully it's the only thing :P [13:22:11] if there is more stuff we need on the client, we should probably ditch the EntityHandler idea, and go for the array-of-callbacks [13:22:25] I'll just prepare the patch for core [13:22:29] it's cleaner in any way [13:22:56] yes, registering callbacks should always be possible [13:23:22] but the question how ContentHandlers are registered is separate from how we handle type-specific service wiring for entities. [13:35:26] addshore: do you get notified on wdqs icinga errors? [13:35:30] no [13:35:43] jzerebecki: see https://phabricator.wikimedia.org/T125975 [13:35:51] I guess that will be blocking it ;) [13:36:19] yup [13:37:37] addshore: do you know of any monitoring::graphite_threshold that works? i heard stuff like that is flaky to the extend that it is unusable, but I don't know the details. [13:37:52] jzerebecki: there is a wdqs-admins contact group you could add yourself too [13:38:07] jzerebecki: it looks like it is used allot (by restbase etc) [13:38:13] yea I should do that [13:38:27] Lydia_WMDE: just saw this: https://de.wikipedia.org/wiki/Wikipedia_Diskussion:Projektneuheiten#Neue_Wikiversion_bugt_auf_Wikidata [13:38:56] I'm not able to confirm this [13:41:02] Tobi_WMDE_SW_rem: works for me, if i understand the german correctly [13:41:23] they are trying to edit "Bildlegende" ? [13:42:34] aude: they are saying "Imagedescription" [13:46:13] i don't see any "bildbeschreibung" [13:46:33] but "Bild" has descriptions as qualifiers [13:46:36] DanielK_WMDE_: fyi https://gerrit.wikimedia.org/r/#/c/268666/ [13:47:05] also created a patch that uses that mechanism in Wikibase https://gerrit.wikimedia.org/r/#/c/268667/ [13:47:09] which, btw really imho should be multilingual text data type, if we had that [13:48:55] !merge 268666 | DanielK_WMDE_ [13:48:55] DanielK_WMDE_: merge merge merge MERGEEEEEEEEEEEEE https://gerrit.wikimedia.org/r/#/c/268666/ [13:49:06] lol [13:51:58] aude Tobi_WMDE_SW_rem: i don't think it is a good idea to add image descriptions as qualifiers on wikidata [13:52:16] that seems rather useless because the information cannot really access in combination with the image, does it? [13:52:54] benestar: yeah, but that's unrelated to the issue described [14:36:15] benestar: code looks good, documentation needs updating. see there. [14:37:38] thanks [15:02:32] hi [15:02:55] hi [16:50:37] hi [19:55:26] Looking for help finding material I started, that I can't find [19:57:14] Can I find help here [19:58:53] yes [19:58:59] how can we help? [19:59:26] Guest24667: do you mean that you created something on Wikidata and can't find that? [20:00:53] I most likely will NEVER donate to this impregnable again..... THERE IS NO FREAKIN HELP TO BE FOUND. My 3 years of donating to this site ends now! [20:01:49] it's shame that you don't even let us help you :/ [20:01:50] All my work is nowhere to be found [20:02:25] It took me quite a while to find this place [20:03:31] Thechoice: what is your username on Wikidata? [20:03:52] I am unablethechoice [20:04:03] Thechoice1 [20:04:50] it seems that you have never edited Wikipedia under your username: https://www.wikidata.org/wiki/Special:Contributions/Thechoice1 [20:04:54] Wikidata* [20:05:13] but on the English Wikipedia you have some contributions: https://en.wikipedia.org/wiki/Special:Contributions/Thechoice1 [20:05:26] I don't want to lose this spot [20:05:52] Thank you [20:06:38] I must choose "English" first [20:08:12] What am I missing [20:08:39] are you at https://en.wikipedia.org/wiki/Special:Contributions/Thechoice1 ? [20:09:02] Thank you, I see it , how did I miss it? [20:09:28] it happens :) happy editing [23:24:40] Does WikidataBuilder run on the release branches (e.g. REL1_25)? [23:28:17] matt_flaschen: Seems the config works since i get this now https://integration.wikimedia.org/ci/job/mediawiki-extensions-qunit/29460/console [23:28:44] If not, is it acceptable to fix them manually? Trying to figure out a good way to fix T126073. [23:29:23] Right now, REL1_25 of Wikibase points to 0.14.0, but that doesn't have the QUnit fix. So we could either somehow figure out a commit of data-values/values-view to point to (maybe even setting up a branch for this issue then a 0.14.0.1), or just fix it manually. [23:30:00] Hum yes. In composer all the versions are set to mostly dev-master manly. [23:30:09] That is in the REL1_25 branch. [23:30:42] paladox, what is the config you're referring to? [23:31:30] matt_flaschen: $wmgUseWikidataTest [23:32:02] matt_flaschen: https://gerrit.wikimedia.org/r/#/c/268828/3/Wikidata.php [23:33:38] matt_flaschen: no [23:33:52] manual would be ok [23:34:50] Thanks, aude. ^ paladox [23:35:35] matt_flaschen: Do you mean to fix manually without config. [23:35:50] paladox, yeah. Rather than disable QUnit, you can fix it manually and link to https://github.com/wikimedia/data-values-value-view/commit/7925b64b7256902a2059273de76963d3bc3286a6#diff-5d507d148d067d30c156cbfe393bb6de as the source. [23:36:17] Or just 7925b64b7256902a2059273de76963d3bc3286a6 . It should auto-link. [23:36:32] matt_flaschen: How would i fix manually since would that be a big change since repo, lib, view also need fixing. [23:38:40] paladox, why does https://gerrit.wikimedia.org/r/#/c/268828/ still fail QUnit if you disabled it? [23:38:57] matt_flaschen: because of flow. [23:39:18] matt_flaschen: I am testing here https://integration.wikimedia.org/ci/job/mediawiki-extensions-qunit/29462/console [23:40:52] paladox, those lines are not the problem. They're trivial. [23:41:55] matt_flaschen: Ok. Seems when i try to do Depends-On: Iab5a905837d4116e660f2178325a14d4965f5c93 on your patch it says merge failed. [23:42:32] We should merge the Flow actual QUnit fix, so that at least will get picked up. [23:42:40] I think we need to force it, but otherwise it will get circular. [23:45:45] matt_flaschen: Yes should work. I say you have to +2 for code review and verify and pres publish and submit not publish and comment or else the tests will start for jenkins. [23:47:12] Yeah I know. i don't like to force it, but here it's circular, and it's a pain to disable and re-enable things. [23:49:16] matt_flaschen: Ok well ive done some testing and your fix does fix it. I found that there is now also anothe qunit failure in wikidata. [23:52:15] matt_flaschen: Ive found where the error is comming from, Its comming from lib.