[02:01:34] * Esther NP: "Have You Ever Seen The Rain?" by Creedence Clearwater Revival from "Pendulum" [07:51:33] Thiemo_WMDE: hi [07:52:33] legoktm: Hi. I wanted to help you but I closed most tabs I had open and will not continue review these changes for now. [07:52:58] Thiemo_WMDE: did you read my responses? disabling the full sniff is wrong as it suppresses passing sniffs [07:53:11] the comment sniff for example has lots of checks, most of which do pass [07:53:44] Which is why I disabled it. It does waaaay to many things that should be split into individual sniffs. [07:54:06] why? [07:54:07] Also all these details should be opt-in, not opt-out. [07:54:32] https://gerrit.wikimedia.org/r/#/c/360196/ - the way you tried to disable the sniff doesn't even work [07:54:47] Just look at the patches you just uploaded. You had to disable almost the exact same rules in every single repository. [07:55:04] well yes [07:55:07] that's on purpose [07:55:15] a script can't write documentation for people [07:55:40] What this tells me is that the entire MediaWiki.Commenting.FunctionComment sniff is bad. [07:56:10] Do you know all of the checks that that sniff does? [07:56:30] It even complains about "missing" comments in cases where all function parameter already have type hints and a comment does not add anything on top of that. [07:57:34] A sniff that complains about a "missing" comment on "function foo( ClassName $bar )" will make people add "@param ClassName $bar". [07:57:41] This is worse than having no comment. [07:58:56] It does not add anything that was not there before. It wastes peoples time when they have to deal with the sniff's complains and when they have to create and review patches that try to "fix" such non-helpful error messages. [07:59:07] OK, so no, you didn't read what the sniff does [07:59:30] Do you want a discussion or not? [07:59:48] Yes, but it's not really possible to have one if you don't understand what you're talking about [08:00:10] I will not continue to talk to you if this is all you have to tell me. [08:01:17] I know what the sniff does, and it does a billion things it should not do. [08:01:22] How about this. Please explain why you think sniffs that check for invalid @throws tags, duplicate @return tags, and related should be disabled [08:01:40] At least not as an opt-out option as it is now. [08:02:13] I think I said everything I wanted to say multiple times now: [08:02:57] 1. This specific sniff is a black hole. It does a billion things the same time. This is hard to control, and almost impossible to understand. Instead most of these things should be individual sniffs. [08:03:17] 2. Most of the details this sniff complains about should be opt-in, not opt-out. [08:03:43] 3. Some of the details it complains about are counterproductive, as I tried to explain above. [08:04:53] Duplicate @return? Cool! Make this an individual sniff and enable it by default. There is nothing wrong with this. [08:06:41] Here's my POV: These sniffs have been in development for nearly a year now, and feedback has been regularly requested on wikitech-l. As far as I know, you've never made these opinions known until now. There are others who did opine who have differing viewpoints, but that's the direction MW-CS went in. If you want to split stuff up into separate sniffs or whatever, feel free to send a patch and I'll gladly review it. But right now you've just hijacked [08:06:41] patches that were functional, broken them, and disabled useful sniffs, which might not be ideal for configurability, but still functional. [08:07:23] "Hijacked"? I'm sorry? [08:08:08] What "configurabiliuty"? [08:08:37] "amended without testing or basic sanity checking causing them to break"? Idk what other word to describe that [08:08:56] You said "hard to control, and almost impossible to understand" [08:14:23] Yea. Like here: All error messages on all sniffs are named after the sniff. The sniff is A.B.C, the error messages are A.B.C.D. [08:14:25] Not here. [08:14:27] MediaWiki.Commenting.FunctionComment.MissingParamComment [08:14:31] MediaWiki.FunctionComment.Missing.Public [08:14:45] The package name "Commenting" is missing. [08:17:02] Which is why I made my mistake in https://gerrit.wikimedia.org/r/#/c/360196/ [08:23:57] I'm sorry about how this feels now, for you and me. [08:25:44] I love to work with code sniffers. But I think they must be a tool that helps people doing their job faster, easier, and with more fun. Not to be punished. [08:26:40] I would love to reuse many of the individual sniffs in the current MediaWiki.Commenting.FunctionComment package but I can't because they can not be reused separately. I can not mix and match that stuff. All I can do is disable what I find painful. [08:28:03] Which is what I'm going to do in https://github.com/wmde/WikibaseCodeSniffer/pull/3 [08:33:22] legoktm: Still reading? The 0.8.0 release caused a lot of the pain I'm talking about and I see 0.9.0 fixed a lot of it. This is very welcome. [09:02:02] legoktm: I just tried the new 0.9 release on the Wikibase code base and it complains about 3200 MediaWiki.Commenting.FunctionComment.MissingParamComment even if the @param docs it complains about are there. [09:03:26] Oh, I see. It complains because it is build on the assumption that "@param $" must be followed by an actual comment. [09:05:06] As I tried to explain before I believe this is bad. In almost all cases the type and the variable name is already enough to understand what a parameter is about. Adding comments like "@param ItemId $referencedItem ID of the referenced item" is bad and worse than having no comment. Such a sniff should not exist as an opt-out feature. [09:47:00] legoktm: I updated my reasoning in https://github.com/wmde/WikibaseCodeSniffer/pull/3/files (see the XML comments). [09:57:39] DanielK_WMDE_: you should have the rights to create properties on https://wikidata.beta.wmflabs.org now [09:57:50] Lydia_WMDE: ^^ [09:58:16] Tobi_WMDE_SW: thx [10:44:22] nicdnearby feature on commons app, but more p18 imports are needed https://www.wikidata.org/w/index.php?title=Q3890198&action=history [10:44:48] not the best to fix such items while on site :) [11:28:57] I dont want to join i just want to ask a question so i dont know how i found this page but i cant help but feel like the good and evil you are talking about is me and my husbands story if it is you obviosly know ive been trying to find out if he is cheating and talking to girls online becausr hes continuosly deleting his phone info can you please tell me [11:30:07] Im not going to share this with anyone its for my own conc [11:30:21] Concience [11:30:51] Hello [11:32:20] Your site says ask questions i just want to know you have the data i deserve to know [15:54:15] is there any bot harvesting the wikilink labels in articles to populate the aliases on items? [15:56:56] or even just redirects? [15:57:59] (for the first strategy: noticing that there are many occurrences of [[University of California, Los Angeles|UCLA]], and therefore adding UCLA as alias to Q174710) [15:58:00] 10[1] 04https://www.wikidata.org/wiki/University_of_California%2C_Los_Angeles [16:07:42] pintoch: afaik some bots did that back in 2013/2014 but not recently. [16:10:52] there was a bot doing that for jawiki about a year ago. I'm still cleaning up after it :/ [16:12:15] I guess that was more problematic because it was trying to get japanese labels/aliases from links to *other* languages [16:12:31] ^ [16:13:21] you can get all these links in dbpedia [16:13:45] not sure if it's a great idea to automatically add aliases using them [16:13:48] lot of noise [16:14:03] but they are definitely useful for nlp tasks [16:16:11] yeah, it would work if it were done via some tool which asks people whether it looks sensible (someone who isn't going to just accept every suggestion like some people seem to) [16:16:35] PROBLEM - Host wdqs2002 is DOWN: PING CRITICAL - Packet loss = 100% [16:23:59] RECOVERY - Host wdqs2002 is UP: PING OK - Packet loss = 0%, RTA = 42.62 ms [17:10:14] nikki, sjoerddebruin, rom1504: ok, thanks! [17:10:28] yeah something semi-supervised would be interesting [17:23:32] Still weird that I don't get language fallback for all languages in my babel template. [18:09:03] sjoerddebruin: we can't do that when rendering items, because it would poison the parser cache [18:09:20] and we figured that different fallback behavior in diofferent [18:09:28] ...in different parts ofthe ui would be confusing [18:09:36] Ah, bummer. [18:10:15] I'm working on Frisian items now, wish the items with only Frisian labels showed up in the search suggestions. [18:10:31] if you think babel based fallback would be good to have in some specific place, and it won't mess with the cache, and it won't be confusing, let us know! [18:10:37] search, maybe [18:11:16] but "search" is also "editing statements referencing entities" [18:11:32] so, you'd get personalized fallback in the search, but not once you hit "save". [18:11:42] Ah, yeah. [18:11:46] we could do that i guess, but i think it's confusing... [18:12:15] I understand, thanks for explaining though. :) [18:12:28] you are welcome! [18:12:38] if you can think of a good place to put this info, go ahead :) [18:12:42] * DanielK_WMDE_ is going home now [18:12:55] Yeah, it's about time. Looking forward to your P279 work! [18:14:01] poke @hoo ;) [18:14:23] Ugh, wish he was not that busy. [18:15:02] it's just a config change, anyoen can do it if he's too busy [18:15:06] anyway [18:15:08] ttfn [18:44:22] but I *do* get different fallbacks in different parts of the ui :/ [19:04:45] Hi. I want to add bunch of new properties for tablet technical specifications. For example, see infobox of https://en.wikipedia.org/wiki/Samsung_Galaxy_Tab_S3 [19:04:45] I found, that for things like phones, tablets, laptops, there are properties at all. So instead of proposing all those properties one by one, I want to propose whole bunch of them at once. The properties would be memory, storage, cpu speed, dimensions, weight, ... [19:05:31] sjoerddebruin: I was just looking at dutch labels for BC years (http://tinyurl.com/ycl7ssm9), it looks like the ones from sitelinks use "v.Chr." and almost all the others were added by edoderoo's bot with either BC or BCE (not sure how a bot managed to be inconsistent...), should the BC(E) ones be changed to "v.Chr."? [19:05:53] Yes. [19:08:43] sirex: we already have properties for dimensions ("height", "width" and "thickness") and weight ("mass") [19:10:24] https://www.wikidata.org/wiki/Wikidata:Property_proposal/Generic - here I found this: „Check if you can give a similar label and definition as an existing Wikipedia infobox parameter, or if it can be matched to an infobox, to or from which data can be transferred automatically.“. How can I import data from infoboxes like this one: https://en.wikipedia.org/wiki/Samsung_Galaxy_Tab_S3 of all tablets? [19:17:31] in general, there's https://tools.wmflabs.org/pltools/harvesttemplates/ but I think the values in that infobox are probably too complicated for it. you'd probably have to ask on https://www.wikidata.org/wiki/Wikidata:Bot_requests [19:45:03] Thanks nikki. [20:00:22] https://www.wikidata.org/wiki/Wikidata:Bot_requests#P18_imports_from_it.wiki_infoboxes_for_buildings [22:17:17] Lydia_WMDE: Auregann_WMDE if you are around, i'm enabling wiktionary site links (non-main ns0 now [22:17:26] err, doing anyway but FYI [22:17:33] aude: cool :) [22:17:43] around if you need me [22:17:45] ok [22:18:05] it needs to be announced when i am done (i can do if you can't stay up this late) [22:18:22] hopefully only takes a ~15 min [22:25:18] ok, wiktionary needs new table [22:25:51] * nikki wonders if anyone is planning any bots for importing the sitelinks [22:31:23] aude: i think lea has everything prepared to do it first thing in the morning [22:32:38] nikki: not much happening yet here it seems: https://www.wikidata.org/wiki/Wikidata:Wiktionary/Sitelinks [22:34:06] ok [22:34:11] i just think people will wonder [22:34:20] but they did get the announcement before [22:35:02] ooh, that's today? exciting [22:35:23] aude: yeah - and will be done in a few hours so think it is fine [22:35:27] ok [22:35:32] Jhs: yes! :D [22:35:54] alright - time to get some sleep for me. [22:36:03] thanks aude for setting it up [22:36:09] good night :) [22:36:47] gnight! [22:39:00] ok [22:39:03] now the sites table [22:39:07] needs to be updated [23:01:22] Lydia_WMDE: looks like maybe the other projects sidebar doesn't work with namespace settings [23:01:29] can fix maybe tomorrow [23:03:12] is this the first wiktionary link added? https://www.wikidata.org/w/index.php?title=Q914807&type=revision&diff=503836649&oldid=492290965 [23:03:21] the sandbox? [23:03:30] oh you're the first :) [23:03:44] i added the sandbox to make sure this isn't breaking anything [23:05:04] oh, needed to purge the page to get the section [23:05:40] yeah for now [23:05:49] i'm not sure about purging the parser cache for this :/ [23:05:55] at least not right now [23:06:12] I was refreshing it from time to time being like "I wanna add links!" :P [23:06:26] and I was doing it wrong all along [23:06:27] probably ok to need to purge (and not all pages need that) [23:07:16] http://tinyurl.com/yc64y4gk :D [23:07:39] :) [23:08:10] i have to go now but will monitor irc from my phone (and have my laptop with incase i need to fix something) [23:08:18] I should be going to bed really [23:08:29] there’s a wikibase:wikiGroup predicate for sitelinks? :O [23:08:30] but everythign looks fine [23:08:52] hi WikidataFacts :) [23:08:55] hi :) [23:09:23] WikidataFacts: it's even documented :P https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Sitelinks [23:09:25] oh, I see, it’s on the isPartOf subject, not on the article [23:09:42] I was wondering why it wasn’t part of the upper code block [23:09:45] nikki: thanks :) [23:10:16] aude: hm, on https://www.wikidata.org/wiki/Q5296 when I click edit, I just get sent to Special:SetSiteLink [23:10:46] looking [23:11:01] probably caching [23:11:19] I purged it twice, dunno if there's other caches that might be a problem [23:11:35] the list of sites is cached for up to an hour [23:11:57] it works for me but suggest to check again in 15-30 min maybe [23:12:11] i am logged in (btw) [23:12:21] wouldn't that apply to every item? (I did manage to add some links to other items) [23:12:34] hmm [23:13:07] might depend on the server handling the request [23:13:24] is my guess [23:13:28] ah [23:13:55] after an hour, if it still doesn't work, let me know [23:14:03] hopefully it won't take near that long [23:14:37] back later (and watching irc on my phone) [23:17:30] yay now https://phabricator.wikimedia.org/T104052 gets to be super annoying [23:18:22] yeah :/ [23:23:13] I think I'll go to bed, I'll check that page again in the morning and see if it works then [23:23:39] and then I can go wild linking stuff :P [23:43:08] aude, i think i might be making trouble for you if you're planning on disabling linking to the main namespace… [23:43:24] because i'm adding the main pages to the correct item [23:43:33] and most main pages are in the main mainspace [23:45:39] or perhaps it's good that i do that now before it's disabled [23:50:00] Jhs probalby is ok [23:50:23] (Y)