[02:10:04] I'm trying to change the Wikidata entry for the Lojban Wikipedia's main page, but it looks like it's protected or something. [02:13:48] ksft: Yes [02:14:02] hoo: how do I get it corrected? [02:14:15] Are you logged in on Wikidata? [02:14:56] I am [02:15:07] In that case it seems you're not autoconfirmed [02:15:33] I'm not sure I've edited Wikidata before [02:16:23] ksft: I've changed it for you [02:16:40] although a bot would have performed that change as well at some point in the next 24 hours [02:17:04] oh [02:17:08] didn't know that [02:22:10] I run a bot that does that once per day for all wikis and there are other bots taht do similar things, but I don't know how often [02:22:17] My bot resolves redirects within sitelinks [07:25:43] can someone blog https://www.wikidata.org/wiki/Special:Contributions/181.91.41.183 [07:25:52] he changed the label for female [08:04:23] hi [08:04:45] Wikibase tests are failing again and blocking merges on CX: https://integration.wikimedia.org/ci/job/mwext-testextension-zend/17453/consoleFull [08:39:27] and a fix has been merged [08:44:39] thanks Nikerabbit and legoktm [08:44:57] aude: ;) [08:45:04] np :) [08:45:18] correction: will be merged after tests are completed ;) [08:45:36] i'll make a new build once it's merged [08:46:24] aude: speaking of that, is "a build" something you deploy to somewhere? [08:46:36] to beta [08:46:49] right [08:46:58] and is what is used by jenkins for content translation [08:47:09] e.g. so that composer doesn't have to be run [08:49:20] aude: I see [08:58:57] aude: is there some place I can check when the build is ready? [09:00:08] it's https://gerrit.wikimedia.org/r/#/c/260722/ right? [09:01:46] Nikerabbit: that's it [09:02:15] we should wait for jenkins, then +2 and wait for merge [09:24:12] It's not fun to wait for Jenkins, it constantly fails for totally random reasons since yesterday. Nikerabbit, is your fix related to that? [09:26:05] Thiemo_WMDE: https://phabricator.wikimedia.org/T121291 [09:27:18] Thiemo_WMDE: yes it is not fun at all, and it happens too often [09:30:46] Ok, thanks. I will work on this a bit and *disable* one or two expensive PHPCS sniffs. That's still better than the current situation. [09:36:30] Nikerabbit: new build is merged [09:43:52] aude: thanks [09:44:42] oooh * wonders what broke above* [09:45:13] https://integration.wikimedia.org/ci/job/mwext-testextension-zend/17461/testReport/(root)/ Scribunto takes many minuts for you too [09:46:34] Nikerabbit: :( [09:49:41] so I don't know whether Scribunto belongs to any team anymore... maybe we need a combined effort to make those tests faster? [10:38:03] * aude fixed enough code style issues for now :/ [11:08:53] hello there! I'm looking for a property to reference the de facto twitter hashtag of an entity, is there such a thing? [11:09:35] maxlath: I don't think so. [11:47:43] maxlath: I also don't think there is [11:55:01] addshore: sjoerddebruin: so I made a proposition here https://www.wikidata.org/wiki/Wikidata:Property_proposal/Generic#main_Twitter_hashtag [12:02:12] The big issue with hashtags is that they are not unique and become very useless just a few weeks after something hyped. [12:02:59] They are not very different from a Google search. Would you add a property for a Google search? I don't think so. [13:21:27] aude: I think I get cats in the place I might be staying in in NY :P [13:35:30] aude: Around? [13:40:51] addshore: https://gerrit.wikimedia.org/r/#/c/260255/3..5/includes/site/MediaWikiSite.php Do you have an opinion there? [13:41:23] We could use a static variable with a single instance and use the API url as a parameter [14:02:27] *looks* [14:04:11] hi audephone ... at an airport? [14:07:02] hoo: I wouldnt mind a seperate insatcne for each site [14:07:17] I dont think there is really a case where LOADS of stuff to LOADS of different sites is normalized? [14:08:22] Well, you can trick the API into doing that [14:08:32] but I don't think it happens in the real world [14:08:57] And for the Wikibase use case of the new class, I don't think it really matters [14:15:40] hello [14:15:51] hi HakanIST [14:16:02] does the bot permission needed for running in own user namespace? [14:16:29] depends on the edit volume [14:16:57] addshore: but would it also be ok to you to amend it to use one instance to rule them all? [14:17:09] I really want to get that stuff unstuck... [14:17:35] how much is high volume? [14:17:45] is it ok to run hourly or something? [14:17:53] hoo: one instace of MediaWikiPageNameNormalizer to rule them all? :) [14:18:01] yeh, doesnt look like that would be that bad [14:18:22] HakanIST: hourly in own userspace sounds fine! [14:19:28] addshore: Yes (tied to MediaWikiSite)... we will have a separate one in Wikibase commons validation (but that's another story) [14:19:39] + for commons validation [14:30:52] hoo: not yet, but today is omg panic last day before airport :-) [14:30:52] is there a way to find out history of a site link as in past removals from items? [14:31:30] Back in the office in ~1 hour [14:31:49] :) [14:32:06] HakanIST: You mean like a history page for a specifc site link? Sadly we don't have that [14:32:14] we only have the history for specific items [14:33:00] We (hopefully sometime soon) will have that history showable within the article history on the clients (Wikipedia, ...) themselves [14:33:31] it is available on wikis' recent changes but not in the article history [14:33:47] it'd be nice to have [14:33:50] Yes, that's the current state [14:33:57] Definitely :) [15:41:03] The tiny things... https://gerrit.wikimedia.org/r/260759 [15:41:11] that one made me go "wtf" yesterday [15:41:32] easy merge, btw [15:42:37] That's why it was possible to have media file linked without file ending and fun stuff like that [15:42:42] * files [15:42:46] * endings [15:50:50] Thanks, aude :) [15:50:59] omg, can't believe it [15:51:07] thanks for fixing [15:51:42] I think I found some more "fun" stuff yesterday regarding value parsing, but need to investigate further [15:51:52] ok :/ [16:13:30] Yikes, purging wb_changes is broken :( [16:13:39] Will look into that later [16:16:06] what do you mean broken? [16:16:29] I dunno, it just doesn't work in production [16:16:34] https://www.wikidata.org/wiki/Special:DispatchStats [16:16:39] last run was on Oct 31 [16:16:49] * successful [16:16:54] was it moved back to terbium? [16:17:00] Maybe it wasn't [19:58:28] aude: Around? [20:14:08] nevermind [20:19:26] Wow... I would have never over thought we could go over 500 edits in a single second oO [20:19:27] https://phabricator.wikimedia.org/T122336 [20:21:05] 6551 edits in that whole minute... wow [21:34:26] `Does Wikidata have any good tools for find and replace? [21:34:28] ^ hoo? [21:35:08] also, who was responsible for >500 edits/second? [21:35:48] :( [21:35:49] harej: Find and replace? [21:35:56] Not sure, will check in a bit [21:36:01] I'll be back in 20m or so [21:36:20] Find and replace as in, replace "instance of Q123" with "instance of Q456" [21:37:25] Oh, you can just use autolist for that. [21:38:25] http://tools.wmflabs.org/autolist/index.php?language=en&project=wikipedia&category=&depth=12&wdq=claim%5B31%3A123%5D&pagepile=&statementlist=-P31%3AQ123%0AP31%3AQ456&run=Run&mode_manual=or&mode_cat=or&mode_wdq=not&mode_find=or&chunk_size=10000 [21:54:34] harej: In that second 546 of 547 edits were from BotNinja [21:54:45] And what was BotNinja doing? [21:54:54] By the way, sjoerddebruin answered my question quite well [21:54:56] Probably adding terms, as usually [21:56:30] Just checked, the bot was adding labels to taxons [21:57:06] Of course. [22:18:15] i am close to edit #4,000 on wikidata!