[01:41:33] hmph... it seems I can't middle click coordinate links in some wikipedias any more, it just treats it like a left click, but it still works in other wikipedias [01:42:54] does anyone know what changed there? or whether that's been reported somewhere already? [01:59:01] No sorry I hadn't heard anything about that [01:59:53] Krenair, is this anything you have heard about? [02:00:08] no [02:04:22] also weird, it seems it works in firefox but not chromium [09:09:26] aude: it is the focus patch - the focus is set to page and then the tab brings the focus to collapse [10:52:10] Lydia_WMDE: around? [10:53:18] Hey, I got a bit confused about the wikidata mailinglists, since it seems like both are used (?) - so wikidata-l as well as wikidata@lists.wikimedia [10:53:40] isn't wikidata-l just a redirect to wikidata now/ [10:53:41] ? [10:54:11] Yep, right, okay, thanks. That makes sense. [10:54:32] yeah, https://lists.wikimedia.org/pipermail/wikidata/2015-May/006128.html [10:54:45] But it isn't the other way, right? So if you're on wikidata-l you won't see the mails coming from wikidata? [10:55:06] * nikki doesn't know enough to answer that [10:55:53] Okay, I was just wondering, nothing important ;) anyway, thank you! [11:29:28] I know, this might sound like another ridiculous question, but did my Mail make it through to you on the wikidata list? :) [11:31:28] frimelle: I guess it's about "Merging items creates redirect now", so yes [11:33:00] \o/ Thanks! [11:33:10] Np :) [12:42:50] *waves a pull request around https://github.com/wmde/WikibaseInternalSerialization/pull/82 * [13:06:57] moin! [13:09:12] addshore: DM 4.0 isn't released yet? [13:09:51] how can you still use DM 4.0 in that pull request and travis succeeds? [13:10:10] it has been released benestar =o [13:10:22] morning aude ! [13:11:19] benestar: what was our alternative for getBestStatementPerProperty ? O_o [13:11:49] but the release looks very strange here: https://github.com/wmde/WikibaseDataModel/releases [13:12:15] fixed it, thats just because it was tagged but hadnt been given a title on github [13:12:44] ah, now I also got the mail [13:13:04] addshore: you should've merged https://github.com/wmde/WikibaseDataModel/pull/518 though before -.- [13:13:21] we can just merge it after and do another release ;) [13:13:47] yes, lets do 42.0.0 right away [13:14:08] ;) [13:15:12] do we have a task for displaying statements grouped by rank on phabricator? [13:15:51] not that I know of bre [13:15:52] benestar: [13:19:05] addshore: found it: https://phabricator.wikimedia.org/T87327 [13:19:36] awesome, thats the only thing blocking using dm4 [13:20:26] addshore: what is the only thing? [13:21:00] well, a replacement for getBestStatementPerProperty right? [13:21:14] addshore: that method isn't used anywhere [13:21:31] used in TruthyStatementRdfBuilder [13:21:44] and FullStatementRdfBuilder [13:22:50] o.O PHPStorm doesn't find those usages ... [13:23:10] oh, it's in StatementList : [13:23:12] :P [13:23:32] * benestar will fix that [13:26:09] =] [13:30:41] just waiting for packagist to realise 1.5.0 of internal serialization exists >.> [13:30:50] then I think I have a patch for Wikibase to use DM4 ready [13:45:34] yeh aude, someone just needs to move the code over :P [13:45:47] I tried a force push but obviously cant ;) [13:46:05] apparently jan has permission [13:46:10] can poke later [14:11:29] benestar: yay, random failing test in client somewhere for this dm4 patch [14:11:38] https://integration.wikimedia.org/ci/job/mwext-Wikibase-client-tests-mysql-hhvm/3063/consoleFull [14:12:00] o.O [14:12:32] addshore: cannot look at the patch as it seems to be a draft [14:12:49] benestar: try again ;) [14:14:27] everything seems to be passing locally just fine ;) [14:23:05] addshore: trying on my machine now [14:29:46] benestar: reordered them for you ;) still fails :( [14:32:02] addshore: thanks :) [14:32:15] I'm investigating but composer is on strike -.- [14:33:25] Fatal error: Call to undefined method Mock_IDatabase_820b3b15::startAtomic() in C:\xampp\htdocs\mediawiki\w\extensions\Wikibase\client\includes\store\sql\ConsistentReadConnectionMan [14:33:25] ager.php on line 118 [14:33:28] addshore: --^ [14:33:41] >.> [14:34:06] something got merged in core ? O_o? or whut [14:34:12] not sure how that could work on your local machine :P [14:34:27] I didnt pull core [14:34:31] me neither [14:35:33] wait, and core hasnt changed... [14:35:35] maybe it's unrelated and tests are broken on my machine at all [14:35:43] well, pull core and run it again [14:35:54] thats something I fixed about 27 days ago [14:35:56] * benestar checks on Wikibase master [14:36:23] also occurs on master [14:36:32] pull core! ;) [14:36:35] will do [14:36:40] but that takes ages :P [14:38:15] relevant change in Wikibase was https://github.com/wikimedia/mediawiki-extensions-Wikibase/commit/93a4cc197d940f7a9e0c68835afe6924b2fc20c0#diff-6cbea56c7e3d1746e742111290cdf71c and https://github.com/wikimedia/mediawiki/commit/6e283d394f31ce24470006f09271db2f21a7f0e7 in core [14:42:10] yes, I remember [14:43:05] now I get strange errors but they are shown by PHPUnit ... [14:43:22] so not like the random exit of the program on jenkins [14:43:44] My IRC is totally full of addshore's and benestars [14:43:49] djeeez [14:43:52] :D [14:43:59] benestar: what fails do you get? =o [14:44:02] / errors [14:44:11] * nikki dilutes JeroenDeDauw's irc a bit [14:44:33] Wikibase\Client\Tests\Hooks\BeforePageDisplayHandlerTest::testHandleTitleNotExist_NoWikibaseLinks with data set #0 (array(), array(), true, array(), NULL, true) MWException: ResourceLoader duplicate registration error. Another module has already been registered as jquery.i18n [14:44:40] perhaps some wrong config [14:45:11] nah, think you need to pull ULS too [14:45:17] or disable it ;) [14:45:19] and 1) Wikibase\Lib\Test\PidLockTest::testPidLock with data set #0 ('mediawiki')strpos(): Empty needle [14:45:27] yeh benestar I get those! [15:08:15] aude: fixed that PS at the begining of the chain! :) [15:08:44] addshore: yay [15:09:00] will look again when i'm done w/ usage tracking or have a minute [15:09:53] epic, that was a good spot! :P its stupid, the xml output currently returns empty elements, and the other formats dont :D cant wait to kill it all at the end of the chain! [15:10:27] aaah [15:10:40] insanity! [15:23:59] addshore: https://github.com/wmde/WikibaseDataModel/pull/518 can be merged now ;) [15:24:42] *waits for tests* [15:24:45] sure [15:25:02] addshore: how should DerivedPropertyValueSnaks be serialized? [15:25:16] the slots just added to the object or as an extra "slots" key? [15:28:14] benestar: not sure ;) [15:28:51] I think the slots just added to the object, perhaps [15:29:07] Still, I dont know how these things easily will fit into the serialization component [15:29:25] we have the same issue with the TypedSnak thing [15:30:10] yes, I'll just add an instanceof check for now [15:30:22] to where? [15:30:35] we should however improve this, and use a DispatchableSerializer [15:30:39] SnakSerializer [15:31:03] mhhhm [15:31:29] but again thats basically useless as we never serialize individual snaks, only when part of a statement etc [15:31:46] and we have no way to add typed or slotty snaks (which dont extend snak) into statements etc. [15:34:08] addshore: true [15:38:21] and thats why we aren't using typed snaks yet :P [15:42:54] addshore: what do you think? https://github.com/wmde/WikibaseDataModelSerialization/pull/162 [15:43:18] already added my 1 though comment ;) [15:43:47] just saw it :) [15:43:58] as the derived thing extends snak in theroy we can do something like ->addDerivedValuesToEntitySnaks( $entity ) [15:44:31] not sure how tidy that would be in practise [15:44:54] addshore: you mean before passing it to the serialization? [15:45:00] I think that's the idea :P [15:45:13] yes [15:45:15] another option would be to use an interface SnakTypeAssigner [15:45:20] or DerivedValuesAssigner [15:45:45] we should also do the same thing for filtering language codes, sitelinks etc [15:46:00] potentially even props of an entity :/ [16:10:16] Hello everyone [16:10:43] I had a question on WDQ can anyone help me please ? [16:11:30] is there any wikidata expert here ? [16:12:58] just ask, if someone can help, they will [16:17:40] HELOOOOOOOO [16:19:20] OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO [16:19:28] OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO [16:19:33] OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO [16:19:36] OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO [16:19:59] HELP HELP HELP HELP HELP HELP HELP HELP [16:19:59] HELP HELP HELP HELP HELP HELP HELP HELP [16:20:00] HELP HELP HELP HELP HELP HELP HELP HELP [16:20:01] HELP HELP HELP HELP HELP HELP HELP HELP [16:20:05] HELP HELP HELP HELP HELP HELP HELP HELP [16:20:06] HELP HELP HELP HELP HELP HELP HELP HELP [16:20:07] HELP HELP HELP HELP HELP HELP HELP HELP [16:20:22] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:22] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:23] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:23] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:24] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:24] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:25] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:25] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:26] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:20:26] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:24:34] Guest34928: just ask your question! [16:25:03] I need help on WDQ (Wikidata Query) [16:25:19] I know how to query items but I don't know how to retrieve values . [16:25:29] like names of items , etc [16:28:24] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:24] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:25] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:28] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:29] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:30] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:38] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:28:39] HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP HELP [16:32:29] addshore: added some patches on DataModel to get rid of Entity [16:32:39] just hit merge :P [16:33:38] Hi [16:34:25] How can I retrieve values from wikidata ? [16:36:30] heh JohnFLewis I tried that then realised I dont have ops in this channel ;) [16:36:51] Guest74556: which values? either the API, Dumps or Queries [16:37:07] addshore: unlucky you ;) [16:37:25] https://wdq.wmflabs.org/api?q=claim[31:(TREE[12280][][279])]%20AND%20claim[177:1653] [16:39:47] Guest74556: then to get values youd need to do an api call [16:39:59] eg https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q64|Q77 [16:45:36] Or there is SPARQL too :) [18:20:05] There are on Wikidata quite a lot of date with precision <= 9 (year) that are in the format 01-01T00-00-00Z. Is it something wanted or is it a bug that should be solved? [18:21:06] Sorry, "01-01T00:00:00Z" [18:44:30] Tpt: if it contains no year that is invalid [18:47:35] sorry, I wanted to say that the timestamp is like 1967-01-01T00:00:00Z [18:47:49] and not 1967-00-00T00:00:00Z [18:48:07] as dates created using the UI [18:57:22] Tpt: 1967-00-00 is invalid, but we accept it and probably correct it in the formatter [18:58:01] jzerebecki: are you sure, this kind of datas are created by the current Wikidata UI since the beggining. [19:00:06] In the data model spec "the month, day and time will be set to zero if they are unknown" [19:00:44] https://www.mediawiki.org/wiki/Wikibase/DataModel#Dates_and_times [19:05:56] Tpt: no idea who added that, but it is still invalid. and yes month 00 and day 00 are widespread on wikidata. the date and time datatype is a mess. that setence in the data model doesn't give some special meaning. precision still determines if there is meaning to e.g. the month. [19:07:04] I think it was there since the beggining. I remind to have seen it in 2013 [19:07:17] I'll open a bug about it [20:32:34] Who broke the lua modules :-( [20:32:44] ? [20:32:48] where? [20:33:15] oh, i see [20:33:26] EVERYWHERE! :P [20:33:57] legoktm: ^ [20:34:01] aude: https://www.wikidata.org/wiki/Module:Arguments <- wtf [20:34:08] Looks like parsed as wikitext, not LUA [20:34:13] i think it has the wrong contentn model [20:34:15] yeah [20:34:40] legoktm maybe have been poking at it, generally [20:34:40] https://www.wikidata.org/w/index.php?title=Module:Arguments&action=info <- auw [20:35:17] https://phabricator.wikimedia.org/T107340 [20:35:30] thanks [20:35:52] looking [20:35:57] thanks legoktm [20:36:30] aude / legoktm https://phabricator.wikimedia.org/T107340 I guess? [20:36:46] do we know when this broke? [20:37:23] Last 24 hours? [20:37:29] legoktm: it affects the main page [20:37:37] hm [20:37:39] i didn't notice the issue before but sure someone would have [20:39:23] http://fpaste.org/249567/2358143/raw/ [20:39:37] fixed for now [20:39:40] ohhhh [20:39:41] ok [20:39:47] I think this was always broken [20:40:13] and https://gerrit.wikimedia.org/r/#/c/160605/ exposed it [20:40:18] which just went out in today's train [20:41:23] thanks legoktm [20:41:27] Thanks, only https://www.wikidata.org/wiki/Module:RFBOT/testcases seems to be left broken [20:41:45] https://www.wikidata.org/w/index.php?title=Module:RFBOT/testcases&action=edit <- nice fatals [20:41:49] Fatality! [20:42:11] :/ [20:42:12] o.O [20:42:17] * aude looks [20:42:39] probably revision/title content model mis-match [20:43:22] yeah [20:43:33] title is wikitext when it should be module [20:43:40] Format text/x-wiki is not supported for content model Scribunto [20:44:27] fixed [20:44:43] legoktm: FAIL @ https://www.wikidata.org/wiki/Module:Open_RfAs/doc [20:44:54] Lot's of them at https://www.wikidata.org/w/index.php?title=Category:Scribunto_modules_with_errors [20:44:55] hmm [20:45:01] I think my script was too agressive. [20:45:23] are /doc pages supposed to have a content model of Scribunto? [20:45:27] or are they supposed to be wikitext? [20:46:00] Wikitext [20:46:20] I think all the Module:/ are wikitext [20:46:30] But I'm not sure [20:46:45] Just loop over https://www.wikidata.org/w/index.php?title=Category:Scribunto_modules_with_errors and change all pages with / in it to wikitext again [20:47:43] We seem to have 135 modules with a / in the name legoktm [20:48:26] yeah, it's a little more complicated [20:48:31] we have to update the revision rows too. [20:48:33] gah [20:48:51] Good luck [20:49:31] so /doc is wikitext [20:49:49] legoktm: MariaDB [wikidatawiki_p]> SELECT page_title FROm page WHERE page_namespace=828 AND page_title LIKE '%/%' limit 150; [20:50:19] Some of these are wikitext and some submodules. pffff [20:50:19] it's only /doc [20:50:24] sigh. [20:50:35] I see stuff like /testcases [20:50:43] hmm. [20:51:28] Other stuff like https://www.wikidata.org/wiki/Module:Wikidata/Globes should be scribunto [20:52:40] legoktm: Force an update on all the / pages. Than loop over the category with scribunto errors, that's probably all wikitext by than [21:00:26] multichill: does https://people.wikimedia.org/~legoktm/wd_modules.txt look right? [21:03:39] I think this was always broken and https://gerrit.wikimedia.org/r/#/c/160605/ exposed it <-- correct [21:04:07] legoktm: I think so [21:04:57] JohnFLewis: Andy is being very aggresive again at https://www.wikidata.org/wiki/Wikidata:Administrators%27_noticeboard#Disruption_at_Template:Authority_control_properties , maybe you can talk some sense into him? [21:06:25] multichill: I'm stepping back really. I've told Kolja to avoid calling him a spammer as its a two-way street as Kolja imho is purposeful aggravating him now. [21:07:58] I can't look into someones head [21:09:19] oh crap, i did https://www.wikidata.org/w/index.php?title=Module_talk:RFBOT/testcases&action=history and now i can't edit it [21:12:36] jackmcbarn: Dude! Why did you move it to the wrong namespace? [21:12:53] Now legoktm has to fix it again ;-) [21:13:01] legoktm: Another nice fatal at https://www.wikidata.org/w/index.php?title=Module_talk:RFBOT/testcases&action=edit [21:14:09] multichill: i didn't [21:14:18] i moved it to the *right* namespace [21:14:21] it's the content model that's wrong [21:14:50] jackmcbarn: You moved it to the talk namespace, that's not the right one [21:14:55] yes it is [21:15:09] module:*/testcases is for Lua testing code, and the wikitext that runs the tests goes at module talk [21:15:09] It's the talk namespace, not the testcases namespace [21:15:46] wikitext test cases belong in talk [21:15:58] jackmcbarn: No, see for example https://en.wikipedia.org/wiki/Template:NRHP_row/testcases [21:16:08] for templates that's not true, but for modules it is [21:16:17] look how enwiki does all its module test cases [21:16:22] Or https://en.wikipedia.org/wiki/Module:Authority_control/sandbox/testcases [21:16:31] exactly [21:16:32] https://en.wikipedia.org/wiki/Wikipedia:Lua#Unit_testing [21:16:41] By convention, unit tests for a module like Module:Bananas are placed in Module:Bananas/testcases [21:16:41] the enwiki module you linked proves my point [21:16:46] yes, the Lua test cases are [21:16:50] the wikitext to run the test cases goes in Module_talk [21:16:59] No, it's in the moduel namespace [21:17:13] look at https://en.wikipedia.org/wiki/Module:Arguments/testcases and https://en.wikipedia.org/wiki/Module_talk:Arguments/testcases [21:17:18] Oh, you have two?] [21:17:25] they're two parts of the same thing [21:17:30] lua part in Module and wikitext part in Module_talk [21:17:44] the page i moved was the wikitext part, so it belongs in Module_talk [21:18:00] That's confusing! [21:18:06] that's how modules have always been [21:18:24] Brrrr, talk page abuse [21:21:45] jackmcbarn: Thanks for the explanation, I should really dive into the wonderful world of LUA some day..... [21:21:50] np [21:22:13] ok so, [21:22:26] https://people.wikimedia.org/~legoktm/wd_modules_log.txt is all done now [21:22:30] are things still broken? [21:22:45] just the testcases page? [21:23:02] jackmcbarn: do I need to fix /testcases? [21:23:08] yes, it should be changed to wikitext [21:23:51] legoktm: Probably should make a new bug for the moving of module pages with content model scribunto to other namespaces that it gives a fatal [21:23:54] I'm going to bed [21:23:56] later [21:24:19] multichill: I fixed that, it hasn't been deployed yet :) [21:24:36] Good job! [21:27:05] Reading https://docs.google.com/document/d/1pxp3JS72odvUc2PYn3-0gwXY-8WrmJhzbcKm9wnNueM/edit?pli=1 [21:27:09] Wrong window [22:18:03] legoktm: can you fix that one page? [22:19:15] gah yeah sorry, got distracted [22:26:53] jackmcbarn: fixed [22:27:17] looks good [22:28:37] ok so, now time to run it on all other projects [22:29:25] legoktm: we should probably add a sanity test to make sure it looks like lua [22:29:35] to avoid what happened to that stupid testcases page happening on a mass scale elsewhere [22:30:08] jackmcbarn: if you can come up with a php snippet, I can stick it in my script :) [22:30:28] legoktm: can you post the script as it is now btw? [22:32:02] jackmcbarn: http://fpaste.org/249604/14382091/raw/ [22:32:19] it was originally written for one off changes, then I added in fixing the entire module namespace :| [22:41:17] i think i've got it; testing it now... [22:42:15] jackmcbarn: we could just check that it's valid lua syntax? [22:42:21] that's what i'm doing; pretty much [22:42:23] I just remembered we have that checking now. [22:42:25] okay :D [22:44:12] legoktm: http://fpaste.org/249606/20984214/ are the 2 changed functions [22:44:48] (note that those are hardcoded to work for modules; you shouldn't keep those in this for other use) [22:45:16] yeah, I'll just make this changeContentModel3.php ;) [22:48:11] running in dry-run mode right now [22:49:02] i don't see provisions for a dry-run mode in that script [22:49:14] jackmcbarn: I comment out the $page->doEditContent() part :P [22:51:00] can you paste the logs of it, if there are any "Not valid Scribunto" messages? [22:55:42] sure, I just have to spot-check that there are no private wikis in that output >.> [22:56:02] it's at kkwiki right now [22:56:35] private wikis have secret information in the names of their modules? [22:56:38] :P [22:57:21] who knows [23:13:02] jackmcbarn: no invalid modules [23:13:10] in ANY of the wikis? [23:13:15] yeah [23:13:16] :D [23:13:28] * jackmcbarn finds that a bit hard to believe, but ok... [23:13:41] A group can have two music brainz ids leading to the exact same content.. what [23:13:58] hoo: which group? [23:14:21] A got it... seems one is the canonical one, the other some sort of redirect [23:14:28] https://www.wikidata.org/wiki/Q323803 [23:14:42] (you need the primary sources tool enabled to see the second id) [23:14:54] https://musicbrainz.org/artist/411a1f3c-fe92-46bf-84ff-05f41466d4cd https://musicbrainz.org/artist/5de80871-f721-4c7c-b0c5-4ad0d350c232 [23:15:26] The 5d… one seems to be the "right" one given the links even from the 411a… one point to that id [23:17:36] might want to ask in #musicbrainz [23:17:53] I'm pretty sure by now [23:18:07] ok :) [23:18:14] but still, they could make that a little more obvious :P [23:19:25] jackmcbarn: ok, running it now [23:26:41] any wikidata admins around? [23:27:11] hoo is. [23:27:53] huh? [23:27:58] hoo: can you delete https://www.wikidata.org/wiki/Module:IDs/doc and then move https://www.wikidata.org/wiki/Module:IDs/doc/doc to Module:IDs/doc ? [23:29:30] done [23:29:54] thanks [23:30:04] https://it.wikisource.org/wiki/Modulo:Dati/Bettinelli_-_Opere_edite_e_inedite,_Tomo_7,_1799.djvu o.o [23:39:11] jackmcbarn: aaaand done. [23:39:40] 435 modules fixed