[01:44:59] hi, is there anyone around? [08:07:38] good morning [08:08:24] good morning hashar [08:14:17] addshore: aude: I am around :] [08:15:49] for the audience, testwikidatawiki and wikidatawiki are still on 1.28.0-wmf.18 [08:16:03] the dispatch script / jobs got bugged and were no more processing when on .19 [08:17:43] Morning hashar [08:18:00] See the ticket, the error isn't just happening on wikibase, but also other random requests on other wikis [08:22:42] arghh [08:22:49] that is the getConfiguration.php thing? [08:22:53] Yep :/ [08:23:00] well [08:23:06] let me migrate the wikis to Wordpress [08:23:11] might be more robust [08:23:16] ;) [08:23:39] Bit the number of logs didn't seem worthy of shouting about too much when everyone wanted to go to bed last night ;) [08:24:22] But yeh, the despatch script runs every 3 mins, and hit the error every time, hence despatch lag rockets, alarms sounded and we actually noticed! :) [08:24:58] also [08:25:09] seems like Aaron has hot fixed the mw/ext/Wikidata repo [08:25:19] due to LBFactory being renamed in core [08:25:27] so most probably the WikibaseClient.git is still broken [08:26:08] and where have you found the nice stacktrace at https://phabricator.wikimedia.org/T145819#2642193 ? :D [08:26:21] been hunting for that yesterday but could not find where they were logged [08:26:21] Exception.log on fluorine! [08:26:25] oh my god [08:26:44] I remember we had issues with the despatch script not loving before... So we made it just log with the rest of the exceptions ;) [08:26:53] But yeh, it also took me a little time to find.. [08:27:00] I also found a few jobs spurting "This script must be run from the command line" [08:27:12] Ooh, interesting [08:27:21] There is another ticket that may be relevant then! [08:28:05] MWException from line 561 of /srv/mediawiki/php-1.28.0-wmf.19/includes/SiteConfiguration.php: Failed to run getConfiguration.php. [08:28:16] that one from CentralAuth [08:28:16] :( [08:30:00] hashar: T111441 [08:30:00] T111441: SiteConfiguration::getConfig() does not work in Wikimedia production - https://phabricator.wikimedia.org/T111441 [08:30:52] May be vaugly related! [08:32:07] SiteConfiguration::getConfig() uses wfShellExec() to call maintenance/getConfiguration.php [08:32:11] seriously ... :( [08:32:28] Yeh i know, that's about when I decided to go the bed ;) [08:32:41] I resign [08:32:59] I am going to buy land in the middle of France [08:33:03] and grow Tomatoes instead [08:33:37] Can I come? :0 I'm near France! [08:33:49] sure [08:33:59] will send you the incorporating paperwork. Let em call an accountant :] [08:34:25] so looks like wfShellExec is passed REQUEST_METHOD [08:42:02] so from the https://logstash.wikimedia.org/goto/259821dc32242eb3fde0cd02755685f6 [08:42:09] that looks like some regression in wmf.19 [08:42:19] yeh [08:42:40] which hits Wikidata pretty hard [08:42:49] due to one of wikidata job actively using that codepath [08:42:49] I didn't go diving through the changes yet to see if I could find out what exactly happened... [08:43:10] and yeh, wikidata hits it every 3 mins for all dispatch stuff, thus dispatch totally breaks [08:44:32] did I say that shelling out to run a maintenance script is TERRIBLE? [08:45:29] Not quite in those words, but yes! ;) [08:46:32] harmonia: isn't it better to wait for a bot to fix the redirect? You now delete sources. https://www.wikidata.org/w/index.php?title=Q5284&type=revision&diff=376722678&oldid=371935730 [08:47:33] sjoerddebruin: uh? normally it shouldn't delete sources [08:47:47] rah [08:47:51] petscan always worked like that [08:47:52] I have stopped it [08:48:11] >< [08:48:37] If you aren't running other scripts at the time, you can easily rollback. [08:48:47] yep, will do so [08:48:56] most easy way [08:49:05] You know https://meta.wikimedia.org/wiki/User:Hoo_man/Scripts/Smart_rollback? [08:49:46] nope [08:49:53] addshore: got a bunch of them in archive/exec.log-20160916 :) [08:50:13] oooh [08:51:00] harmonia: useful script that selects all rollback on the given contributions page [08:51:14] hashar: back a bit later! [08:51:25] sjoerddebruin: how does that work? [08:53:17] https://www.wikidata.org/w/index.php?title=Special:Contributions&offset=20170101000000&limit=288&contribs=user&target=Harmonia+Amanda&namespace=&tagfilter=OAuth+CID%3A+378 displays all who should be reverted [08:53:33] A link in the page right top has been added. [08:54:08] Then fill in a useful summary, maybe mark as bot edits (if you can) and click "revert everything" [08:54:32] sjoerddebruin: thank you! [08:54:40] I'm doing it [08:55:38] If you add the added code on https://meta.wikimedia.org/wiki/User:Sjoerddebruin/global.js too, it will show up under tools instead (better place imo). [08:55:48] sjoerddebruin: thank you very much and I'll be wary of petscan now [08:56:06] No problem. :) [08:56:20] I thought it only deleted strictly what we asked to delete, so not the properties-value with qualifiers or sources [08:56:35] and then i would have corrected those manually [08:56:57] i don't even know why I thought that >< [08:57:09] well, no harm done, i think, but still :s [08:57:33] Well I may have removed some too, but most were "imported from" [08:58:09] yay, useless edits [08:58:12] *sigh* [09:01:45] at least you don't waste your time on rollbacking them now [09:02:13] yep, that's a good point [09:31:00] Oh, what a coincidence. https://twitter.com/maeusehaut/status/776698855949791232 [09:35:31] is there a way to get an inventory of the bots that are actively working on a given set of items (for instance all items that are about software) ? [10:00:04] harmonia or you could ask rollbot :p [10:00:26] Alphos: to late [10:00:31] ^^ [10:00:50] but yeah, I would have asked you and not done this manually [10:18:07] addshore: and the root cause is: [pid 25957] write(7, ... , 1024 ) = -1 EFBIG (File too large) [10:39:48] hashar: :D [10:47:08] ohia tarrow [10:48:19] oh hiya addshore [10:52:14] I'm obviously here because I'm trying to get more out of WDQS than I can get at the moment. Am I right in understanding that the best way to find all items which have a particular ancestor is using the blazegraph gas api? [10:54:39] Umm..why are people (Succu) removing sources I've added? https://www.wikidata.org/w/index.php?title=Q26884531&curid=28796687&diff=376713164&oldid=376713147 [10:55:30] tarrow: indeed [10:56:45] Josve05a who needs sources anyway ?! :p [10:56:52] For example I want to get all species that are animals with http://tinyurl.com/htjym8b. Is this the best way to get it without timing out? should I change gas class? [10:56:52] citations are for chumps ! :p [10:58:08] Alphos: You are a chump {{citation missing|reason=due to you being a chump}} [10:58:08] 10[2] 04https://www.wikidata.org/wiki/Template:citation_missing [10:59:44] ^^ [11:01:13] tarrow http://tinyurl.com/zsmmtd7 SERVICE wikibase:label is slow, rdfs:label goes a bit faster but is somewhat more complicated for fallbacks. since your original query didn't ask for a fallback, i guess that'll do :) [11:01:28] 93658 Results in 13785 ms [11:01:35] :D [11:02:39] Alphos: Thanks! Now the question is how high can I push the maxIterations before it times out :P [11:03:14] Josve05a: probably best to ask succu :) maybe they thought it was a mistake or something [11:03:44] I'd even be happy to consider skipping the label all together and resolving them later; is there an api for bulk resolution of Q's to labels? [11:03:45] nikki: S/he's done the same on 2-3 more items I've done in the last day.. [11:04:20] but wondering if I've missed that we should not have sources on specific properties, or... [11:05:25] tarrow http://tinyurl.com/jgxzrcf and there's with fallback to french label, equivalent to SERVICE wikibase:label { bd:serviceParam wikibase:language "en,fr" . } [11:05:40] 93657 Results in 19925 ms if you were wondering :p [11:06:28] I can't see anything obvious, but I don't know much about taxonomy stuff anyway [11:06:40] it's still usually best to ask the person doing the edits, only they know for sure why they're doing something :P [11:08:07] Alphos: cool! I'll bump maxIterations and see if I get any further [11:08:44] tarrow gas:maxIterations 12 ; # goes without flinching , 239308 Results in 33804 ms [11:09:02] (without the fallback, just the english label) [11:11:02] 14 takes an awful long time :x [11:11:12] weird that it hasn't timed out yet [11:11:22] Alphos: thanks, I think it gets better each time you run a query; I guess there is some caching somewhere [11:11:38] tarrow actually it only gets better after the first time [11:11:52] ah, so it's always work running twice [11:11:55] and only for a few minutes, and only if you don't change anything at all in the query, including whitespace [11:12:09] the third time should take about the same time as the second ;) [11:12:33] ah, there isn't any caching 'popular' vertices in blazegraph then? [11:12:46] not that i'm aware of [11:13:09] (but that would be so awesome, it would teach people to write their queries incrementally :x ) [11:14:05] yep, it would be really cool. I don't really have any idea what magic goes on under the blazegraph hood [11:14:22] sometimes weird bugs, i found [11:14:29] other than that, no idea either :p [11:15:29] blazegraph bus are the best bugs ! https://phabricator.wikimedia.org/T143897 https://phabricator.wikimedia.org/T145466 [11:15:36] s/bus/bugs/ [11:20:49] SMalyshev: how long will it take until all items have the number of statements in WDQS? [11:20:49] Haha, that's excellent. Nothing like an hour ago actually being sometime early next week :P [11:22:49] tarrow nothing like the united states being a human being either ! [11:24:04] https://www.wikidata.org/wiki/Q10717559 O_O [11:24:13] i don't even know what to do [11:24:29] addshore: know if you have a bulk api to go from Qid to label? [11:24:40] (funny thing is, in 1655, louis the XIVth is said to have said "The State is Me", and WDQS confirms that statement :p http://tinyurl.com/hmwd9dy ) [11:27:41] tarrow: I dunno if there are any better ways, but one option would be to use something like "values ?item { wd:Q1 wd:Q2 etc } ?item rdfs:label ?label filter (lang(?label) = "en")" (or whichever language you want)... the main problem is making sure the query stays under the url limit [11:28:32] harmonia: https://www.wikidata.org/w/index.php?title=Q10717559&action=edit&restore=23120053? :P [11:29:34] sjoerddebruin: ^^ [11:29:40] ah, yeah. I was deliberately trying not to hammer WDQS too much. I think just wbgetentities is probably best [11:30:03] how many are you trying to get anyway? [11:30:20] Well, it will probably be around 600-700k [11:31:07] ah [11:32:43] I'm not in charge of any of it, but I would have thought either option would be fine as long as you do the requests sequentially [11:34:07] harmonia it's bad faith on your part, as if you've never heard of that given name ! :p [11:34:27] Alphos: :p [11:34:34] "well hello, Mr Världsmästerskapet i fotboll 1930 (gruppspel) Smith !" [11:34:43] https://www.wikidata.org/wiki/Q211024 ;) [11:34:48] what a mouthfull [11:34:49] although i'd like to point out it would be better to create a second element for the family name [11:35:03] sjoerddebruin: Albin! :p [11:35:09] it would be particularly useful for Mrs Världsmästerskapet i fotboll 1930 (gruppspel) Världsmästerskapet i fotboll 1930 (gruppspel) [11:35:16] Yeah, we have a spoken version of that article... [11:35:27] (yes, it's a first name that can be for male or female) [11:35:33] It's sad we can't include the whole name of https://www.wikidata.org/wiki/Q2732136 [11:36:22] "Wolfe­schlegelstein­hausenberger­dorffvoraltern­waren­gewissenhaft­schaferswessen­schafewaren­wohlgepflege­und­sorgfaltigkeit­beschutzen­von­angreifen­durch­ihrraubgierigfeinde­welche­voraltern­zwolftausend­jahres­vorandieerscheinen­wander­ersteer­dem­enschderraumschiff­gebrauchlicht­als­sein­ursprung­von­kraftgestart­sein­lange­fahrt­hinzwischen­sternartigraum­auf­der­suchenach­diestern­welche [11:36:22] ­gehabt­bewohnbar­planeten­kreise­drehen­sich­und­wohin­der­neurasse­von­verstandigmen­schlichkeit­konnte­fortplanzen­und­sicher­freuen­anlebens­langlich­freude­und­ruhe­mit­nicht­ein­furcht­vor­angreifen­von­anderer­intelligent­geschopfs­von­hinzwischen­sternartigraum" is a candidate for a item though [11:38:22] ah [11:38:36] LlanfairPG ! [11:38:43] it can be problematic in administrative surveys :p [11:39:21] https://upload.wikimedia.org/wikipedia/commons/a/ae/Cy-Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch.ogg <3 [11:39:36] i especially love the double double l in the middle :) [11:40:44] sjoerddebruin now i'm wondering if the guy had had a son named Robert'); DROP TABLE students;-- [11:41:02] hello Tpt ! [11:41:07] Alphos: https://xkcd.com/327/ [11:41:16] :) [11:41:19] Hello harmonia :)à [11:41:20] sjoerddebruin where do you think i got the idea from ? :p [11:43:27] sjoerddebruin : "well hello, Robert'); DROP TABLE students;-- Wolfe­schlegel­stein­hausen­berger­dorff­welche­vor­altern­waren­gewissen­haft­schafers­wessen­schafe­waren­wohl­gepflege­und­sorg­faltig­keit­be­schutzen­vor­an­greifen­durch­ihr­raub­gierig­feinde­welche­vor­altern­zwolf­hundert­tausend­jah­res­voran­die­er­scheinen­von­der­erste­erde­mensch­der­raum­schiff­genacht­mit­tung­stein­und [11:43:27] ­sieben­iridium­elek­trisch­motors­ge­brauch­licht­als­sein­ur­sprung­von­kraft­ge­start­sein­lange­fahrt­hin­zwischen­stern­artig­raum­auf­der­suchen­nach­bar­schaft­der­stern­welche­ge­habt­be­wohn­bar­planeten­kreise­drehen­sich­und­wo­hin­der­neue­rasse­von­ver­stand­ig­mensch­lich­keit­konnte­fort­pflanzen­und­sicher­freuen­an­lebens­lang­lich­freude­und­ru­he­mit­nicht­ein­f [11:43:28] urcht­vor­an­greifen­vor­anderer­intelligent­ge­schopfs­von­hin­zwischen­stern­art­ig­raum, how's the weather in Taumatawhakatangihangakoauauotamateaturipukakapikimaungahoronukupokaiwhenuakitanatahu ?" [11:43:48] https://www.wikidata.org/wiki/Q26903397 [11:43:57] VIP's labels doesn't work ;_; [11:44:26] this looks like a bug [11:44:32] * harmonia in unkind probably, but she's laughing [11:44:42] you can create items with such long labels, but you can't add labels... [11:45:28] Lydia_WMDE: ^ [11:46:08] https://www.wikidata.org/wiki/Q2732136#P734 is also borked [11:51:24] found the tickert [11:52:12] RECOVERY - puppet last run on wdqs1001 is OK: OK: Puppet is currently enabled, last run 2 minutes ago with 0 failures [12:19:09] addshore: so eventually i found it the hardware [12:19:30] addshore: the ulimit for the file size is 512MBytes (defined in mediawiki-config and I have better documented it in ac hange you are a reviewer for ) [12:19:46] addshore: but terbium hhvm cli cache was 876 Mbytes --> failure [12:19:56] and most probably a ton of hhvm app servers have the same issue [12:20:11] I have no idea though why changing wmf version suddenly cause hhvm to try to write to the file [12:20:16] when the cache should already be populated [12:27:04] Oooh [12:30:23] hashar: you mean Kbyte, right? [12:31:02] Tobi_WMDE_SW: the cache file ? [12:31:05] no it is Mbytes [12:31:25] on a prod server that just had the "Failed to run getConfiguration.php" error I got: [12:31:30] -rw-r--r-- 1 www-data www-data 202M Sep 16 12:28 cli.hhbc.sq3 [12:31:30] -rw-r--r-- 1 www-data www-data 2.3G Sep 16 04:10 fcgi.hhbc.sq3 [12:31:35] 202M and 2.3Gbytes [12:31:51] hashar: oh, I was confused [12:32:01] it's *1024 :) [12:32:23] yeah it is rather confusing [12:39:01] Tobi_WMDE_SW: addshore I am rolling back [12:39:06] accounts can no more be created [12:40:44] hashar: hi [12:41:06] i saw getConfiguration errors for other stuff, besides wikidata [12:41:58] hashar: okay! [12:42:40] hashar: would increasing that limit not be the solution though? or would we want to get to the route cause of the increase? [12:42:48] at this point no [12:42:51] it is already 3pm [12:42:54] i also see https://gerrit.wikimedia.org/r/#/c/310726/ from aaron [12:42:55] I am leaving in a couple hours [12:43:00] :) [12:43:04] * aude just woke up [12:43:16] so I am just rolling back the whole mess [12:43:19] aude: the LBFactory stuff is not in the branch afaik [12:43:23] hashar: okay! [12:43:23] and hope we will back in a sane position [12:43:33] i think the best thing is to cut a new wikibse / wikidata branch on monday [12:43:58] and then maybe we could update test wikis on monday to use it [12:44:36] and think some of the issues might not be specific to wikidata also [12:44:39] getConfiguration [12:45:17] i also see [12:45:18] message Count [12:45:18] Warning: Failed to run getConfiguration.php. [Called from Wikibase\Client\Hooks\UpdateRepoHookHandlers::doTitleMoveComplete in /srv/mediawiki/php-1.28.0-wmf.19/extensions/Wikidata/extensions/Wikibase/client/includes/Hooks/UpdateRepoHookHandlers.php [12:45:29] yeh aude the getConfiguration issue is a core / cli issue (that is currently deployed) wmf19. The LBfactory thing was just in tests afaik, not deployed and not in wmf19 [12:45:31] so that's on wikipedias [12:46:56] yeah LBFactory is a different isue [12:47:07] i can maybe look more during the weekend [12:47:21] the getConfiguration [12:47:22] * aude is moving today + tomorrow + flying to germany tomorrow :) [12:47:30] I am afraid we might well have the same issue despite rolling back [12:47:42] and has nothing to do on sunday [12:48:14] hashar: the same getConfig issue even after rolling back? :/ [12:48:19] that would be lame [12:50:21] rolled back [12:50:25] ok [12:50:39] sjoerddebruin: thx - having a look :) [12:52:17] https://www.wikidata.org/wiki/Special:DispatchStats is happy [12:52:29] \o/ [12:52:35] and account creation or going back [12:52:35] :( [12:52:50] I should have caught the drop on https://grafana.wikimedia.org/dashboard/db/authentication-metrics [12:52:52] way earlier [12:53:20] what's this? [12:56:44] wow, hashar there should totally be an alarm for that.... [12:56:53] so [12:56:56] https://grafana.wikimedia.org/dashboard/db/wikidata-dispatch is all happy [12:56:57] if account creation error >95% then something is probably wrong [12:57:09] and [12:57:10] https://grafana.wikimedia.org/dashboard/db/authentication-metrics?panelId=14&fullscreen [12:57:11] o_O [12:57:22] which was at 100% accountcreation error rate is recovering [12:57:30] extremely scary [12:57:31] it was good to get the dispatch lag icinga notice yesterday right away [12:57:38] very helpful [12:57:38] yeah [12:57:42] our monitoring sucks [12:57:54] but at least we have pretty good metrics coverage [12:59:54] aude: yeh, I dont quite understand why it wasn't caught on testwikidata first though? as that dispatch script should have also had the issue? [13:00:14] i don't know if the dispatcher works totally correctly there [13:00:15] but I guess we dont monitor the dispatch lag or scirpt for testwikidatawiki.. [13:00:35] ahh okay, from this I am guessing not then :D as it wasn't throwing exceptions that I could see! [13:00:40] might need https://gerrit.wikimedia.org/r/#/c/208655/ [13:01:04] which i suppose should make a priority to announce and get deployed [13:01:40] the dispatcher might work but not the change notification jobs perhaps [13:02:37] also perhaps minor (or maybe not), i did run into https://phabricator.wikimedia.org/T145624 [13:02:43] PHP Notice: Undefined index: rc_new when running ChangeNotification jobs [13:04:28] aude is that for master or the rollbacked to wmf.18 ? [13:04:44] ah your dev wiki :] [13:04:58] yes, but who knows if this happens in production [13:05:05] or will [13:22:17] it will for sure :] [13:22:23] meanwhile. I am writing an incident report [13:36:46] [= [13:43:14] hashar: thanks [13:48:08] http://wordpress.alphos.fr/2016/09/16/944/sparquickl-2-risque-de-deces/ i've had a lot of fun writing that query, and the best part is that it works without timing out ^^ [13:48:38] http://tinyurl.com/hfnmjbx ^_^ [15:26:08] DanielK_WMDE: Hey, around? [15:46:22] https://twitter.com/ReaderMeter/status/776643039804608517 hot diggity ! thanks SMalyshev :) [15:48:08] Amir1: hey [15:48:36] DanielK_WMDE: Do you think we need more tests? https://gerrit.wikimedia.org/r/#/c/310580/ I added a lot [15:48:54] https://gerrit.wikimedia.org/r/#/c/310580/5/client/tests/phpunit/includes/Specials/SpecialEntityUsageTest.php [15:48:56] line 67 [15:50:13] Amir1: you added more checks, but you didn't cover any additional cases, right? [15:50:36] Oh, I see [15:50:38] e.g. passing the item ID via a named param instead of using subpage syntax. [15:50:51] I thought you meant more asserts [15:51:09] that's also good, but not what i meant :) [15:51:23] okay. Working on it [15:51:25] thanks [15:51:37] also, a test that includes an aspect modifier, e.g. L.fa [15:52:21] Amir1: have you looked into action=info integration for repo yet? [15:52:56] DanielK_WMDE: not yet. I wanted to get this done first, if that's okay [16:01:07] Amir1: sure, no rush [16:01:45] action=info on the repo isn't useful without the special page on the clients anyway [16:02:04] hm... i guess the repo should also get an api module for exposing client subscriptions [16:02:28] could you file a ticket for that? no need to work on it right away, but it woudl be good to have it on phab [16:07:38] DanielK_WMDE: okay, I do it right now [16:09:01] DanielK_WMDE: https://phabricator.wikimedia.org/T145880 [16:14:43] Amir1: thanks :) [16:15:05] Amir1: i added a comment regarding the check for the special page name in the test [16:15:08] there's an easy fix [16:32:41] multichill: 0,4 seconds <3 [16:33:08] sjoerddebruin: Quite an improvement from 100+ seconds :-D [16:33:37] Now, I can finally search for items that need much improvements. [16:33:56] Humans with only one claim, etc [16:36:08] Wonder if Magnus can add a easily filter to Petscan: equally, less or more than X statements. [16:36:23] You can now do that with sparql [16:36:55] I see 23813828 items with this property so that seems to be quite complete [16:37:06] Yeah, but for the unexperienced users it would be much nicer. [16:42:06] But I love how easy things like http://tinyurl.com/jxf326u are now. [16:44:31] VIAF down? [16:54:03] DanielK_WMDE: I was afk [16:54:04] thanks [16:54:23] Josve05a: is VIAF working for you? [16:54:36] sjoerddebruin: no [16:54:37] SMalyshev: Good job, you made us happy :-) [16:54:43] ugh [16:57:34] Let's see if OCLC people are watching their Twitters. [17:09:07] I need someone who know how to speak Swedish [17:09:19] like, pleaaaase [17:09:26] Harmonia_Amanda: be patient [17:10:18] Harmonia_Amanda: I could ping Josve05a again, if that isn't annoying. :P [17:10:22] Platonides: actually I don't know if anyone here speak Swedish [17:10:38] Harmonia_Amanda : i don't think that "pleaaaase" is a swedish word :p [17:10:43] Josve05a probably knows [17:10:58] a good place to find Swedish speakers would be #wikipedia-sv [17:11:03] * Harmonia_Amanda bats her eyes at Josve05a [17:11:23] Hejsan :D [17:11:28] Platonides but that's cheating ! :p [17:11:32] Platonides: the last time I went here, I waited two weeks before having someone answering me [17:11:51] Josve05a: do you really write Swedish? [17:12:05] Well, I am a native Swede...so maybe? :p [17:12:09] xD [17:12:11] fantastic! [17:12:15] damn, we'll have to look elsewhere then :-( [17:12:15] Josve05a: https://sv.wikipedia.org/wiki/Robert_Anderson_(sk%C3%A5despelare) [17:12:48] hmm? [17:12:55] Josve05a: half the filmography is made by another Robert Anderson [17:13:02] in Spanish "hacerse el sueco" means to play dumb :P [17:13:03] oh dear [17:13:06] born in 1920 in Casey Towns [17:13:22] another film was made by http://www.imdb.com/name/nm0026434/?ref_=ttfc_fc_cl_t17 [17:13:26] born in 1933 [17:13:44] Josve05a: and the imdb link in the Swedish WP is also wrong [17:13:47] * Josve05a will clean up and write a bit more after dinner [17:13:55] Josve05a: so I don't even know how to correct all that [17:14:07] so thank youuuuuuu [17:14:20] sjoerddebruin: If you need me to do something, just add a ticket on my board https://phabricator.wikimedia.org/tag/user-josve05a/ [17:14:25] :p [17:14:32] dinner in 5 min [17:14:44] Josve05a: https://www.wikidata.org/wiki/Q5553706 and https://www.wikidata.org/wiki/Q26884722 [17:15:01] Well, my windows need to be clean... [17:15:05] Josve05a: I think the third Robert J. Anderson doesn't have a Qid [17:17:35] And my shoulders hurt, but I don't think all that is covered. [17:26:51] sjoerddebruin: Well, I can assign someone else :p [17:27:50] :O [17:45:17] VIAF up, VIAF down, VIAF up... [18:20:57] Amir1: +2 [18:21:11] YESSSS [18:21:51] DanielK_WMDE: I found this interesting (?) bug working on this patch: https://phabricator.wikimedia.org/T145885 [18:22:23] 🤔 [18:32:30] doing a #wikidata talk at socratees now! [18:33:01] <3 [18:58:54] is that bug where when I edit exact (+-0) quantity and it goes to +-1 alreayd reported and being fixed? it's very annoying [18:59:21] SMalyshev: changes are upcoming, AFAIK the code is ready. Only annoucement and conversion is needed. [18:59:37] conversion? [18:59:49] https://phabricator.wikimedia.org/T115269 [19:00:04] for the conversion: https://phabricator.wikimedia.org/T142087 [19:00:25] ah... that's bigger one. I'd be happy with just not messing up +-0. [19:14:46] https://twitter.com/OCLC/status/776861680886317056 [20:01:57] I was trying to write an automated script to upload some population data to wikidata [20:02:34] I found a lot of population data for villages, but the villages don't exist on wikidata. Should I be creating these village pages ? Is it considered notable enough ? [20:02:57] ^ This is for India [20:03:42] What is the license of the data? [20:03:56] And yeah, they are notable but I think that is more important. [20:04:56] It's census data, publicly available [20:05:07] Let me check the exact license though [20:05:23] There have been some problems with population before, that's why I'm asking. [20:05:36] I see, makes sense. Give me a moment [20:06:36] Relevant discussion: https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/ValterVBot_12 [20:13:40] sjoerddebruin: I seem to need some help in understanding the licensing. The official site (data.gov.in) says it's released under "National Data Sharing and Accessibility Policy (NDSAP)" [20:14:03] https://data.gov.in/sites/default/files/NDSAP.pdf is the official documentation for that ... (scroll down for english) [20:14:32] unlucky chosen filename, btw [20:19:07] sjoerddebruin: I notice that all the data from https://en.wikipedia.org/wiki/2011_Census_of_India is taken from the same source. Would that imply it's alright to use it ? Or does wikidata have a different policy ? [20:19:33] Wikidata has a CC0 license, so if the license of the source requires attribtion we might have a problem. [20:19:52] Ah, ok. [20:19:57] But I can't find anything about attribution [20:21:21] Ah! https://data.gov.in/sites/default/files/NDSAP_Implementation_Guidelines_2.2.pdf [20:22:34] Seems like no attribution is needed, according to that document. [20:22:58] could you tell me which sections you read that in ? [20:23:24] "Don‘t impose Terms of Service, attribution requirements, restrictions on dissemination and so on, which act as barriers to public use of data." ? [20:23:37] Yeah, I think that is the most clear one. [20:23:40] sjoerddebruin: ^ [20:23:44] Awesome :+!: [20:24:25] Also, to be clear I *should* add the villages right ? Even if the village has only ~500 people ? [20:24:43] It is in a trusted source, I don't see the problem of that. [20:25:34] (y) [20:25:35] "These data need to be made available in an open format to facilitate use, reuse and redistribute; it should be free from any license or any other mechanism of control." [20:27:38] Is there a unique identifier for those villages? A property would be great. :) [20:29:13] What do you mean by unique identifier ? There would be a "instance of" village I guess, and "country" India [20:30:16] Well, most census data assign a code to every village [20:31:10] Ah. yes census data does have a 4 level system: state-id, district-id, subdistrict-id, village-id [20:33:57] Would be great to have a property for that, makes new imports easier? [20:50:24] sjoerddebruin: https://www.wikidata.org/wiki/Wikidata:Property_proposal/census_area_code [21:09:59] * aude waves [21:13:49] * Matthew_ waves at aude [21:23:22] sajoerddebruin, are you there [21:25:32] sjoerddebruin* [21:25:37] ? [21:29:09] Hey, can you please explain this edit by hazardbot https://www.wikidata.org/w/index.php?title=Wikidata:Project_chat&oldid=376836745 [21:29:49] your message was not signed [21:30:01] and it's very confusing to repost a archived section, btw [21:30:25] isn't that what you told me to do [21:30:34] that what I understood [21:30:47] I said pointing to the old discussion, aka linking to it. [21:31:37] ooh now it make more sense, I thought I should copy it [21:32:58] sjoerddebruin, thanks for patience (: [22:09:45] is there a gadget that gives a list of entities linked to the current one by a given property ? if so, who would want one ? :) [22:11:38] so for instance, if you're on wd:Q, and you want a list of all entities that have wdt:P wd:Q [22:12:58] and while we're at it, if you're on [[Property:P]], and you want a list of all entities that have wdt:P [22:12:58] Caracteres inválidos en el enlace «Property:P»; no están permitidos: #<>[]|{} [22:13:04] Alphos, I think you can do that with sparql [22:13:08] but I don't know how [22:13:16] GhassanMas i know you can do that with sparql, and i even know how [22:13:39] the question is, would anyone like it as an on-wikidata gadget, if it doesn't already exist ^^ [22:18:31] Alphos: sparql access from lua... theoretically possible, but getting caching/purging right is going to be tricky. Also, queries take too long to be done during rendering. [22:18:35] but it would be very cool :) [22:19:33] we are working on this problem for suppor tfor automatic list generation for use on wikipedia. but it's a though nut [22:20:16] DanielK_WMDE i'm thinking of a js gadget, and i can think of two ways to implement it, although one of them needs checking [22:21:37] first way is to actually perform an ajax request on wdqs and displaying its results, that's the one that needs a bit of checking [22:22:03] second way is plainly and simply to give a link to wdqs/embed.html# ^^ [22:24:57] first problem with the first way is that i'm still unsure where to perform direct queries to the api ^^' haven't really looked into it, is all ; second problem is to check how www.wikidata.org and the domain for wdqs api handle CORS [22:25:25] but the second way is really problem-free, to the best of my knowledge :) [22:27:52] (or well, you know, an iframe :p kinda why it's named embed.html, innit ? ^^ ) [22:30:51] Like EasyQuery? [22:35:09] pretty much [22:35:22] except position of the gadget wouldn't be the same [22:35:57] easyquery tries to see if there are identical statements to ones that are already loaded on your page [22:36:22] my gadget would try to see if there are related pages, with the twist that they must be related by a given property [22:37:22] (and since one could potentially use any property, it'd be pretty hard to put an icon in the statements the gadget isn't gonna use :D )