[03:25:07] hi loles [07:44:02] ping [07:45:20] i have a list of wiki article. is any easy way to get wikidata id ? [07:45:44] corresponding wikidata id [07:48:56] manojkmohan : https://en.wikipedia.org/w/api.php?action=query&utf8&prop=pageprops&titles=Wikidata [07:54:21] Thanks HakanIST [08:54:48] query.wikidata.org is giving me 502 bad gateway errors when I try to run queries :( [08:56:26] indeed [09:01:56] nikki: already poked SMalyshev on-wiki [09:02:00] not sure what is going on [09:02:12] thanks :) [09:27:00] nikki: rom1504: https://phabricator.wikimedia.org/T134238 [09:34:16] yiks [10:00:57] Thiemo_WMDE: addshore: hello. Does purtle library needs to be tested on Zend 5.3 ? It fails https://gerrit.wikimedia.org/r/#/c/286407/ [10:01:50] hashar: im guessing no! [10:02:41] addshore: neat. I am dropping it https://gerrit.wikimedia.org/r/286617 [10:54:38] Thiemo_WMDE: https://phabricator.wikimedia.org/T124786 <- can this be closed? [10:55:40] addshore: hashar: I uploaded a patch for this yesterday. [10:58:30] Lydia_WMDE: I will check, but I think the checkboxes in that ticket are up to date. There is more to do. [11:03:58] addshore: https://phabricator.wikimedia.org/T124418 <- do i read the initially linked graph right that this recently got even worse? [11:04:11] addshore: and do you know what we need to do to fix it? [11:04:18] ooooh [11:05:11] well, per https://grafana.wikimedia.org/dashboard/db/wikipageupdater-calls I dont know if it is wikidata [11:05:56] addshore: hmmm ok [11:06:15] but bblack hs just posted a graph there with no context :p [11:06:38] yeah [11:06:48] ok so i will ask in the ticket [11:14:08] How can i make a query of random wikidata items where their statements like instance or subclass and aka exists? Also their crosspening English wikipedia exist [11:15:02] Would i be able to make that from query. wikidata.com [12:02:28] GhassanMas Yes should be possible. You would have to search how to random get a result but should be reasonable to do [12:02:59] *if it's possible and how to get a random result in SPARQL [12:05:20] Yeah frimelle I am searching... [12:07:57] GhassanMas http://stackoverflow.com/questions/5677340/how-to-select-random-dbpedia-nodes-from-sparql [12:09:35] So getting a "random" selection is kind of possible. Writing the query shouldn't be too hard with the existing examples at https://www.mediawiki.org/wiki/Wikibase/Indexing/SPARQL_Query_Examples [12:11:51] GhassanMas even easier solution for the random thing: ...} ORDER BY RAND() LIMIT 100 [12:13:13] Suppose i made a query about items that has statement of "nstance of or subclass" then limit the result to 100 [12:13:43] Based on what the first 100 result would be [12:14:38] Oh i got you, you meant if i used RAND() before the result would completely random [12:14:43] ? [12:34:53] if you use random at the end of your query (where you'd use e.g. LIMIT, too) you could get 100 random items, yes. [12:35:45] Alright l, thanks frimelle [12:36:07] Very welcome :) [12:42:16] Hello. [12:42:31] Lydia_WMDE: we have this information scheduled in Deployments for today: [12:42:32] 283227 deployment-prep: keyholder shinken monitoring [12:42:39] er not the good one [12:42:44] May 3: Wikiversity gets access to the data on Wikidata (so far only has access to sitelinks) [12:42:59] Would you know if a change has been prepared for that? [12:44:56] Dereckson: Not aware of a change for that [12:45:17] but I'm also not through all of my mails yet [12:45:45] Such a change would be welcome for the 17:00 UTC morning SWAT. [12:46:17] Yeah, I can have a look [12:49:43] it's a hoo! [12:49:51] are we deploying a new branch today? [12:50:32] not according to the calendar nor https://www.mediawiki.org/wiki/Wikidata_deployment [12:50:38] but let's ask Lydia_WMDE [12:50:46] we didn't deploy last week afaik [12:51:27] oh wow https://www.mediawiki.org/wiki/MediaWiki_1.28/Roadmap [12:51:31] we're on 1.28 now [12:54:40] I kind of missed that :P [12:54:56] I'm fine with deploying this week, but not sure how much I can be around [12:54:57] i'll ask lydia and jan in a minute [12:55:02] i can be around [12:55:12] Will be traveling to Berlin tomorrow nad conference on Thursday [12:55:21] :) [12:57:41] aude: Do you know whether we ahve a ticket for data for Wikiversity? [12:57:45] I can't find one [12:57:51] and Lydia is not responding right now [12:58:43] we have meeting now [12:58:50] ah right [12:58:56] I keep forgetting about that when remote [12:58:59] it's Tuesday [12:59:00] for wikiversity, i can do it if you want [12:59:14] we need to remove arbitraryaccess.dblist [12:59:28] and then just have arbitrary access be the default, including for wikiversity [13:00:02] aude: Ok, sounds good [13:00:08] We don't cover beta Wikiversity, do we? [13:00:10] i don't see a ticket [13:00:19] assume we do [13:00:36] if i understand what beta wikiversity is [13:00:53] oh https://beta.wikiversity.org/wiki/Main_Page [13:01:06] that's like mul wikisource [13:01:13] I thought we left it out [13:01:17] but seems we didn't [13:01:28] It should get data access in user language, then [13:04:55] does it have site links on wikidata? [13:05:05] looks like it [13:05:43] but not sure it's possible to enter them on wikidata, but it has wikibase [13:07:00] linkItem is active there [13:07:06] (the client widget) [13:07:13] and I suppose the API also supports it [13:07:15] but not sure [13:09:08] not sure what site id [13:09:32] oO https://www.wikidata.org/wiki/Q23685749 [13:09:39] Look at the json [13:09:48] goddammit -.- [13:10:01] oh [13:10:15] but not supported in the UI? [13:10:17] but that's the only site link we have [13:10:19] at all [13:10:26] ok [13:10:32] MariaDB [wikidatawiki_p]> SELECT * FROM wb_items_per_site WHERE ips_site_id = 'betawikiversity'; [13:10:37] hm [13:10:49] I guess we accidentally enabled it at some point [13:10:56] but not in the UI [13:17:42] hoo: we are branching today [13:17:56] i'll generally be around this week [13:18:14] Nice :) [13:24:29] aude: Make sure to delete the 1.28 branches to avoid confusion/ problems nex tweek [13:25:03] doing [13:48:17] aude: What's the progress on moving term search into elastic? [13:49:01] there's a lot of refactoring needed of TermIndex etc [13:49:35] and TermSearchInteractor etc. which uses TermIndexEntry which is a bit specific to TermSqlIndex [13:49:59] hm :/ [13:50:34] given that's the case, i don't know how independent i can make the elastic stuff from wikibase [13:50:42] if i need to implement TermIndex [13:51:11] and TermIndex includes write stuff, like clear() the whole thing [13:51:24] so maybe we want a new thing [13:51:36] Yeah, make a more narrow interface [13:51:43] think we need to [13:51:48] and then have both the SQL and the elastic one implement that [13:51:50] * aude doesn't want to clear elastic :o [13:51:58] and inject it into Wikibase via a callback? [13:52:02] hehe :D [13:52:04] maybe [13:52:59] aude: What do we want to do with betawv now? Disable Wikibase or properly enable it? Ask Lydia? [13:53:21] not sure [14:05:46] Hi [14:05:55] I have a local installation of Wikibase [14:06:06] and I would like to import the properties from wikidata [14:06:28] I have tried with various dumps and plugins [14:06:45] but I do not find an easy way [14:08:03] what is a feasible way to do this? The only dump of the properties I have found is a RDF one, and I have found no importer for this format [14:14:12] aga_: There are also other dump formats [14:14:23] We also have JSON dumps and XML dumps (containing the json as well) [14:14:54] I think aude's https://github.com/filbertkm/WikibaseImport can do that [14:18:04] my scripts don't yet work with a dump, though [14:18:37] They use the API? [14:19:29] yeah [14:22:49] Lydia_WMDE: around? [14:22:51] I had tried with the script, but I was getting an error with php maintenance/importEntities.php --all-properties [14:22:56] hmmm [14:23:23] I have managed to start importing directly the rdf dump of the properties [14:23:46] wikidata-properties.nt with the ImportProperties script from the wikibase extension [14:24:21] what kind of error? [14:24:30] The error I was obtainin with wikidataImport was: [14:24:30] > php maintenance/importEntities.php --all-properties Catchable fatal error: Argument 1 passed to Wikibase\Import\PropertyIdLister::extractContinuation() must be of the type array, null given, called in C:\wamp\www\mediawiki\extensions\WikibaseImport\src\PropertyIdLister.php on line 26 and defined in C:\wamp\www\mediawiki\extensions\WikibaseImport\src\PropertyIdLister.php on line 56 Call Stack: 0.0005 307000 1. {main}() C [14:24:37] oh [14:24:56] jzerebecki: rl-ping [14:26:08] the script works for me for properties but suere there must be an issue [14:26:24] liek a problem with the api request [14:28:19] importing the RDF dump is not working, any ideas how could i solve the api request issue [14:33:31] aga_: We have nothing to turn RDF back into json [14:33:46] And we don't even have anything to import our json dumps [14:34:09] Ok, i will try to debug the wikibaseImport [14:34:10] You could only use the (gigantic) XML dumps and import them, but I would suggest to wait for aude to have a look at her script [14:34:28] problem is the script works for me [14:35:00] Maybe on request timed out? [14:35:03] Just try again? [14:35:07] it uses $wgWBImportSourceApi setting but the default is wikidata for that [14:39:34] aude: around now [14:39:57] Lydia_WMDE: there's some called beta.wikiversity (https://beta.wikiversity.org/wiki/Main_Page) [14:40:07] yeah [14:40:16] think it's similar to mul.wikisource [14:40:26] it doesn't work in the UI to add site links to it [14:40:32] right [14:40:35] let's deactivate it [14:40:38] ok [14:40:51] ok [14:40:57] there exists one item with site link to it (added in the api?) [14:42:10] i'd say remove it [14:42:19] if it isn't working it shouldn't be there [14:42:36] can we have a ticket for supporting it [14:42:43] like mul.wikisource [14:42:48] so we don't forget [14:43:44] aude are you runnign the script from windows or linux? [14:43:51] aude: exists already [14:44:19] aude: Did you update the sites table for jamwiki, yet? [14:44:24] I am runnign from windows, I will try to analyze the packets send to wikidata to try to find something [14:44:55] Ok, no [14:47:48] hoo: the script is broken :( [14:47:58] More than usually? [14:48:03] Or do you mean the usual stuff [14:48:26] With it sometimes bringing the tables in an invalid state requiring manual emptying? [14:48:28] we can wait until tomorrow / thursday [14:48:39] or backport or hack around it [14:49:25] https://gerrit.wikimedia.org/r/#/c/286473/ [14:49:51] ah ok [14:50:22] Lydia_WMDE: link? [14:51:07] https://phabricator.wikimedia.org/T54971 ? [14:52:40] aude: yeah [14:56:16] aude: I do not find anything strange [14:56:33] do you think you can fix the script? [15:02:14] aga_: i could probably add more checks + validation in the script to better know what is wrong [15:02:41] i'd also like to add support for importing from json dumps [15:03:09] just right now i don't have time to look at it more [15:25:33] hoo: how long are you in berlin? [15:27:40] aude: Until Saturday only [15:28:04] oh [15:28:24] i'm wondering who wants to take care of article placeholder deployment? [15:29:57] That's Wednesday only, right? [15:30:41] yes [15:33:19] I can take that [15:33:39] Should have time all afternoon/ evening [15:34:36] ok, great [15:34:55] Could do it after Morning SWAT [15:35:52] ok [17:02:40] Is sparql down for pywikibot? [17:12:41] tobias47n9e__: there are issues and we are working on it [17:12:53] sparql / query service in general [17:14:09] aude: Thanks! [17:44:25] I miss wikidata item creator :( [17:45:59] you can do the same with the Petscan? just not so simple [17:46:17] that's the problem :3 [17:48:19] Yup lol [17:48:38] Finally learned bit more complex than what used to be [19:42:12] Lydia_WMDE: we deployed new code on test.wikidata [19:42:58] aude: \o/ [19:42:58] aude: will put in some time testing it tomorrow morning [19:43:01] i'm seeing some issues with authority control, but then the gadget code there is a bit old and probably just needs to be updated [19:43:01] we might also need to purge parser cache again [19:43:01] ok? [19:43:06] *nod* [19:43:06] ok [19:43:25] ok [19:43:48] omg.... my password is not valid!?! [19:44:43] :D [19:44:46] too short? [19:45:48] it's definitely long enough and doesn't involve common words [19:46:00] think i found a bug [19:46:18] https://phabricator.wikimedia.org/F3963505 [19:46:19] wtf [19:46:27] must be at least 8 characters [19:46:31] must be at least 1 character [19:46:33] $4 [19:46:56] mine is definitely more than 8 characters [19:47:20] -.- [20:15:43] hi, https://meta.wikimedia.org/wiki/Special:OAuthConsumerRegistration/propose says " [20:15:46] The action you have requested is limited to users in one of the groups: Autoconfirmed users, Confirmed users. [20:16:51] ", but the documentation on bots strongly suggest to use OAuth, such as the one here: https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:Pywikibot/user-config.py [20:17:35] I'm not even sure if using password-based authentication is good, I'm in what seem to be a corner case [20:18:02] I don't want to have to use javascript, that means that I can't properly edit trough a browser [20:18:36] so I'm trying to make pywikibot to do it instead, right now, I can read pages with it (the content is json-like) [20:19:12] I didn't try editing yet, and I won't nor want to try it on the public wikidata, I'd prefer to use the test one first [20:19:41] Still I prefer to authenticate with it, to have the edits made with my user [20:20:32] Hi I was wondering if there is documentation and/or best practices for the most ideal way to enter Authority Control information into Wikidata [20:21:19] I was going to update a bunch of subjects but I want to be consistent with best practices -- if best practices exist. [20:28:03] Hi, is there a way to change the datatype of a property [20:29:30] for articles regarding physics the info boxes use the datatype math rather than string [20:31:30] https://www.wikidata.org/wiki/Property_talk:P416 that has only 47 https://query.wikidata.org/#PREFIX%20wdt%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fprop%2Fdirect%2F%3E%0APREFIX%20wikibase%3A%20%3Chttp%3A%2F%2Fwikiba.se%2Fontology%23%3E%0A%0ASELECT%20%3Fp%20%3FpLabel%20%3Fw%20%3FwLabel%20WHERE%20{%0A%20%20%20%3Fp%20wdt%3AP416%20%3Fw%20.%0A%20%20%20SERVICE%20wikibase%3Alabel%20{%0A%20%20%20%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22%20.%0A% [20:31:30] P416 . - https://phabricator.wikimedia.org/P416 [20:31:52] thank you stashbot [21:56:40] RECOVERY - WDQS SPARQL on wdqs1001 is OK: HTTP OK: HTTP/1.1 200 OK - 15318 bytes in 0.010 second response time [21:57:59] RECOVERY - WDQS HTTP on wdqs1001 is OK: HTTP OK: HTTP/1.1 200 OK - 15318 bytes in 0.002 second response time