[01:34:27] Hi! Is there an operators IRC channel? [01:35:24] gangleri|home: #wikimedia-ops [01:35:41] Thanks! [09:14:45] Lydia_WMDE: The statements section header is still displayed in English. [10:42:57] Lydia_WMDE: btw one of the properties that's on the identifiers page right now uses the url type (https://www.wikidata.org/wiki/Property:P1421) [10:44:12] I also tried diffing the list of string properties and the list of properties on the identifiers page, there seems to be around 250 which have ID, identifier or code in the name that aren't on the list of identifiers even though most of them probably should be [10:44:44] I added some to the list of identifiers but I don't have time at the moment to go through all of them [16:24:00] * hoo waves to aude [16:24:17] We need to branch today if everything important is in [16:25:31] hoo: ok [16:26:08] Do you have anything important in mind? If not, I'll just branch in a bit [16:27:27] as long as identifiers etc. are in deployable state [16:27:58] They should be [16:27:58] they should still be strings [16:28:03] afaik [16:28:05] yeah [16:28:12] nobody wrote the parser, yet [16:28:16] ok [16:28:19] eh, formatter [16:29:07] We could start converting properties on test this week [16:35:00] hoo: maybe we want https://gerrit.wikimedia.org/r/#/c/262741/ and the corresponding patch in wikibase to go in [16:36:45] Mh... if you want to merge the GeoData one [16:36:52] https://gerrit.wikimedia.org/r/#/c/262845/ [16:36:54] yeah [16:36:57] we really shouldn't have the repo get out of sync [16:37:14] it wont break (fatal), but will cause secondary data loss (geo data) [16:37:27] i don't want the geodata one to get merged, say this week, and get deployed next week [16:37:29] w/o the wikibase one [16:38:01] +1 [16:38:54] gave +2 to max's patch [16:38:58] Nice :) [16:39:21] Are there other Extensions that bind against GeoData? [16:39:27] i don't think so [16:39:37] Ok :) [16:39:43] Will merge your patch, then [16:39:58] * aude checks [16:40:36] ok, have to update mobile frontend [16:40:39] should be easy [16:41:29] :) [16:44:50] Seems MF follows the standard deployment cycle [16:44:58] so will +2 as well [16:45:36] thanks [16:46:17] I'm creating new property suggester data and will update later on, btw [16:47:12] ok :) [17:05:17] hello [17:05:51] hoo: i can put https://gerrit.wikimedia.org/r/#/c/263354/ for swat on wednesday (along with other stuff we have) [17:06:12] I've a problem with wikbase-api php [17:06:44] it occurs when I try to have my bot code on wikidata [17:07:50] aude: Would be nice, because I wont make it in time for the early SWAT (also not tomorrow and probably not on Thursday) [17:08:05] HakanIST: What exactly do you struggle with? [17:08:16] hoo http://aq.si/wikidata/test.php [17:08:23] it's the very basic [17:08:42] code: http://aq.si/wikidata/test.phps [17:09:10] new \Wikibase\Api\WikibaseFactory( $api ); [17:09:22] that apparently needs a Deserializer as second parameter [17:09:59] https://www.wikidata.org/wiki/Wikidata:Creating_a_bot#Example_1:_Basic_example [17:10:08] this is the example I go for [17:10:35] that's not fully up to date, then [17:10:54] Or the version of the component you have is not, but it's more likely that the Wiki is outdated [17:14:15] if only I knew the version of the relevant component [17:14:56] been stuck at this pont for long :/ [18:12:45] hey guys ! trying to test a new bot (aimed at a more precise rollbacking in case of other bots gone awry), so i'd be needing to find a few accounts (at least 2 or 3) on test.wikidata (i'm not testing on wikidata itself, are you mad ?! :-p ) who have edited pages in the main namespace and at least one other namespace, and for each namespace i need 3 pages : one created and only edited by the account, one last edited by the account, and one last edited [18:12:45] by any other account ^^' [18:13:09] (i know, this may seem absurd, but i assure you, it's not :D ) [19:13:31] Lydia_WMDE: Under which which project should I file wikidata/wikibase api stuff? [19:15:53] wikidata, (if the api is in repo, which almsot all are) medaiwiki-extensions-WikibaseRepository [19:16:08] there's also some general API project that can be used on top of that [19:16:36] (answering as Lydia is still traveling AFAIK) [19:17:35] https://phabricator.wikimedia.org/T123208 ? [19:18:25] Thanks for answering hoo :-) [19:19:05] yeah, that looks good [19:19:57] I could do a Lydia impression, just sit on my knees and nod the whole day. [19:20:36] (no offense taken, btw) <3 [19:21:29] Ran into that the other day. Already fixed the client side so it resumed at https://www.wikidata.org/w/index.php?title=Q22010361&action=history :-) [19:22:59] multichill: Which API module is that? [19:23:31] Letmecheck [19:24:38] hoo: wbcreateclaim [19:27:48] hoo: Any idea why at https://www.wikidata.org/w/api.php?action=help&recursivesubmodules=1#wbcreateclaim summary, baserevid and bot: give " (no description)"? [19:28:12] Because there just isn't one [19:28:25] I fixed that a bit ago, so should be fine after the deploy on Wednesday or so [19:28:31] I'll be back in a bit [20:52:25] hoo: Something else. How difficult do you think it would be to index strings in the search engine? Like SK-A-23 on https://www.wikidata.org/wiki/Q17319674 [20:52:51] What kind of search? [20:53:02] https://www.wikidata.org/wiki/Q17319674?action=cirrusdump <- this search [20:55:29] Shouldn't be too hard, but not sure what aude's plans for the search in that regards are in details [20:57:12] Because currently it's really hard to find a painting by inventory number or catalog code because it isn't indexed [20:57:43] Maybe a separate identifier table would be smarter for that [20:57:48] like the wb_items_per_site one [20:58:29] Pushing ids into "fulltext representation" might not work very well [21:00:38] multichill: indexing identifiers wouldn't be too difficult [21:01:03] providing another syntax like incategory: would be more difficult but still possible [21:01:28] could also just make them into "sites" and then use existing infrastructure [21:01:46] Pushing identifiers in a table sounds like convient too :-) [21:02:23] As a user I would just like to type "SK-A-23 [21:02:36] in the search and find https://www.wikidata.org/wiki/Q17319674 among the results [21:02:40] identifiers being language-agnostic makes any solution pretty easy [21:02:55] multichill: should be doable [21:07:59] Maybe a nice follow up story after the identifiers got deployed aude? [21:09:34] * nikki finds https://phabricator.wikimedia.org/T99899 - is that the same thing? [21:11:53] multichill: yeah :) [21:12:00] That looks closely related nikki. [21:36:02] jzerebecki: If you find the time, it would be nice, if you could make sure we ahve Wikibase for (upcoming) tests in WikimediaBadges in place [21:38:12] I'll have some food, but am on IRC from my phone [21:38:18] so reachable, in emergencies [21:39:07] hoo|away: (upcoming) tests? [21:40:33] For the sidebar hook handler ;) [21:42:43] hoo|away: yeah, i know but still don't understand what you are asking jan [21:42:48] (but as long as he understands....) [21:44:00] Well, I need Wikibase installed, as my tests will need its classes etc. [21:44:27] Even if I mock everything, I still need the definitions [21:44:36] oh, in jenkins [21:44:53] Yeah