[00:18:17] aude: Wont stay around much longer, will you still upload a patch now, or shall we create an UNB! bug and make sure things are handled tomorrow [00:31:15] hoo: i'll make a patch or else create a task [00:31:38] aude: Was just about to submit my task [00:32:54] ok [00:33:20] https://phabricator.wikimedia.org/T123447 [00:34:15] Made it very clear in the ticket that we *need* to resolve this [00:34:34] Removing the definition just from lib/WikibaseLib.datatypes.php should work, btw [00:34:39] I briefly tested that. [00:35:37] It might be possible to use the WikibaseRepoDataTypes and WikibaseClientDataTypes hooks, but I doubt that's a good idea [00:36:00] I'm sure you'll figure something. Good night o/ [00:36:11] something will work [00:36:56] Yeah :) [00:37:40] i'd like something that generally work for these things [00:37:51] Would be nice [00:37:51] but if it's too complicated, then a small hack can work [00:38:02] the current way is flexible only for adding things [00:38:19] or well, it's only supposed for that [00:38:49] just where the data types are registered, they could check against the setting [00:38:54] or something like that maybe [00:39:13] Well, not nice... that thing is in lib [00:39:22] and settings are bound to repo, thus dependency hell :( [00:39:39] The callers could have the setting and do array_diff [00:39:57] * to repo and client [00:40:55] DataTypeDefinitions are instantiated in repo and client [00:43:56] Yeah [00:46:31] Anyway, good night :) [03:39:41] addshore: "addwiki/mediawiki-api-base": "0.1.2", [03:39:43] >_> [03:39:51] Since we do not want any bugfixes [03:39:55] oh no, that'd be bad! [12:24:29] Anyone around who could help me out with some SPARQL? [12:26:25] I'm trying to figure out why I get vastly different responses from two very similar queries. [14:20:44] tarrow: you should just ask your question [14:21:17] rom1504: sure! Basically I want to know why these two queries give vastly different results: [14:21:25] http://tinyurl.com/hnahhls [14:21:29] does what I expect [14:21:42] and http://tinyurl.com/jfpdvdh does not [14:22:16] the only difference is a random letter in the URI. But I think the logic should still remain the same [14:22:51] I.e. I should get back whatever is in the VALUES statement if it meets the criteria in the FILTER and nothing otherwise [14:30:15] tarrow: in the wrong entity query (second one), ?s is not found, and I'm not entirely sure but I think the endpoint interpret that as UNDEF (see https://www.w3.org/TR/2012/WD-sparql11-query-20120724/#inline-data) which just means "just limit on the other values of VALUES, not this one" [14:30:26] which means you are not limiting by ?s [14:30:39] only by ?rev [14:30:53] sure, that makes sense, but in the first query the ID is also not in the dsatastore [14:30:57] data* [14:31:21] it is a fictional long Q that also isn't there [14:31:27] and indeed if you remove the ?s from values, you get the same result http://tinyurl.com/jyutjx3 [14:31:33] ah [14:32:06] so the first one is the unexpected one, let's see [14:32:33] well, I am surprised it is taken as UDEF [14:33:14] tarrow: maybe http://www.wikidata.org/entity/Q1q345234524514643243431 cannot be a wikidata uri ? [14:33:27] while the first one can be valid, but just doesn't exist [14:33:40] yeah look [14:33:45] http://www.wikidata.org/entity/Q1q345234524514643243431 go there [14:33:46] I supposed this could be the case, but where in the triple store does it 'know' that? [14:33:48] bad request [14:34:07] https://www.wikidata.org/wiki/Q1345234524514643243431 -> This entity does not exist. [14:34:31] ah, so you think the query engine makes a request to that URI and doesn't just look it up in stored data? [14:34:47] tarrow: well this is a query service specific to wikidata, I wouldn't be surprised if they do checks specific to wikidata [14:35:12] no, I think the query engine knows what wikidata are valid [14:35:17] *wikidata uris [14:35:42] or maybe it does do a request to some uri service, I don't know [14:35:49] That makes sense; I am trying to use it on my own wikibase installation. I want to make it so I have the same behaviour as on query.wikidata.org [14:37:00] that query is part of the code included in the tools that keeps query.wikidata.org up to date [14:37:56] in my case it always behaves like the second link for any entity not in the datastore [14:40:22] I'm not sure what's the objective of that query [14:40:27] checking that the uri exists ? [14:41:28] what about http://tinyurl.com/htqhzl5 ? [14:48:56] tarrow: how convenient is it to install wikibase ? I've considered doing that but it seems like there are many moving part and it's not so easy to have it working [15:12:04] rom1504: It's not too bad. I have installed it locally before but this copy was actually rolled by someone else: librarybase.wmflabs.org [15:13:15] the objective is to see if there is that statement in the data store and if so what versions is it. If it doesn't exist or if it is older than that one in the query then we know we need to put new data into the triple store [15:30:34] wouldn't a simple query like http://tinyurl.com/zp3pnta do that ? [15:39:13] we want a list of all those entities that don't exist or have a revision below the one given. We need to have an optional so that it can return those which don't exist (and we can then check are unbound). If we just do a hard comparison in the filter we can only compare one entity and revision per query whereas, the way the code was written you can lass a [15:39:13] list of entities and revs into the Values block. [15:39:55] That said I think it could be done more slowly using your code but I'd like to figure out what is going on so I can just the the same code as query.wikidata.org [16:28:30] JeroenDeDauw: 0.1.2, where was that? :p [16:28:49] Bug fixes suck, am I right? ;) [16:58:15] aude: is anything else needed for https://phabricator.wikimedia.org/T123447 ? [16:58:40] i think we're good [16:58:45] we're deploying now [16:59:18] of course, please take a look around test.wikidata and test2.wikipedia to see if you find any other issues [17:03:14] i'll be back for the train [17:06:19] k [17:22:37] hey guys ! when editing an entity through the api with clear=1, and sending the same content the entity had immediately prior to the edit attempt, what happens ? [17:23:16] is the edit still considered successful ? [17:55:46] aude: Around? [18:29:06] hoo: ? [18:29:16] aude: hi :) [18:29:27] Thanks for taking care about the identifiers thing [18:29:49] sure [18:30:04] i can't think of any thing else we need to do before deployment [18:30:08] Regarding the other projects sidebar... ahve we ever talked about cache invalidation? [18:30:28] Yeah, I think it's fine for now [18:30:29] hoo: which cache? [18:30:54] AFAIR they end up in the parser achace as extension data [18:31:05] they do [18:31:17] i suppose we could attach a cache key [18:31:39] for the extension data [18:31:59] To evict that, we would need to add an other usage for each and every page which is linked with a Wikidata item [18:32:06] that would be horrible [18:32:19] ugh [18:32:21] yeah [18:32:32] i'm not sure touching extension data would work nice [18:34:35] hm [18:34:47] Guess we need to find some kind of solution [18:35:16] or just document the restriction [18:35:21] Lydia_WMDE: ^ [18:36:40] maybe a bot or script could purge the relevant items / pages [18:37:07] For the initial deployment or long term? [18:37:10] initial [18:37:19] yeah, that sounds ok to me [18:37:28] this is a one time thing, i hope [18:37:30] but only one off [18:38:46] * aude needs to eat and be back for deployment time [18:39:08] Anything important? [18:39:16] no [18:39:27] I'll probably be around (unless I get so tired that I fall asleep early :P) [18:39:34] afterwards, i'll do some stuff with the search index [18:39:39] I'll try to reach out to Lydia about this [18:39:42] same stuff we did for geodata [18:39:44] k [19:13:14] hoo: around? [19:13:20] or any admin or property creator? [19:13:41] aude: yes? [19:14:09] can you look on https://www.wikidata.org/wiki/Special:NewProperty and make sure that 'external-identifier' is not (yet) a choice of data type? [19:14:29] soonish it will be but not it's not ready yet for deployment [19:15:59] aude: I still don't see it [19:16:20] good [19:16:21] thanks [19:16:33] yw [19:16:46] * aude lazy to login to my staff account and probably forgot the password :/ [19:18:02] :O [19:20:45] what is the definitive docs on wikibase data model? [19:21:26] I'm trying to figure out what https://phabricator.wikimedia.org/T123392 should generate - i.e. what should happen if precision is "null" [19:21:39] see https://gerrit.wikimedia.org/r/#/c/263680/ [19:22:32] SMalyshev: probably https://www.mediawiki.org/wiki/Wikibase/DataModel [19:22:54] and i don't think this question is covered [19:23:03] so DataModel says: a precision (decimal, representing degrees of distance, defaults to 0, 9 digits after the dot and three before, unsigned, used to save the precision of the representation) [19:23:14] null is not a valid value there, and default is 0 [19:23:33] which looks like converting 0 to null is correct [19:23:41] null to 0 that is [19:23:48] since 0 is the default [19:35:10] SMalyshev: do you have a list of all snaks that have a null precision? [19:35:15] (I could go and fix them all ;)) [19:35:29] addshore: no, there are more than 100K of them [19:35:33] =o [19:35:35] ewww [19:36:11] sounds like something in a validator somewhere might need poking (unless they are all really old coords) [19:36:35] addshore: wait, no, I'm wrong. Just 33K [19:36:43] addshore: http://tinyurl.com/hlsmk2n [19:37:35] also http://tinyurl.com/hvhsull - 33644 statements [19:37:59] can't easily check if they are old :( [19:38:21] I could write a bot that just fixes all of them, but no guarantee it won't happen again [19:38:36] also, not sure what the bot would fix it to - same issue :) [19:39:43] yeh, well, as null is documented as correct i guess the next best thing is 0! [19:40:01] SMalyshev: *sigh* the code has a different default when precision is null https://github.com/DataValues/Geo/blob/master/src/Formatters/GeoCoordinateFormatter.php#L130 [19:40:35] addshore: null is not mentioned in the docs? [19:40:50] jzerebecki: ugh. then docs need to be updated? But I can use 1/3600 too [19:40:54] *as null is not documented as corret* [19:41:24] addshore: 0 is a bad default. nobody can measure this with precision 0 in reality. [19:41:31] addshore: i think they are old [19:41:34] yeh, so default should be 1/3600! [19:41:43] but 2) looking at the code, think null is still allowed [19:41:59] if we fix 2) [19:42:07] aude: is checks for $precision <= 0 so null would fall there [19:42:08] only things that are defined can have precision 0 [19:42:12] SMalyshev: ah [19:42:12] and be converted to 1/3600 [19:42:52] * SMalyshev also doesn't like it is an ad-hoc expression and not a constant [19:43:10] and of course not documented :) [19:43:13] i was looking at GlobeCoordinateValue [19:43:19] addshore: no the precision should be null or something equivalent so we can distinguish between user entered explicit precision and they didn't and the system guessed a reasonable one, like for quantities: https://phabricator.wikimedia.org/T115269 [19:43:48] suppose we allow it there because of the bad values [19:43:49] jzerebecki: so is null precision valid or not? [19:43:55] but maybe validate now for new coordinates [19:44:09] if it's valid why formatter converts it to 1/3600? [19:45:43] jzerebecki: also, if it is valid, what it means and how it should be presented on export to RDF? We need some literal value or blank node there [19:45:57] I dont see an explicit validator for geostuff anywhere? :P thus everything is valid? ;) [19:46:14] SMalyshev: currently one part of the code says yes and that when it is formatted for indexing or conversion then 1/3600 is applied as precision, however I didn't check if other code parts do change that and I don't know if there is special formatting for display to humans... [19:46:25] as long as it is valid per https://github.com/DataValues/Geo/blob/master/src/Values/GlobeCoordinateValue.php [19:46:26] ok, we validae that precision is a number [19:46:44] @param float|int|null $precision in degrees, e.g. 0.01. [19:46:53] null is thus considered valid [19:46:56] https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/repo/includes/ValidatorBuilders.php#L276 [19:46:57] ah, ok, so null is valid there [19:47:10] the we need to update the data model docs to say so [19:47:12] https://github.com/wikimedia/mediawiki-extensions-Wikibase/blob/master/repo/includes/Validators/NumberValidator.php#L23 [19:47:29] ahh okay, but then wikibase uses NumberValidator :P [19:47:31] i think null is allowed in GlobeCoordinateValue so we don't choke on old values [19:47:35] already int he database [19:47:48] but new ones need to at least be a number (and not null) [19:48:02] https://github.com/DataValues/Geo/blob/master/src/Values/GlobeCoordinateValue.php#L60 is stil lOK with null [19:48:19] SMalyshev: as i say [19:48:28] also https://github.com/DataValues/Geo/blob/master/src/Values/GlobeCoordinateValue.php#L211 [19:48:36] explicitly uses null [19:48:39] we need some backwards compatiblity for the bad data being red from the database [19:49:00] read* [19:49:01] aude: looks like newFromArray also generates null [19:49:15] SMalyshev: that's probably invoked in unserializing [19:49:19] so you can create new values which have nulls [19:49:37] SMalyshev: yeah, but any new statements need to pass through validators [19:50:02] aha, I see [19:50:18] aude: is it mandatory for all code paths? e.g. api, bots, etc.? [19:50:23] i think a bot could clean these up (like we did for globes) [19:50:27] SMalyshev: i think so [19:50:34] if not, then it's a bug [19:50:56] ok, that's good [19:51:16] so we're back to what to do with it in RDF then... [19:55:10] I would asume null -> 0 and then assume 0 -> 1/3600 (as done in the formatter) [19:56:03] well, not sure. I understand 1/3600 is just used for formatting purposes (since you have to have something) but not sure if it's actual precision value [19:57:33] well, the true meaning is there is no precision, which is rather useless ;) [19:57:56] right... [19:57:59] addshore: no that is very helpful to know that a human explicitly didn't set a precision [19:58:21] jzerebecki: well, then we should be doing the same with 0? [19:58:32] (when formatting) [19:59:11] however we don't yet actually make proper use of that distinction, see https://phabricator.wikimedia.org/T106928 [19:59:48] so in the future we need to represent "auto precision" in complex rdf values [20:00:56] question: do we do that now and assume null and 0 to mean "auto precision" even tho other parts of our code don't fully handle it that way? [20:01:16] jzerebecki: well, formatter seems to think 0 is auto-precision [20:01:32] not sure about other parts of the code [20:03:46] yea but for e.g. the equator a latitude of 0 with precision 0 would actually be correct, not auto precision... [20:05:37] jzerebecki: ok, so if we say 0 is not the right thing... what we put there? [20:05:46] jzerebecki: maybe -1 [20:05:47] ? [20:06:10] or some other negative number [20:07:37] note that null precision survives editing roundtrip [20:07:50] so I think the code treats null the same as 0 basically [20:08:07] ah, wait, strike that [20:08:13] SMalyshev: that is probably not semantic enough. new type (or however that is called) that only means auto precision? [20:08:15] cache :( [20:08:44] jzerebecki: RDF/OWL really frowns on mixing literals and URIs in the same property [20:09:05] which means if we have a property whose value is supposed to be literal, it should be literal [20:09:23] so null is converted to "precision": 0.00027777777777778 on edit [20:09:30] I assume it's 1/3600 [20:09:43] yes it is [20:10:06] so should we just use that? [20:10:47] SMalyshev: we need to represent auto precision at some point, why not do it now? [20:11:53] jzerebecki: well, two things: a) not clear how and b) suppose we have geo-search which somehow can use precision (long way ahead for it, but suppose) - what would we use there? [20:12:03] how does RDF/OWL then represent no value and other complex types (e.g. nullable/optional types like everything boxed in java) [20:12:39] jzerebecki: no value right now looks like this: https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Novalue [20:12:56] SMalyshev: for search whatever precision gets used on conversion [20:13:11] jzerebecki: basically it's "instance of class saying "this has no value"". rather clunky construct IMHO but that's what Markus said people use [20:14:05] I'm not sure if we want to create the same thing for precision... [20:16:08] jzerebecki: when we export to RDF though we have no control over what happens next, so we need to ensure the result is useful. Also, RDF does not need to be roundtrip - so the question is do we need to know it was null originally or we can substitute [20:22:24] SMalyshev: for indexing the simple value should be used which should have conversion applied which means rounding. for the complex one we need to at least say when auto precision is in effect. not sure if we should additionally explicitly say for each coordinate with auto precision that the guessed precision is. [20:23:17] if we always mention the guessed precision, then we could represent it as two values one for precision the other saying if precision is auto or not. [20:23:26] jzerebecki: do we really need to say it? I.e. as a consumer of RDF, what would be the use of "auto-precision"? [20:23:52] jzerebecki: you don't know how our code treats auto-precision - i.e. magic 1/3600 is not visible to RDF user [20:24:23] so how he knows that's what we mean by "auto-precision"? [20:26:04] for a consumer it means a statement with auto precision is less carefully measured than one with explicit precision. [20:27:15] jzerebecki: ok, but this is not very practical - i.e. if I need to match these coordinates, I have to use something... Also, the current null-precision one I'm not sure they are less accurately measured. Many of them have pretty precisse coordinates [20:27:54] or they may be fake precision [20:28:21] i.e. by putting a pin on a map and then not applying the maps precision [20:29:06] jzerebecki: sometimes it is - e.g. https://www.wikidata.org/wiki/Q62295 has null precision but I have no idea what it means [20:29:57] Ew, the new grey bar makes everything clutterted. [20:30:25] It's too dark grey to do these kind of stuff [20:31:08] Lydia_WMDE: ^^ [20:31:12] jzerebecki: so, what we could do is drop precision altogether from RDF [20:31:16] when it is null [20:31:27] jzerebecki: but then it would be very hard to locate such entries [20:31:56] well, by vry hard I mean "the query might be too slow because no-value may not be indexed" [20:32:06] so not too hard, but harder [20:32:36] SMalyshev: but that makes it less useful to consumers. if we explicitly mention the precision regardless if auto or manual then that is more useful. but the downside is to then differentiate these we need to always in another statement mention if the precision is auto or manual. [20:32:43] but it is an option. Depends on if we intend to fix those or intend to legitimize null as a value [20:33:30] jzerebecki: I do not think "auto" is useful to RDF consumer. If I have a geo-search engine which can do precision, I'd have to give it a number [20:34:05] so I couldn't load our RDF then, I'd have to do pre-processing of values to replace "auto" with some value. But that value would be different from what our code is using. [20:34:44] because our value is not documented and ad-hoc [20:35:10] SMalyshev: but that is what I said to give it. the auto/manual statement would be a separate statement. basically this comes down to the question if we want to represent all wikibase information faithfully in rdf or we want to reduce it in some ways. [20:36:14] jzerebecki: I think we want to reduce. [20:36:20] the downside of doing it faithfully is having two triples instead of one for precision one literal precision and on auto|manual. [20:36:44] SMalyshev: is there anything else we decided to reduce already? [20:36:56] jzerebecki: datetimes? [20:38:30] jzerebecki: we clean up broken dates (such as 0 month/day, etc.) [20:39:33] also convert calendar Julian->Gregorian [20:39:33] SMalyshev: yes but we don't omit information. we should clean up geo too. [20:39:43] and conversion is fine, too [20:40:47] jzerebecki: well, we could add auto-precision class to signal that the value is using derived precision, but I'd wait until somebody really asks for hat [20:40:50] *that [20:40:54] SMalyshev: um is the complex wikibase:timeValue pre conversion or post conversion? [20:41:12] jzerebecki: $valueWriter->say( RdfVocabulary::NS_ONTOLOGY, 'timeValue' ); [20:41:12] $this->sayDateLiteral( $valueWriter, $value ); [20:41:44] jzerebecki: looks like post. you can not have xsd:datetime values in Julian :) [20:42:23] jzerebecki: I think we had open ticket for adding the original form too, but not sure what happened to that [20:42:41] but since we have xsd:datetime we had to convert [20:42:52] ok [20:43:19] SMalyshev: what do we gain by waiting for someone to ask for it? [20:43:40] so, going back to precision, I see two viable options (if we reject 0): a) drop precision when it's null b) use 1/3600 [20:43:51] jzerebecki: we'd know whether anybody actually needs it for anything [20:44:20] jzerebecki: that said, adding "a wikibase:geoAutoPrecision" is not a big deal [20:44:47] I can do that right now if that's what you think is good way [20:46:20] SMalyshev: i would say (b) (as in the auto precision the formatter applies) and add the a wikibase:geoAutoPrecision|a wikibase:geoManualPrecision [20:47:07] jzerebecki: I don't want to add ManualPrecision since it'd be on 99.99% of the entries (which means no use to select for it) and I'd have to drop it from WDQS base anyway [20:47:50] since it's not selective it would be no penalty for just filtering on non-existance of geoAutoPrecision [20:47:55] SMalyshev: i assume if the data were correct it would be mostly auto precision... so that may change in the future [20:48:22] jzerebecki: wait, what you mean by auto precision then? I meant it's when precision is null [20:48:34] but i'm fine with having one be implicit [20:49:46] ok [20:52:41] sjoerddebruin: which grey bar do you mean? [20:53:01] In the statements section. https://www.dropbox.com/s/lqwpbe0i8dlr4fw/Schermafdruk%202016-01-13%2021.52.55.png?dl=0 [20:54:05] sjoerddebruin: ok gotcha. thanks. will talk to jonas to get this a bit lighter. for me it was considerably lighter iirc. will check again [20:54:12] SMalyshev: I think currently 0 is also like auto precission, but as Wikibase doesn't model auto precision well yet (the UI automatically selects a precision even if it was previously manually set which never results in null nor 0) probably most geo precisions in the db are incorrect [20:54:37] Lydia_WMDE: If it avoids wrong edits, I'm okay with it but this way it doesn't seem right. [20:54:55] agreed [20:55:02] jzerebecki: I see. Check out https://gerrit.wikimedia.org/r/#/c/263680/ [20:55:10] * Lydia_WMDE waves to jzerebecki [20:55:20] jzerebecki: will you be in the planning tomorrow? [20:55:26] or send me your input? [20:55:51] Lydia_WMDE: btw, did you saw my comments about the images on items [20:55:54] Lydia_WMDE: is it about user visible featues only? [20:56:13] sjoerddebruin: in the ticket? yes. thank you! [20:56:22] Okay, great. [20:56:24] jzerebecki: that plus big behind the szenes things [20:56:29] * jzerebecki waves to Lydia_WMDE [20:57:02] If you need more comments from the community, maybe include it in the status update. Because of the huge impact this doesn't seem like a developer only thing. [20:57:09] Lydia_WMDE: the meeting is at 8am :( but will try to join [20:57:21] aude: \o/ [20:57:47] hm... seeing a lot of serialization exceptions from testwikidata [20:58:03] hoo: oh really? [20:58:07] sjoerddebruin: *nod* my impression is we're good at the moment with it. or do you expect considerable pushback? in this case i can do it [20:58:08] but could also be someone "fuzzing" with a bot [20:58:20] i checked the logs a bit a go [20:58:38] Lydia_WMDE: Hm, yeah, We're not Commons or dewiki. [20:59:05] aude: Well, 6... but that is a lot given the few edits tst usually receives (like 50 a day) [21:00:30] sjoerddebruin: ;-) [21:00:38] hoo: looks like a bot experimenting [21:00:55] sjoerddebruin: anyway i can add it to the next summary [21:00:55] Probably [21:00:57] giving malformed snalist [21:01:00] snaklist* [21:01:05] We should still catch these in the api module [21:01:12] the exceptions, I mean [21:01:21] Lydia_WMDE: don't know when somebody is going to work on it, but more comments are still a good idea I think. [21:01:46] we will do quarterly planning tomorrow. i hope we can get it into the quarter [21:01:49] hoo: sure [21:01:52] Also wondering if there is a easy way to get pageviews for just a single page. [21:02:13] via the page view api it should work. but i don't know the exact calls [21:02:24] Still waiting for a good UI. ;) [21:02:26] sjoerddebruin: http://blog.wikimedia.org/2015/12/14/pageview-data-easily-accessible/ [21:02:28] ack [21:02:35] aude: that's for comparing pages [21:02:41] thx aude :) [21:03:27] Just like last time, I just want to know the visits of https://www.wikidata.org/wiki/Special:Nearby, to see the impact of having it in the navigation menu [21:04:09] sjoerddebruin: i see [21:04:23] https://grafana.wikimedia.org/dashboard/db/wikidata-top-page-views [21:04:26] i see nearby there [21:04:51] Yeah, also got that page last time but I don't get it working [21:05:21] :/ [21:05:30] it's not so user-friendly [21:05:58] addshore: maybe you can help sjoerddebruin with a pageview api call? [21:06:08] For pages and item, I wish we could just have graphs on "action=info" [21:06:26] legoktm has been working on that [21:06:35] let me see what the state is [21:06:36] one sec [21:07:33] https://phabricator.wikimedia.org/T43327 I think? [21:08:01] hah! exactly [21:08:05] sjoerddebruin: yeh that should be coming aoon! [21:08:06] Soon! [21:08:27] I'm getting more and more experience with the Phabricator search... ;) [21:08:34] What are you looking for? :) [21:08:43] \o/ [21:08:43] Special:Nearby page views. [21:11:08] addshore: Debugging on test, running into some rights issues. Can you flag my bot https://test.wikidata.org/wiki/User:BotMultichill ? Or flag me so I can do it myself [21:12:56] addshore: i can request something like https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/en.wikipedia/all-access/all-agents/Berlin/daily/20151020/20151130 [21:13:10] what would the project be for wikidata, instead of "en.wikipedia"? [21:13:24] {language code}.{project name}, [21:14:08] aude: Or can you help me on test? [21:14:27] multichill: looking [21:14:38] done [21:16:12] Thanks aude! [21:16:17] :) [21:16:20] made you admin also [21:17:01] or if you want to be bureaucrat, think we can do that [21:18:40] aude: on pageviews it is wikidata [21:18:56] jzerebecki: thanks [21:19:06] e.g. https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/wikidata/all-access/all-agents/Special:Nearby/daily/20151020/20151130 [21:19:16] sjoerddebruin: ^^ [21:19:31] yep https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/wikidata/all-access/all-agents/Special:Nearby/daily/20160101/20160113 :) [21:28:03] Lydia_WMDE: where do I find the last quarterly planning result [21:28:14] jzerebecki: sec [21:30:06] jzerebecki: Nice, it's something. \o/ [21:30:46] Would save, check later if I can do something cool with it [21:30:50] Going to bed now, bye [21:33:50] cya [21:40:32] wb_changes purging works again :) [21:41:07] hoo: \o/ [21:43:33] In other news: Labs can't keep up with that :P [21:44:48] Hi. Anybody know why is it wrong this quantity type statement in Quickstatement ("LAST P2048 51 S143 Q22026414"), please? It doesn't add its reference. [21:54:48] aude: Thanks, close to finding it at https://phabricator.wikimedia.org/T104522 . Maybe someone else manages to find the bug [22:01:56] multichill: interesting [22:02:10] seems related to https://phabricator.wikimedia.org/T121395 but there's probably more to the problem than that [22:12:59] aude: It's somewhere in pywikibot the revisionid gets lost. That's at least what it looks like [22:26:40] multichill: ah, i see