[00:00:52] But I can has no api! [00:26:04] addshore: ah... well, just stuff all the things in the script so it supports them all? [00:26:25] addshore: if you have an idea on how to improve the situation I'd love to hear it [00:26:33] Seems like an inherent stuffs tho [00:27:06] But in theory they could be mapped to different ids :/ [00:27:27] And when peope have customised ones... Well just dam that xD [08:09:51] Nothing yet. :) https://www.wikidata.org/w/index.php?diff=300000000 [08:43:14] :O [08:44:24] We're just at https://www.wikidata.org/w/index.php?diff=298000000 [10:10:27] nikki: https://www.wikidata.org/w/index.php?title=Q21943977&action=history :( [10:13:42] ew [10:14:38] I wish I knew why people did things like that [10:15:17] Don't understand the form, get errors when they fill in nothing, enter gibberish. [10:15:30] Another question is why some people fill in the same for all three. [10:15:44] Main reason is people don't read. [10:17:17] Now I just need to go back in time to the 5th of January to tell nikki to patrol. [10:17:39] well, sure, but what were they trying to do in the first place? [10:18:42] Well, we just install some spyware on their computers to find out. ;) [10:18:55] (to our Google readers, this is a joke) [10:22:02] I haven't been able to do much lately :/ been busy and also haven't been able to use my main computer much (and rech doesn't work on my laptop any more for some reason) [10:22:54] There has been some oAuth changes, maybe clear your cache and cookies? [10:23:16] did that bot with the ipv6 address ever stop or get blocked? [10:23:21] it hasn't worked for months :/ [10:23:42] I dunno why, it loads the page, it just doesn't load any of the edits [10:24:23] I've marked some edits again this morning of the ipv6 thing. :( [10:24:52] I'm confused. https://www.wikidata.org/wiki/Q21970502 [10:28:51] And this is also always raising questions... https://www.wikidata.org/w/index.php?title=Q21976289&action=history [10:57:30] ? [10:58:34] such IP edits :P [11:23:02] aude: Menü 10 [11:26:43] DanielK_WMDE: ok [11:42:24] nom nom [11:45:12] :( [11:58:45] Hi there, I was wondering about an index for looking up wikipedia page names [11:59:08] I am not able to find one on the replica db for quickly doing a lookup [13:18:30] mdupont: try wb_items_per_site [13:19:06] yes, I have that DanielK_WMDE but it seems to be very slow to query, I ended up just doing an api call [13:21:25] that's strange, it should have an index for each field. [13:21:28] PRIMARY KEY (`ips_row_id`), [13:21:29] UNIQUE KEY `wb_ips_item_site_page` (`ips_site_id`,`ips_site_page`), [13:21:30] KEY `wb_ips_site_page` (`ips_site_page`), [13:21:32] KEY `wb_ips_item_id` (`ips_item_id`) [13:21:35] Maybe something is missing on the replica? [13:22:16] let me check [13:25:53] show indexes from wb_items_per_site; [13:25:53] Empty set (0.01 sec) [13:26:05] i did not find any indexs on the page table in enwiki either [13:26:17] starting to wonder what the use of these replicas could be [13:26:47] DanielK_WMDE, gotta go to work, bis spaeter [13:28:11] mdupont: it's probably a view, so there are no indexes directly on the view... [13:28:25] ...but it would use the indexes internally. usually. [13:28:41] but sometimes the query planner gets confused about that [13:28:48] ok [13:28:58] well I cannot explain the query [13:31:05] select * from wb_items_per_site where ips_site_page="Wikibooks"; that is very fast, good enough [13:31:08] thanks, [13:31:23] Dankeschoen DanielK_WMDE [13:32:39] here is a list of cats I wan to port over from wp https://www.wikidata.org/wiki/User:Mdupont/Todo working on https://en.wikipedia.org/wiki/User:Mdupont/Open_content [13:32:46] DanielK_WMDE: https://phabricator.wikimedia.org/T125352 Data prefetching seems to be something that Gabriel's dependency system doesn't really take into account, does it? [13:32:48] mdupont: yea, explain doesn't work well with views. [13:33:13] hoo: quite possibly. ask him about it. [13:41:13] hi [13:41:40] i would like to get the wikidata ids for a number of movies (text list) [13:41:43] how to do this? [13:42:07] Do you have the titles or Wikpedia page names or what exactly do you have? [13:42:33] i have the movies titles [13:42:38] but there are some disambigs [13:43:24] You probably need movie titles and at least years if you want to get it right [13:43:41] you can use https://www.wikidata.org/wiki/Property:P1476 [13:43:51] and use sparql [13:44:19] benestar: aber wo soll ich die titel eingeben? [13:44:22] bei autolist? [13:46:25] sowas wie hier: http://tinyurl.com/gp95qjh [13:46:58] ka ob autolist das auch kann [13:47:38] wie automatisiere ich das? [13:47:46] es sind 100 italienische filmtitel [13:47:51] oh, italienisch [13:47:58] sind das originaltitel? [13:48:09] die title -property hat nur originaltitle [13:48:46] ja, original [13:49:40] dann geht's [13:50:12] kannst dann auch sowas machen: http://tinyurl.com/hcou7de [13:50:19] alternative wäre, über imdb zu gehen [13:52:50] sorry, ich bin noch nicht so weit: wo kommt jetzt die titelliste rein? [13:53:55] many thanks for that review DanielK_WMDE :) [13:56:15] benestar: I've created a pagepile (http://tools.wmflabs.org/pagepile/api.php?id=2148&action=get_data&format=html) but it only got 85 out of 100 [13:58:56] kopiersperre: where can I see the query you used? [13:59:05] no query [13:59:29] benestar: "create a new pile > manual list" [14:00:16] i don't know how that tool works so maybe someone else can help you with that [14:02:12] benestar: please give me 5 additional minutes of your time and tell me how to do this with wikidata query [14:04:37] with wikidata query you can do a query for every title using a script or add them all to one query [14:04:55] like http://tinyurl.com/j2g2lcd [14:04:58] that should work for you [14:05:13] just add the list into the brackets where "titel"@it, "another title"@it is currently [14:05:25] (not that @it is needed to define the language you are looking for) [14:05:40] wow, I didn't know that syntax even existed [14:05:48] but you're looking for it only anyways, right? [14:05:50] was always using filter with lang [14:05:58] hoo: the language syntax? ^^ [14:06:06] with @ yes [14:06:24] well, it doesn't actually match if you omit it [14:06:42] well, you can match by str() and lang() [14:06:50] true [14:07:09] kopiersperre: does that work for you? :) [14:09:21] benestar: yes [14:09:48] benestar: but i'm not sure what the query does in case of disambiguity [14:10:14] it only creates exact matches for the film titles [14:10:43] but if two films have exactly the same title, they both match [14:11:19] It's case sensitive, even [14:20:01] not perfect, but thanks! [14:27:14] benestar|cloud: hey, we were going to look into your search api stuff... I'm a bit confused abotu how the individual parts tie together... [14:38:21] DanielK_WMDE: what exactly are you confused about? [14:46:41] aude, if I use this new Depends-On thing in a commit in core and then 3 different extensions it would still be possible to +2 them all and then them all merge at once right? [14:49:50] addshore, aude > Depends-On is indeed a metadat for Zuul, which understands it [14:50:15] epic, right, this will be my first time using it I think! [14:51:31] http://docs.openstack.org/infra/zuul/gating.html#multiple-changes explains the detail for multiple changes [14:52:35] The only thing to know is if you have an infinite cycle, Zuul will detect that, abort, but won't report anything on Gerrit. [15:07:28] https://www.wikidata.org/w/index.php?title=Q194105&type=revision&diff=298693650&oldid=296927243 [15:52:13] benestar|cloud: naw, i think i got it now. [15:53:48] DanielK_WMDE: hmm, ok [15:53:49] tell me if you have any questions ;) [15:55:20] benestar: where are the unit tests? ;) [15:56:37] for the api module? QuerySearchEntitiesTest [15:57:49] benestar: no, for all the js code in MF of course :P [15:57:57] never mind. i just gave a +2 [15:58:44] DanielK_WMDE: we then should also merge the config change [15:59:50] benestar: not right away. talking to katie about it. [16:00:10] 3:52 PM The only thing to know is if you have an infinite cycle, Zuul will detect that, abort, but won't report anything on Gerrit. [16:00:11] ahhhh [16:24:29] DanielK_WMDE: the config change doesn't actually *need* the MF patch [16:24:42] benestar: i know [16:24:46] the basic functionality to recognize that option is there already since about half a year :P [16:25:14] benestar: when deploying a config change, you want people around to check that it actually does what we want. [16:25:28] sure :) [16:25:30] so we'll only deploy it once the code is deployed, too. [16:25:48] because otherwise, we can't really check that it works, right? [16:26:12] benestar: i made a separate ticket for the config patch. we'll track it in the spring, katie will look into it. [16:27:48] just saw it, looks good [16:55:08] Lydia_WMDE: so with the UI tracking I think we should do (which would likely be trivial for someone to implement that knows the JS) should I go and make a ticket? [16:55:19] or come and tell Jonas_WMDE all of the things now ;) [16:55:43] addshore: ticket would be <3 [17:20:46] Lydia_WMDE: https://phabricator.wikimedia.org/T125404 [17:21:07] addshore: thanks! [17:21:32] it will likely make sense to have some sub tasks once people look over it ;) [17:21:44] I may look at doing some of the PHP bits [17:21:50] sweet [17:29:34] addshore: ui tracking? =o [17:29:42] =o ya [17:30:38] evil data collection =ooo [17:30:50] ^^ [17:30:56] hah, there is no data about people, or that links it to people at all ;) [17:30:59] just counts [17:31:07] just kidding ;) [17:49:17] Ehm, are others also seeing just the homepage and no sidebar etc? [17:49:36] correction: missing a stylesheet? [17:49:47] looks fine to me [17:49:51] logged in and out [17:50:28] "Our servers are currently experiencing a technical problem. This is probably temporary and should be fixed soon. " [17:50:41] When I want to purge [17:51:17] Rest in fine, just homepage. Logged in and out didn't help [17:52:23] also looks fine for me :/ [17:52:40] Error only says "PHP fatal error" at the bottom... [17:53:45] Ah, not the only one. Going to #wikimedia-operations then. [17:54:04] i see stuff in the logs :( [17:54:29] Now it sends me to https://donate.wikimedia.org, is this a sign? [17:54:53] maybe hoo fixed it just now? [17:55:06] Nope [17:55:24] :( [17:59:29] sjoerddebruin: still broken? [17:59:43] not obvious what the problem is [17:59:56] Yep, it's says "PHP fatal error" on purge. [18:00:25] purging which page? [18:00:31] probably any [18:00:38] I guess it fails to insert jobs [18:01:28] nice [18:02:35] sjoerddebruin: Looks good again [18:04:09] hoo|away: sorry, no difference here... [18:07:15] sjoerddebruin: oh [18:07:20] only the main page [18:08:20] sjoerddebruin: try again [18:08:52] Still no css and purge still broken. [18:09:03] :/ [18:09:10] i saw the fatal erorr [18:09:43] think it might be caching but not sure [18:10:09] cp1055 miss(0), cp3040 hit(2), cp3031 frontend hit(1) [18:12:59] sjoerddebruin: can you try again? [18:13:08] and only the main page or other ones? [18:13:19] (i think they just got cached in broken state) [18:14:17] It's only the homepage, no difference. [18:15:06] can you paste the url? just https://www.wikidata.org/wiki/Wikidata:Main_Page ? [18:15:34] aude: DanielK_WMDE I expect my comment here is probably meant for you :p https://phabricator.wikimedia.org/T125392 [18:15:47] https://www.wikidata.org/wiki/Wikidata:Main_Page?action=purge still broken for me [18:16:38] addshore: hm... i don't think this is sufficient... [18:16:49] addshore: a lot of logged in requests (bypassing cache) [18:17:11] DanielK_WMDE: what more do you want? ;) [18:17:53] aude: https://www.wikidata.org/wiki/Wikidata:Main_Page [18:17:56] addshore: logged in users vs anons. "pass" doesn't imply "logged in". [18:18:05] sjoerddebruin: :/ [18:18:11] addshore: added a comment. [18:18:17] i also tried https://www.wikidata.org/wiki/Wikidata:Main_Page?action=purge in firefox [18:18:31] addshore: i guess if we can get that query just for the main namespace... [18:18:45] ahh, so you want it just for items? :) [18:19:03] #cando [18:19:26] addshore: i'd be happy to have it for everything, but for non-article pages, the cache my be bypassed for different reasons, skewing the results [18:19:59] addshore: I think pass vs hit+miss on the item namespace will give us useful info [18:20:19] more like pass / total, really [18:20:35] ...and if we could then explcude crawlers... [18:25:15] DanielK_WMDE: I talked to Gabriel: https://phabricator.wikimedia.org/T124737#1986666 [18:25:37] That also means we should do prefetching via usage tracking [18:26:14] sjoerddebruin: append something to that url, e.g. https://www.wikidata.org/wiki/Wikidata:Main_Page?action=purge&whatever will work [18:26:28] hoo: nope, still missing stylesheet [18:27:03] you might have had bad luck and the fatal got cached [18:27:04] "Did not parse stylesheet at 'https://www.wikidata.org/w/load.php?debug=false&lang=nl&modules=ext.echo.badgeicons%7Cext.echo.styles.badge%7Cext.gadget.AuthorityControl%2CCommonsMedia%2CDuplicateReferences%2CFindRedirectsForAliases%2CHotCat%2CMark_as_patrolled%2CMerge%2CMove%2CPopupsFix%2CPrimarySources%2CProtectionIndicators%2CRequestDeletion%2CRfDHelper%2CSiteIdToInterwiki%2CautoEdit%2Cimagelinks%2ClabelLister%2CmarkAdmins%7Cext.tmh.t [18:27:05] humbnail.styles%7Cext.uls.nojs%7Cext.visualEditor.desktopArticleTarget.noscript%7Cext.wikimediaBadges%7Cmediawiki.legacy.commonPrint%2Cshared%7Cmediawiki.page.gallery.styles%7Cmediawiki.raggett%2CsectionAnchor%7Cmediawiki.skinning.interface%7Cskins.vector.styles%7Cwikibase.client.init&only=styles&skin=vector' because non CSS MIME types are not allowed in strict mode." [18:27:10] although it shouldn't be cached for to long [18:27:31] hoo: on terbium, with curl, i get the error page [18:28:15] tried to purge the url [18:29:50] heh, looking at logstash the error page /srv/mediawiki/errorpages/hhvm-fatal-error.php has errors? O_o xD [18:29:50] aude: works for me [18:30:03] addshore: that should be gone [18:30:07] hoo: even on terbium? [18:30:08] aude: which host were you polling? [18:30:18] I was testing mw1111 randomly [18:30:40] curl -i "https://www.wikidata.org/wiki/Wikidata:Main_Page?action=purge" [18:30:41] mw1234 seems good as well [18:30:45] oh [18:30:49] that goes through varnish [18:30:49] just that from teribum [18:30:56] then tried with -X PURGE [18:31:06] and then still getting the error [18:31:23] i don't know if i can do -X PURGE [18:31:38] you shouldn't be able to [18:31:44] oh [18:31:48] * aude can on beta [18:38:43] DanielK_WMDE: for just NS0 the ratio for miss:hit:pass is 1996823:304905:7405 [18:38:52] running ignoring spiders now [18:38:56] aude: the problem is that due to hhvm failing so badly during that request, the request returned 200 OK and no private cache header [18:39:03] thus varnish caches it like a static asset [18:39:09] even for logged in users [18:39:41] addshore: wikidata has 100x more readers than exitors? huh. [18:39:52] addshore: for what period of time is that though? [18:40:04] addshore: also, please put the numbers on the ticket [18:40:08] spiders? [18:40:10] 1 day, the 31st of jan 2016 [18:40:27] DanielK_WMDE: I'll add these to the ticket once the last query runs! :) [18:41:29] uhh... only 7400 logged in page views per day? [18:41:37] that's... low. [18:41:43] even for a sunday. [18:42:24] hoo: :( [18:42:39] addshore: are we sure these flags mean what we think they mean? Who should we ask to make sure? [18:43:13] DanielK_WMDE: we could double check with ops, but the descriptions I mentioned in the ticket are from the varnish docs [18:46:12] addshore: yea, but it doesn't say anything about logged in users, right? [18:49:01] DanielK_WMDE: no, there is no way to determine that [18:49:46] DanielK_WMDE: apparently removing spiders takes it from miss:hit:pass is 1996823:304905:7405 to 234885:14140:6445 [18:50:19] what does pass mean in this context? [18:52:22] hoo: logged in users? [18:52:31] * aude not sure [18:52:45] addshore: quite a difference. and... 1000 page views by a logged in crawler?! [18:52:48] hm... that should be mostly misses on the backend varnishes [18:53:02] Are these back or frontend vanrishes? [18:53:11] front, i hope [18:53:16] at least, that#s what i'm interested in [18:53:27] also front I hope, otherwise the webrequest data and page view data would be incomplete ;) [18:53:57] ah ok [18:54:06] yes, these take their data from frontend varnishes [18:55:18] But the webrequest data knows whether the user is logged in or out [18:55:24] so why look at misses vs. pass [18:55:27] Stylesheet is back [18:55:35] purge is working again [18:55:42] \o/ [18:55:49] sjoerddebruin: yeah, workaround has been deployed [18:56:10] hoo: no it doesnt :/ Not in the schema, also just checked with analytics [18:58:20] addshore: Ok, but logged in users are somewhere between these misses and probably also some passes [18:58:29] not sure where we even do pass [18:59:41] so when I view Q42 now I make a "cache_status":"pass" [19:00:44] interesting [19:00:55] although only the last backend varnish passes [19:01:18] Yeh, the full header is "x_cache":"cp1055 miss+chfp(0), cp3013 pass+chfp(0), cp3010 frontend pass+chfp(0)" but its recorded as pass [19:02:25] addshore: But you only get a pass if you (yourself) visit that page fro the second time [19:02:41] *will go and check* ;) [19:03:01] hoo: logged in should be a subset of pass, right? [19:03:12] afaik, any cookie -> pass [19:03:15] of pass and miss, yes [19:03:18] no [19:03:25] o_O [19:03:39] do https://www.wikidata.org/wiki/Special:Random logged in [19:03:47] you'll almost certainly get miss [19:03:49] hoo: if loged in views may also be counted as misses, then we are screwed [19:04:04] i'm confused. the foundation has to have a way to count logged in page views, no? [19:04:05] they are [19:04:27] you only get a pass if you visit the very same page twice [19:04:32] hoo: i'd assume that Special:Random sends nocache, and thus is a pass, not a miss [19:04:43] so if I visit Q42 I get a miss, second time a pass (using one account) [19:04:52] if I then switch account, I also get a miss first and then a pass [19:05:10] DanielK_WMDE: I'm not talking about the special page itself, but the page it redirects you to [19:05:13] sorry for the confusion [19:05:14] hoo: huh? that sounds strange. why?... [19:05:34] because we have a hit then in the backend [19:05:38] but we decide to ignore it [19:05:41] thus we pass the request [19:05:41] so "pass" means "i have a cached version, but i won't give it to you"? [19:05:43] I guess [19:05:47] yeah [19:05:55] that makes no sense [19:06:08] the output from app servers that is generated for logged in users should not ever be cached [19:06:19] it contains personalized info [19:06:32] hoo: DanielK_WMDE yes so as a logged in user first I generate a miss, then I generate a pass [19:06:40] Yes, but varnish probably still records it briefly [19:06:47] o_O [19:06:50] wtf? why would it? [19:06:56] dunno [19:07:04] addshore: that means the stats are completely useless to us :( [19:07:11] yes, that's certain [19:07:13] so all of the passes will be logged in users, and also lots of the misses will be :P [19:07:23] bleh [19:07:34] addshore: please ask the analytics folks, there gotta be a way [19:07:42] I'll ask again ;) [19:07:45] thanks [19:21:01] addshore: bad news. i see the uls icon even when not logged in [19:21:17] DanielK_WMDE: how about a gadget? [19:21:49] addshore: default gadgets are enabled for anons. [19:21:55] other gadgets arn't for all users... [19:21:58] bah [19:22:00] I'll come up with something [19:22:14] :) [19:23:25] addshore: if you do hack up your own, just load an icon or something, so we can grab the number from the request counters [19:23:37] yup [19:23:45] in the mw.config, there is wgUserId etc. [19:25:08] https://upload.wikimedia.org/wikipedia/commons/d/db/Symbol_list_class.svg any ideas what this is for? [19:25:11] addshore: but that resource would need to have must-revalidate set... [19:25:17] it's complicated :) [19:26:44] oh, thats the label lister gadget, bah I thought this second account had clean user preferences [19:27:12] Ah, yes. [19:27:26] Still some work needed to set the beta version of that live. :( [21:02:07] anybody knows why when I try to add value to identifier property on test.wikidata.org - e.g. InstNum - I get "It is not possible to define a new value for a deleted property."? [21:13:14] aude: any ideas ^? [21:20:26] SMalyshev: ugh [21:20:36] does that happen for all properties or only for a certain subset? [21:20:46] hoo: for all I tried so far [21:21:00] that sounds a little like property info gone corrupt [21:21:11] which is three here: https://test.wikidata.org/wiki/Special:ListProperties/external-id [21:21:21] oh, of course [21:21:31] we disabled the external identifier datatype [21:21:47] so that's expected for now [21:22:16] Guess these were created between us deploying the code for identifiers and (temporarily) disabling them [21:22:39] hoo: I created a new one and the same hapens [21:22:51] https://test.wikidata.org/wiki/Property:P708 [21:22:55] wait, you can create now ones? [21:23:03] * hoo rages [21:23:08] hoo: why not? [21:23:23] it's test wiki [21:23:24] because it shouldn't not be enabled yet [21:23:32] hmm... dunno :) [21:23:43] I fear we have the same on the real one, then [21:24:12] I wouldn't try to mess up the real one, but on test it appears broken [21:24:15] ok, we don't [21:24:34] on wikidata.org the type doesn't show up in special page [21:24:36] on test it does [21:25:47] hm... I actually only disabled it on the real wikidata [21:26:03] Did I intend that? [21:26:09] heh :) [21:26:31] Well, seems that was intended [21:26:41] weird [21:27:34] so let's see why it fails [21:30:28] all of them are correctly referenced in the property info [21:35:11] wow, same happens locally [21:39:05] SMalyshev: Ah ok, the code in question is not merged on master yet, even [21:39:10] I was not up to date on that end [21:39:26] you should be able to set it via the api [21:39:31] but the UI is not up with that [21:40:11] ah, ok. So I just wait until the rest of the code is deployed on test? [21:40:21] Probably two more weeks, yes [21:40:39] Lydia just said in 2 weeks it goes live on wikidata? [21:40:47] did she? [21:40:57] It's no in master yet and I'm about to create a branch [21:41:02] and we don't plan to branch next week [21:41:18] https://lists.wikimedia.org/pipermail/wikidata/2016-February/008068.html [21:44:17] new datatype for identifiers: we'll bring it to wikidata.org on the 16th [21:44:17] hm, ok [21:44:17] that's why I wanted to check if everything is working nicely like rdf, bots, etc. but if it's not done yet then I'll wait [21:44:17] guess we branch next week, then [21:44:17] ok, sounds good [21:44:17] or someones comes around to merging the code in question really fast [21:44:18] addshore: Can you make a list under https://www.wikidata.org/wiki/User:Addshore/Identifiers of all the string properties that are *not* going to be converted? [21:44:18] +1 [21:48:21] I like how Wikidata has become this index of ID numbers [21:48:30] and Wikidata items in turn have their own ID numbers as well. The ID number to end all ID numbers. [21:48:48] Know the Wikidata item number, know the other ID numbers as well. [21:52:01] hoo: we can deploy without the frontende changes. it would behave as it does now with the widget: after editing, the id would not belinked [21:52:20] DanielK_WMDE: No, right now that can't be edited at all [21:52:25] we don't do fallbacks in the frontend [21:52:35] the DV is just unknown [21:52:42] which makes the UI assume the property doesn't exist [21:52:52] (which is stupid) [21:53:02] meh.. i thought i tested that... must have been wishful thinking. [21:53:15] I thought so as well [21:53:27] then I realized I didn't create signle external identifier locally [21:53:38] question: with the new identifier datatype, does this mean identifiers will be presented differently? [21:53:39] I just tested the blacklisting :P [21:54:09] harej: Not directly related to that, but yes that will happen really soon [21:54:23] how will it look different?