[00:04:44] 10MediaWiki-extensions-WikibaseClient, 10Wikidata, 7Tracking: Allow accessing data from an item not connected to the current page - arbitrary access (tracking) - https://phabricator.wikimedia.org/T49930#1072150 (10aude) @greg there are some more issues (e.g. T89002) that we need to resolve for commons, since... [08:01:12] https://www.wikidata.org/wiki/Q19316447 <-- somebody has made a item for "restaurant" in Russian and similar languages [08:01:49] https://www.wikidata.org/w/index.php?title=Q19316447&action=history <-- they've moved site links from the restaurant item [08:20:01] 10Wikidata, 10Continuous-Integration, 10§ Wikidata-Sprint-2015-02-03, 10§ Wikidata-Sprint-2015-02-25: fix the qunit tests for wikidata: mwext-Wikibase-qunit - https://phabricator.wikimedia.org/T74184#1072953 (10adrianheine) The focusing tests fail. We should either try to detect if they could possibly pass... [08:54:57] 10Wikidata, 10Continuous-Integration, 10§ Wikidata-Sprint-2015-02-03, 10§ Wikidata-Sprint-2015-02-25: fix the qunit tests for wikidata: mwext-Wikibase-qunit - https://phabricator.wikimedia.org/T74184#1072985 (10Tobi_WMDE_SW) Regarding the focus tests, IIRC we talked about removing them quite some times. Th... [10:18:42] [13WikidataBrowserTests] 15WMDE-Fisch pushed 1 new commit to 06master: 02http://git.io/xkid [10:18:42] 13WikidataBrowserTests/06master 146885cf6 15WMDE-Fisch: updated gems excluding mediawiki_selenium 1.0.0 [10:19:23] [13WikidataBrowserTests] 15WMDE-Fisch pushed 1 new commit to 06sitelink_fixmes: 02http://git.io/xkPf [10:19:23] 13WikidataBrowserTests/06sitelink_fixmes 143ebabd0 15WMDE-Fisch: refactored sitelinks to use indexed_properties [10:30:57] [13WikidataBrowserTests] 15tobijat comment on pull request #51 1464c9938: What I meant s that the WD element is missing a set method to put text into an input field. That's why we used the send_keys. 02http://git.io/xkMN [10:32:36] [13WikidataBrowserTests] 15tobijat closed pull request #51: added test not allowing two sitelinks to the same site T52362 (06master...06sitelinks_T52362) 02http://git.io/AyxJ [10:34:51] [13WikidataBrowserTests] 15WMDE-Fisch comment on pull request #51 1464c9938: I avoid all this in #52 by refactoring the usage of @browser.element(css.... to have dynamic elements with the help of indexed properties 02http://git.io/xkym [10:35:57] [13WikidataBrowserTests] 15WMDE-Fisch 04force-pushed 06sitelink_fixmes from 143ebabd0 to 14da6ff46: 02http://git.io/AFia [10:35:58] 13WikidataBrowserTests/06sitelink_fixmes 14ed7e088 15WMDE-Fisch: get rid of fixmes and refactored sitelink creation [10:35:58] 13WikidataBrowserTests/06sitelink_fixmes 14da6ff46 15WMDE-Fisch: refactored sitelinks to use indexed_properties [11:05:28] (03CR) 10Aude: "it probably makes sense to review this together with the follow up" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/192459 (https://phabricator.wikimedia.org/T89956) (owner: 10Aude) [11:08:07] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Add mechanism for registering new entity types, to work with things like EntityViewFactory - https://phabricator.wikimedia.org/T77985#1073126 (10JanZerebecki) 5Invalid>3Open [11:08:08] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Support new types of Entities in Wikibase Repository - https://phabricator.wikimedia.org/T75496#1073127 (10JanZerebecki) [11:08:58] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Create extension mechanism for Wikibase Repository for new entity types - https://phabricator.wikimedia.org/T76021#1073128 (10JanZerebecki) [11:10:04] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Add mechanism for registering new entity types, to work with things like EntityViewFactory - https://phabricator.wikimedia.org/T77985#833341 (10JanZerebecki) On the right there is a button Merge Duplicates In. [11:21:48] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10Possible-Tech-Projects, 7Need-volunteer: add a new datatype for geoshapes - https://phabricator.wikimedia.org/T57549#1073147 (10Lydia_Pintscher) [11:22:11] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10Possible-Tech-Projects, 7Need-volunteer: add a new datatype for formulas - https://phabricator.wikimedia.org/T67397#1073155 (10Lydia_Pintscher) [11:23:27] [13WikidataBrowserTests] 15WMDE-Fisch created 06time_T88542 (+1 new commit): 02http://git.io/xkh6 [11:23:27] 13WikidataBrowserTests/06time_T88542 14d25d7d8 15WMDE-Fisch: initial statments_time feature commit [11:58:14] 10Wikidata, 7Tracking: Wikidata Browsertests (tracking) - https://phabricator.wikimedia.org/T88541#1073208 (10WMDE-Fisch) [11:58:15] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Browsertests to check that only one sitelink per target site can exist - https://phabricator.wikimedia.org/T52362#1073207 (10WMDE-Fisch) 5Open>3Resolved [11:58:18] 10Wikidata, 10Analytics: active user statistics that have less lag than wikistats - https://phabricator.wikimedia.org/T88121#1073209 (10JanZerebecki) The previous query was a bogus result, because the data is usually only kept 2 days in that table. [11:59:55] (03CR) 10Daniel Kinzler: [C: 032] Allow getting the user's language and splitting the ParserCache from Lua [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/189324 (owner: 10Hoo man) [12:00:44] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10§ Wikidata-Sprint-2015-02-25: Browsertests for URL value input - https://phabricator.wikimedia.org/T88545#1073211 (10WMDE-Fisch) a:3WMDE-Fisch [12:02:59] (03Merged) 10jenkins-bot: Allow getting the user's language and splitting the ParserCache from Lua [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/189324 (owner: 10Hoo man) [12:09:49] (03CR) 10Daniel Kinzler: [C: 032] Introduce PropertyParserFunctionIntegrationTest (031 comment) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/192859 (owner: 10Hoo man) [12:12:29] (03Merged) 10jenkins-bot: Introduce PropertyParserFunctionIntegrationTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/192859 (owner: 10Hoo man) [13:10:09] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Add mechanism for registering new entity types, to work with things like EntityViewFactory - https://phabricator.wikimedia.org/T77985#1073285 (10GPHemsley) [13:12:57] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Create extension mechanism for Wikibase Repository - https://phabricator.wikimedia.org/T76021#1073296 (10JeroenDeDauw) [13:13:38] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 5Patch-For-Review: Introduce wikibase.view.ViewFactory - https://phabricator.wikimedia.org/T90720#1073307 (10Tobi_WMDE_SW) [13:18:03] 10Wikidata, 10Analytics: active user statistics that have less lag than wikistats - https://phabricator.wikimedia.org/T88121#1073309 (10JanZerebecki) $ date -d '-30days' --iso 2015-01-28 $ date -d '-1days' --iso 2015-02-26 [wikidatawiki]> select count(*) as c, rc_user_text from recentchanges where rc_timestamp... [13:27:23] 10Wikidata, 10Analytics: active user statistics that have less lag than wikistats - https://phabricator.wikimedia.org/T88121#1073317 (10Lydia_Pintscher) 5Open>3Resolved a:3Lydia_Pintscher Sweet. Thanks! [13:32:02] 10Wikidata, 10Analytics, 10§ Wikidata-Sprint-2015-02-25: active user statistics that have less lag than wikistats - https://phabricator.wikimedia.org/T88121#1073320 (10JanZerebecki) [13:43:20] 10Wikidata, 10Continuous-Integration, 10MediaWiki-ResourceLoader, 10MediaWiki-Vagrant, and 2 others: qunit test broken without explicitly setting $wgResourceLoaderMaxQueryLength - https://phabricator.wikimedia.org/T90453#1073327 (10Krinkle) This change ([Ic416def846f361425c46f7b](https://gerrit.wikimedia.o... [13:54:03] (03CR) 10WMDE-Fisch: [C: 031 V: 031] "could not check on IE though..." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/192545 (https://phabricator.wikimedia.org/T90395) (owner: 10Aude) [13:54:48] (03CR) 10WMDE-Fisch: [C: 031 V: 031] "could not check on IE though..." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/192301 (https://phabricator.wikimedia.org/T90426) (owner: 10Aude) [13:55:38] (03CR) 10WMDE-Fisch: [C: 031 V: 031] "could not check on IE though..." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/192459 (https://phabricator.wikimedia.org/T89956) (owner: 10Aude) [14:07:09] 10Wikidata, 10Continuous-Integration, 10MediaWiki-ResourceLoader, 10MediaWiki-Vagrant, and 2 others: qunit test broken without explicitly setting $wgResourceLoaderMaxQueryLength - https://phabricator.wikimedia.org/T90453#1073364 (10JanZerebecki) Test run without that setting: https://integration.wikimedia.... [14:16:43] 10Wikidata: Port wikibase templates to Mustache - https://phabricator.wikimedia.org/T91067#1073386 (10daniel) 3NEW [14:33:01] so, authority control does not link values in a reference? [14:33:55] no, it does [14:38:11] actually seems it depends [14:50:44] 10Wikidata, 10Datasets-General-or-Unknown, 6operations: Wikidata dumps contain old-style serialization. - https://phabricator.wikimedia.org/T74348#1073417 (10ArielGlenn) OK, I no longer feel as stupid. The number of items with the 'entity' format is small in comparison to the total number of qualities, we... [14:51:23] 10Wikidata, 10Datasets-General-or-Unknown, 6operations: Wikidata dumps contain old-style serialization. - https://phabricator.wikimedia.org/T74348#1073418 (10ArielGlenn) Um, "with this format" means new redirects are dumped with {"entity" ... etc. [15:03:14] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: Data Type URL should accept mail links - https://phabricator.wikimedia.org/T91069#1073423 (10WMDE-Fisch) 3NEW [15:09:55] [13WikidataBrowserTests] 15WMDE-Fisch created 06url_T88545 (+1 new commit): 02http://git.io/xLWU [15:09:56] 13WikidataBrowserTests/06url_T88545 1472a52c9 15WMDE-Fisch: initial statments_url feature commit [15:11:37] dear colleagues, is there a way to keep the 'references' part of a statement 'closed' upon page load. 'add reference' just takes so much screen space. i feel guilty enough for a lack of references withtout this :) [15:11:56] Nothing built in, no [15:12:03] might be that there is a gadget, but I doubt it [15:14:09] [13WikidataBrowserTests] 15WMDE-Fisch 04force-pushed 06time_T88542 from 14d25d7d8 to 1459800c8: 02http://git.io/xLlF [15:14:10] 13WikidataBrowserTests/06time_T88542 1459800c8 15WMDE-Fisch: initial statments_time feature commit [15:14:54] [13WikidataBrowserTests] 15WMDE-Fisch 04force-pushed 06url_T88545 from 1472a52c9 to 14e6b6aaf: 02http://git.io/xL8g [15:14:54] 13WikidataBrowserTests/06url_T88545 14e6b6aaf 15WMDE-Fisch: initial statments_url feature commit [15:16:29] [13WikidataBrowserTests] 15WMDE-Fisch opened pull request #53: initial tests for URL data type on statements T88545 (06master...06url_T88545) 02http://git.io/xL4L [15:18:00] [13WikidataBrowserTests] 15WMDE-Fisch opened pull request #54: initial tests for Time data type on statements T88542 (06master...06time_T88542) 02http://git.io/xLBt [15:30:39] 10Wikidata: brainstorming for Wikimania 2015 talk ideas for Wikidata - https://phabricator.wikimedia.org/T87334#1073536 (10Lydia_Pintscher) 5Open>3Resolved a:3Lydia_Pintscher https://wikimania2015.wikimedia.org/wiki/Category:Wikidata [15:41:19] * edward merged Q7509539 and Q2282186 [15:50:54] hoo: thnx [16:00:47] 10Wikidata, 10MediaWiki-API: Update Wikidata for ApiResult rewrite - https://phabricator.wikimedia.org/T91073#1073637 (10bd808) 3NEW [16:01:49] 10Wikidata, 10MediaWiki-API: Update Wikidata for ApiResult rewrite - https://phabricator.wikimedia.org/T91073#1073652 (10bd808) See also: * {T76728} * {T57371} * {T33629} [16:06:11] it's a pity that we are going to hit 200m edits in the next few days, but because of missing dumps we can't even answer basic questions like how many of those were made by bots, etc. [16:06:41] dennyvrandecic: tool labs can answer these [16:06:51] yay! [16:06:54] can you ask it? [16:07:26] I'd just like to know how many of the 200m are done by bots [16:07:52] No of bot edits? Which time span? Users that have the bot flag? Or users taht contain the string "bot"? [16:08:11] two time spans: from start to today, the last month [16:08:18] hoo: is there some convenient way you know to find all properties used in a statement (e.g. main snak, reference, qualifiers) in js? [16:08:57] * aude thinks a method like that should be somewhere in wikibase itself, or one of the components [16:09:04] re what are bots, whatever is likelier to be correct ... :-/ probably a union of those two [16:09:28] aude: Not something I'm aware of... look how authority control does it [16:09:39] hoo: that's what i am hacking on :P [16:09:40] it just iterates over everything AFAIR [16:09:41] :P [16:09:46] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 7Need-volunteer: suggest existing items when creating a new one to prevent duplication - https://phabricator.wikimedia.org/T41763#1073673 (10Lydia_Pintscher) 5Open>3declined a:3Lydia_Pintscher [16:09:52] dennyvrandecic: Ok, mh [16:09:54] it actually doesn't find referenced properties if they are only on references or qualifiers [16:10:11] hoo: thanks! [16:10:11] only if they happen to be used ina main snak, then they also get linked in a reference or qualifier :o [16:10:34] have a fix for references and will do qualifiers, but instinct tells me this doesn't belong in a gadget [16:12:07] aude: Our data model doesn't have that knowledge in any single point [16:12:14] hm [16:12:28] i also don't find such convenience method in php [16:12:51] In PHP look for where we add the internal links [16:13:05] i did [16:13:06] pagelinks, I mean [16:14:00] entity->getStatements()->getAllSnaks [16:15:26] Oh right, yeah [16:15:33] that iterates over everything as well [16:15:43] and then you have to iterate over all Snaks again [16:15:48] bad runtime, I guess [16:15:50] yeah [16:16:38] JS data model doesn't have that [16:16:53] it could, if it's efficient enough etc [16:16:56] it has a wb.datamodel.StatementList.getPropertyIds but taht only takes main snaks into account [16:17:05] would at least be better there than in a gadget [16:17:17] Probably [16:17:25] shouldn't be very hard to implement [16:17:32] analogue to the PHP implementation [16:17:34] anyway, i am fixing the gadget now but can make a task for that [16:17:46] and can be prioritized accordingly [16:18:06] For the gadget on Wikidata it might be faster to iterate over all and filter by href [16:18:26] maybe [16:18:30] hmm... I am thinking of a submission to Wikimania on "What can Google do for you?" [16:18:45] but I cannot figure out which track would be appropriate [16:18:54] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10§ Wikidata-Sprint-2015-02-25: Browsertests for URL value input - https://phabricator.wikimedia.org/T88545#1073682 (10WMDE-Fisch) https://github.com/wmde/WikidataBrowserTests/pull/53 [16:19:14] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10§ Wikidata-Sprint-2015-02-25, 5Patch-For-Review: Browsertests for URL value input - https://phabricator.wikimedia.org/T88545#1073683 (10WMDE-Fisch) [16:19:19] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata: JSON should (optionally) contain full URIs for referenced external entities - https://phabricator.wikimedia.org/T73992#1073685 (10Lydia_Pintscher) [16:19:20] 10Wikidata, 7Need-volunteer, 7Tracking: selfcontained projects around Wikidata (tracking) - https://phabricator.wikimedia.org/T90870#1073684 (10Lydia_Pintscher) [16:19:30] dennyvrandecic: Tomorrow is the last day for submissions [16:19:36] I know [16:19:44] that's why I should decide on the track today :) [16:19:47] dennyvrandecic: sad there is no "oepn data" track [16:19:57] yeah, that would help [16:19:59] not at all perfect, but maybe technology [16:20:06] since data is sort of related [16:20:18] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10§ Wikidata-Sprint-2015-02-25: Browsertests for time value input - https://phabricator.wikimedia.org/T88542#1073686 (10WMDE-Fisch) https://github.com/wmde/WikidataBrowserTests/pull/54 [16:20:24] and google is known for GSOC, code-in etc [16:20:40] 10MediaWiki-extensions-WikibaseRepository, 10Wikidata, 10§ Wikidata-Sprint-2015-02-25, 5Patch-For-Review: Browsertests for time value input - https://phabricator.wikimedia.org/T88542#1073687 (10WMDE-Fisch) [16:20:53] yeah, I guess [16:21:08] if it's about open data maybe "Legal & Free Culture" could also fit? [16:21:40] it is also about how to have a conversation with the community [16:21:43] could work [16:21:46] yeah [16:21:49] if we have a dataset and how to offer it to the community [16:21:51] maybe what hoo says [16:22:07] what's the best way to do it, this is why I was thinking of WikiCulture and Community [16:22:29] could also work [16:22:56] Maybe annotate that you're ok with them changing it, if they see fit? [16:23:05] yes, definitively [16:23:33] ok, I think I will take community and tell them to change it if it doesn't fit [16:24:43] dennyvrandecic: MariaDB [wikidatawiki_p]> SELECT COUNT(*) AS nr_edits FROM recentchanges LEFT JOIN user_groups ON ug_user = rc_user AND ug_group = 'bot' WHERE ug_group IS NULL; [16:24:48] 2293534 [16:24:52] Last 31 days [16:25:03] Only users that are not in the bot group [16:25:13] for the longer period I guess we need a bet check for is bot [16:25:26] Oh, I'm stupid [16:25:33] recentchanges has that information directly :S [16:25:36] nice. what are the total edits in that timeframe? [16:25:38] woot? [16:26:33] MariaDB [wikidatawiki_p]> SELECT COUNT(*) AS nr_edits FROM recentchanges WHERE rc_type = 0; [16:26:33] 7561005 [16:26:40] MariaDB [wikidatawiki_p]> SELECT COUNT(*) AS nr_edits FROM recentchanges WHERE rc_type = 0 AND rc_bot = 0; [16:26:40] 2144832 [16:26:53] that are exact numbers (even taking flood accounts into account) [16:26:57] or the flood flag [16:27:07] so 7.5 M edits in the last months, of those 2.1M made by humans? [16:27:20] Yep :) [16:27:25] that's a great number! [16:27:43] any way to get the ratio for all time? [16:27:54] i have numbers from last summer, and I could extrapolate [16:28:01] but fresh numbers would be great [16:28:01] Yeah, sure... just not this straight forward and exact [16:28:08] doesn't have to be exact [16:28:30] i am happy with precision +-5% :) [16:29:00] oh, and just for fun, could you get the above numbers for the last 30 days from enwp? [16:29:23] dennyvrandecic: Maybe... got to leave in 10 min to catch a train :S [16:29:31] sure :) [16:30:38] dennyvrandecic: Do we care about deleted edits? [16:30:54] whatever is easier [16:31:07] In that case: We don't [16:31:38] 4104368 edits on enwiki; 3650218 non-bot edits on enwiki [16:31:41] last 31 days [16:32:50] great [16:33:28] so about 10-15% bots on enwiki, about 70% bots on wikidata in the last 30 days [16:33:59] https://tools.wmflabs.org/hoo/wikidata-nonbot-edits.txt [16:34:06] results for wikidat will appear there (hopefully [16:34:24] query: SELECT COUNT(*) FROM revision INNER JOIN page ON page_id = rev_page WHERE NOT EXISTS(SELECT 1 FROM user_groups WHERE ug_user = rev_user AND ug_group = 'bot') AND NOT EXISTS(SELECT 1 FROM user_former_groups WHERE ufg_user = rev_user AND ufg_group = 'bot') AND page_namespace = 0; [16:34:30] got to go... see you :) [16:37:08] thank you! [16:47:35] 10Wikidata, 10§ Wikidata-Sprint-2015-02-03, 10§ Wikidata-Sprint-2015-02-25: Adjust AuthorityControl Gadget to snakview name changes - https://phabricator.wikimedia.org/T87858#1073727 (10aude) ideally this fix is on top of fix for T89259 [16:58:44] 10Wikidata, 10§ Wikidata-Sprint-2015-02-25: Automated Wikidata build process - https://phabricator.wikimedia.org/T90567#1073743 (10JanZerebecki) There is something that pushes to gerrit from jenkins@gallium, which is no option for Wikidata as we don't want to run composer there. Quick greping revealed nothing... [19:24:34] if anyone is curious https://wikimania2015.wikimedia.org/wiki/Submissions/What_can_Google_do_for_you%3F [19:31:04] * Lydia_WMDE would love some names under https://wikimania2015.wikimedia.org/wiki/Submissions/State_of_Wikidata_-_giving_more_people_more_access_to_more_knowledge_one_edit_at_a_time and https://wikimania2015.wikimedia.org/wiki/Submissions/Ask_Us_Anything_About_Wikidata [19:31:50] I am not going [19:32:57] I'm not able to go, sorry [19:35:13] you can still vote! [19:35:31] that's totally ok since you will want to watch videos later, no? ;-) [19:36:31] I have not even seen my presentation in London [19:36:44] never learned it become online [19:38:53] dennyvrandecic: interesting talk [19:39:20] GerardM-: thanks [19:45:10] Lydia_WMDE: i should add my name [19:45:25] :) [19:45:34] else, they might schedule my osm workshop at the same time [19:46:19] https://wikimania2015.wikimedia.org/wiki/Submissions/OpenStreetMap_workshop :) [20:04:38] 10Wikibase-DataModel-Serialization: Empty JSON maps serialized as empty lists in XML dumps - https://phabricator.wikimedia.org/T91117#1074553 (10mkroetzsch) 3NEW [20:17:32] 10Wikidata, 6Language-Engineering, 7Upstream: dvwiki link (font) causes page to crash in chrome 40.* and doesn't render in firefox - https://phabricator.wikimedia.org/T88478#1074587 (10aude) [21:39:08] aude: ReferencedEntitiesFinder? [21:39:43] aude: hm, that takes a snak list, though.... [21:41:01] oh, JS! Sorry :D [21:44:28] DanielK_WMDE: yeah :) [21:45:14] Entity::getAllSnaks would help (if in js), although might not be very efficient [22:00:14] anybody knows if there's an internal testing db for wikidata with production data? I tried to use labs one but turns out it doesn't have text table... so not useful [22:00:15] no, it would be nicer to directly collect the entity ids while traversing the structure [22:01:20] SMalyshev: try implementing an EntityRevisionLookup that asks the live site via the API [22:01:50] if we did the injection right, that should Just Work (tm) [22:02:27] narrow interfaces FTW :) [22:02:38] DanielK_WMDE: where that would be? from the backtraces I'm getting from wiki I don't see any mention of EntityRevisionLookup (or, for that matter, of any wikidata files at all...) [22:03:06] https://gist.github.com/smalyshev/835689dac58b50e6f51f [22:03:50] SMalyshev: running a full mediawiki with that would be tricky. but it should be possible to rig the dump script to use the custom lookup [22:04:21] SMalyshev: running mediawiki against the replicated databases on labs will not work anyway [22:04:49] too many things are not replicated. some tables are missing columns, too. [22:05:14] hmm... what's the point of having replicated db if one can't use it? [22:05:24] you can use it for querying [22:05:27] not for runnign a wiki [22:05:43] people build analysis tools on top of these databases [22:05:47] is there any way to test wiki with actual production data? [22:05:56] no [22:06:01] I see [22:06:07] you'd need to copy the database [22:06:17] or parts of it [22:06:25] that's how the beta site works [22:06:28] that sounds like it's going to take a lot of time... [22:07:00] we *should* have a vagrant bundle with a bit of magic that would pull in decent test data [22:07:08] but i think the second part doesn't exist [22:07:34] ok... I guess I submit a ticket about it and will just create a huge instance on labs and load dump there [22:08:48] SMalyshev: try to just import a small dump. https://phabricator.wikimedia.org/T90148 [22:09:06] note that you'll need to enable entity imports, they are disabled per default. [22:09:23] allowEntityImport=true [22:11:23] SMalyshev: you can of course install your patched code on an app server on the live cluster that is not in rotation. i believe people do that, but it's risky, you could mess up the live db. I have never done it. [22:11:49] but i think there are boxes set aside for this somewhere. wmf folks would know [22:13:14] SMalyshev: DumpJson::initServices has this: $revisionLookup = WikibaseRepo::getDefaultInstance()->getStore()->getEntityRevisionLookup( 'uncached' ); [22:13:26] just provide an implementation that pulls from the API [22:20:04] DanielK_WMDE: I'm not sure that would be a good solution - it probably will take me some time to implement it and it would hit the server for every entity, of which we have 16 millions [22:20:36] DanielK_WMDE: where is the small dump you are talking about? [22:24:27] not done yet. https://phabricator.wikimedia.org/T90148 [22:24:44] but you can make one from Special:Export. should work for a hundred items or so [22:24:47] ah ok :) [22:25:46] DanielK_WMDE: hundred is not much... we'd need something much bigger pretty soon [22:29:08] SMalyshev: yea... poke us about the ticket next week [22:34:34] ok, I will :) [22:37:14] 10Wikidata, 10Wikimedia-Labs-General: Need a way to test with data set reasonably close to production - https://phabricator.wikimedia.org/T91131#1074849 (10Smalyshev) 3NEW [22:42:56] DanielK_WMDE: is there a description somewhere what each of wikidata dumps contains? [22:43:01] I mean XML dumps