[00:27:07] Is there a way I can get template + module copied from enwiki? [04:01:56] I want to add an item but the 'subsets' already exist and they are publications describing the item [04:02:23] The item I want to add is Ovarian remnant syndrome [04:03:49] (...fifty people in the chat room and no one is concious?) [04:04:39] Well, if you see this and know the answer or can fix it, leave a note on my talk page on the enWikipedia. My username is Barbara (WVS) [09:41:53] Am I thinking properly here...? https://www.wikidata.org/wiki/Wikidata_talk:WikiProject_Taxonomy#Deprecated_status [09:42:14] sjoerddebruin: Mind looking at my comments and see if I'm making any sense....? [09:42:40] Ugh, taxa... [09:43:46] Yeah, well, this is more about Wikidat's function/use of Ranks thank it really do about axa [09:43:47] taxe* [09:43:49] taxa**** [09:44:03] Yeah, I would love ranks instead of this mess. [11:56:55] https://www.wikidata.org/wiki/Q5705 [11:57:30] The description in Catalan is "State of Western Europe" [11:59:00] In case a very brave admin wants to fix it... [12:01:03] Heh. Well, that's clearly not the state item, that is https://www.wikidata.org/wiki/Q138837 [12:01:37] Was the previous one okay? https://www.wikidata.org/w/index.php?title=Q5705&type=revision&diff=584976900&oldid=584064771 [12:01:38] So there's no argument that it should not be changed even if you support independence :) [12:01:48] sjoerddebruin: hah [12:02:09] That's "Western Mediterranean country constituted as a region of Spain" or whatnot [12:02:15] So... eh, a bit better I guess? :p [12:02:45] Yeah, a bit more right :) [12:03:18] done [12:03:18] Whether it is a country is debatable, but at least it specifies it's an item for the autonomous community [12:03:32] Should I leave a note to the editor? [12:04:04] Not necessary, I think [12:04:34] Thanks, sjoerddebruin, reosarevok :) [12:04:50] np [12:06:03] ;) [12:06:16] * reosarevok wonders about P1336 in https://www.wikidata.org/wiki/Q138837 [12:06:43] The territory isn't claimed by the Autonomous Community of Catalonia, is it? It's claimed by Spain and this own item [12:06:52] * reosarevok shrugs [12:07:06] Not sure how to do that, I guess linking to itself would be odd [12:13:23] There are discussions about whether Q138837 even exists [12:13:58] But I don't like politics :) [12:14:45] Description says "unilaterally declared republic at the Iberian peninsula" [12:15:09] However, it wasn't declared clearly... or it simply wasn't declared [12:15:28] https://www.theguardian.com/world/2017/oct/10/catalan-government-suspends-declaration-of-independence [12:17:08] https://www.thesun.co.uk/news/3970067/catalonia-independence-referendum-latest-result-spain-map/ [12:17:38] Well, the "government in exile" talks of itself as a "government in exile" so there's definitely something they claim to govern. I guess this isn't as much "does this definitely exist" as "are there enough claims that this exists" - Wikipedia/Wikidata doesn't take decisions like those, just documents what others are saying :) [12:18:08] So as long as there is a claim it exists, we should store it (but not on the autonomous community article) [12:23:02] Sure, the point is... what is the claim? And what sources support it? [14:48:29] someone know the status of the Wikidata lists project (aka Wikidata Phase 3)? is was discontinued? [14:49:23] I read https://www.wikidata.org/wiki/Wikidata:WikiProject_Lists but it is not much clear to me (my English is not very good) [14:50:04] It still seems to be in the planning stage [14:53:57] is there some page or phab task more up to date about it? [14:55:37] I think embedded queries will probably give more results danilo [14:56:26] And we have lists like https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Collection/Rijksmuseum_Twenthe :-) [14:58:23] i didn't know about the embedded queries, is there a doc about it? [15:02:31] oh, it is made by listeriaBot, I already know about that [15:04:01] I was wonder if it would be an way to automatic generate and update lists without bots [15:07:32] *would have [16:39:30] Hi, i want to query all topleveldomains containing two parts e.g. co.uk any idea? [16:41:23] biddy: I’d probably use the public suffix list for that https://publicsuffix.org/ [16:42:21] So [16:42:53] Thx, so this would also be a.nice propery [16:42:58] there don’t seem to be any examples of this on Wikidata http://tinyurl.com/ycl8u5sv [16:43:12] e. g. our TLD for the United Kingdom is just .uk https://www.wikidata.org/wiki/Q145#P78 [16:43:25] (which is correct, too, afaik) [16:44:32] But all domains in uk are givenout as co.uk my plan is to extract the domain/host [16:44:53] biddy: given foo.co.uk you want to extract “foo” [16:44:57] ? [16:45:31] (btw, my query was incorrect, here’s a better version: http://tinyurl.com/y9fofrdw) [16:45:51] But thx much for the suggestions, from.what i know wikidata looks pretty cool. Thxcommunity! [16:48:06] Thx for the regex qry, all new but exciting! [16:48:39] no problem :) [16:48:53] but if I understood your problem correctly, the public suffix list is definitely what you want, I think [16:50:28] May i ask also a question about the relevance of data? Im.wondering about your opinion... [16:50:52] sure [16:55:48] Is this relevant for wikidata? [17:07:07] multichill: are you aware of http://www.divaantwerp.be/nl/collection/overzicht? [17:07:29] Not sure if they have paintings though. [17:13:50] It seems like they don't have a stable URL yet though. The ID differs in English and Dutch. [18:56:16] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1952 bytes in 0.090 second response time [19:04:49] Hmmm [19:05:03] That looks odly. https://grafana.wikimedia.org/dashboard/db/wikidata-dispatch?refresh=1m&orgId=1&from=now-1h&to=now [19:05:56] yeah [19:07:22] https://grafana.wikimedia.org/dashboard/db/mediawiki-graphite-alerts?panelId=2&fullscreen&orgId=1 [19:07:41] oh dear [19:08:33] Did a bot get really busy? [19:08:55] from the three graphs below “dispatch pending changes”, it looks like the dispatch jobs don’t actually run [19:09:12] not sure why that would cause the zigzag in the dispatch lag and pending changes [19:09:17] There are some problems with ORES, but how does that affect dispatch... [19:10:54] Hmm. https://grafana.wikimedia.org/dashboard/db/wikidata-edits?refresh=1m&orgId=1&from=now-3h&to=now [19:12:33] Something odly going on with logging? [19:18:27] #0 /srv/mediawiki/php-1.31.0-wmf.7/extensions/Wikidata/extensions/Wikibase/repo/includes/Store/Sql/SqlChangeDispatchCoordinator.php(529): Wikimedia\Rdbms\LBFactory->waitForReplication(array) [19:18:29] Replag? [19:18:58] 28093: Dispatcher exited with 255 [19:19:16] [455bc1bb8b0dd58a51232a74] [no req] Wikimedia\Rdbms\DBReplicationWaitError from line 368 of /srv/mediawiki/php-1.31.0-wmf.7/includes/libs/rdbms/lbfactory/LBFactory.php: Could not wait for replica DBs to catch up to db1070 [19:19:25] hoo: ^ [19:19:39] I don't see any replag... currently at least [19:19:44] meh [19:19:50] looking (briefly) [19:20:05] "host": "db1100", [19:20:06] "lag": 1944.3836581707 [19:20:13] That's a lot of lag [19:20:22] Is that all Mediawiki jobs? [19:20:24] https://dbtree.wikimedia.org/ doesn't agree [19:20:44] But I trust mw more :P [19:21:04] MW uses pt-heartbeat [19:21:06] * Reedy looks at ganglia [19:21:07] Something doesn't look good on that db server https://grafana.wikimedia.org/dashboard/db/server-board?refresh=1m&orgId=1&var-server=db1100&var-network=eth0&from=now-3h&to=now [19:21:48] shit [19:21:52] replication's broken there [19:22:03] Want me to find a DBA? [19:22:58] Yeah… wonder why it didn't yet page [19:23:15] We can set the weight of that server to 0 for now, I guess [19:23:21] it's "only" vslow [19:26:54] Editing is also slower. [19:27:45] that shouldn't be related [19:28:01] Not just on my side. https://twitter.com/AndreasP_RV/status/932329069747859457 [19:28:19] for bots, yes (they are stopped by maxlag) [19:28:28] but for everyone else, this should just be fine [19:28:35] (but there might be some other issue) [19:28:57] * hoo goes to edit the sandbox [19:29:37] damn, you're right [19:29:45] It takes a little over 10s [19:30:03] so I assume we hit a wait for replication with the default 10s timeout somewhere [19:30:06] should be easy to find [19:41:02] Server is being depooled. [19:42:12] The problem is mostly per design in MediaWiki::preOutputCommit [19:42:25] I guess this is how it's supposed to be :/ [19:50:04] Hey there, the saving process takes a little longer than normal, server overloaded? [19:50:29] Some database server was having a hiccup, it's being worked on Crazy1880_. [19:51:21] we're good again [19:52:24] Thanks sjoerddebruin. The search service would then probably have to be updated again, because some numbers do not fit. (Maintenance lists) [19:52:40] What do you mean? [19:54:23] I have a search result that shows the same search result for over an hour even though the articles have been updated. [19:55:04] Is it powered by the query service? [19:55:41] partly partly (query services and cirrus) [19:56:15] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1948 bytes in 0.117 second response time [19:57:22] I don't see much lag on the query service. Not sure about cirrus. [19:59:13] I'll check back tomorrow morning. Thanks sjoerddebruin. And Cirrus at german Wikipedia is busy, too. Have a nice day, see ya [19:59:24] Bye! [20:11:23] in WDQS, is there a simple way to sort items by their order of creation? [20:26:08] Envlh: not an *easy* way, I don’t think (we only have the date of the last edit as a predicate) [20:26:32] but I suppose you can SUBSTR() the numeric part of the Q-ID out of the URI, xsd:integer() that, and then sort by that [20:26:49] yeah, it's what I did :) [20:26:50] ORDER BY DESC(xsd:integer(SUBSTR(STR(?item), 33))) [20:26:55] ok :) [20:27:21] (you could use STRLEN(STR(wd)) instead of 33, just in case the prefix ever changes) [20:27:30] but it's not straightforward ^^ [20:27:32] (plus or minus one, if I recall correctly) [20:28:06] hmm, how so? [20:28:12] oh, sorry, wd: [20:28:13] not wd [20:28:24] (which expands to http://www.wikidata.org/entity/) [20:28:30] oh ok, nice [20:37:20] aaand it's +2 :) [20:38:19] (with +1, you still have the Q character) [20:39:46] ah, of course :D [21:07:23] sjoerddebruin: No, don't know it. The only thing I did in Belgium was the Flemish project with Sandra [21:07:59] But they need to provide stable urls first. Seems like even the creator urls are connected with objects... [21:09:05] Stable-ish ;-) [21:09:13] They don't seem to have paintings [21:10:23] sjoerddebruin: Anyway, if you ever find a nice collection. Notes are at https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Location [21:10:29] I know, I know. [21:11:01] Did you notice that noclaims dropped to 0,6%? ;) [21:23:08] Nice! When are we going to hit0%? [21:24:13] So how is the new Da Vinci sale reflected on the data? :) [21:24:16] Maybe if I get fired or something. [21:25:50] We're now on 10 jan 2013. [21:48:10] So, only 3 months left ? [21:48:32] Well, that's the bottom item. Still a lot of challenges on the whole page. [21:49:22] We'll see, nn