[01:26:15] I have two bot proposals up. Comments are appreciated. https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/Citationgraph_bot https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/NIOSH_bot [15:30:52] Technical Advice IRC meeting starting in 30 minutes in channel #wikimedia-tech, hosts: @addshore & @Christoph_Jauera_(WMDE) - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [18:48:29] Almost all objects on my watchlist are marked as "very likely habe problems". Is there a known (new - everything worked fine till today) bug or is there more likely a problem with my account/preferences? [18:55:40] addshore: hi! so what is the status of wmf.8 on wikidata - is it going to happen? [19:25:36] Mirer: in what language? [19:32:58] german [19:33:58] stays the same after changing to english ... [19:39:52] One example where nothing strange happened is "Dan Fouts" (Q1159065). Even after updating it is marked. [19:41:56] Strange ... looks like my edits are the reason for the changes. Henry Jordan (Q28771259) was one of the few who wasn't marked. After updating it is marked too. [19:42:15] Is there a way that somebody marked my changes as "suspicious"? [19:57:42] {"mainsnak": {"snaktype": "value","property": "P373","hash": "bb5854eac678fa3c5850e3c6d040214e9f33ec78","datavalue": {"value": "Lepidoptera","type": "string"}},"type": "statement","id": "q28319$6212B332-2450-46E2-98C1-58FAE9B14831","rank": "normal"} [19:58:00] this is the json part of a commonscat (P373) [19:58:14] I want to add some commonscats to wikidata for butterfly's I uploaded via a script [19:58:24] but does someone know how I generate those hashes? [19:59:47] or can I remove then and does wikidata generate those? [19:59:52] *them [20:06:00] Rudolphous: wikidata generates them and unfortunately it's not easy to generate them externally (see https://phabricator.wikimedia.org/T167759) but I do not think you need them when editing. they will be generated by the code [20:09:44] thanks! [20:17:51] PROBLEM - High lag on wdqs2002 is CRITICAL: CRITICAL: 36.67% of data above the critical threshold [1800.0] [20:20:51] RECOVERY - High lag on wdqs2002 is OK: OK: Less than 30.00% above the threshold [600.0] [20:44:36] does anyone here know about editions? [20:44:56] Q3331189 [21:05:25] * addshore taps out [21:05:36] SMalyshev: wikidata.org will be on .8 with the regular train now :) [21:16:01] addshore: great, thanks! [21:46:02] I cannot test my bot on wikidata: Direct editing is disabled in namespace [21:46:16] I use this page: https://www.wikidata.org/wiki/Wikidata:Sandbox [21:46:26] can my rights be changed, or is there another page for testing? [21:47:30] I use RudolphousBot for this b.t.w. [22:25:33] SMalyshev: do you know how many Blazegraph edges are generated per Wikidata statement? Or rather, how things on Wikidata translate over to Blazegraph? [22:26:55] yannf: an edition is a specific published version of a book. As originally intended, you have "work" to describe the abstract notion of a given work, and "edition" to note first edition, second edition, etc. When citing a book it's generally preferred you cite the edition, since you are citing a particular edition of the book, not the abstract notion of it [22:27:34] and "book" is used to describe actual physical books [22:28:37] harej, I know that [22:28:44] but nvm I found out [22:28:49] okay [22:29:01] the answer is P1433 [22:30:02] wikidata list just doesn't work with properties, right? [22:50:04] harej: Do you want the RDF or actually the internal storing withing blazegraph (which has indexes on top, not just the triples we dump at it) [22:51:00] the internal storage within blazegraph. anything that has a consequence on how much ram and disk space blazegraph uses [23:08:27] harej, my question was about collections of several small works [23:09:12] i.e. how to list them in the larger work? and how to link them to the collection? [23:09:34] there are 2 properties P361 and P1433 [23:09:43] harej: https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format But for internal usage… no idea really [23:09:44] which confused me [23:10:10] I could tell how much space we use now, but I'm not sure that's very helpful [23:15:22] That number combined with the number of RDF triples would be useful, as well as the amount of RAM being used if available [23:16:15] 3908741036 triples [23:17:03] $ ll -h /srv/wdqs/wikidata.jnl [23:17:03] -rwxrwxrwx 1 smalyshev wikidev 310G Nov 15 23:16 /srv/wdqs/wikidata.jnl [23:18:25] blazegr+ 24071 250 29.0 54421612 38352372 ? Ssl Nov14 4441:42 java -server … [23:18:31] that's the service [23:18:39] blazegr+ 24288 12.6 1.5 9008604 2037880 ? Ssl Nov14 224:45 java … [23:18:43] that's the updater [23:18:49] harej: ^ [23:19:06] the service has some more triples (categories) and not "just" WD [23:19:24] Still, that’s very useful. Thank you [23:19:28] so the above numbers are slightly off for a Wikibase only use case [23:19:50] (Which number there is the RAM usage?) [23:20:09] It's USER PID %CPU %MEM VSZ RSS [23:21:27] So, 29% and 1.5% of what, exactly [23:22:13] the box has 128gb of ram and probably 24 cores, let me check [23:22:43] 16 cores (32 virtual, w/ HT) [23:23:40] This is all very useful. Thank you.