[05:20:27] I'm so sick of cleaning up after badly done mass-edits [05:21:47] especially when the person who did it won't do it themselves [09:45:45] nikki: who wont cleanup? [09:45:56] they should be barred from doing mass edits if they aren;t going to cleanup..... [09:46:00] IMO [09:46:00] :p [14:25:37] addshore: I'm currently cleaning up a load of descriptions Mr. Ibrahem added with their bot last year. I pointed out four months ago that there's still lots to clean up and haven't had a response... also they did change some of them last year, but some of the changes still weren't right [14:28:16] but they're hardly the only person who does things like that... in june someone added a load of bad p31 statements with quickstatements and has since left wikidata, so that was another thing I ended up cleaning up myself [14:29:29] and even magnus has bots which re-add bad data after we remove it [14:35:04] there was another person last year around the same time who was mass-adding bad descriptions... oh and the person who eventually did get blocked for a variety of reasons who was... [14:36:08] I guess if we had automatic descriptions, that would help stop people from wanting to mass-add descriptions in languages they don't speak [14:40:47] it would also be nice to have an easy way for bots to keep track of things they've already done, so if someone removes or changes a value, the bot doesn't re-add it [14:41:42] as far as I know if someone wants that, they would have to implement their own tracking from scratch [14:59:49] Lucas_WMDE: do you have any idea why https://tinyurl.com/y836uhl6 doesn't work? (and more to the point, how I can make it work :P) [15:00:21] it seems to be the count that fails, but I don't know why [15:00:54] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @Thiemo_WMDE & @CFisch_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:01:53] nikki: COUNT(1) works [15:01:56] o_O [15:02:07] no idea why [15:02:11] huh [15:02:14] thanks [15:02:41] COUNT("wtf BlazeGraph") also works, for that matter, if you want to abuse it for a comment ;) [15:03:00] but ?item, ?desc or * don’t work [15:11:45] hah [15:12:36] > it would also be nice to have an easy way for bots to keep track of things they've already done, so if someone removes or changes a value, the bot doesn't re-add it [15:12:47] yup, nothing like that exists, but, hmm, that is an interesting idea [15:18:26] "bot-go-away: true" :D [15:50:54] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @Thiemo_WMDE & @CFisch_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [17:06:02] nikki: how about i guide you through inviting me to that channel then? :P [17:06:14] sure [17:06:23] yurik_: I saw you have a OSM wikibase working? :D [17:07:06] addshore, not yet, but working on it [17:07:14] nikki: something like /invite addshore #channel [17:07:43] * nikki did that [17:09:44] addshore, ah, sorry, confused it with my current project - yes, osm wikibase has been up for about 2 months now [17:09:57] i'm working on importing all that data into sophox (wdqs{) [17:10:06] * CFisch_WMDE randomly drops a randomly found question from discourse: https://discourse-mediawiki.wmflabs.org/t/how-to-upgrade-mediawiki-after-a-dockerized-wikibase-run/938 [17:10:30] ( it's about wikibase updates in dockerized wikibase [17:11:33] CFisch_WMDE: oooh [17:15:16] nikki: has there ever been any discussion to identify statements in some way that just probably don't need a reference? [17:16:35] not that I'm aware of [17:16:36] addshore, do you know how to configure namespaces for the wdqs export? [17:18:36] the wdqs export? as in dumpRdf ? or something else? [17:18:41] we do now have https://www.wikidata.org/wiki/Property:P5852, https://www.wikidata.org/wiki/Property:P3452 and https://www.wikidata.org/wiki/Property:P887 which can be used for some things where "normal" references aren't needed [17:19:33] nikki: thanks [17:26:09] for some things there's nothing that really works though... I'd like to replace "imported from wikipedia" for external ids, but there's often nothing I can replace it with [17:26:27] i kind of feel like we need something [17:28:43] I don't like using "stated in" as a self-reference for external ids unless it actually claims that the id belongs to the wikidata item, because how does anyone verify that? [18:36:49] nikki: any idea how the item with most statements? :P [18:37:22] or rather the item with the most statements with different entity values [18:37:22] heh [18:37:35] the article with a zillion authors? [18:38:09] as long as the statements are for entities :D [18:38:09] I tend to avoid large items, my computer doesn't like them :P [18:38:17] * addshore looks at long pages [18:38:22] yeah, that's a good place to look [18:38:42] it gets grumpy enough just loading country items [18:39:15] https://www.wikidata.org/wiki/Q30407207 is the biggest, it loads pretty fast for me now actually, a few seconds [18:40:33] your computer is probably less temperamental than mine :P [18:40:48] I really need to get a new hard drive and do a new install [18:41:06] addshore: loading yes but rendering once you scroll down is slow on my PC. [18:41:49] yupp [18:41:59] well, it least it loads, i remember when it didnt :P [20:13:29] jar:file:///tmp/jetty-localhost-9999-blazegraph-service-0.3.1-SNAPSHOT.war-_bigdata-any-1469619187860440480.dir/webapp/WEB-INF/lib/bigdata-runtime-2.1.5-SNAPSHOT.jar!/com/bigdata/rdf/sail/sparql/ast/SyntaxTreeBuilderConstants.class, jar:file:///tmp/jetty-localhost-9999-blazegraph-service-0.3.1-SNAPSHOT.war-_bigdata-any-1469619187860440480.dir/webapp/WEB-INF/lib/sparql-grammar-2.1.5-SNAPSHOT.jar!/com/bigdata/rdf/sail/sparql/ast/SyntaxTreeBuilderConstants. [20:13:32] class [20:13:39] oops, sorry [20:14:05] Don't worry :) [20:15:44] Sometimes I configure my terminal as read-only to avoid this, it happens to me too often [21:56:55] addshore: coincidentally, today someone else was annoyed at magnus's bots re-adding bad data - https://www.wikidata.org/wiki/Topic:Uodrphi2rhc499ng - and the impression I get there is that he's not going to do anything about it :/ [21:57:49] I thought the bot respected rollbacks? [21:58:29] I have no idea how it works, but it shouldn't even be re-adding the data in the first place [21:58:47] it was added from mix'n'match, so it should already know that it added that data already [22:00:30] and as far as I can tell, mix'n'match even keeps a record of the actions people do [22:03:06] I'm glad I'm not the only one who gets frustrated at it re-adding bad data though... I sometimes wonder why I bother trying to fix things when mix'n'match thinks it knows better [22:04:09] * abian shares nikki's frustration [22:04:21] yay, sort of [22:04:27] Maybe an abuse filter can prevent these re-edits? [22:04:51] I'm not sure how [22:05:00] I've never seen one to automatically detect edit wars, so I'm not sure either [22:07:02] I don't even understand what the sync is for [22:07:39] In case you add IDs manually on Wikidata (?) [22:09:07] but that's the opposite direction [22:30:29] nikki: what are the edits? mainly claims or descriptions or? [22:30:55] external ids [22:31:18] its almost like you need another database full of data that shouldnt be re added :P [22:32:22] haha pretty much :P [22:32:51] Or look at page history... [22:33:13] addshore: Do you know what's wrong with ? [22:34:35] *looks* [22:34:44] hah, no [22:34:49] :) [22:35:07] i rebased it, lets see what it does now [23:00:31] Happy birthday, sjoerddebruin :) [23:00:45] abian: <3 [23:09:33] I think today is the birthday of another Wikimedian on this channel, but I'm not sure why I have that data nor if that Wikimedian made the date public :/ [23:10:15] addshore: -1 again :S