[17:19:58] just tried to create items with PetScan, but nothing happened - any1 else observed this problem? [17:20:32] I don't have much experience with PetScan yet [17:34:10] its similar to autolist, if you are familiar [20:28:20] hello! I wish to get an idea of the active users on wikidata by querying for all users who made an edit in recent past, which table should I be using for edits - revision or wb_changes? [22:47:51] Hi can I ask a technical question? [22:48:09] I'm looking to derive a list of interwikis between two wikis (enwiki and nlwiki) [22:48:28] but I can't quite figure out what is the easiest way to do this [22:49:04] I tried using the Wikidata-API, but if I go and crawl over all entities this will take something like a month [22:49:36] downloading the json-dump from wikidata also caused some issue, due to the size being above 100GB [22:49:50] I'd think that there would be an easier way to achieve this [22:50:01] basvb: We have gziped and bzip2 versions of the dumps [22:50:05] using those should be finde [22:50:07] * fine [22:50:22] they're at 4 and 6GiB each or so [22:50:30] https://dumps.wikimedia.org/wikidatawiki/entities/ [22:50:31] and then leaving them zipped while running over them in python? [22:50:37] yeah [22:51:01] yes, those were the ones I was trying to unpack [22:51:29] thanks for the tip, will see how that works [22:51:34] hey basvb :) [22:51:39] Hoi Sjoerd [22:51:50] Hoor jij niet in bed te liggen? :O [22:52:16] naah, ik heb mijn ritme een beetje verschoven [22:52:23] ff aan mijn thesis werken nog [22:52:37] dinsdag weer een scriptie-afspraak, dus moet ff wat afmaken [22:53:56] ahhh [23:04:10] thanks alot hoo, have it working, that wont take weeks but rather hours [23:04:53] That sounds reasonable :) [23:05:54] yep, hopefully the last big bump in the road to my full model