[00:08:56] so basically, i'm using it to grab data from this page - https://www.snpedia.com/index.php/Rs2235544 [00:09:01] i'm able to grab all the things lioke orientation, stabilized, chromosome, etc - but how can i get the data inside the little table at the top using that api [00:09:06] like, here's an example of what i'm currently doing https://bots.snpedia.com/api.php?action=askargs&conditions=Rs28942072&printouts=Orientation|StabilizedOrientation&format=json [00:10:34] FoxT: [19:32:39] eskimo: Take https://bots.snpedia.com/api.php?action=askargs&conditions=Rs28942072&printouts=-Rsnum&format=json and loop through the results [19:34:17] FoxT: eek, but adding an extra 3 requests for each request is gonna be bad [19:51:23] eskimo: You could use the ask api module and get it all at once, but then your code will have to sort out the resulting mess: https://bots.snpedia.com/api.php?action=ask&query=[[Rs28942072]]%20OR%20[[Rsnum::Rs28942072]]|?Orientation|?StabilizedOrientation|?Genotype|?Magnitude|?Summary&format=json [19:54:00] oh hey thats pretty good [19:54:17] is there a way to do stuff in like super super bulk [19:54:30] rather than killing the wiki with say 10k requests in a loop [19:55:10] You could probably query for a category [19:59:39] w0y_someone: AFAIK, no. If at all possible I avoid stuff like {{#show: {{PAGENAME}} | ... }} [20:02:05] FoxT: so could I do like, [[Category: Is a snp]] [20:02:08] And get all that data back [20:05:30] I guess so. Well, you will run into limits at some point, I guess. Add an offset to get the next pages. [20:09:17] Well my goal is to get it in the least amount of time/requests possible [20:09:36] i don't wanna make my script take forever and obviously don't wanna pound their servers too hard [20:13:27] You can up the number of returned rows with the limit= parameter, but there is an upper threshold to that as well beyond which you will not get additional values. I don't know if there is a way to find that threshold other than asking them. [20:14:53] I'd have to experiment a bit to see if it's practical [20:15:09] thanks for the help though. definitely much better than the ghetto way I was planning on trying lol [20:15:13] Or you could assume that they know what they are doing and set some limit of 10000 or so and then let them throttle you down to whatever they are comfortable with. The result contains a query-continue-offset you can use to find the correct continuation offset. [20:15:41] yeah I've used that offset for another portion of the script already [20:15:50] pretty convenient [20:15:58] I hate when apis make you try to keep track yourself [20:17:45] "They" is Mike Cariaso, by the way. He's a nice guy, so just send him a mail if unsure what load the server can handle. [20:18:46] oh you know him? [20:19:21] SNPedia is one of the earliest SMW showcases and he has been to some of the conferences. [20:19:39] cool cool [20:19:46] it's a great resource