[03:28:08] PROBLEM - WDQS SPARQL on wdqs1006 is CRITICAL: CRITICAL - Socket timeout after 10 seconds https://wikitech.wikimedia.org/wiki/Wikidata_query_service/Runbook [03:32:13] Lucas : many thanks. I'll investigate further tomorrow. [03:43:30] RECOVERY - WDQS SPARQL on wdqs1006 is OK: HTTP OK: HTTP/1.1 200 OK - 689 bytes in 1.068 second response time https://wikitech.wikimedia.org/wiki/Wikidata_query_service/Runbook [23:22:14] I am having a horrible time attempting to automate data entry into my wikibase installation. [23:22:34] I am able to get information into CSV format from powershell. [23:23:00] But cannot seem to figure out how the developers imagined people who don't develop for mediawiki would dynamically load data. [23:23:15] Quickstatements is a no go because I wasn't able to install via docker. [23:23:28] And cannot seem to find any info on installing it outside of docker. [23:32:22] Hey all, can anyone give me some advice on how to automate the importation of a CSV file full of data? [23:32:30] Into Wikibase. [23:33:44] I chose this platform because it looked like there were many avenues of data integration, but after spending the time to create the Wikibase installation, it seems most of the data integration are irrelevent, not documented, or impossible to use. [23:33:58] Any advice would be greatly appreciated.