[05:05:34] OpenStreetMapers + Wikidata - please help document and fix - https://wiki.openstreetmap.org/wiki/Wikidata_RDF_database [05:05:42] CC: aude ^ [09:52:25] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1949 bytes in 0.322 second response time [10:00:27] Hello, I look for help on corporations and countries excel data processing! Complex financial data on 10 years, I share all the data for one-task-help ! [10:19:04] PROBLEM - High lag on wdqs1001 is CRITICAL: CRITICAL: 31.03% of data above the critical threshold [1800.0] [10:21:06] PROBLEM - High lag on wdqs1001 is CRITICAL: CRITICAL: 37.93% of data above the critical threshold [1800.0] [10:25:25] PROBLEM - High lag on wdqs1002 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [1800.0] [10:28:23] hi folks! I wonder if there's a way to import Wikidata properties along with their corresponding PID to my local MediaWiki? (e.g. instance of should be imported as P31 instead of P2) [10:29:25] PROBLEM - High lag on wdqs2002 is CRITICAL: CRITICAL: 31.03% of data above the critical threshold [1800.0] [10:31:14] PROBLEM - High lag on wdqs1003 is CRITICAL: CRITICAL: 30.00% of data above the critical threshold [1800.0] [10:31:25] PROBLEM - High lag on wdqs2001 is CRITICAL: CRITICAL: 40.00% of data above the critical threshold [1800.0] [10:33:25] PROBLEM - High lag on wdqs2003 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [1800.0] [10:35:25] PROBLEM - High lag on wdqs2003 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [1800.0] [10:50:26] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1946 bytes in 0.130 second response time [11:30:31] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1931 bytes in 0.175 second response time [12:35:02] ACKNOWLEDGEMENT - High lag on wdqs1001 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] Gehel updater is catching up after high edit rate [12:35:03] ACKNOWLEDGEMENT - High lag on wdqs1002 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] Gehel updater is catching up after high edit rate [12:35:03] ACKNOWLEDGEMENT - High lag on wdqs1003 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] Gehel updater is catching up after high edit rate [12:35:04] ACKNOWLEDGEMENT - High lag on wdqs2001 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] Gehel updater is catching up after high edit rate [12:35:04] ACKNOWLEDGEMENT - High lag on wdqs2002 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] Gehel updater is catching up after high edit rate [12:35:05] ACKNOWLEDGEMENT - High lag on wdqs2003 is CRITICAL: CRITICAL: 96.67% of data above the critical threshold [1800.0] Gehel updater is catching up after high edit rate [12:56:42] RECOVERY - High lag on wdqs2003 is OK: OK: Less than 30.00% above the threshold [600.0] [13:09:41] RECOVERY - High lag on wdqs2002 is OK: OK: Less than 30.00% above the threshold [600.0] [13:11:41] RECOVERY - High lag on wdqs2001 is OK: OK: Less than 30.00% above the threshold [600.0] [13:19:22] RECOVERY - High lag on wdqs1003 is OK: OK: Less than 30.00% above the threshold [600.0] [13:42:41] RECOVERY - High lag on wdqs1002 is OK: OK: Less than 30.00% above the threshold [600.0] [13:56:12] RECOVERY - High lag on wdqs1001 is OK: OK: Less than 30.00% above the threshold [600.0] [17:22:33] SMalyshev, is there an easy way for me to change munger so that it converts %20->'_' ? [17:22:56] (sitelinks don't use '_' atm) [17:24:41] well yes, add code to the Munger class that does the conversion :) [17:26:41] SMalyshev, can you think of an easy query that would do the same? https://wiki.openstreetmap.org/wiki/Wikidata_RDF_database [17:27:07] e.g. -- i need to find osm objects who's wikidata doesn't match wikipedia sitelink [17:31:44] hmm matching sitelinks may not work until the encoding is fixed [17:32:07] also, does "wikipedia" always mean english wikipedia? [17:32:12] nope [17:32:27] hmm [17:32:33] the "wikipedia" tag is already language-converted to a sitelink format [17:32:41] except that my conversion uses _ [17:33:15] well, you could do string replacement I guess but there are other encoding differences [17:33:41] or you wait couple of weeks until encoding is fixed :) [17:35:12] heh, i guess that's the best approach... Lydia_WMDE, do we need to wait one month for https://phabricator.wikimedia.org/T131960 ? [21:25:10] ... https://www.wikidata.org/w/index.php?title=Q10738&curid=12181&diff=490887597&oldid=490747848 [21:31:42] sjoerddebruin: That's weird [21:32:02] Very weird. [21:32:41] He's adding thousands of them. [21:32:54] sjoerddebruin: https://www.wikidata.org/wiki/Wikidata:Requests_for_deletions#Q29975888_.26_Q29975869 [21:32:58] Same user it seems [21:33:28] What a mess. https://www.wikidata.org/w/index.php?title=Q12775&action=history [21:33:48] https://www.wikidata.org/w/index.php?title=Q772916&type=revision&diff=68333859&oldid=68333857 [21:33:53] It had an imported from [21:34:22] Still, it's useless. [21:34:39] Oh wait, that one was wrong [21:34:47] Yeah, backfilling imported from is plain incorrect [21:35:05] Going to tell him to stop now. [21:36:01] Left him a note [21:36:11] Oh. [21:49:57] sjoerddebruin: July 2013. Still going steady it seems :-) [21:53:24] multichill: didn't really had time the last weeks. Tomorrow again. :D [23:12:10] multichill regarding https://www.wikidata.org/wiki/Wikidata:Requests_for_deletions#Q29975888_.26_Q29975869 (and pretty much all contributions by https://www.wikidata.org/wiki/Special:Contributions/PokestarFan ), allow me to remind you of RollBot :-p