hell yeah this looks sick!! I rolled my own speedtest container that sends the results to homeassistant but this seems like a great solution. I will have to try it - I hope I can still send the results to homeassistat.
hell yeah this looks sick!! I rolled my own speedtest container that sends the results to homeassistant but this seems like a great solution. I will have to try it - I hope I can still send the results to homeassistat.
project is here https://github.com/jaypyles/Scraperr
Thank YOU for being the MVP that let me know it was broken! 🤣 🔥
And I’ve fixed it now :D Sorry for the delay and thx for reporting <3
Yes it is me, the author 🤗
I have not had much time lately to work on it, one day I’d love to integrate some more cool graphics and visualizations for the network.
If you’re willing to post your code somewhere or send it to me somehow, I might have to find some time to integrate it on lemmyverse - also welcome to submit a PR if you have the inclination https://github.com/tgxn/lemmy-explorer
Yeah it is, docker stopped loading the redis DB 🤣
https://github.com/tgxn/lemmy-explorer/pull/189
looks like it’s fixed it https://develop.lemmyverse.net/
will be in prod shortly :)
you can find some more data for your instance here too https://lemmyverse.net/instance/feddit.org
:)
Did you graph these with a JS library? I’d love to improve the community stats page with some more cool graphs like these.
I had a crack at it on these pages, but didn’t dive into specific community info
https://lemmyverse.net/inspect
https://lemmyverse.net/inspect/versions
I’m here now, thanks for the ping :)
It sometimes does that when the crawler hasn’t had time to fetch a large proportion of the instances in the last 4 hours. the data should be pretty recent still. I’ll check on it now :)
LMAO they really are inept 🤦♂️
No problems mate :) As long as I’m still paying for hosting I’ll attempt to keep it updated.
I sometimes go awol and work on other stuff, pinging me here usually works 😊
I’m also glad that people still use and find the site valuable
I have now accepted the PR from @sunaurus@lemm.ee :) The changes are rolling out and the crawlers are crawling hard 24/7 again.
Sorry for the couple week delay, hit my up here next time :D
Right, I’ve calmed down and had a read. I did notice the instance count reducing. 😅 It makes sense .19 instances can’t be crawlel. I’ll see about the PR tonight
Thanks for pointing it me, I didn’t know about this otherwise 🤣
… I can just update it. I’ll check the PR today…
It still works no? What’s wrong if with it 😆
I’ve moved a couple of domains to dnssec and it’s great, simple DNS.
yeah, plus you just copy and paste the docker-compose and you don’t even need to know what’s under the hood.