it would be really great to have a lemmy client (or feature of existing client) that allows for batch downloading of a user specified list of communities.
this would allow a user to download all the content for the day or week on wifi internet and then depart from the source of internet but slowly & carefully read a selection of material(text posts, comment discussion, and even images like memes).
one benefit is that it would be extra impossible to see what users are loading/viewing because they already loaded everything and are disconnected from the internet entirely. performance is also good because there is no network latency that would be experienced, each time, when accessing the servers.
You should be able to do this already with an RSS reader that also downloads the content it’s connected to. Most feeds you would need have RSS links you can use.
thats cool. are you sure it works?
how do u do it?I don’t see why it wouldn’t. You’d just need to research a good RSS app for it. When you look in every community or feed there’s a wifi looking symbol and that is the RSS link you would use for what you’re seeing on your screen.
Was flym one of the ones you tried?
You would also need to check if RSS fetches the comments
Interesting request, it makes sense in some use cases.
Seems like a lot of server upload data from instances for stuff that may never be viewed. Plus, what would you get? A bunch of titles with links you can’t click? A large amount of user comments that are normally not loaded because they aren’t viewed most of the time? Seems like a bad idea.
If you limit it to text, it might be kept light usage-wise. I guess OP is thinking about comments-focused communities where you can just read the text and don’t need to visit external links.
There are libraries out there that can be use to fetch a snapshot of a webpage. Would be cool to use them when generating offline snapshot so links to articles can bee viewed offline.
a way more radical approach would be to distribute the database openly as a torrent file. everyone peers it and everyone downloads it [or parts]. then plug in the data to the client.