- cross-posted to:
- linux@sh.itjust.works
- hackernews@derp.foo
- cross-posted to:
- linux@sh.itjust.works
- hackernews@derp.foo
Please create a comment or react with an emoji there.
(IMO, they should’ve limited comments,and gone with reaction count there, its looks mess right now )
Wouldn’t that hit the GPU performance? I’m ok with it as long as it doesn’t become a personal data farming tool.
I do want it to be a personal data farming tool, as lomg as I’m the only consumer of the data.
A personal assistant is only as good as it knows you. In an ideal world, your AI will be customized to you, but only for your use.
I think this sort of processing addition would empower running ML locally; having to send dara out for processing on (other people’s) servers is one of the biggest obsticals to adoption by privacy-minded folk.
See that’s the issue I can see with it. Sure it does local processing for my own stuff, but I’d be miffed if they’re uploading everything to their servers, effectively using my system resources to build out their AI capabilities. Granted, this would be how they make their money (if something is free then you’re the product), so it’s kind of a tricky balance.