• 0 Posts
  • 39 Comments
Joined 1 year ago
cake
Cake day: February 14th, 2024

help-circle




  • This is my use case exactly.

    I do a lot of analysis locally, this is more than enough for my experiments and research. 64 to 96gb VRAM is exactly the window I need. There are analyses I’ve had to let run for 2 or 3 days and dealing with that on the cloud is annoying.

    Plus this will replace GH Copilot for me. It’ll run voice models. I have diffusion model experiments I plan to run but are totally inaccessible locally to me (not just image models). I’ve got workloads that take 2 or 3 days at 100% CPU/GPU that are annoying to run in the cloud.

    This basically frees me from paying for any cloud stuff in my personal life for the foreseeable future. I’m trying to localize as much as I can.

    I’ve got tons of ideas I’m free to try out risk free on this machine, and it’s the most affordable “entry level” solution I’ve seen.







  • I don’t think you’re dramatic.

    For me it’s about reclaiming my right to participate in online discourse on my own terms. In a way the fediverse is freedom.

    I plan on trying to self host a mastodon instance for myself and rebuild my blogs.

    I don’t think I’d ever want to self host a lemmy, I like being anonymous to a reasonable degree. But I like that lemmy lets me have a voice without acquiescence to reddits enshittification.

    Also the community is largely great. I love the memes, I love the comments and discussions, and I like that I learn things from fellow internet users again.