

He’s getting 10,000 more next week
He’s getting 10,000 more next week
I don’t mind that it’s slower, I would rather wait than waste time on machines measured in multiple dollars per hour.
I’ve never locked up an A100 that long, I’ve used them for full work days and was glad I wasn’t paying directly.
Question about how shared VRAM works
So I need to specify in the BIOS the split, and then it’s dedicated at runtime, or can I allocate VRAM dynamically as needed by workload?
On macos you don’t really have to think about this, so wondering how this compares.
This is my use case exactly.
I do a lot of analysis locally, this is more than enough for my experiments and research. 64 to 96gb VRAM is exactly the window I need. There are analyses I’ve had to let run for 2 or 3 days and dealing with that on the cloud is annoying.
Plus this will replace GH Copilot for me. It’ll run voice models. I have diffusion model experiments I plan to run but are totally inaccessible locally to me (not just image models). I’ve got workloads that take 2 or 3 days at 100% CPU/GPU that are annoying to run in the cloud.
This basically frees me from paying for any cloud stuff in my personal life for the foreseeable future. I’m trying to localize as much as I can.
I’ve got tons of ideas I’m free to try out risk free on this machine, and it’s the most affordable “entry level” solution I’ve seen.
Welcome!
The fediverse situation has been improved a lot since the first API migration, hope you enjoy it here!
I’m happy to say I’m officially degoogled in my personal life/computer.
I still have legacy accounts with gmail that I can’t migrate though.
Hugging face already reproduced deepseek R1 (called Open R1) and open sourced the entire pipeline
The model weights and research paper are, which is the accepted terminology nowadays.
It would be nice to have the training corpus and RLHF too.
To a lot of people yes, but I think they’re more worried there’s nothing that proves you’re a real person, if that makes sense.
I see it as a liability more than anything. Your data is forever.
I don’t think you’re dramatic.
For me it’s about reclaiming my right to participate in online discourse on my own terms. In a way the fediverse is freedom.
I plan on trying to self host a mastodon instance for myself and rebuild my blogs.
I don’t think I’d ever want to self host a lemmy, I like being anonymous to a reasonable degree. But I like that lemmy lets me have a voice without acquiescence to reddits enshittification.
Also the community is largely great. I love the memes, I love the comments and discussions, and I like that I learn things from fellow internet users again.
deleted by creator
They’ve stuffed it into every page in Google cloud and workspace.
I tried it in sheets one time and it literally made up fake data for my analysis. Never trusting that again.
This makes sense to me.
All these people had to pick an instance and now probably understand how to do that.
That’s the biggest hurdle to signing up for lemmy or mastodon.
Gotta collect them allll… FEDIVERSE!
It’s a great conversation starter
This genus named genius game is sending pain to these previous devious data devourors
I don’t know if Lemmy is for everyone, but I’m very happy to see the fedicerse gaining more traction.
Every user that onboards to pixelfed is going to now understand how to set up mastodon.
Blue sky is basically trying the exact same thing and expecting a different result.
It could be cool to get a blue check mark for hosting your own domain (excluding the free domains)
It would be more expensive than bot armies are willing to deal with.
Yeah they were pure evil for that. I don’t know if evil is strong enough a word for causing generations of pain and conflict on purpose.
So what happens to the situation of having to pay ad-word rent on your own brand name?
Do we still get that with AI search, or is Google going to give up their golden goose of ad revenue. I’m guessing the adwords feed into the background RAG that influences what the model says.