Features I can think of:
- a system for stricter content moderation, especially something that would automatically delete NSFW/NSFL posts,
- no direct messaging,
- some kind of tool for moderators to efficiently review content,
- multi-layered access to an account to allow for parental control,
- time management tool that would not be based on the client, but with the session duration calculated through interactions.


I don’t think that what you are envisioning and the fediverse are necessarily a good fit. The fediverse is potentially able to network with every other instance operating on the same protocol. With every instance you add more potential to have bad actors within reach.
There is no tool that can automatically remove everything. There is also the Scunthorpe problem. And there aren’t enough moderators in the world to do this job safely for children that don’t also expect remuneration for their services. And then you need to add in the cross cultural differences in what constitutes NSF anything. Maybe in a few years you can train a model to do a decent job with this.
The protocol can probably be adapted to fit most of your requirements. But the fediverse is held together by donations, sweat, and duct tape. It’s having a hard enough time attracting adults; I don’t think a kids version is in the works. Plus, there are now real legal hurdles like in Australia.
Personally, I wouldn’t want my kids to social network until they are 15-16. Before that I’d try to keep them in services and settings where I’m the moderator. And only after having not only the birds and the bees talk but also the know about grooming, no nudes, and no bullying talks you can slowly release them into the wild. And at that age they will not want to sit at the kids table any more.