Off-and-on trying out an account over at @[email protected] due to scraping bots bogging down lemmy.today to the point of near-unusability.

  • 16 Posts
  • 1.09K Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle
  • You could also just only use Macs.

    I actually don’t know what the current requirement is. Back in the day, Apple used to build some of the OS — like QuickDraw — into the ROMs, so unless you had a physical Mac, not just a purchased copy of MacOS, you couldn’t legally run MacOS, since the ROM contents were copyrighted, and doing so would require infringing on the ROM copyright. Apple obviously doesn’t care about this most of the time, but I imagine that if it becomes institutionalized at places that make real money, they might.

    But I don’t know if that’s still the case today. I’m vaguely recalling that there was some period where part of Apple’s EULA for MacOS prohibited running MacOS on non-Apple hardware, which would have been a different method of trying to tie it to the hardware.

    searches

    This is from 2019, and it sounds like at that point, Apple was leveraging the EULAs.

    https://discussions.apple.com/thread/250646417?sortBy=rank

    Posted on Sep 20, 2019 5:05 AM

    The widely held consensus is that it is only legal to run virtual copies of macOS on a genuine Apple made Apple Mac computer.

    There are numerous packages to do this but as above they all have to be done on a genuine Apple Mac.

    • VMware Fusion - this allows creating VMs that run as windows within a normal Mac environment. You can therefore have a virtual Mac running inside a Mac. This is useful to either run simultaneously different versions of macOS or to run a test environment inside your production environment. A lot of people are going to use this approach to run an older version of macOS which supports 32bit apps as macOS Catalina will not support old 32bit apps.
    • VMware ESXi aka vSphere - this is a different approach known as a ‘bare metal’ approach. With this you use a special VMware environment and then inside that create and run virtual machines. So on a Mac you could create one or more virtual Mac but these would run inside ESXi and not inside a Mac environment. It is more commonly used in enterprise situations and hence less applicable to Mac users.
    • Parallels Desktop - this works in the same way as VMware Fusion but is written by Parallels instead.
    • VirtualBox - this works in the same way as VMware Fusion and Parallels Desktop. Unlike those it is free of charge. Ostensible it is ‘owned’ by Oracle. It works but at least with regards to running virtual copies of macOS is still vastly inferior to VMware Fusion and Parallels Desktop. (You get what you pay for.)

    Last time I checked Apple’s terms you could do the following.

    • Run a virtualised copy of macOS on a genuine Apple made Mac for the purposes of doing software development
    • Run a virtualised copy of macOS on a genuine Apple made Mac for the purposes of testing
    • Run a virtualised copy of macOS on a genuine Apple made Mac for the purposes of being a server
    • Run a virtualised copy of macOS on a genuine Apple made Mac for personal non-commercial use

    No. Apple spells this out very clearly in the License Agreement for macOS. Must be installed on Apple branded hardware.

    They switched to ARM in 2020, so unless their legal position changed around ARM, I’d guess that they’re probably still relying on the EULA restrictions. That being said, EULAs have also been thrown out for various reasons, so…shrugs

    goes looking for the actual license text.

    Yeah, this is Tahoe’s EULA, the most-recent release:

    https://www.apple.com/legal/sla/docs/macOSTahoe.pdf

    Page 2 (of 895 pages):

    They allow only on Apple-branded hardware for individual purchases unless you buy from the Mac Store. For Mac Store purchases, they allow up to two virtual instances of MacOS to be executed on Apple-branded hardware that is also running the OS, and only under certain conditions (like for software development). And for volume purchase contracts, they say that the terms are whatever the purchaser negotiated. I’m assuming that there’s no chance that Apple is going to grant some “go use it as much as you want whenever you want to do CI tests or builds for open-source projects targeting MacOS” license.

    So for the general case, the EULA prohibits you from running MacOS wherever on non-Apple hardware.



  • Yes. For a single change. Like having an editor with 2 minute save lag, pushing commit using program running on cassette tapes4 or playing chess over snail-mail. It’s 2026 for Pete’s sake, and we5 won’t tolerate this behavior!

    Now of course, in some Perfect World, GitHub could have a local runner with all the bells and whistles. Or maybe something that would allow me to quickly check for progress upon the push6 or even something like a “scratch commit”, i.e. a way that I could testbed different runs without polluting history of both Git and Action runs.

    For the love of all that is holy, don’t let GitHub Actions manage your logic. Keep your scripts under your own damn control and just make the Actions call them!

    I don’t use GitHub Actions and am not familiar with it, but if you’re using it for continuous integration or build stuff, I’d think that it’s probably a good idea to have that decoupled from GitHub anyway, unless you want to be unable to do development without an Internet connection and access to GitHub.

    I mean, I’d wager that someone out there has already built some kind of system to do this for git projects. If you need some kind of isolated, reproducible environment, maybe Podman or similar, and just have some framework to run it?

    like macOS builds that would be quite hard to get otherwise

    Does Rust not do cross-compilation?

    searches

    It looks like it can.

    https://rust-lang.github.io/rustup/cross-compilation.html

    I guess maybe MacOS CI might be a pain to do locally on a non-MacOS machine. You can’t just freely redistribute MacOS.

    goes looking

    Maybe this?

    https://www.darlinghq.org/

    Darling is a translation layer that lets you run macOS software on Linux

    That sounds a lot like Wine

    And it is! Wine lets you run Windows software on Linux, and Darling does the same for macOS software.

    As long as that’s sufficient, I’d think that you could maybe run MacOS CI in Darling in Podman? Podman can run on Linux, MacOS, Windows, and BSD, and if you can run Darling in Podman, I’d think that you’d be able to run MacOS stuff on whatever.








  • There was some similar project that the UK was going to do, run an HVDC submarine line down from the UK to Africa.

    searches

    https://en.wikipedia.org/wiki/Xlinks_Morocco–UK_Power_Project

    The Xlinks Morocco-UK Power Project is a proposal to create 11.5 GW of renewable generation, 22.5 GWh of battery storage and a 3.6 GW high-voltage direct current interconnector to carry solar and wind-generated electricity from Morocco to the United Kingdom.[1][2][3][4] Morocco has been hailed as a potential key power generator for Europe as the continent looks to reduce reliance on fossil fuels.[5]

    If built, the 4,000 km (2,500 miles) cable would be the world’s longest undersea power cable, and would supply up to 8% of the UK’s electricity consumption.[6][7][8] The project was projected to be operational within a decade.[9][10] The proposal was rejected by the UK government in June 2025.




  • I think another major factor for Linux gaming beyond Valve was a large shift by game developers to using widely-used game engines. A lot of the platform portability work happened at that level, so was spread across many games. Writing games that could run on both personal computers and personal-computer-like consoles with less porting work became a goal. And today, some games also have releases on mobile platforms.

    When I started using Linux in the late 1990s, the situation was wildly different on that front.


  • Context:

    https://en.wikipedia.org/wiki/Ultra-mobile_PC

    An ultra-mobile PC,[1] or ultra-mobile personal computer (UMPC), is a miniature version of a pen computer, a class of laptop whose specifications were launched by Microsoft and Intel in Spring 2006. Sony had already made a first attempt in this direction in 2004 with its Vaio U series, which was only sold in Asia. UMPCs are generally smaller than subnotebooks, have a TFT display measuring (diagonally) about 12.7 to 17.8 centimetres (5.0 to 7.0 in), are operated like tablet PCs using a touchscreen or a stylus, and can also have a physical keyboard.


  • considers

    I’ve been in a couple conversation threads about this topic before on here. I’m more optimistic.

    I think that the Internet has definitely democratized information in many ways. I mean, if you have an Internet connection, you have access to a huge amount of information. Your voice has an enormous potential reach. A lot of stuff where one would have had to buy expensive reference works or spend a lot of time digging information up are now readily available to anyone with Internet access.

    I think that the big issue wasn’t that people became less critical, but that one stopped having experts filter what one saw. In, say, 1996, most of what I read had passed through the hands of some sort of professional or professionals specialized in writing. For newspapers or magazines, maybe it was a journalist and their editor. For books, an author and their editor and maybe a typesetter.

    Like, in 1996, I mostly didn’t get to actually see the writing of Average Joe. In 2026, I do, and Average Joe plays a larger role in directly setting the conversation. That is democratization. Average Joe of 2026 didn’t, maybe, become a better journalist than the professional journalist of 1996. But…I think that it’s very plausible that he’s a better journalist than Average Joe of 1996.

    Would it have been reasonable to expect Average Joe of 2026 to, in addition to all the other things he does, also be better at journalism than a journalist of 1996? That seems like a high bar to set.

    And we’re also living in a very immature environment as our current media goes. I am not sold that this is the end game.

    There’s a quote from Future Shock — written in 1970, but I think that we can steal the general idea for today:

    It has been observed, for example, that if the last 50,000 years of man’s existence were divided into lifetimes of approximately sixty-two years each, there have been about 800 such lifetimes. Of these 800, fully 650 were spent in caves.

    Only during the last seventy lifetimes has it been possible to communicate effectively from one lifetime to another—as writing made it possible to do. Only during the last six lifetimes did masses of men ever see a printed word. Only during the last four has it been possible to measure time with any precision. Only in the last two has anyone anywhere used an electric motor. And the overwhelming majority of all the material goods we use in daily life today have been developed within the present, the 800th, lifetime.

    That’s just to drive home how extremely rapidly the environment in which we all live has shifted compared to how it had in the past. In that quote, Alvin Toffler was talking about how incredibly quickly things had changed in that it had only been six lifetimes since the public as a whole had seen printed text, how much things had changed. But in 2026, we live in a world where it has only been a quarter of a lifetime, less for most, since much of the global population of humanity has been intimately linked by near-instant, inexpensive, mass communication.

    I think that it would be awfully unexpected and surprising if we would have immediately figured out conventions and social structures and technical solutions to every deficiency for such a new environment. Social media is a very new thing in the human experience at this scale. I think that it is very probable that humanity will — partly by trial-and-error, getting some scrapes and bruises along the way — develop practices to smooth over rough spots and address problems.

    Consider, say, the early motorcar, which had no seatbelts, windscreen, roof, suspension, was driven on a road infrastructure designed for horse-drawn carts to travel maybe ten miles an hour, didn’t have a muffler, didn’t have an electric starter, lacked electric headlights and other lighting, an instrument panel, and all that. It probably had a lot of very glaring problems as a form of transportation to people who saw it. An awful lot of those problems have been solved over time. I think that it would be very surprising if electronic mass communication available to everyone doesn’t do something similar.


  • I don’t know if I can count this as mine, but I certainly didn’t disagree with predictions of others around 1990 or so that the smart home would be the future. The idea was that you’d have a central home computer and it would interface with all sorts of other systems and basically control the house.

    While there are various systems for home automation, things like Home Assistant or OpenHAB, and some people use them, and I’ve used some technology that were expected to be part of this myself, like X10 for device control over power circuits, the vision of a heavily-automated, centrally-controlled home never made it to become the normal. I think that the most-widely-deployed piece of home automation that has shown up since then is maybe the smart thermostat, which isn’t generally hooked into some central home computer.




  • tal@lemmy.todaytoComic Strips@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    6 days ago

    I mean, essentially all users on the Threadiverse, Reddit, etc are pseudonymous. Not exactly the same thing as anonymity, but pretty close.

    EDIT: And I kinda think that the Penny Arcade most-likely actually meant pseudonymity, since actually doing anonymity is fairly uncommon. 4chan.org does anonymity. Slashdot.org optionally does it. My guess is that he was probably thinking of some multiplayer game, if not a video game forum, given that Penny Arcade subject matter is video games, and those tend to be pseudonymous, where a user selects and uses some kind of persistent handle visible to others.



  • I don’t know what a Halo battleship is (like…a spaceship in the Halo series?), but basically an amphibious assault ship — can deploy amphibious craft and aircraft — with a deck gun, cruise missiles, SAM array, CIWS, and torpedoes, so kinda an agglomeration of multiple modern-day real-world ship types. Yeah, and then you can either have AI control with you giving orders or you directly control the vehicles.

    There have been a couple games in the line. Carrier Command, a very old game, which I’ve never played. Hostile Waters: Anteus Rising, which is a spiritual successor and is oriented around a single-player campaign. Carrier Command 2, which is really principally a multi-player game, but can be played single-player if you can manage the workload and handle all the roles concurrently; I play it single-player. I like both, though I wish that the last games had a more-sophisticated single-player setup. Not a lot of “fleet command” games out there.

    But in this context, it’s one of the games I can think of, like Race the Sun or some older games, Avara, Spectre, Star Fox, AV-8B Harrier Assault/Flying Nightmares that use untextured polygons as a major element of the game’s graphics. Rez wasn’t untextured, but it made a lot of use of untextured polygons and wireframe. Just saying that one can make a decent 3D game, and one that has an attractive aesthetic, without spending memory on textures at all.


  • I think that there should be realistic video games. Not all video games, certainly, but I don’t think that we should avoid ever trying to make video games with a high level of graphical realism.

    I don’t particularly have any issue specific to violence. Like, I don’t particularly subscribe to past concerns over the years in various countries that no realistic violence should be portrayed in video games, and humans should be replaced by zombies or blood should be green or whatever.

    Whether or not specifically the Grand Theft Auto series should use realistic characters or stick with the more-cartoony representations that it used in the past is, I think, a harder question. I don’t have a hard opinion on it, though personally I enjoyed and played through Grand Theft Auto 3 and never bothered to get through the more-realistic, gritty, Grand Theft Auto 5. Certainly I think that it’s quite possible to make very good games that are not photorealistic. And given the current RAM shortages, if there’s ever been a good time to maybe pull back a bit on more-photorealistic graphics in order to reduce RAM requirements, this seems like a good time.

    Yesterday, I was playing Carrier Command 2. That uses mostly untextured polygons for its graphics, and it’s a perfectly fine game. I have other, many more photorealistic, games available, and the hardware to run them, but that happened to be more appealing.

    EDIT: I just opened it, and with it running, it increased the VRAM usage on my video card by 1.1 GB. Not very VRAM-dependent. And it is pretty, at least in my eyes.