

Why would you need to take your stuff home?
Why would you need to take your stuff home?
How is it a retcon? The use of giga- as a prefix for 109 has been in use as part of the metric system since 1960. I don’t think anyone in the fledgeling computer industry was talking about giga- or mega- anything at that time. The use of mega- as a prefix for 106 has been in use since 1873, over 60 years before Claude Shannon even came up with the concept of a digital computer.
if anything, the use of mega- and giga- to mean 1024 is a retcon over previous usage.
The power of not employing minimum-wage 16 year olds who don’t give a shit.
These are enterprise drives, they aren’t going to contain anything pirated. They are probably going to one of those cloud providers you don’t want to upload your data to.
I’d want to be able to lose two drives in an array before I lose all my shit. So RAID 6 for me.
Repeat after me: RAID is not a backup solution, RAID is a high-availability solution.
The point of RAID is not to safeguard your data, you need proper backups for that (3-2-1 rule of backups: 3 copies of the data on 2 different storage media, with 1 copy off-site). RAID will not protect your data from deletion from user error, malware, OS bugs, or anything like that.
The point of RAID is so everyone can keep working if there is a hardware failure. It’s there to prevent downtime.
That’s not a solution at all. First of all, depending country, you will need a gambling license. This is a PITA as gambling laws will differ per country. In my country gambling is heavily regulated and you would need to check ID and keep track of how much a person gambles. You have a duty of care and if you notice a person’s gambling habits are becoming problematic you have to refuse them.
You can put money and source code in escrow for this exact eventuality.
I don’t think it’s a token gesture. Microsoft is a software company; they want to sell software, not hardware. They don’t really care about xbox other than as a means to sell more software and gamepass subscriptions. Selling their games on as many platforms as possible is a logical move for them.
Sony/PlayStation DualSense is type C and a great controller.
It’s $99 a year. I wish my hobbies were that cheap.
Which is a complete non-issue. It’s $99 / year, basically a symbolic amount just high enough to prevent spammers from making a billion accounts.
I have no problems with this. Notarizing your app is trivial and takes just a few minutes. As a user I want to know who actually produced an app and ensure it wasn’t tampered with.
don’t let apple tell you they invented it.
Why always the knee-jerk anti-apple reaction even if they do something good?
FYI: Apple isn’t telling anyone they invented this. In fact, they didn’t even tell anyone about this feature and declined to comment after it was discovered and people started asking questions.
Yep. The best people will leave first because they have options. It’s called the dead sea effect
I mean, you have to explicitly give permission before apps can access the camera.
How can an app turn on the camera without your consent?
And yet, I’ve never run into RAM problems on iPhones, both as a user and as a developer. On iOS an app can use almost all the RAM if needed, as long as your app is running in the foreground. Android by contrast is much stingier with RAM, especially with Java/Kotlin apps. There are some hard limits on how much RAM you can actually use and it’s a small fractIon of the total amount. The actual limit is set by the manufacturer and differs per device, Android itself only guarantees a minimum of 16MB per app.
The reason is probably because Android is much more lenient with letting stuff run in the background so it needs to limit the per-app memory usage.
Those apps also use more RAM than an equivalent iOS app, simply because they run on a garbage-collected runtime. With a GC there is a trade-off between performance and memory usage. A GC always wastes memory, as memory isn’t freed immediately once no longer in use. It’s only freed when the GC runs. If you run it very often you waste little RAM at the cost of performance (all the CPU cycles used by the GC) if you run it at large intervals you waste a lot of RAM (because you let a lot of ‘garbage’ accumulate before cleaning it up). In general, to achieve similar performance to non-GC’d code you need to tune it so it uses about 4 times as much RAM. The actual overhead depends on how Google tuned the GC in ART combined with the behavior of specific apps.
Note that this only applies to apps running in ART, many system components like the web browser are written in C++ and don’t suffer from this inefficiency. But it does mean Android both uses more RAM than iOS while at the same time giving apps less RAM to actually use.
It basically comes down to different architectural choices made by Google and Apple.
It’s not hard to target the older models, with iOS it’s mostly just a few small tweaks.
It depends what you are doing. Targeting the iPhone 7’s GPU can be quite a PITA.
Upgrade your dinosaur of a phone.
A $1 million dinner? Does that involve buying the restaurant?